CN106125926B - A kind of information processing method and electronic equipment - Google Patents
A kind of information processing method and electronic equipment Download PDFInfo
- Publication number
- CN106125926B CN106125926B CN201610460487.0A CN201610460487A CN106125926B CN 106125926 B CN106125926 B CN 106125926B CN 201610460487 A CN201610460487 A CN 201610460487A CN 106125926 B CN106125926 B CN 106125926B
- Authority
- CN
- China
- Prior art keywords
- moving objects
- motion information
- motion
- eye
- degree
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/013—Eye tracking input arrangements
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
The invention discloses a kind of information processing method and electronic equipments, comprising: obtains the eye motion information of the eyes of user;M object motion information of M Moving Objects on display unit is obtained, M is the integer more than or equal to 1;Based on the eye motion information and the M object motion information, the first Moving Objects are determined from the M Moving Objects.The technical solution provided through the invention, for solving the technical problem lower there are precision of ocular pursuit technology in the prior art.
Description
Technical field
The present invention relates to electronic technology field, in particular to a kind of information processing method and electronic equipment.
Background technique
With the development of computer technology and increasing for consumer demand, more and more electronic products come out, and right
The control technology of these electronic products is also more and more, such as is controlled by mouse, Trackpad, makes to be more convenient user
With electronic product, occurs eye movement control in the prior art.
Currently, eye movement control in the prior art, needs user to calibrate first to display screen, if without calibration,
It will cause tracking accuracy is lower.
As it can be seen that the ocular pursuit technology technical problem lower there are precision in the prior art.
Summary of the invention
The embodiment of the present invention provides a kind of information processing method and electronic equipment, for solving ocular pursuit skill in the prior art
The art technical problem lower there are accuracy, to reach the technical effect for the accuracy for improving ocular pursuit technology.
On the one hand, the embodiment of the present application provides a kind of information processing method, comprising the following steps:
Obtain the eye motion information of the eyes of user;
M object motion information of M Moving Objects on display unit is obtained, M is the integer more than or equal to 1;
Based on the eye motion information and the M object motion information, is determined from the M Moving Objects
One Moving Objects.
Optionally, the eye motion information of the eyes for obtaining user, comprising:
Obtain first direction of motion of the eyes of user;
Correspondingly, the M object motion information for obtaining M Moving Objects on display unit, comprising:
Obtain the M direction of motion of M Moving Objects on display unit.
Optionally, described to be based on the eye motion information and the M object motion information, from described M movement pair
The first Moving Objects are determined as in, comprising:
The angle between first direction of motion and the M direction of motion is calculated, obtains M angle altogether;
Determine that angle is less than the first angle of default angle from the M angle;
Will Moving Objects corresponding with first angle as first Moving Objects.
Optionally, the eye motion information of the eyes for obtaining user, comprising:
Obtain the first motion profile of the eyes of user;
Correspondingly, the M object motion information for obtaining M Moving Objects on display unit, comprising:
Obtain M motion profile of M Moving Objects on display unit.
Optionally, described to be based on the eye motion information and the M object motion information, from described M movement pair
The first Moving Objects are determined as in, comprising:
The degree of association between first motion profile and the M motion profile is calculated, obtains the M degree of association altogether;
Determine that the degree of association is greater than first degree of association of the default degree of association from the M degree of association;
Will Moving Objects corresponding with first degree of association as the first Moving Objects.
Optionally, it is described determine the first Moving Objects from the M Moving Objects when, the method also includes:
It moves the cursor on position corresponding with first Moving Objects.
Optionally, it is based on the eye motion information and the M object motion information described, is moved from described M
After determining the first Moving Objects in object, the method also includes:
Obtain the second eye motion information of the eyes of the user and/or the head movement information on head;
According to the second eye motion information and/or the head movement information, first Moving Objects are carried out
Corresponding operation.
On the other hand, the embodiment of the present application also provides a kind of electronic equipment, comprising:
Shell;
Display unit;
Eye movement monitoring component is arranged in the shell, connect with the display unit, for obtaining the eyes of user
Eye motion information;
Processing unit is arranged in the shell, connect with the eye movement monitoring component, for obtaining M on display unit
M object motion information of a Moving Objects, M are the integer more than or equal to 1;Based on the eye motion information and the M
Object motion information determines the first Moving Objects from the M Moving Objects.
Optionally, the eye movement monitoring component is used for:
Obtain first direction of motion of the eyes of user;
Correspondingly, the processing unit is used for:
Obtain the M direction of motion of M Moving Objects on display unit.
Optionally, the processing unit is used for:
The angle between first direction of motion and the M direction of motion is calculated, obtains M angle altogether;
Determine that angle is less than the first angle of default angle from the M angle;
Will Moving Objects corresponding with first angle as first Moving Objects.
Optionally, the eye movement monitoring component is used for:
Obtain the first motion profile of the eyes of user;
Correspondingly, the processing unit is used for:
Obtain M motion profile of M Moving Objects on display unit.
Optionally, the processing unit is used for:
The degree of association between first motion profile and the M motion profile is calculated, obtains the M degree of association altogether;
Determine that the degree of association is greater than first degree of association of the default degree of association from the M degree of association;
Will Moving Objects corresponding with first degree of association as the first Moving Objects.
Optionally, when determining the first Moving Objects from the M Moving Objects, the processing unit is also used to:
It moves the cursor on position corresponding with first Moving Objects.
Optionally, it is based on the eye motion information and the M object motion information described, is moved from described M
After determining the first Moving Objects in object, the electronic equipment further include:
Sensing device is used for:
Obtain the second eye motion information of the eyes of the user and/or the head movement information on head;
The processing unit is also used to:
According to the second eye motion information and/or the head movement information, first Moving Objects are carried out
Corresponding operation.
One embodiment or multiple embodiments in above-described embodiment in through the invention, at least may be implemented following skill
Art effect:
One, due to the technical solution in the embodiment of the present application, the eye motion information of the eyes of user is obtained;Obtain display
M object motion information of M Moving Objects on unit, M are the integer more than or equal to 1;Based on the eye motion information and
The M object motion information determines the first Moving Objects from the M Moving Objects.It i.e. will not be as in the prior art
Eye movement control, first have to calibrate display screen, if will affect tracking accuracy, and in this technology without calibration
It is that is determined according to M object motion information of M Moving Objects on the motion information of eyes and display unit in scheme
One Moving Objects also ensure the essence of tracking to carry out corresponding operation to the Moving Objects without calibration
Exactness thus it is possible to effectively solve the technical problem lower there are accuracy of ocular pursuit technology in the prior art, and then reaches and mentions
The technical effect of high ocular pursuit accuracy.
Two, due to the technical solution in the embodiment of the present application, the first fortune is determined from the M Moving Objects described
When dynamic object, move the cursor on position corresponding with first Moving Objects.It i.e. can be to use by the technical program
The experience that family brings cursor dynamic with eye movement, and then further reach the technical effect for improving user experience.
Three, due to the technical solution in the embodiment of the present application, the second eye motion information of the eyes of the user is obtained
And/or the head movement information on head;According to the second eye motion information and/or the head movement information, to described
First Moving Objects carry out corresponding operation.It i.e. in the technical scheme, is to pass through user after determining the first Moving Objects
The eye motion information of eyes and/or the head movement information on head carry out corresponding operation to the second Moving Objects, to keep away
Exempt from maloperation, and then reaches the technical effect for improving user experience.
Detailed description of the invention
Fig. 1 is a kind of specific implementation flow chart for information processing method that the embodiment of the present application one provides;
In a kind of information processing method that Fig. 2 provides for the embodiment of the present application one on eye motion track and display unit
The movement locus schematic diagram of Moving Objects;
Fig. 3 is the structural schematic diagram for a kind of electronic equipment that the embodiment of the present application two provides;
Fig. 4 is the structural schematic diagram that a kind of electronic equipment that the embodiment of the present application two provides is specially headset equipment.
Specific embodiment
Technical solution provided by the embodiments of the present application, for solving in the prior art, that there are accuracy is lower for ocular pursuit technology
The technical issues of, to reach the technical effect for the accuracy for improving ocular pursuit technology.
Technical solution in the embodiment of the present application is in order to solve the above technical problems, general thought is as follows:
Obtain the eye motion information of the eyes of user;
M object motion information of M Moving Objects on display unit is obtained, M is the integer more than or equal to 1;
Based on the eye motion information and the M object motion information, is determined from the M Moving Objects
One Moving Objects.
In the above-mentioned technical solutions, the eye motion information of the eyes of user is obtained;Obtain M movement pair on display unit
The M object motion information of elephant, M are the integer more than or equal to 1;It is moved based on the eye motion information and the M object
Information determines the first Moving Objects from the M Moving Objects.It will not be controlled as eye movement in the prior art, first
Display screen is calibrated, be according to eye if will affect tracking accuracy without calibration, and in the technical scheme
M object motion information of M Moving Objects determines the first Moving Objects in the motion information and display unit of eyeball, thus
Corresponding operation is carried out to the Moving Objects, the accuracy of tracking is also ensured without calibration, thus it is possible to have
Effect solves the ocular pursuit technology technical problem lower there are accuracy in the prior art, and then reaches and improve ocular pursuit accuracy
Technical effect.
In order to illustrate the technical solutions in the embodiments of the present application or in the prior art more clearly, to embodiment or will show below
There is attached drawing needed in technical description to be briefly described, it should be apparent that, the accompanying drawings in the following description is only this
Apply embodiment, for those of ordinary skill in the art, without creative efforts, can also basis mention
The attached drawing of confession obtains other attached drawings.
Embodiment one
Referring to FIG. 1, a kind of information processing method provided for the embodiment of the present application one, comprising:
S101: the eye motion information of the eyes of user is obtained;
S102: obtaining M object motion information of M Moving Objects on display unit, and M is the integer more than or equal to 1;
S103: being based on the eye motion information and the M object motion information, from the M Moving Objects really
Make the first Moving Objects.
A kind of information processing method that the application implements to provide can be applied to an electronic equipment, as: with camera
Headset equipment, tablet computer, smart phone or laptop etc., or be other electronic equipments, here, just no longer one by one
It schematically illustrates.
In the embodiment of the present application, step S101 is first carried out: obtaining the eye motion information of the eyes of user.
In the embodiment of the present application, the eye movement information of eyes can be obtained by the eye-tracking equipment in electronic equipment;
Facial image can also be obtained by the camera on electronic equipment, eye image is detected by facial image identification model,
The processing such as gray value, marginalisation is carried out to eye image again, to obtain eye movement information.In the embodiment of the present application, eye movement is believed
Breath includes but is not limited to: blinkpunkt, fixation times, twitching of the eyelid distance or pupil size.Specifically, there are three types of basic modes for eye movement:
Watch attentively, twitching of the eyelid and following movement.
Then according to the motion information of the eye movement acquisition of information eyes of acquisition.In the embodiment of the present application, the movement of eyes
Specifically there are two types of situations for information, and both of these case is described in detail separately below.
The first situation specifically comprises the following steps: the specific implementation process of step S101
Obtain first direction of motion of the eyes of user;
Correspondingly, for step S102: obtaining M object motion information of M Moving Objects on display unit, M is big
In the specific implementation process of the integer equal to 1, specifically comprise the following steps:
Obtain the M direction of motion of M Moving Objects on display unit.
During specific implementation, according to first direction of motion of the eyes of the eye movement acquisition of information user of user, such as: phase
For display unit, moved right from the left side of display unit;Or it is moved downward on the right side of display unit;Or display
The upper side edge of unit moves downward etc..
Correspondingly, in the embodiment of the present application, while obtaining the eye motion direction of eyes of user, also to obtain aobvious
Show the M direction of motion of M Moving Objects on unit.
In the embodiment of the present application, M Moving Objects are specifically as follows such as: showing 3 flying birds on the display unit, use
Family is by tracking different flying birds, the details of the available flying bird;Or user is playing the game that frog eats fly, In
Show 4 flies on display unit, for user by watching different flies attentively, frog can then eat up that user watched attentively
Fly;Or user is used Baidu map and carries out route inquiry, shows different target positions, user on display unit
By watching different target positions attentively, such as: target position A and target position B can automatically determine the navigation circuit by A to B
Deng.
During specific implementation, if M, for 2, the direction of motion of first Moving Objects is specifically as follows from display
The left side of unit moves right, and the direction of motion of second Moving Objects is specially the east of the centre coordinate of opposite display unit
30 degree by north;Or the direction of motion of first Moving Objects is west by south 30 degree of the centre coordinate of opposite display unit, second
The direction of motion of a Moving Objects is specially moved downward from the upper side edge of display unit, or is other directions, in the application
It is not especially limited in embodiment.
In the embodiment of the present application, when the eye motion information of eyes is specially the direction of motion, for step S103: base
In the eye motion information and the M object motion information, the first Moving Objects are determined from the M Moving Objects
Specific implementation process, specifically comprise the following steps:
The angle between first direction of motion and the M direction of motion is calculated, obtains M angle altogether;
Determine that angle is less than the first angle of default angle from the M angle;
Will Moving Objects corresponding with first angle as first Moving Objects.
In the embodiment of the present application, for M for 2, the direction of motion of first Moving Objects is from the left side of display unit
While move right, 30 degree of the centre coordinate north by east of the direction of motion of second Moving Objects with respect to display unit;The of eyes
One direction of motion is to move to the right from the left side of display unit.Default angle is 5 degree, 10 degree or 15 degree, or is other
Angle, those of ordinary skill in the art can set according to actual needs, be not especially limited in the embodiment of the present application.
During specific implementation, if default angle is for 10 degree.First direction of motion and first Moving Objects
The first angle between the direction of motion is 0 degree;Second between first direction of motion and the direction of motion of second Moving Objects
Angle is 30 degree.Since the first angle is less than default angle 10, the second angle is greater than default angle 10, it is determined that with the first angle
Corresponding first Moving Objects are the first Moving Objects.
Second situation specifically comprises the following steps: the specific implementation process of step S101
Obtain the first motion profile of the eyes of user;
Correspondingly, for step S102: obtaining M object motion information of M Moving Objects on display unit, M is big
In the specific implementation process of the integer equal to 1, specifically comprise the following steps:
Obtain M motion profile of M Moving Objects on display unit.During specific implementation, electronics can be passed through
Eye-tracking equipment in equipment obtains the direction of motion, the variation posture of movement and motion change trend of glasses etc. and determines eye
The motion profile of eyeball, such as: for the display screen of electronic equipment, moving right 5 centimetres from the left side of display screen, then
Move upwards 5 centimetres;Or 5 centimetres are moved downward from the upper side edge of display screen, then move right 5 centimetres;Or from left side
While moving right 5 centimetres, 5 centimetres are moved then up, or be other motion profiles, here, just no longer schematically illustrating one by one.
Correspondingly, in the embodiment of the present application, while obtaining the eye motion track of eyes of user, also to obtain aobvious
Show M motion profile of M Moving Objects on unit.In the embodiment of the present application, M Moving Objects are specifically as follows: display
3 flying birds shown on unit;Or user is playing the game that frog eats fly, 4 flies shown on display unit, or
For the other display objects shown on display unit, it is not especially limited in the embodiment of the present application.
During specific implementation, if M, for 2, the motion profile of first Moving Objects is from the left side of display screen
While moving right 5 centimetres, 5 centimetres are then moved downward;The motion profile of second Moving Objects is the upper side edge from display screen
5 centimetres are moved downward, then moves right 5 centimetres;Or be other motion profiles, here, just no longer schematically illustrating one by one.
Correspondingly, in the embodiment of the present application, when the motion information of eyes is specially the motion profile of eyes, for step
Rapid S103: being based on the eye motion information and the M object motion information, and the is determined from the M Moving Objects
The specific implementation process of one Moving Objects, specifically comprises the following steps:
The degree of association between first motion profile and the M motion profile is calculated, obtains the M degree of association altogether;
Determine that the degree of association is greater than first degree of association of the default degree of association from the M degree of association;
Will Moving Objects corresponding with first degree of association as the first Moving Objects.
In the embodiment of the present application, for M for 2, the motion profile of first Moving Objects is the left side from display screen
5 centimetres are moved right, then moves downward 5 centimetres;The motion profile of second Moving Objects be from the upper side edge of display screen to
Lower 5 centimetres of movement, then moves right 5 centimetres;First motion profile is to move right 5 centimetres from the left side of display screen, so
After move downward 5 centimetres.The default degree of association is 50%, 70% or 80%, or is other degrees of association, ordinary skill people
Member can be set according to actual needs, be not especially limited in the embodiment of the present application.
During specific implementation, the degree of association between the first motion profile and M motion profile is calculated, the can be extracted
The characteristic point of one motion profile and first motion profile, second motion profile, then compares the characteristic point extracted
It is right.In the embodiment of the present application, characteristic point is the point for being able to reflect the shape of motion profile, such as: starting point in motion profile turns
Point, peak point, terminal etc..
During specific implementation, 4 characteristic points of the first motion profile, 4 features of first motion profile are extracted
4 characteristic points of point, second motion profile, after extracting these characteristic points, it is determined that between the characteristic point extracted
Whether match.In the embodiment of the present application, it determines whether match between each characteristic point, is the position between determining each characteristic point
Deviation is within a preset range.
During specific implementation, if presetting the degree of association for 80%, 4 characteristic points of the first motion profile and first
It is exactly matched between 4 characteristic points of a motion profile, then shows between the first motion profile and first motion profile
One degree of association is 100%;There was only 2 between 4 characteristic points of the first motion profile and 4 characteristic points of second motion profile
Feature Points Matching then shows that second degree of association between the first motion profile and second motion profile is 50%.Due to first
The degree of association is greater than the default degree of association 80%, and second degree of association is less than the default degree of association 80%, it is determined that corresponding with first degree of association
First Moving Objects be the first Moving Objects, specific schematic diagram please refers to Fig. 2.
In the embodiment of the present application, after determining the first Moving Objects in M Moving Objects, the method is also wrapped
It includes:
It moves the cursor on position corresponding with first Moving Objects.
Further, it for more humane operation electric terminal, realizes and the eye movement of electronic equipment is controlled, determining first
When Moving Objects, it will also be shown that the cursor on unit is moved on the corresponding position of the first Moving Objects, user can have cursor
The feeling dynamic with eye movement, to provide the user with preferable experience effect.
Further, in the embodiment of the present application, described based on the eye motion information and M object movement letter
Breath, after determining the first Moving Objects in the M Moving Objects, the method also includes:
Obtain the second eye motion information of the eyes of the user and/or the head movement information on head;
According to the second eye motion information and/or the head movement information, first Moving Objects are carried out
Corresponding operation.
In the embodiment of the present application, the eye motion of user can be identified by gravity sensor or camera, such as: blinking
Eyes;And/or headwork, such as: it shakes the head, point head, the corresponding different instruction of these movements, and different instructions may be implemented
Different operations.Specifically, such as: when determining Moving Objects is the first Moving Objects flying bird A, if at this moment detecting user's
Headwork is to nod, then it represents that user wants to obtain the details of flying bird A, so an instruction will be generated, it is single in display
The details of flying bird A are shown in member;Or identify that the headwork of user is rotary head to the left, then flying bird A is shot at.In
In the present embodiment, headwork and corresponding operation can also be fabricated to table, when identifying headwork, just table look-up, just
The corresponding operation of headwork can be found.
In the embodiment of the present application, after determining corresponding Moving Objects, it is also necessary to pass through the eye motion of user
Or after the confirmation movement on head, corresponding operation just is carried out to Moving Objects, to avoid the maloperation of user, and then is reached
Improve the technical effect of user experience.
Embodiment two
Based on the same inventive concept, the embodiment of the present application also provides a kind of electronic equipment, referring to FIG. 3, including:
Shell 30;
Display unit 31;
Eye movement monitoring component 32 is arranged in the shell 30, connect with the display unit 31, for obtaining user's
The eye motion information of eyes;
Processing unit 33 is arranged in the shell 30, connect with the eye movement monitoring component, for obtaining display unit
M object motion information of upper M Moving Objects, M are the integer more than or equal to 1;Based on the eye motion information and the M
A object motion information determines the first Moving Objects from the M Moving Objects.
In the embodiment of the present application, electronic equipment is specifically as follows headset equipment, smart phone, tablet computer or notes
This computer etc., or be other electronic equipments, it is not especially limited in the embodiment of the present application.
In the embodiment of the present application, eye movement detection part is specifically as follows camera, and processing unit is specifically as follows one
Central processor CPU 1 or two central processor CPUs 1 and CPU2, or be the central processing unit of other quantity, this field is general
Logical technical staff can be not especially limited in the embodiment of the present application determine according to actual needs.
Optionally, the eye movement monitoring component 32 is used for:
Obtain first direction of motion of the eyes of user;
Correspondingly, the processing unit 33 is used for:
Obtain the M direction of motion of M Moving Objects on display unit.
Optionally, the processing unit 33 is used for:
The angle between first direction of motion and the M direction of motion is calculated, obtains M angle altogether;
Determine that angle is less than the first angle of default angle from the M angle;
Will Moving Objects corresponding with first angle as first Moving Objects.
Optionally, the eye movement monitoring component 32 is used for:
Obtain the first motion profile of the eyes of user;
Correspondingly, the processing unit 33 is used for:
Obtain M motion profile of M Moving Objects on display unit.
Optionally, the processing unit 33 is used for:
The degree of association between first motion profile and the M motion profile is calculated, obtains the M degree of association altogether;
Determine that the degree of association is greater than first degree of association of the default degree of association from the M degree of association;
Will Moving Objects corresponding with first degree of association as the first Moving Objects.
Optionally, when determining the first Moving Objects from the M Moving Objects, the processing unit 33 is also used
In:
It moves the cursor on position corresponding with first Moving Objects.
Optionally, it is based on the eye motion information and the M object motion information described, is moved from described M
After determining the first Moving Objects in object, the electronic equipment further include:
Sensing device 34, is used for:
Obtain the second eye motion information of the eyes of the user and/or the head movement information on head;
The processing unit 33 is also used to:
According to the second eye motion information and/or the head movement information, first Moving Objects are carried out
Corresponding operation.
In the embodiment of the present application, electronic equipment is specifically by taking headset equipment as an example, referring to FIG. 4, headset equipment packet
Structure member 400 is included, structure member includes nose support 401 and ear mount 402, for the electric terminal to be worn on to the body of user
On;Eye movement monitoring component 32 can be located on structure member 400, such as: at nose support 401, as long as or being located on display unit 31 and can supervise
Measure the eye movement of user.On the body that electronic equipment is worn on user, monitor user's by eye movement monitoring component 32
Eye movement, further, user's eye motion and/or the sensing device of headwork 34 may be alternatively located at structure member 400 for identification
On, which is specifically as follows gravity sensor, or is camera, and those of ordinary skill in the art can be according to reality
It is set, is not especially limited in the embodiment of the present application.Therefore, in the embodiment of the present application, when user is by electronics
When equipment is worn physically, easily electronic equipment can be controlled by eye movement and/or headwork, control efficiency
It is high, it is not easy to maloperation.
One embodiment or multiple embodiments in above-described embodiment in through the invention, at least may be implemented following skill
Art effect:
One, due to the technical solution in the embodiment of the present application, the eye motion information of the eyes of user is obtained;Obtain display
M object motion information of M Moving Objects on unit, M are the integer more than or equal to 1;Based on the eye motion information and
The M object motion information determines the first Moving Objects from the M Moving Objects.It i.e. will not be as in the prior art
Eye movement control, first have to calibrate display screen, if will affect tracking accuracy, and in this technology without calibration
It is that is determined according to M object motion information of M Moving Objects on the motion information of eyes and display unit in scheme
One Moving Objects also ensure the essence of tracking to carry out corresponding operation to the Moving Objects without calibration
Exactness thus it is possible to effectively solve the technical problem lower there are accuracy of ocular pursuit technology in the prior art, and then reaches and mentions
The technical effect of high ocular pursuit accuracy.
Two, due to the technical solution in the embodiment of the present application, the first fortune is determined from the M Moving Objects described
When dynamic object, move the cursor on position corresponding with first Moving Objects.It i.e. can be to use by the technical program
The experience that family brings cursor dynamic with eye movement, and then further reach the technical effect for improving user experience.
Three, due to the technical solution in the embodiment of the present application, the second eye motion information of the eyes of the user is obtained
And/or the head movement information on head;According to the second eye motion information and/or the head movement information, to described
First Moving Objects carry out corresponding operation.It i.e. in the technical scheme, is to pass through user after determining the first Moving Objects
The eye motion information of eyes and/or the head movement information on head carry out corresponding operation to the second Moving Objects, to keep away
Exempt from maloperation, and then reaches the technical effect for improving user experience.
It should be understood by those skilled in the art that, the embodiment of the present invention can provide as method, system or computer program
Product.Therefore, complete hardware embodiment, complete software embodiment or reality combining software and hardware aspects can be used in the present invention
Apply the form of example.Moreover, it wherein includes the computer of computer usable program code that the present invention, which can be used in one or more,
The computer program implemented in usable storage medium (including but not limited to magnetic disk storage, CD-ROM, optical memory etc.) produces
The form of product.
The present invention be referring to according to the method for the embodiment of the present invention, the process of equipment (system) and computer program product
Figure and/or block diagram describe.It should be understood that every one stream in flowchart and/or the block diagram can be realized by computer program instructions
The combination of process and/or box in journey and/or box and flowchart and/or the block diagram.It can provide these computer programs
Instruct the processor of general purpose computer, special purpose computer, Embedded Processor or other programmable data processing devices to produce
A raw machine, so that being generated by the instruction that computer or the processor of other programmable data processing devices execute for real
The device for the function of being specified in present one or more flows of the flowchart and/or one or more blocks of the block diagram.
These computer program instructions, which may also be stored in, is able to guide computer or other programmable data processing devices with spy
Determine in the computer-readable memory that mode works, so that it includes referring to that instruction stored in the computer readable memory, which generates,
Enable the manufacture of device, the command device realize in one box of one or more flows of the flowchart and/or block diagram or
The function of being specified in multiple boxes.
These computer program instructions also can be loaded onto a computer or other programmable data processing device, so that counting
Series of operation steps are executed on calculation machine or other programmable devices to generate computer implemented processing, thus in computer or
The instruction executed on other programmable devices is provided for realizing in one or more flows of the flowchart and/or block diagram one
The step of function of being specified in a box or multiple boxes.
Specifically, the corresponding computer program instructions of information processing method in the embodiment of the present application can be stored in
CD, hard disk, on the storage mediums such as USB flash disk, when the computer program instructions quilt corresponding with information processing method in storage medium
One electronic equipment reads or is performed, and includes the following steps:
Obtain the eye motion information of the eyes of user;
M object motion information of M Moving Objects on display unit is obtained, M is the integer more than or equal to 1;
Based on the eye motion information and the M object motion information, is determined from the M Moving Objects
One Moving Objects.
Optionally, stored in the storage medium and step: the eye motion information of the eyes for obtaining user is right
The computer instruction answered specifically comprises the following steps: during being specifically performed
Obtain first direction of motion of the eyes of user;
Correspondingly, stored in the storage medium and step: described the M for obtaining M Moving Objects on display unit
Object motion information, corresponding computer instruction specifically comprise the following steps: during being specifically performed
Obtain the M direction of motion of M Moving Objects on display unit.
Optionally, stored in the storage medium and step: described right based on the eye motion information and the M
As motion information, determine that the first Moving Objects, corresponding computer instruction were specifically performed from the M Moving Objects
Cheng Zhong specifically comprises the following steps:
The angle between first direction of motion and the M direction of motion is calculated, obtains M angle altogether;
Determine that angle is less than the first angle of default angle from the M angle;
Will Moving Objects corresponding with first angle as first Moving Objects.
Optionally, stored in the storage medium and step: the eye motion information of the eyes for obtaining user is right
The computer instruction answered specifically comprises the following steps: during being specifically performed
Obtain the first motion profile of the eyes of user;
Correspondingly, stored in the storage medium and step: described the M for obtaining M Moving Objects on display unit
Object motion information, corresponding computer instruction specifically comprise the following steps: during being specifically performed
Obtain M motion profile of M Moving Objects on display unit.
Optionally, stored in the storage medium and step: described right based on the eye motion information and the M
As motion information, determine that the first Moving Objects, corresponding computer instruction are specifically being performed from the M Moving Objects
In the process, specifically comprise the following steps:
The degree of association between first motion profile and the M motion profile is calculated, obtains the M degree of association altogether;
Determine that the degree of association is greater than first degree of association of the default degree of association from the M degree of association;
Will Moving Objects corresponding with first degree of association as the first Moving Objects.
Optionally, it is also stored with other computer instruction in the storage medium, the other computer instruction
With step: determining that the corresponding computer instruction of the first Moving Objects is specifically being performed from the M Moving Objects
When be performed, the other computer instruction specifically be performed during, specifically comprise the following steps:
It moves the cursor on position corresponding with first Moving Objects.
Optionally, it is also stored with other computer instruction in the storage medium, the other computer instruction
With step: the eye motion information and the M object motion information are based on described, from the M Moving Objects
Determine that the corresponding computer instruction of the first Moving Objects is performed after being performed, which is performed
In the process, specifically comprise the following steps:
Obtain the second eye motion information of the eyes of the user and/or the head movement information on head;
According to the second eye motion information and/or the head movement information, first Moving Objects are carried out
Corresponding operation.
In order to make the object, technical scheme and advantages of the embodiment of the invention clearer, below in conjunction with the embodiment of the present invention
In attached drawing, technical scheme in the embodiment of the invention is clearly and completely described, it is clear that described embodiment is
A part of the embodiment of the present invention, instead of all the embodiments.Based on the embodiments of the present invention, those of ordinary skill in the art
Every other embodiment obtained without making creative work, shall fall within the protection scope of the present invention.
Although preferred embodiments of the present invention have been described, it is created once a person skilled in the art knows basic
Property concept, then additional changes and modifications may be made to these embodiments.So it includes excellent that the following claims are intended to be interpreted as
It selects embodiment and falls into all change and modification of the scope of the invention.
Obviously, various changes and modifications can be made to the invention without departing from essence of the invention by those skilled in the art
Mind and range.In this way, if these modifications and changes of the present invention belongs to the range of the claims in the present invention and its equivalent technologies
Within, then the present invention is also intended to include these modifications and variations.
Claims (10)
1. a kind of information processing method, comprising:
Obtain the eye motion information of the eyes of user;
M object motion information of M Moving Objects on display unit is obtained, M is the integer more than or equal to 1;
Based on the eye motion information and the M object motion information, the first fortune is determined from the M Moving Objects
Dynamic object;
Wherein, the eye motion information of the eyes for obtaining user, comprising: obtain first direction of motion of the eyes of user;
Correspondingly, the M object motion information for obtaining M Moving Objects on display unit, comprising: obtain on display unit
The M direction of motion of M Moving Objects;
It is described to be based on the eye motion information and the M object motion information, the is determined from the M Moving Objects
One Moving Objects, comprising:
The angle between first direction of motion and the M direction of motion is calculated, obtains M angle altogether;
Determine that angle is less than the first angle of default angle from the M angle;
Will Moving Objects corresponding with first angle as first Moving Objects.
2. the method as described in claim 1, which is characterized in that the eye motion information of the eyes for obtaining user, comprising:
Obtain the first motion profile of the eyes of user;
Correspondingly, the M object motion information for obtaining M Moving Objects on display unit, comprising:
Obtain M motion profile of M Moving Objects on display unit.
3. method according to claim 2, which is characterized in that described to be based on the eye motion information and the M object
Motion information determines the first Moving Objects from the M Moving Objects, comprising:
The degree of association between first motion profile and the M motion profile is calculated, obtains the M degree of association altogether;
Determine that the degree of association is greater than first degree of association of the default degree of association from the M degree of association;
Will Moving Objects corresponding with first degree of association as the first Moving Objects.
4. the method as described in any claim of claim 1-3, which is characterized in that determined from the M Moving Objects
When the first Moving Objects, the method also includes:
It moves the cursor on position corresponding with first Moving Objects.
5. method as claimed in claim 4, which is characterized in that described right based on the eye motion information and the M
As motion information, after determining the first Moving Objects in the M Moving Objects, the method also includes:
Obtain the second eye motion information of the eyes of the user and/or the head movement information on head;
According to the second eye motion information and/or the head movement information, first Moving Objects are carried out corresponding
Operation.
6. a kind of electronic equipment, comprising:
Shell;
Display unit;
Eye movement monitoring component is arranged in the shell, connect with the display unit, the eyes of the eyes for obtaining user
Motion information;
Processing unit is arranged in the shell, connect with the eye movement monitoring component, for obtaining M fortune on display unit
M object motion information of dynamic object, M are the integer more than or equal to 1;Based on the eye motion information and the M object
Motion information determines the first Moving Objects from the M Moving Objects;
The eye movement monitoring component is used for:
Obtain first direction of motion of the eyes of user;
Correspondingly, the processing unit is used for:
Obtain the M direction of motion of M Moving Objects on display unit;
The processing unit is used for:
The angle between first direction of motion and the M direction of motion is calculated, obtains M angle altogether;
Determine that angle is less than the first angle of default angle from the M angle;
Will Moving Objects corresponding with first angle as first Moving Objects.
7. electronic equipment as claimed in claim 6, which is characterized in that the eye movement monitoring component is used for:
Obtain the first motion profile of the eyes of user;
Correspondingly, the processing unit is used for:
Obtain M motion profile of M Moving Objects on display unit.
8. electronic equipment as claimed in claim 7, which is characterized in that the processing unit is used for:
The degree of association between first motion profile and the M motion profile is calculated, obtains the M degree of association altogether;
Determine that the degree of association is greater than first degree of association of the default degree of association from the M degree of association;
Will Moving Objects corresponding with first degree of association as the first Moving Objects.
9. the electronic equipment as described in any claim of claim 6-8, which is characterized in that true from the M Moving Objects
When making the first Moving Objects, the processing unit is also used to:
It moves the cursor on position corresponding with first Moving Objects.
10. electronic equipment as claimed in claim 9, which is characterized in that be based on the eye motion information and the M described
A object motion information, after determining the first Moving Objects in the M Moving Objects, the electronic equipment further include:
Sensing device is used for:
Obtain the second eye motion information of the eyes of the user and/or the head movement information on head;
The processing unit is also used to:
According to the second eye motion information and/or the head movement information, first Moving Objects are carried out corresponding
Operation.
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN201610460487.0A CN106125926B (en) | 2016-06-22 | 2016-06-22 | A kind of information processing method and electronic equipment |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN201610460487.0A CN106125926B (en) | 2016-06-22 | 2016-06-22 | A kind of information processing method and electronic equipment |
Publications (2)
| Publication Number | Publication Date |
|---|---|
| CN106125926A CN106125926A (en) | 2016-11-16 |
| CN106125926B true CN106125926B (en) | 2019-10-29 |
Family
ID=57269144
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| CN201610460487.0A Active CN106125926B (en) | 2016-06-22 | 2016-06-22 | A kind of information processing method and electronic equipment |
Country Status (1)
| Country | Link |
|---|---|
| CN (1) | CN106125926B (en) |
Citations (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN104750232A (en) * | 2013-12-28 | 2015-07-01 | 华为技术有限公司 | Eye tracking method and eye tracking device |
| CN105247447A (en) * | 2013-02-14 | 2016-01-13 | 眼球控制技术有限公司 | Systems and methods of eye tracking calibration |
Family Cites Families (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2010071928A1 (en) * | 2008-12-22 | 2010-07-01 | Seeing Machines Limited | Automatic calibration of a gaze direction algorithm from user behaviour |
-
2016
- 2016-06-22 CN CN201610460487.0A patent/CN106125926B/en active Active
Patent Citations (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN105247447A (en) * | 2013-02-14 | 2016-01-13 | 眼球控制技术有限公司 | Systems and methods of eye tracking calibration |
| CN104750232A (en) * | 2013-12-28 | 2015-07-01 | 华为技术有限公司 | Eye tracking method and eye tracking device |
Also Published As
| Publication number | Publication date |
|---|---|
| CN106125926A (en) | 2016-11-16 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| JP7283506B2 (en) | Information processing device, information processing method, and information processing program | |
| US11625841B2 (en) | Localization and tracking method and platform, head-mounted display system, and computer-readable storage medium | |
| US11947729B2 (en) | Gesture recognition method and device, gesture control method and device and virtual reality apparatus | |
| US11749025B2 (en) | Eye pose identification using eye features | |
| JP6979475B2 (en) | Head-mounted display tracking | |
| Memo et al. | Head-mounted gesture controlled interface for human-computer interaction | |
| US10684469B2 (en) | Detecting and mitigating motion sickness in augmented and virtual reality systems | |
| US10481689B1 (en) | Motion capture glove | |
| EP3063602B1 (en) | Gaze-assisted touchscreen inputs | |
| CN106529409B (en) | A Method for Measuring Eye Gaze Angle Based on Head Posture | |
| US20190026589A1 (en) | Information processing device, information processing method, and program | |
| US10127713B2 (en) | Method and system for providing a virtual space | |
| CN108592865A (en) | Geometric measurement method and its device, AR equipment based on AR equipment | |
| US9547412B1 (en) | User interface configuration to avoid undesired movement effects | |
| US20170289518A1 (en) | Apparatus for replaying content using gaze recognition and method thereof | |
| US10488949B2 (en) | Visual-field information collection method and system for executing the visual-field information collection method | |
| US10775883B2 (en) | Information processing method, information processing apparatus and user equipment | |
| EP3582068A1 (en) | Information processing device, information processing method, and program | |
| CN106125926B (en) | A kind of information processing method and electronic equipment | |
| JP2015118577A5 (en) | ||
| KR101741149B1 (en) | Method and device for controlling a virtual camera's orientation | |
| CN119563153A (en) | Improved interaction accuracy for gaze-enabled AR objects while in motion | |
| US10268265B2 (en) | Information processing method, information processing apparatus and user equipment | |
| US11507185B1 (en) | Electrooculography-based eye tracking using normalized electrode input | |
| JP2021063922A (en) | Information processing device, information processing method, and program |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| C06 | Publication | ||
| PB01 | Publication | ||
| C10 | Entry into substantive examination | ||
| SE01 | Entry into force of request for substantive examination | ||
| GR01 | Patent grant | ||
| GR01 | Patent grant |