CN112162627A - Eyeball tracking method combined with head movement detection and related device - Google Patents
Eyeball tracking method combined with head movement detection and related device Download PDFInfo
- Publication number
- CN112162627A CN112162627A CN202010882804.4A CN202010882804A CN112162627A CN 112162627 A CN112162627 A CN 112162627A CN 202010882804 A CN202010882804 A CN 202010882804A CN 112162627 A CN112162627 A CN 112162627A
- Authority
- CN
- China
- Prior art keywords
- head movement
- terminal
- user
- head
- fixation point
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Withdrawn
Links
- 230000004886 head movement Effects 0.000 title claims abstract description 170
- 210000005252 bulbus oculi Anatomy 0.000 title claims abstract description 68
- 238000001514 detection method Methods 0.000 title claims abstract description 32
- 238000000034 method Methods 0.000 title claims abstract description 32
- 210000003128 head Anatomy 0.000 claims abstract description 64
- 230000000007 visual effect Effects 0.000 claims abstract description 40
- 238000005516 engineering process Methods 0.000 claims abstract description 18
- 239000000284 extract Substances 0.000 claims abstract description 10
- 210000001508 eye Anatomy 0.000 claims description 19
- 210000001747 pupil Anatomy 0.000 claims description 11
- 238000004364 calculation method Methods 0.000 claims description 9
- 230000008859 change Effects 0.000 claims description 8
- 238000000605 extraction Methods 0.000 claims description 7
- 238000004590 computer program Methods 0.000 claims description 3
- 230000008878 coupling Effects 0.000 description 3
- 238000010168 coupling process Methods 0.000 description 3
- 238000005859 coupling reaction Methods 0.000 description 3
- 238000010586 diagram Methods 0.000 description 3
- 230000004424 eye movement Effects 0.000 description 3
- 230000006870 function Effects 0.000 description 3
- 230000008569 process Effects 0.000 description 3
- 238000004891 communication Methods 0.000 description 2
- 230000003993 interaction Effects 0.000 description 2
- 230000005484 gravity Effects 0.000 description 1
- 238000010191 image analysis Methods 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/013—Eye tracking input arrangements
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
- Position Input By Displaying (AREA)
Abstract
The embodiment of the application discloses an eyeball tracking method and a related device in combination with head motion detection, which are used for improving the tracking judgment precision of a terminal on the eyeball fixation point of a user. The method in the embodiment of the application comprises the following steps: when the terminal detects that a user is watching a screen, the terminal extracts eyeball characteristic information of the user through an eyeball tracking technology and calculates to obtain a fixation point coordinate; the terminal displays a visual feedback identifier on the screen, wherein the visual feedback identifier is used for prompting a user of the area where the point of regard is located; when the terminal detects that the fixation point is not in the target area, the terminal sets a head movement range by taking a preset value as a radius; the terminal detects head movement information of a user in the head movement range; and the terminal judges the head movement condition according to the head movement information and corrects the fixation point coordinate and the visual feedback identifier according to the head movement condition.
Description
Technical Field
The present disclosure relates to the field of data processing, and more particularly, to an eye tracking method and related apparatus for head movement detection.
Background
Along with the continuous popularization of intelligent equipment, in the field of man-machine interaction, besides traditional touch control and voice control, a new interaction mode also appears: and (5) eyeball tracking control. The eyeball tracking control means that the terminal estimates the direction and the position of the user watching the terminal by tracking eyeballs, so that the user can control the terminal. For example: the operations of page turning, screen locking time control and the like are realized through eyeball tracking.
In the prior art, the eyeball tracking is realized mainly by capturing the position of an eyeball through a camera, then shooting a video, judging information such as a fixation position and a moving path of the eyeball of a user through image analysis of the video, and then converting the information into a control instruction to execute corresponding operation.
However, in the actual use process, due to the limitation of ambient light and the large individual difference of the eye using habits of the user, the terminal has low accuracy in tracking and judging the eyeballs of the user, and the user use experience is affected.
Disclosure of Invention
The embodiment of the application provides an eyeball tracking method and a related device combining head motion detection, which can improve the tracking judgment precision of a terminal on the eyeball fixation point of a user and improve the user experience.
A first aspect of an embodiment of the present application provides an eye tracking method combining head motion detection:
when a terminal detects that a user is watching a screen, the terminal extracts eyeball characteristic information of the user through an eyeball tracking technology and calculates to obtain a fixation point coordinate, wherein the eyeball characteristic information comprises pupil characteristic information and iris characteristic information, and the fixation point coordinate reflects a fixation point of the user on the screen;
the terminal displays a visual feedback identifier on the screen, wherein the visual feedback identifier is used for prompting a user of the area where the point of regard is located;
when the terminal detects that the fixation point is not in a target area, the terminal sets a head movement range by taking a preset value as a radius, the target area is an area to be operated in the screen, the head movement range is used for limiting the head movement amplitude detected by the terminal, and the head movement detected by the terminal in the head movement range is effective head movement information;
the terminal detects head movement information of a user in the head movement range, wherein the head movement information comprises a yaw angle value and a pitch angle value of the head in a three-dimensional space;
and the terminal judges the head movement condition according to the head movement information and corrects the fixation point coordinate and the visual feedback identifier according to the head movement condition.
Optionally, the detecting, by the terminal, head movement information of the user in the head movement range includes: the terminal acquires coordinates of a plurality of key points of the head;
the terminal tracks the coordinate changes of the key points in the head motion range;
and the terminal acquires the head movement information according to the coordinate change of the key points, wherein the head movement information comprises a yaw angle value and a pitch angle value of the head in a three-dimensional space.
Optionally, when the terminal detects that the gaze point is not in the target area, the method further includes:
and generating prompt information, wherein the prompt information is used for prompting the user to move the head so that the terminal corrects the fixation point coordinate into the target area.
Optionally, the preset value is determined by the resolution precision of the device, the inherent precision of the device, and the size of the eyeball of the user.
Optionally, the displaying, by the terminal, a visual feedback identifier on the screen includes:
the terminal highlights the area where the fixation point is located on the screen;
or the like, or, alternatively,
and the terminal indicates the coordinate of the fixation point on the screen by using a mouse icon.
A second aspect of an embodiment of the present application provides a terminal:
a first detection unit for detecting whether a user is gazing at a screen;
the system comprises an extraction and calculation unit, a first detection unit and a second detection unit, wherein the extraction and calculation unit is used for extracting eyeball characteristic information of a user through an eyeball tracking technology and calculating and obtaining a fixation point coordinate, the eyeball characteristic information comprises pupil characteristic information and iris characteristic information, and the fixation point coordinate reflects a fixation point of the user on a screen;
the display unit is used for displaying a visual feedback identifier on the screen, and the visual feedback identifier is used for prompting a user of the area where the point of regard is located;
the second detection unit is used for detecting whether the fixation point is positioned in a target area, and the target area is an area to be operated in the screen;
a setting unit, configured to set a head movement range with a preset value as a radius when the second detecting unit detects that the gaze point is not within the target area, where the head movement range is used to limit a size of a head movement range detected by a terminal, and the head movement detected by the terminal within the head movement range is effective head movement information;
the third detection unit is used for detecting head movement information of the user in the head movement range, and the head movement information comprises a yaw angle value and a pitch angle value of the head in a three-dimensional space;
the judging unit is used for judging the head movement condition according to the head movement information; and the correction unit is used for correcting the fixation point coordinates and the visual feedback marks according to the head movement condition.
Optionally, the third detecting unit includes:
the first acquisition module is used for acquiring a plurality of key point coordinates of the head;
a tracking module for tracking the coordinate changes of the plurality of key points within the head range of motion;
and the second acquisition module is used for acquiring the head movement information according to the change of the coordinates of the key points, and the head movement information comprises a yaw angle value and a pitch angle value of the head in a three-dimensional space.
Optionally, the terminal further includes:
and the generating unit is used for generating prompt information when the second detecting unit detects that the fixation point is not in the target area, wherein the prompt information is used for prompting a user to move the head so that the terminal corrects the fixation point coordinate and the fixation point falls into the target area.
A third aspect of an embodiment of the present application provides a terminal:
the device comprises a processor, a memory, an input and output unit and a bus;
the processor is connected with the memory, the input and output unit and the bus;
the processor specifically performs the following operations:
when a terminal detects that a user is watching a screen, the terminal extracts eyeball characteristic information of the user through an eyeball tracking technology and calculates to obtain a fixation point coordinate, wherein the eyeball characteristic information comprises pupil characteristic information and iris characteristic information, and the fixation point coordinate reflects a fixation point of the user on the screen;
the terminal displays a visual feedback identifier on the screen, wherein the visual feedback identifier is used for prompting a user of the area where the point of regard is located;
when the terminal detects that the fixation point is not in a target area, the terminal sets a head movement range by taking a preset value as a radius, the target area is an area to be operated in the screen, the head movement range is used for limiting the head movement amplitude detected by the terminal, and the head movement detected by the terminal in the head movement range is effective head movement information;
the terminal detects head movement information of a user in the head movement range, wherein the head movement information comprises a yaw angle value and a pitch angle value of the head in a three-dimensional space;
and the terminal judges the head movement condition according to the head movement information and corrects the fixation point coordinate and the visual feedback identifier according to the head movement condition.
A fourth aspect of embodiments of the present application provides a computer-readable storage medium having stored thereon a computer program for implementing the method according to any one of the first aspects.
According to the technical scheme, the embodiment of the application has the following advantages:
according to the method, when the terminal tracks the eyeballs to determine the fixation points, the final fixation points are determined by combining with head movement detection, so that the tracking and judging precision of the eyeballs of the user by the terminal is improved, and the user experience is improved.
Drawings
FIG. 1 is a flowchart illustrating an eye tracking method incorporating head movement detection according to an embodiment of the present disclosure;
FIG. 2 is a flowchart illustrating an eye tracking method incorporating head movement detection according to another embodiment of the present disclosure;
fig. 3 is a schematic structural diagram of an embodiment of a terminal in the embodiment of the present application;
fig. 4 is a schematic structural diagram of another embodiment of a terminal in the embodiment of the present application;
fig. 5 is a schematic structural diagram of another embodiment of a terminal in the embodiment of the present application.
Detailed Description
The embodiment of the application provides an eyeball tracking method and a related device combining head motion detection, which can improve the tracking judgment precision of a terminal on the eyeball fixation point of a user and improve the user experience.
Referring to fig. 1, an embodiment of an eye tracking method in combination with head movement detection according to the present application includes:
101. when a terminal detects that a user is watching a screen, the terminal extracts eyeball characteristic information of the user through an eyeball tracking technology and calculates to obtain a fixation point coordinate, wherein the eyeball characteristic information comprises pupil characteristic information and iris characteristic information, and the fixation point coordinate reflects a fixation point of the user on the screen;
the eyeball tracking technology is a technology for judging a moving direction and a staying position of a sight line of a user by collecting eyeball characteristic information of the user, and a common embodiment tracks according to characteristic changes around an eyeball, tracks according to changes in an iris, and the like. The key of the eyeball tracking technology is that the fixation point of the user is calculated, the fixation point is the fixation point of the eyes of the user on the screen, and after the fixation point is obtained, the terminal can predict the state and the requirement of the user according to the fixation point and respond to the state and the requirement, so that the purpose of controlling the equipment by the eyes is achieved. However, the individual difference of the users is large, and the habits of the eye movements of different users are different, so that the terminal is prone to generate errors in the determination of the eye gaze point of the user.
102. The terminal displays a visual feedback identifier on the screen, wherein the visual feedback identifier is used for prompting a user of the area where the point of regard is located;
in order to facilitate the use of the user, the terminal can feed back the gaze point position calculated by the terminal through an eyeball tracking technology to the user through the visual feedback identifier, so that the user can know whether the gaze point position calculated by the current terminal is wrong or not and whether the gaze point position is matched with the position which the user wants to reach.
103. When the terminal detects that the fixation point is not in a target area, the terminal sets a head movement range by taking a preset value as a radius, the target area is an area to be operated in the screen, the head movement range is used for limiting the head movement amplitude detected by the terminal, and the head movement detected by the terminal in the head movement range is effective head movement information;
in prior art, generally adopt the camera to carry out eyeball tracking, and the collection scope of camera is limited, if the camera is located the screen top, and user's point of regard is at the screen bottom, and the unable effectual eye movement information of gathering the user of camera this moment, consequently also can't reach accurate control. When the gazing point is located in the expected area, the terminal can collect the eye movement information of the user more effectively to achieve more accurate control, the target area refers to an area to be operated in the screen, namely the expected area, and therefore when the terminal detects that the gazing point is not located in the target area, the gazing point position is corrected.
In this embodiment, the user can correct the gaze location based on the head movement. First, the terminal needs to set a head movement range with a preset value as a radius, because when the head of the user moves too much, the eyes may leave the screen, and then the eyeball tracking and the fixation of the gazing point position should be performed again instead of just correction. Therefore, a head movement range should be set, the head movement amplitude of the user detected by the terminal is limited, and the head movement detected by the terminal in the head movement range is valid head movement information.
104. The terminal detects head movement information of a user in the head movement range, wherein the head movement information comprises a yaw angle value and a pitch angle value of the head in a three-dimensional space;
it should be noted that, in this embodiment, the terminal may extract values of yaw angle and pitch angle of the head in the three-dimensional space by capturing a video of the head movement, and then determine the movement of the head based on the information. The pitch angle is the angle at which the head is "pitched" relative to the XOY plane in the three-dimensional right-hand coordinate system, and is positive when the x-axis of the head is above the XOY plane, and negative otherwise. The yaw angle is the angle between the actual head deflection and the predetermined forward direction, taking the gravity direction as the axis.
105. And the terminal judges the head movement condition according to the head movement information and corrects the fixation point coordinate and the visual feedback identifier according to the head movement condition.
The motion situation of the head can be judged by comparing the yaw angle and pitch angle numerical values in the three-dimensional space with the preset yaw angle and pitch angle numerical values, for example: when the pitch angle value of the head is smaller than a first preset pitch angle value, judging that the corresponding head moves to be a head-down state, and when the pitch angle value of the head is larger than a second preset pitch angle value, judging that the corresponding head moves to be a head-up state; and when the yaw angle value of the head is larger than a second preset yaw angle value, judging that the corresponding head movement is head left turning.
Further, the terminal may correct the position of the gaze point according to the head movement, for example, the target area is at the upper left, and the gaze point coordinate given by the eye tracking technology is at the lower left, so that the user may move the head slightly toward the upper left area, and the gaze point position is corrected to the upper left to reach the desired area.
In this embodiment, when the terminal performs eye tracking to determine the gaze point, the terminal further determines the final gaze point by combining head motion detection, so that the tracking and determining accuracy of the user eye gaze point by the terminal is improved, and the user experience is improved.
Referring to fig. 2, another embodiment of an eye tracking method in combination with head movement detection according to the present application includes:
201. when a terminal detects that a user is watching a screen, the terminal extracts eyeball characteristic information of the user through an eyeball tracking technology and calculates to obtain a fixation point coordinate, wherein the eyeball characteristic information comprises pupil characteristic information and iris characteristic information, and the fixation point coordinate reflects a fixation point of the user on the screen;
step 201 in this embodiment is similar to step 101 in the previous embodiment, and is not described here again.
202. The terminal displays a visual feedback identifier on the screen, wherein the visual feedback identifier is used for prompting a user of the area where the point of regard is located;
the terminal can feed back the gazing point position calculated by the terminal through an eyeball tracking technology to a user through the visual feedback identifier.
Specifically, the terminal may highlight the position of the gaze point on the screen to distinguish the region of the gaze point from other regions; the terminal will also indicate the location of the point of regard with an icon similar to a mouse. When the gaze location changes, the icon also follows the change. Therefore, the user can know whether the position of the gaze point calculated by the current terminal is wrong or not and whether the position is matched with the position which the user wants to reach.
203. When the terminal detects that the fixation point is not in a target area, the terminal sets a head movement range by taking a preset value as a radius, the target area is an area to be operated in the screen, the head movement range is used for limiting the head movement amplitude detected by the terminal, and the head movement detected by the terminal in the head movement range is effective head movement information;
it should be noted that the preset value in this embodiment is determined by the resolution precision of the device, the inherent precision of the device, and the size of the eyeball of the user. The equipment resolution precision is related to the pixel information of a camera of the equipment, and the general precision is 1-6 mm; the inherent accuracy of the equipment refers to the technical accuracy of the eyeball tracking technology of the equipment, and generally takes a value of 0.2 to 0.7; the eyeball size of the user is the eyeball size collected by the camera. According to experience, when the preset value is calculated, the inherent precision of the equipment can be multiplied by a fixed multiplying power, then the eyeball size of the user is divided by the fixed multiplying power, and then the preset value is obtained by multiplying the three.
The rest of step 203 in this embodiment is similar to step 103 in the previous embodiment, and is not described here again.
204. When the terminal detects that the fixation point is not in the target area, generating prompt information, wherein the prompt information is used for prompting a user to move the head so that the terminal corrects the fixation point coordinate to the target area;
when the terminal detects that the gazing point position is not in the target area, that is, the desired area, the terminal may generate a prompt message to prompt the user to move the head to correct the gazing point position, and specifically, the moving direction may be given, for example, to generate: prompt information of "please raise head upwards to correct the gazing point position". Therefore, a user can quickly master the method and key points for correcting the gazing point position, and user experience is improved.
It should be noted that step 204 is an optional step, and in actual operation, the user may choose to turn off the prompt function after being familiar with the correction operation method.
205. The terminal acquires coordinates of a plurality of key points of the head;
the terminal needs to judge the head movement condition of the user by capturing the head operation information so as to correct the fixation point position, and firstly, the terminal needs to acquire a plurality of key point coordinates of the head, wherein the plurality of key points can be 68 key points of the human face in the face recognition technology. Preferably, the terminal can acquire the coordinates of 5 key points, namely the left eye central point, the right eye central point and the nose tip key point, and the left mouth corner key point and the right mouth corner key point, so as to perform subsequent tracking and calculation, and therefore, the calculation is simple and efficient.
206. The terminal tracks the coordinate changes of the key points in the head motion range;
after the terminal acquires a plurality of key points of the head, the terminal needs to track the coordinate changes of the key points in the head movement range so as to calculate the required head movement information.
207. The terminal acquires the head movement information according to the coordinate change of the key points, wherein the head movement information comprises a yaw angle value and a pitch angle value of the head in a three-dimensional space;
the terminal can calculate the yaw angle value and the pitch angle value of the head in the three-dimensional space according to the change of the coordinate values, and then judges the motion condition of the head according to the values, so that the aim of correcting the fixation point is fulfilled.
208. And the terminal judges the head movement condition according to the head movement information and corrects the fixation point coordinate and the visual feedback identifier according to the head movement condition.
Step 208 in this embodiment is similar to step 105 in the previous embodiments, and is not described herein again.
Referring to fig. 3, an embodiment of a terminal in the present application includes:
a first detection unit 301, wherein the first detection unit 301 is used for detecting whether a user is looking at a screen;
an extraction and calculation unit 302, configured to, when the first detection unit 301 detects that the user is gazing at the screen, extract eyeball feature information of the user through an eyeball tracking technique, and calculate and obtain a gazing point coordinate, where the eyeball feature information includes pupil feature information and iris feature information, and the gazing point coordinate reflects a gazing point of the user on the screen;
a display unit 303, where the display unit 303 is configured to display a visual feedback identifier on the screen, where the visual feedback identifier is used to prompt a user of an area where the point of regard is located;
a second detecting unit 304, where the second detecting unit 304 is configured to detect whether the gazing point is located in a target area, where the target area is an area to be operated in the screen;
a setting unit 305, configured to set a head movement range with a preset value as a radius when the second detecting unit 304 detects that the gaze point is not within the target area, where the head movement range is used to limit the size of the head movement amplitude detected by the terminal, and the head movement detected by the terminal within the head movement range is effective head movement information;
a third detecting unit 306, where the third detecting unit 306 is configured to detect head movement information of the user in the head movement range, where the head movement information includes a yaw angle value and a pitch angle value of the head in a three-dimensional space;
a judging unit 307, wherein the judging unit 307 is used for judging the head movement situation according to the head movement information;
a correcting unit 308, wherein the correcting unit 308 is configured to correct the gaze point coordinates and the visual feedback identifier according to the head movement condition.
In this embodiment, when the terminal performs eye tracking to determine the gaze point by the extraction calculation unit 302, the terminal further determines the final gaze point by combining head motion detection performed by the third detection unit 306, so that the tracking and determining accuracy of the terminal on the user eye gaze point is improved, and the user experience is improved.
Referring to fig. 4, a terminal in the present application is described in detail below, and another embodiment of the terminal in the present application includes:
a first detection unit 401, where the first detection unit 401 is configured to detect whether a user is looking at a screen;
an extraction and calculation unit 402, configured to, when the first detection unit 401 detects that the user is gazing at the screen, extract eyeball feature information of the user through an eyeball tracking technique, and calculate and obtain a gazing point coordinate, where the eyeball feature information includes pupil feature information and iris feature information, and the gazing point coordinate reflects a gazing point of the user on the screen;
a display unit 403, where the display unit 403 is configured to display a visual feedback identifier on the screen, where the visual feedback identifier is used to prompt a user of an area where the point of regard is located;
a second detecting unit 404, where the second detecting unit 404 is configured to detect whether the gaze point is located in a target area, where the target area is an area to be operated in the screen;
a setting unit 405, where the setting unit 405 is configured to set a head movement range by taking a preset value as a radius when the second detecting unit 404 detects that the gaze point is not within the target area, where the head movement range is used to limit a size of a head movement amplitude detected by a terminal, and the head movement detected by the terminal within the head movement range is effective head movement information;
a third detecting unit 406, where the third detecting unit 406 is configured to detect head movement information of the user in the head movement range, and the head movement information includes a yaw angle value and a pitch angle value of the head in a three-dimensional space;
a judging unit 407, where the judging unit 407 is configured to judge a head movement condition according to the head movement information;
a correcting unit 408, wherein the correcting unit 408 is configured to correct the gazing point coordinate and the visual feedback identifier according to the head movement condition.
In this embodiment, the third detecting unit 406 further includes:
a first obtaining module 4061, where the first obtaining module 4061 is configured to obtain coordinates of a plurality of key points of the head;
a tracking module 4062, said tracking module 4062 configured to track said plurality of keypoint coordinate changes over said range of head motion;
a second obtaining module 4063, where the second obtaining module 4063 is configured to obtain the head motion information according to the change of the coordinates of the plurality of key points, and the head motion information includes a yaw angle value and a pitch angle value of the head in a three-dimensional space.
In this embodiment, the terminal may further include:
a generating unit 409, configured to generate prompting information for prompting a user to move a head so that the terminal corrects the coordinates of the gaze point so that the gaze point falls within the target area when the second detecting unit 404 detects that the gaze point is not within the target area.
In this embodiment, the functions of each unit and each module correspond to the steps in the embodiment shown in fig. 2, and are not described herein again.
Referring to fig. 5, another embodiment of a terminal according to the present application includes:
a processor 501, a memory 502, an input/output unit 503, and a bus 504;
the processor 501 is connected to the memory 502, the input/output unit 503, and the bus 504;
the processor 501 specifically executes the following operations:
when detecting that a user is watching a screen, extracting eyeball characteristic information of the user through an eyeball tracking technology, and calculating to obtain a fixation point coordinate, wherein the eyeball characteristic information comprises pupil characteristic information and iris characteristic information, and the fixation point coordinate reflects a fixation point of the user on the screen;
displaying a visual feedback identifier on the screen, wherein the visual feedback identifier is used for prompting a user of the area where the point of regard is located;
when the fixation point is detected not to be in a target area, setting a head movement range by taking a preset value as a radius, wherein the target area is an area to be operated in the screen, the head movement range is used for limiting the head movement amplitude detected by a terminal, and the head movement detected by the terminal in the head movement range is effective head movement information;
detecting head movement information of a user in the head movement range, wherein the head movement information comprises a yaw angle value and a pitch angle value of the head in a three-dimensional space;
and judging the head movement condition according to the head movement information, and correcting the fixation point coordinate and the visual feedback identifier according to the head movement condition.
In this embodiment, the functions of the processor 501 correspond to the steps in the embodiments shown in fig. 1 to fig. 2, and are not described herein again.
It is clear to those skilled in the art that, for convenience and brevity of description, the specific working processes of the above-described systems, apparatuses and units may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the several embodiments provided in the present application, it should be understood that the disclosed system, apparatus and method may be implemented in other manners. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the units is only one logical division, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application may be substantially implemented or contributed to by the prior art, or all or part of the technical solution may be embodied in a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a read-only memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and the like.
Claims (10)
1. An eye tracking method in conjunction with head motion detection, comprising:
when a terminal detects that a user is watching a screen, the terminal extracts eyeball characteristic information of the user through an eyeball tracking technology and calculates to obtain a fixation point coordinate, wherein the eyeball characteristic information comprises pupil characteristic information and iris characteristic information, and the fixation point coordinate reflects a fixation point of the user on the screen;
the terminal displays a visual feedback identifier on the screen, wherein the visual feedback identifier is used for prompting a user of the area where the point of regard is located;
when the terminal detects that the fixation point is not in a target area, the terminal sets a head movement range by taking a preset value as a radius, the target area is an area to be operated in the screen, the head movement range is used for limiting the head movement amplitude detected by the terminal, and the head movement detected by the terminal in the head movement range is effective head movement information;
the terminal detects head movement information of a user in the head movement range, wherein the head movement information comprises a yaw angle value and a pitch angle value of the head in a three-dimensional space;
and the terminal judges the head movement condition according to the head movement information and corrects the fixation point coordinate and the visual feedback identifier according to the head movement condition.
2. The method of claim 1, wherein the terminal detecting the head movement information of the user in the head movement range comprises:
the terminal acquires coordinates of a plurality of key points of the head;
the terminal tracks the coordinate changes of the key points in the head motion range;
and the terminal acquires the head movement information according to the coordinate change of the key points, wherein the head movement information comprises a yaw angle value and a pitch angle value of the head in a three-dimensional space.
3. The method of claim 1, wherein when the terminal detects that the point of regard is not within a target area, the method further comprises:
and generating prompt information, wherein the prompt information is used for prompting the user to move the head so that the terminal corrects the fixation point coordinate into the target area.
4. The method according to any one of claims 1 to 3, wherein the preset value is determined by equipment resolution precision, equipment inherent precision and user eyeball size.
5. The method according to any one of claims 1 to 3, wherein the terminal displaying a visual feedback indicator on the screen comprises:
the terminal highlights the area where the fixation point is located on the screen;
or the like, or, alternatively,
and the terminal indicates the coordinate of the fixation point on the screen by using a mouse icon.
6. A terminal, comprising:
a first detection unit for detecting whether a user is gazing at a screen;
the system comprises an extraction and calculation unit, a first detection unit and a second detection unit, wherein the extraction and calculation unit is used for extracting eyeball characteristic information of a user through an eyeball tracking technology and calculating and obtaining a fixation point coordinate, the eyeball characteristic information comprises pupil characteristic information and iris characteristic information, and the fixation point coordinate reflects a fixation point of the user on a screen;
the display unit is used for displaying a visual feedback identifier on the screen, and the visual feedback identifier is used for prompting a user of the area where the point of regard is located;
the second detection unit is used for detecting whether the fixation point is positioned in a target area, and the target area is an area to be operated in the screen;
a setting unit, configured to set a head movement range with a preset value as a radius when the second detecting unit detects that the gaze point is not within the target area, where the head movement range is used to limit a size of a head movement range detected by a terminal, and the head movement detected by the terminal within the head movement range is effective head movement information;
the third detection unit is used for detecting head movement information of the user in the head movement range, and the head movement information comprises a yaw angle value and a pitch angle value of the head in a three-dimensional space;
the judging unit is used for judging the head movement condition according to the head movement information; and the correction unit is used for correcting the fixation point coordinates and the visual feedback marks according to the head movement condition.
7. The terminal of claim 6, wherein the third detecting unit comprises:
the first acquisition module is used for acquiring a plurality of key point coordinates of the head;
a tracking module for tracking the coordinate changes of the plurality of key points within the head range of motion;
and the second acquisition module is used for acquiring the head movement information according to the change of the coordinates of the key points, and the head movement information comprises a yaw angle value and a pitch angle value of the head in a three-dimensional space.
8. The terminal of claim 6, further comprising:
and the generating unit is used for generating prompt information when the second detecting unit detects that the fixation point is not in the target area, wherein the prompt information is used for prompting a user to move the head so that the terminal corrects the fixation point coordinate and the fixation point falls into the target area.
9. A terminal, comprising:
the device comprises a processor, a memory, an input and output unit and a bus;
the processor is connected with the memory, the input and output unit and the bus;
the processor specifically performs the following operations:
when a terminal detects that a user is watching a screen, the terminal extracts eyeball characteristic information of the user through an eyeball tracking technology and calculates to obtain a fixation point coordinate, wherein the eyeball characteristic information comprises pupil characteristic information and iris characteristic information, and the fixation point coordinate reflects a fixation point of the user on the screen;
the terminal displays a visual feedback identifier on the screen, wherein the visual feedback identifier is used for prompting a user of the area where the point of regard is located;
when the terminal detects that the fixation point is not in a target area, the terminal sets a head movement range by taking a preset value as a radius, the target area is an area to be operated in the screen, the head movement range is used for limiting the head movement amplitude detected by the terminal, and the head movement detected by the terminal in the head movement range is effective head movement information;
the terminal detects head movement information of a user in the head movement range, wherein the head movement information comprises a yaw angle value and a pitch angle value of the head in a three-dimensional space;
and the terminal judges the head movement condition according to the head movement information and corrects the fixation point coordinate and the visual feedback identifier according to the head movement condition.
10. A computer-readable storage medium, on which a computer program is stored, the computer program being for implementing the method of any one of claims 1 to 5.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010882804.4A CN112162627A (en) | 2020-08-28 | 2020-08-28 | Eyeball tracking method combined with head movement detection and related device |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010882804.4A CN112162627A (en) | 2020-08-28 | 2020-08-28 | Eyeball tracking method combined with head movement detection and related device |
Publications (1)
Publication Number | Publication Date |
---|---|
CN112162627A true CN112162627A (en) | 2021-01-01 |
Family
ID=73860436
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202010882804.4A Withdrawn CN112162627A (en) | 2020-08-28 | 2020-08-28 | Eyeball tracking method combined with head movement detection and related device |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN112162627A (en) |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113111745A (en) * | 2021-03-30 | 2021-07-13 | 四川大学 | Eye movement identification method based on product attention of openposition |
CN114201054A (en) * | 2022-02-18 | 2022-03-18 | 深圳佑驾创新科技有限公司 | Method for realizing non-contact human-computer interaction based on head posture |
CN114569056A (en) * | 2022-01-28 | 2022-06-03 | 首都医科大学附属北京天坛医院 | Eyeball detection and vision simulation device and eyeball detection and vision simulation method |
CN115228081A (en) * | 2022-06-24 | 2022-10-25 | 珠海金山数字网络科技有限公司 | Virtual scene switching method and device |
-
2020
- 2020-08-28 CN CN202010882804.4A patent/CN112162627A/en not_active Withdrawn
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113111745A (en) * | 2021-03-30 | 2021-07-13 | 四川大学 | Eye movement identification method based on product attention of openposition |
CN114569056A (en) * | 2022-01-28 | 2022-06-03 | 首都医科大学附属北京天坛医院 | Eyeball detection and vision simulation device and eyeball detection and vision simulation method |
CN114569056B (en) * | 2022-01-28 | 2022-11-15 | 首都医科大学附属北京天坛医院 | Eyeball detection and vision simulation device and eyeball detection and vision simulation method |
CN114201054A (en) * | 2022-02-18 | 2022-03-18 | 深圳佑驾创新科技有限公司 | Method for realizing non-contact human-computer interaction based on head posture |
CN115228081A (en) * | 2022-06-24 | 2022-10-25 | 珠海金山数字网络科技有限公司 | Virtual scene switching method and device |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN112162627A (en) | Eyeball tracking method combined with head movement detection and related device | |
JP6598617B2 (en) | Information processing apparatus, information processing method, and program | |
US9489574B2 (en) | Apparatus and method for enhancing user recognition | |
US11893161B2 (en) | Gesture recognition based on user proximity to a camera | |
US20030076299A1 (en) | Pointing direction calibration in video conferencing and other camera-based system applications | |
JP2013250882A5 (en) | ||
CN109375765B (en) | Eyeball tracking interaction method and device | |
KR101631011B1 (en) | Gesture recognition apparatus and control method of gesture recognition apparatus | |
CN113936324A (en) | Gaze detection method, control method of electronic equipment, and related equipment | |
CN112666705A (en) | Eye movement tracking device and eye movement tracking method | |
CN109634431B (en) | Medium-free floating projection visual tracking interaction system | |
EP3127586B1 (en) | Interactive system, remote controller and operating method thereof | |
CN107422844B (en) | Information processing method and electronic equipment | |
CN110520822B (en) | Control device, information processing system, control method, and program | |
CN113903078A (en) | Human eye gaze detection method, control method and related equipment | |
CN109144262B (en) | Human-computer interaction method, device, equipment and storage medium based on eye movement | |
CN114092985A (en) | A terminal control method, device, terminal and storage medium | |
JPH09167049A (en) | Line of sight input device for console | |
CN109389082B (en) | Line-of-sight collection method, device, system, and computer-readable storage medium | |
KR20130051319A (en) | Apparatus for signal input and method thereof | |
CN108108709B (en) | Identification method and device and computer storage medium | |
CN109960412B (en) | Method for adjusting gazing area based on touch control and terminal equipment | |
CN109917923B (en) | Method for adjusting gazing area based on free motion and terminal equipment | |
WO2018076609A1 (en) | Terminal and method for operating terminal | |
CN114740966A (en) | Multi-modal image display control method and system and computer equipment |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
WW01 | Invention patent application withdrawn after publication | ||
WW01 | Invention patent application withdrawn after publication |
Application publication date: 20210101 |