[go: up one dir, main page]

CN113495613B - Eyeball tracking calibration method and device - Google Patents

Eyeball tracking calibration method and device Download PDF

Info

Publication number
CN113495613B
CN113495613B CN202010191024.5A CN202010191024A CN113495613B CN 113495613 B CN113495613 B CN 113495613B CN 202010191024 A CN202010191024 A CN 202010191024A CN 113495613 B CN113495613 B CN 113495613B
Authority
CN
China
Prior art keywords
user
calibration
target
calibration coefficient
personal
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010191024.5A
Other languages
Chinese (zh)
Other versions
CN113495613A (en
Inventor
张朕
路伟成
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing 7Invensun Technology Co Ltd
Original Assignee
Beijing 7Invensun Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing 7Invensun Technology Co Ltd filed Critical Beijing 7Invensun Technology Co Ltd
Priority to CN202010191024.5A priority Critical patent/CN113495613B/en
Priority to PCT/CN2021/079596 priority patent/WO2021185110A1/en
Priority to JP2022555765A priority patent/JP2023517380A/en
Publication of CN113495613A publication Critical patent/CN113495613A/en
Application granted granted Critical
Publication of CN113495613B publication Critical patent/CN113495613B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Position Input By Displaying (AREA)

Abstract

The invention discloses an eyeball tracking calibration method and device, which are used for completing calibration in a scene applying line-of-sight positioning interaction without separate calibration links. The method comprises the following steps: in one interactive operation, if a user selects a non-line-of-sight positioning interactive mode, starting a background calibration flow; in a background calibration process, acquiring eye feature information of a user; acquiring a position coordinate obtained by positioning in a non-sight positioning interaction mode as a calibration point coordinate; and calculating to obtain the current personal calibration coefficient of the user according to the acquired eye feature information and the calibration point coordinates. Therefore, in the embodiment of the invention, if the user selects the non-line-of-sight positioning interaction mode to perform positioning, a background calibration process is started at the same time, and the personal calibration coefficient is obtained by calculating the position coordinate obtained by positioning the non-line-of-sight positioning interaction mode as the calibration point coordinate in the background calibration process. The background calibration flow of the embodiment of the invention is hidden for the user and does not need to exit the current scene.

Description

Eyeball tracking calibration method and device
Technical Field
The invention relates to the technical field of eyeball tracking, in particular to an eyeball tracking calibration method and device.
Background
With the development of scientific technology, eyeball tracking technology is increasingly widely applied, for example, the eyeball tracking technology can perform positioning interaction through sight in terminal devices (sight positioning devices for short) related to sight positioning, such as Virtual Reality (VR), augmented Reality (AR), eye control tablet computers and the like.
Because there are some differences in the physiological structures of the eyes of each user, in the prior art, the user usually needs to first enter a calibration link before using a sight line positioning device with an eyeball tracking function, and obtains the eye characteristic information of the user while focusing on one or more calibration points, so as to obtain the personal calibration coefficient of the user according to the correlation calculation of the coordinates of the calibration points and the eye characteristic information. After the calibration is finished, a point set picture is displayed, so that a user can evaluate the calibration effect by himself, and if the user subjectively considers that the calibration effect can not meet the requirement of sight line positioning, the user needs to recalibrate.
After calibration is completed, the user can enter a scene using sight line positioning, and if the user finds that the sight line positioning is inaccurate in the use process or the relative positions of the head display and eyes are changed due to the adjustment of the position of the head-mounted display and other reasons, the user needs to exit the use scene to reenter the calibration link, so that inconvenience is brought to the user.
Disclosure of Invention
In view of the above problems, the present invention provides an eyeball tracking calibration method and device, so as to complete calibration in a scene where line-of-sight positioning interaction is applied, without requiring a separate calibration link.
In order to achieve the above object, the present invention provides the following technical solutions:
an eye tracking calibration method, comprising:
in one interactive operation, if the interactive mode selected by the user is a non-line-of-sight positioning interactive mode, starting a background calibration flow;
in the background calibration flow, acquiring eye feature information of the user;
acquiring a position coordinate obtained by adopting the non-sight line positioning interaction mode as a calibration point coordinate;
and calculating the current personal calibration coefficient of the user according to the acquired eye feature information and the calibration point coordinates.
Optionally, before starting the background calibration procedure, the method further includes: acquiring eye feature information of a user; calculating to obtain the gaze point coordinates of the user according to the acquired eye feature information and the target calibration coefficient; the target calibration coefficients include: a system default calibration factor or a personal calibration factor associated with a user identification of the user; displaying the interaction effect of the functional area to which the gaze point coordinates belong; if the non-sight line positioning interactive equipment is used, and the non-sight line positioning interactive equipment is positioned to other functional areas according to the non-sight line positioning interactive equipment, the interactive mode adopted by the interactive operation is determined to be a non-sight line positioning mode.
Optionally, the user identifier of the user is a target user identifier; before the target calibration coefficient is used for calculating the gaze point coordinates of the user, the method further comprises the following steps: if the target user identification is associated with the personal calibration coefficient, determining the personal calibration coefficient associated with the target user identification as the target calibration coefficient; if the user identification of the user is not associated with the personal calibration coefficient, determining the default calibration coefficient of the system as the target calibration coefficient.
Optionally, after calculating the current personal calibration coefficient, the method further includes: if the target user identifier is associated with the personal calibration coefficient, updating the personal calibration coefficient associated with the target user identifier to the current personal calibration coefficient; and if the target user identifier is not associated with the personal calibration coefficient, associating the current personal calibration coefficient with the target user identifier.
Optionally, the user identifier is bound to the biometric feature; before the target calibration coefficient is used for calculating the gaze point coordinates of the user, the method further comprises the following steps: acquiring the biological identification characteristics of the user; matching the acquired biometric characteristic with the biometric characteristic of the established user identification; if the matching is successful, determining the user identification which is successfully matched as the user identification of the user; if the matching fails, a user identifier is established for the user, and the established user identifier is bound with the acquired biological identification feature.
An eye tracking calibration method calibration device, comprising:
a background calibration starting unit for: in one interactive operation, if the interactive mode selected by the user is a non-line-of-sight positioning interactive mode, starting a background calibration flow;
the acquisition unit is used for: in the background calibration flow, acquiring eye feature information of the user;
a calibration unit for:
acquiring a position coordinate obtained by adopting the non-sight line positioning interaction mode as a calibration point coordinate;
and calculating the current personal calibration coefficient of the user according to the acquired eye feature information and the calibration point coordinates.
Optionally, the method further comprises: the computing unit is used for acquiring eye feature information of a user before starting the background calibration flow; calculating to obtain the gaze point coordinates of the user according to the acquired eye feature information and the target calibration coefficient; the target calibration coefficients include: a system default calibration factor or a personal calibration factor associated with a user identification of the user; the display unit is used for displaying the interaction effect of the functional area to which the gaze point coordinates belong; and the monitoring unit is used for determining that the interaction mode adopted by the interaction operation is a non-sight line positioning mode if the non-sight line positioning interaction equipment is used and is positioned to other functional areas according to the non-sight line positioning interaction equipment.
Optionally, the user identifier of the user is a target user identifier; before the calculating using the target calibration coefficients to obtain the gaze point coordinates of the user, the calculating unit is further configured to: if the target user identification is associated with the personal calibration coefficient, determining the personal calibration coefficient associated with the target user identification as the target calibration coefficient; if the user identification of the user is not associated with the personal calibration coefficient, determining the default calibration coefficient of the system as the target calibration coefficient.
Optionally, after calculating the current personal calibration coefficient, the calibration unit is further configured to: if the target user identifier is associated with the personal calibration coefficient, updating the personal calibration coefficient associated with the target user identifier by using the current personal calibration coefficient; and if the target user identifier is not associated with the personal calibration coefficient, associating the current personal calibration coefficient with the target user identifier.
Optionally, the user identifier is bound to the biometric feature; the apparatus further comprises: a biometric identification unit for: acquiring the biological identification characteristics of a user before the target calibration coefficient is used for calculating to obtain the gaze point coordinates of the user; matching the acquired biometric characteristic with the biometric characteristic of the established user identification; if the matching is successful, determining the user identification which is successfully matched as the user identification of the user; if the matching fails, a user identifier is established for the user, and the established user identifier is bound with the acquired biological identification feature.
An eyeball tracking calibration method device, comprising: a processor, a memory and a computer program stored on the memory and executable on the processor, the processor implementing the steps of the eye tracking calibration method as described in any one of the preceding claims when the program is executed.
A storage medium having stored therein computer executable instructions which, when loaded and executed by a processor, implement the eye tracking calibration method steps of any one of the preceding claims.
Therefore, in the embodiment of the invention, in a scene where eyeball tracking is used for interaction, if a user selects a non-line-of-sight positioning interaction mode for positioning, a background calibration process is started at the same time, and in the background calibration process, the personal calibration coefficient is calculated by using the position coordinate obtained by positioning in the non-line-of-sight positioning interaction mode as the calibration point coordinate. The background calibration flow is hidden for the user, the user does not feel, and the user does not need to exit the current scene to calibrate again like the prior art, so that the calibration time can be saved, and the user experience can be improved.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings that are required to be used in the embodiments or the description of the prior art will be briefly described below, and it is obvious that the drawings in the following description are only embodiments of the present invention, and that other drawings can be obtained according to the provided drawings without inventive effort for a person skilled in the art.
FIG. 1 is a flowchart of an exemplary method for calibrating eye tracking according to an embodiment of the present invention;
fig. 2a is a schematic diagram of obtaining gaze point coordinates by adopting a gaze location interaction manner according to an embodiment of the present invention;
fig. 2b is a schematic diagram of expanding a functional area to which a fixation point coordinate belongs according to an embodiment of the present invention;
FIG. 3 is a flowchart illustrating another exemplary method for calibrating eye tracking according to an embodiment of the present invention;
FIG. 4 is a flowchart illustrating another exemplary method for calibrating eye tracking according to an embodiment of the present invention;
FIG. 5 is a flowchart illustrating another exemplary method for calibrating eye tracking according to an embodiment of the present invention;
FIG. 6 is a flowchart illustrating an exemplary method for calibrating eye tracking according to an embodiment of the present invention;
FIG. 7 is a diagram illustrating an exemplary configuration of an eye tracking calibration apparatus according to an embodiment of the present invention;
FIG. 8 is a diagram illustrating another exemplary configuration of an eye tracking calibration apparatus according to an embodiment of the present invention;
fig. 9 is a schematic diagram of an eye tracking calibration apparatus according to an embodiment of the present invention.
Detailed Description
The following description of the embodiments of the present invention will be made clearly and completely with reference to the accompanying drawings, in which it is apparent that the embodiments described are only some embodiments of the present invention, but not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
The terms first and second and the like in the description and in the claims and in the above-described figures are used for distinguishing between different objects and not necessarily for describing a sequential or chronological order. Furthermore, the terms "comprise" and "have," as well as any variations thereof, are intended to cover a non-exclusive inclusion. For example, a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to the listed steps or elements but may include steps or elements not expressly listed.
The eye tracking calibration method and the eye tracking calibration device provided in the embodiments of the present invention are applied to the field of eye tracking, and eye tracking may also be referred to as gaze tracking, which is a technique for estimating the gaze and/or gaze point of an eye by measuring the movement of the eye.
The eyeball tracking calibration device related to the embodiment of the invention can be deployed in terminal equipment (sight line positioning equipment for short) related to sight line positioning such as a VR system, AR equipment, an eye control tablet computer and the like.
The VR system generally includes a plurality of devices, for example, a VR integrated machine+a handle/remote controller, a PC end+a VR head-mounted display (a VR software is installed on the PC end to communicate with the head display) +the handle/remote controller/mouse, an intelligent mobile terminal+a head-mounted display (a VR software is installed on the intelligent mobile terminal to communicate with the head display), and the like.
The eye tracking and calibrating device can be deployed in an eye control tablet computer and the like, and the eye control tablet computer can be further provided with an infrared light source and a camera (such as an environmental camera and an infrared camera) for shooting an eye pattern for acquiring an eye pattern of a user. An eye pattern here refers to an image that includes eyes, such as a front, side head, or an image that includes only eyes.
Alternatively, the eye tracking calibration device may be deployed on the VR/AR head mounted display described above. The head-mounted display is also provided with a sight line tracking device (or an eyeball tracking device), other eyeball tracking devices can be used besides the eyeball tracking device using a camera for shooting an eye pattern and an infrared light source, for example, a MEMS micro-electromechanical system can be specifically included in a MEMS infrared scanning reflecting mirror, an infrared light source and an infrared receiver, and the eyeball tracking device can detect eyeball movement by shooting an eye image; for another example, the gaze tracking device may also be a capacitive sensor that detects eye movement by a capacitance value between the eye and a capacitive pad; for another example, the gaze tracking device may also be a myocurrent detector, more specifically, detecting eye movement by placing electrodes at the bridge of the nose, forehead, ear, or earlobe to detect patterns of myocurrent signals.
Embodiments of the present application will be described in further detail below based on the common aspects of the present application in the above description.
Example 1
In order to solve the problems existing in the existing calibration mode: if the user finds that the line of sight positioning is inaccurate in the scene where the line of sight positioning interaction is applied, or the relative positions of the head display and eyes are changed due to the adjustment of the position of the head-mounted display and other reasons, the problem of recalibration of the scene needs to be exited.
In a first embodiment of the present application, an eye tracking calibration method is provided to complete calibration in a scene where line-of-sight positioning interaction is applied, and no separate calibration link is required.
Referring to fig. 1, the eye tracking calibration method includes:
s0: the eye tracking function is started.
In one example, the eye tracking function of the gaze location device is on by default.
Of course, the eye tracking function may be activated according to the user's operation.
S1: in one interactive operation, if the interactive mode selected by the user is a non-line-of-sight positioning interactive mode, starting a background calibration flow.
One interaction may refer to: the user inputs information and operation commands through the input and output system, the system receives post-processing, and the processing result is displayed through the input and output system.
The subsequent user can further input information and operation commands according to the processing result.
Taking VR scenes as an example, it is generally default to obtain gaze point coordinates by adopting a line-of-sight positioning interaction manner (see fig. 2 a).
And (3) expanding the functional area to which the fixation point coordinates belong (see fig. 2 b), and displaying interaction effects such as color changing, brightness increasing and the like. Functional areas herein include, but are not limited to: icon controls, spatial regions, virtual objects (e.g., virtual objects that may be walked to a designated spatial region or selected for capture by line of sight positioning in a VR game), etc.
The mouse, the remote controller, the handle, the keyboard at the PC end and the like can be used for man-machine interaction, and compared with a sight line positioning interaction mode, the interaction mode can be called as a non-sight line positioning interaction mode, and the mouse, the remote controller, the handle, the keyboard and the like can be called as non-sight line positioning interaction equipment.
If the user uses a non-line-of-sight positioning interaction mode in one interaction operation, a background calibration flow is entered, the flow is hidden from the user, and the user does not feel the flow.
S2: in the background calibration process, eye feature information of a user is acquired.
In one example, the ocular feature information may include pupil position coordinates, pupil shape, iris position, iris shape, eyelid position, corner of the eye position, position coordinates of a corneal reflection spot, and the like.
Specifically, if the eye tracking calibration device is deployed in the mobile terminal device, the eye pattern may be shot by a camera of the mobile terminal device, and the eye pattern may be processed to obtain eye feature information.
If the eye tracking and calibrating device is deployed on the VR/AR head-mounted display, the eye image can be shot by the eye tracking device on the VR/AR head-mounted display, and the eye image is processed to obtain eye feature information.
S3: and acquiring the position coordinates obtained by adopting the non-sight line positioning interaction mode as the calibration point coordinates.
Taking the handle as an example, if the user uses the handle to move the cursor to a certain position and clicks to confirm, the position coordinate obtained at this time is the calibration point coordinate.
Unlike the prior art, embodiments of the present invention do not exhibit fixed calibration points nor do they require the user to focus on the vision gaze fixed calibration points. But is based on the fact that the user is eyes to hands when using the non-line-of-sight positioning interactive equipment such as a handle and a mouse, so that the position coordinates obtained by positioning by adopting the non-line-of-sight positioning interactive mode are the point where the user gazes.
S4: and calculating the current personal calibration coefficient of the user according to the acquired eye feature information and the calibration point coordinates.
The personal calibration coefficients are parameters used in the gaze estimation algorithm to calculate the final result of the gaze and are related to the user's pupil radius, corneal curvature, angular difference between the visual axis and the optical axis, etc.
The calculation principle of the personal calibration coefficient is as follows:
according to the acquired eye feature information and the calibration point coordinates, the individual calibration coefficients can be reversely solved by using a sight line estimation algorithm. After the current personal calibration coefficient is obtained, the last personal calibration coefficient can be covered or the default calibration coefficient can be replaced, so that the eyeball tracking precision is more accurate.
Therefore, in the embodiment of the invention, in a scene where eyeball tracking is used for interaction, if a user selects a non-line-of-sight positioning interaction mode for positioning, a background calibration process is started at the same time, and in the background calibration process, the personal calibration coefficient is calculated by using the position coordinate obtained by positioning in the non-line-of-sight positioning interaction mode as the calibration point coordinate. The background calibration process of the embodiment of the invention can continuously update the personal calibration coefficient of the user, so that the eyeball tracking is more accurate, in addition, the calibration method is hidden for the user, the user cannot feel, and the user does not need to exit the current scene to calibrate again like the prior art, thereby saving the calibration time and improving the use experience of the user.
Example two
Typically, eye tracking techniques have default calibration coefficients. The default calibration factor is the calibration factor that is highly accurate when used by most people.
Of course, there is a possibility that the default calibration coefficient may be used with misalignment due to individual differences in the radius of the eyes of the user, etc., which is also why the personal calibration coefficient is obtained by performing calibration.
The definable target calibration coefficients include: the system defaults to calibration coefficients or personal calibration coefficients associated with the user's user identification.
At the beginning, the target calibration coefficient is a default calibration coefficient of the system, and after at least one background calibration, the target calibration coefficient is updated to be the latest personal calibration coefficient.
This embodiment will describe an exemplary flow of eye tracking calibration based on the target calibration coefficient in a scenario where the line-of-sight positioning interaction is adopted by default, please refer to fig. 3, which may include:
s31, acquiring eye feature information of a user, and calculating to obtain the gaze point coordinates of the user according to the acquired eye feature information and the target calibration coefficient.
The acquisition mode can be referred to in step S2 of the first embodiment.
There are various eye tracking algorithms for calculating gaze point coordinates, for example, eye tracking algorithms based on pupil position morphology calculate gaze information from the pupil's principal axis direction and pupil position.
For another example, a certain feature vector fitting method has the following working principle:
extracting the central position of the pupil, the central position of the left eye corner and the central position of the right eye corner;
subtracting the left eye corner center position from the pupil center position as a feature vector A;
the pupil center position minus the right eye corner center position is taken as the feature vector B.
A mapping function (a number of equations) between vector a, vector B and the gaze point is constructed.
Based on the given eigenvectors a and B, and known gaze information, polynomial fitting is performed to obtain unknown coefficients in the mapping function (solution of the unknown coefficients may be done in a calibration process).
After obtaining the unknown coefficient, the current extracted feature vector is input into a mapping function, and the current fixation point (the tracking process) can be obtained.
Specifically, the feature data may include: pupil position, pupil shape, iris position, iris shape, eyelid position, corner of the eye position, spot position, etc.
Different eye tracking algorithms may have different requirements for the kind of feature data, for example, the aforementioned eye tracking algorithm according to the pupil position morphology, and feature data to be extracted includes the main axis direction of the pupil, the pupil position, the length of the long axis of the pupil, the length of the short axis of the pupil, and the like.
For another example, the aforementioned feature vector fitting method, whose feature data to be extracted includes the pupil center position, the left-eye corner center position, and the center position of the right-eye corner.
Since the eye tracking algorithm is of a wide variety, it is not described in detail herein.
It should be noted that, if the user is a new user or the user identifier of the user is not associated with a personal calibration coefficient, the target calibration coefficient is a default standard coefficient.
S32: and displaying the interaction effect of the functional area to which the gaze point coordinates belong.
Specifically, the functional area to which the gaze point coordinates belong can be enlarged (see fig. 2 b), and the interactive effects such as color change, brightness enhancement and the like can be displayed. The functional areas herein may be icon controls, spatial areas, virtual objects, etc.
S31 to S32 are one interactive operation.
If the user considers that the functional area displaying the interaction effect is the expected functional area, the user can confirm to enter the functional area by continuously looking at, pressing keys and the like later.
If the user considers that the functional area for displaying the interaction effect is not the expected functional area (which indicates that the accuracy of the gaze point coordinate calculation cannot meet the use requirement at this time), the focus on the screen can be moved for repositioning by using a mouse, a remote controller, a handle, or even a non-line-of-sight positioning interaction device such as a keyboard at the PC end.
S33: judging whether the background calibration starting condition is met, if so, entering S34, otherwise, entering S31;
the background calibration initiation conditions may include: the non-line-of-sight positioning interaction device is monitored to be used and positioned to other functional areas according to the non-line-of-sight positioning interaction device.
When the background calibration is started, determining that the interaction mode adopted by the interaction operation is a non-sight line positioning mode, executing a background calibration flow subsequently, and assigning the latest personal calibration coefficient to the target calibration coefficient.
If the background calibration start is not satisfied, it indicates that the current sight line positioning accuracy satisfies the needs of the user, and the target calibration coefficient does not need to be changed, and the process returns to step S31.
S34: starting a background calibration flow, and calculating to obtain the current personal calibration coefficient of the user according to the acquired eye feature information and the calibration point coordinates.
The specific description is referred to the foregoing S2-S4, and will not be repeated here.
S35: and updating the target calibration coefficient using the calculated personal calibration coefficient, returning to S31.
After positioning using the non-line-of-sight positioning interaction device, the line-of-sight positioning interaction mode is returned again.
In this embodiment, in each interaction operation based on line-of-sight positioning, the gaze point coordinates are calculated using default calibration coefficients, or the user-associated personal calibration coefficients, and the functional area to which the gaze point coordinates belong is highlighted.
If the highlighted function area is the function area intended by the user, the user can confirm that the user enters the function area by continuously looking at, pressing a button, and the like.
If the highlighted function area is not the function area expected by the user, the user can use the non-line-of-sight positioning interaction equipment such as a handle to reposition, and meanwhile, the background calibration is started. Therefore, in addition to the man-machine interaction, the embodiment can also determine whether to start background calibration.
In embodiment two, the user identification of the user is mentioned. For convenience of reference, the user identification of the current user may be referred to as the target user identification.
Before calculating the gaze point coordinates of the user using the target calibration coefficients, the method may further comprise the steps of:
if the target user identification is associated with the personal calibration coefficient, determining the personal calibration coefficient associated with the target user identification as the target calibration coefficient;
if the user identification of the user is not associated with the personal calibration coefficient, the default calibration coefficient of the system is determined as the target calibration coefficient.
In addition, after the current personal calibration coefficient is calculated, the following operations can be performed:
if the target user identification is associated with the personal calibration coefficient, updating the personal calibration coefficient associated with the target user identification into the current personal calibration coefficient;
And if the target user identification is not associated with the personal calibration coefficient, associating the current personal calibration coefficient with the target user identification.
A third embodiment will be described on how to perform eye tracking calibration in the case where the initial target user identifies an unassociated personal calibration coefficient, please refer to fig. 4, which may exemplarily include:
s41: the target user identification is not associated with personal calibration coefficients, and the default calibration coefficients of the system are determined to be target calibration coefficients.
S42-S45 are similar to S31-S34 described above and are not described in detail herein.
S46: and correlating the calculated current personal calibration coefficient with the target user identification.
S47: and updating the target calibration coefficient by using the calculated personal calibration coefficient, and returning to S41.
After positioning using the non-line-of-sight positioning interaction device, the line-of-sight positioning interaction mode is returned again.
The embodiment can realize the following steps: when the user is not associated with the personal calibration coefficient, determining the default calibration coefficient as the target calibration coefficient to calculate the gaze point coordinate, and highlighting the functional area to which the gaze point coordinate belongs. If the highlighted function area is not the function area expected by the user, the user can use the non-line-of-sight positioning interaction equipment such as a handle to reposition, and meanwhile, background calibration is started.
The personal calibration coefficient obtained by background calibration is related to the target user identification, and is used in the next interactive operation, so that the gaze point coordinate can be calculated by using the personal calibration coefficient obtained by the last calculation when the gaze positioning mode is used next time, the transition from using the default calibration coefficient to using the more accurate personal calibration coefficient can be realized, and more personalized and accurate gaze positioning service can be further provided for the user.
Example IV
A fourth embodiment describes an eye tracking calibration method specifically according to a personal calibration coefficient associated with a first target user identifier, please refer to fig. 5, which may include:
s51: the target user identification has associated with it a personal calibration coefficient, which is determined as the target calibration coefficient.
S52-S55 are similar to S31-S34 described above and are not described in detail herein.
S56: and updating the personal calibration coefficient associated with the target user identification into the current personal calibration coefficient.
S57: and updating the target calibration coefficient using the calculated personal calibration coefficient, returning to S51.
After positioning using the non-line-of-sight positioning interaction device, the line-of-sight positioning interaction mode is returned again.
The embodiment can realize the following steps: determining the personal calibration coefficients associated with the user as target calibration coefficients when the user has associated with them may provide a more personalized and accurate line-of-sight positioning service for the user.
In addition, if the background calibration is started, the personal calibration coefficient obtained by calculation is associated with the target user identification, and the personal calibration coefficient is used in the next interactive operation, so that the personal calibration coefficient can be adaptively updated, and more personalized and accurate sight line positioning service can be further provided for the user.
Example five
A user using a line-of-sight positioning apparatus may default to be the same person. Alternatively, in consideration of the fact that a plurality of persons may use the same apparatus, different users may be distinguished by a technique such as biometric identification.
In particular implementations, the user identifier may be bound to a biometric feature, different biometric features corresponding to different user identifiers, and different user identifiers characterizing different users, thereby distinguishing between different users.
Referring to fig. 6, an eye tracking calibration method based on biometric features includes the following steps:
S60: acquiring the biological identification characteristics of the current user;
the biometric features herein may illustratively include: iris, fingerprint, voiceprint, or even facial features.
S61: the acquired biometric characteristic is matched with the biometric characteristic of the established user identification.
S62: if the matching is successful, the user identification which is successfully matched is determined as the user identification (target user identification) of the current user;
for example, the user identification of successful match is 000010, then "000010" is determined as the user identification of the current user.
S63: if the matching fails, a user identifier (target user identifier) is established for the user, and the established user identifier is bound with the acquired biological identification feature.
The user identifier successfully matched in step S62, or the user identifier established in step S63, is the target user identifier in the foregoing embodiment.
S64: judging whether the target user identifier is related to the personal calibration coefficient, if so, entering S65, otherwise, entering S66;
s65: determining a personal calibration coefficient associated with the target user identification as a target calibration coefficient;
s66: the default calibration coefficients of the system are determined as target calibration coefficients.
S67: and calculating the gaze point coordinates of the user by using the target calibration coefficients.
Specifically, firstly, eye feature information of a user is obtained, and the gaze point coordinate of the user is calculated according to the obtained eye feature information and a target calibration coefficient.
The specific description is referred to the foregoing description, and will not be repeated here.
S68-S610 are similar to S32-S34 described above and are not described in detail herein.
S611: if the target user identification is associated with the personal calibration coefficient, updating the personal calibration coefficient associated with the target user identification into the current personal calibration coefficient;
s612: and if the target user identification is not associated with the personal calibration coefficient, associating the current personal calibration coefficient with the target user identification.
S613: and updating the target calibration coefficient by using the calculated personal calibration coefficient, and returning to S67.
In this embodiment, after the biometric feature (such as iris) of the user is extracted, the biometric feature is matched with the feature identification feature of the established user identifier to identify whether the user is a new user, and if the biometric feature is matched with the established user identifier, the successfully matched user identifier is determined to be the user identifier of the current user. In the interactive operation, the gaze point coordinates are calculated by using the personal calibration coefficients of the successfully matched user identifications.
If the user is a new user, a user identification is established for the new user, and in the first interactive operation, the default calibration coefficient of the system is used for calculating the fixation point coordinate.
The use of biometric features to distinguish users helps to provide personalized, more accurate line-of-sight location services for different users. As one can envisage such a scenario:
assume that a mobile device is used by multiple family members. Member A performs an interactive operation to calculate a personal calibration coefficient. Member B then uses the mobile device as well, and if not distinguished, directly uses the calculated personal calibration coefficients of member a for calibration, and positioning may not be accurate. If background calibration is triggered, the calculated personal calibration coefficients are actually member B.
And if the member A is used again, under the condition of no distinction, the personal calibration coefficient of the member B is used for calculating the fixation point coordinate, and the positioning accuracy still cannot be achieved.
If different users are distinguished through biological recognition, personal calibration coefficients of the different users are not mixed, and personalized sight positioning service can be provided for the different users.
Example six
The present embodiment provides an eyeball tracking calibration device, see fig. 7, which includes:
the background calibration starting unit 1 is used for starting a background calibration flow of the calibration unit 3 if the interaction mode selected by the user is a non-line-of-sight positioning interaction mode in one interaction operation;
the acquisition unit 2 is used for acquiring the eye feature information of the user in a background calibration process;
in the background calibration flow, the calibration unit 3 is configured to:
acquiring a position coordinate obtained by adopting the non-sight line positioning interaction mode as a calibration point coordinate;
and calculating to obtain the current personal calibration coefficient of the user according to the acquired eye feature information and the calibration point coordinates.
The eyeball tracking calibration device related to the embodiment of the invention can be deployed in terminal equipment (abbreviated as sight positioning equipment) related to sight positioning such as a VR system, AR equipment, an eye control tablet computer and the like to execute an eyeball tracking calibration method.
The specific content and the beneficial effects are shown in the foregoing description, and are not repeated here.
In other embodiments, referring to fig. 8, the eye tracking calibration device in all the above embodiments may further include:
the calculating unit 4 is used for acquiring eye feature information of the user before starting a background calibration process, and calculating to obtain the gaze point coordinate of the user according to the acquired eye feature information and the target calibration coefficient;
Wherein the target calibration coefficients comprise: the system defaults to calibration coefficients or personal calibration coefficients associated with the user's user identification.
The display unit 5 is used for displaying the interaction effect of the functional area to which the gaze point coordinates belong;
and the monitoring unit 6 is used for determining that the interaction mode adopted by the interaction operation is a non-sight line positioning mode if the non-sight line positioning interaction equipment is used and is positioned to other functional areas according to the non-sight line positioning interaction equipment.
The specific content and the beneficial effects are shown in the foregoing description, and are not repeated here.
The user identification of the user may be referred to as a target user identification.
In other embodiments of the present invention, after calculating the current personal calibration coefficient, the calibration unit 3 in all the above embodiments may be further configured to:
if the target user identification is associated with the personal calibration coefficient, updating the personal calibration coefficient associated with the target user identification by using the current personal calibration coefficient;
and if the target user identification is not associated with the personal calibration coefficient, associating the current personal calibration coefficient with the target user identification.
The specific content and the beneficial effects are shown in the foregoing description, and are not repeated here.
In other embodiments of the present invention, the calculating unit 4 in all the above embodiments may be further configured to:
if the target user identification is associated with the personal calibration coefficient, determining the personal calibration coefficient associated with the target user identification as the target calibration coefficient;
if the user identification of the user is not associated with the personal calibration coefficient, the default calibration coefficient of the system is determined as the target calibration coefficient.
The specific content and the beneficial effects are shown in the foregoing description, and are not repeated here.
The user identification may be tied to the biometric feature. In other embodiments of the present invention, referring to fig. 9, the apparatus may further include:
a biometric identification unit 7 for:
acquiring the biological identification characteristics of a user before calculating the gaze point coordinates of the user by using the target calibration coefficients;
matching the acquired biometric characteristic with the biometric characteristic of the established user identification;
if the matching is successful, determining the user identification which is successfully matched as the user identification of the user;
if the matching fails, a user identifier is established for the user, and the established user identifier is bound with the acquired biological identification feature.
The specific content and the beneficial effects are shown in the foregoing description, and are not repeated here.
Example seven
The embodiment of the present application provides a storage medium having stored therein computer-executable instructions that, when loaded and executed by a processor, implement the steps of the eye tracking calibration method of any one of the first to sixth embodiments.
Example eight
An embodiment of the present application provides a processor, where the processor is configured to execute a program, and the program executes the steps of the eye tracking calibration method according to any one of the first to sixth embodiments.
Example nine
The embodiment of the application provides an eyeball tracking calibration method calibration device, which comprises a processor, a memory and a program stored on the memory and capable of running on the processor, wherein when the processor executes the program, any one of the eyeball tracking calibration method steps from the first embodiment to the sixth embodiment can be executed.
It will be appreciated by those skilled in the art that embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present application is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flowchart illustrations and/or block diagrams, and combinations of flows and/or blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
In one typical configuration, a computing device includes one or more processors (CPUs), input/output interfaces, network interfaces, and memory.
The memory may include volatile memory in a computer-readable medium, random Access Memory (RAM) and/or nonvolatile memory, etc., such as Read Only Memory (ROM) or flash RAM. Memory is an example of a computer-readable medium.
Computer readable media, including both non-transitory and non-transitory, removable and non-removable media, may implement information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of storage media for a computer include, but are not limited to, phase change memory (PRAM), static Random Access Memory (SRAM), dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), read Only Memory (ROM), electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), digital Versatile Discs (DVD) or other optical storage, magnetic cassettes, magnetic tape magnetic disk storage or other magnetic storage devices, or any other non-transmission medium, which can be used to store information that can be accessed by a computing device. Computer-readable media, as defined herein, does not include transitory computer-readable media (transmission media), such as modulated data signals and carrier waves.
It should also be noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article or apparatus that comprises an element.
It will be appreciated by those skilled in the art that embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The foregoing is merely exemplary of the present application and is not intended to limit the present application. Various modifications and variations of the present application will be apparent to those skilled in the art. Any modification, equivalent replacement, improvement, etc. which come within the spirit and principles of the application are to be included in the scope of the claims of the present application.

Claims (8)

1. An eye tracking calibration method, comprising:
in one interactive operation, acquiring eye feature information of a user;
calculating to obtain the gaze point coordinates of the user according to the acquired eye feature information and the target calibration coefficient; the target calibration coefficients include: a system default calibration factor or a personal calibration factor associated with a user identification of the user;
displaying the interaction effect of the functional area to which the gaze point coordinate belongs, wherein the interaction effect comprises expansion, color change and brightening;
if the non-sight line positioning interactive equipment is used, determining that the interactive mode adopted by the interactive operation is a non-sight line positioning mode according to the fact that the non-sight line positioning interactive equipment is positioned to other functional areas;
if the interaction mode selected by the user is a non-line-of-sight positioning interaction mode, starting a background calibration flow;
in the background calibration flow, acquiring eye feature information of the user;
acquiring a position coordinate obtained by adopting the non-sight line positioning interaction mode as a calibration point coordinate;
and calculating the current personal calibration coefficient of the user according to the acquired eye feature information and the calibration point coordinates.
2. The method of claim 1, wherein,
the user identification of the user is a target user identification;
before the target calibration coefficient is used for calculating the gaze point coordinates of the user, the method further comprises the following steps:
if the target user identification is associated with the personal calibration coefficient, determining the personal calibration coefficient associated with the target user identification as the target calibration coefficient;
and if the target user identification is not associated with the personal calibration coefficient, determining the default calibration coefficient of the system as the target calibration coefficient.
3. The method of claim 2, further comprising, after calculating the current personal calibration coefficients:
if the target user identifier is associated with the personal calibration coefficient, updating the personal calibration coefficient associated with the target user identifier to the current personal calibration coefficient;
and if the target user identifier is not associated with the personal calibration coefficient, associating the current personal calibration coefficient with the target user identifier.
4. The method of any one of claim 1 to 3, wherein,
the user identification is bound with the biological identification feature;
before the target calibration coefficient is used for calculating the gaze point coordinates of the user, the method further comprises the following steps:
Acquiring the biological identification characteristics of the user;
matching the acquired biometric characteristic with the biometric characteristic of the established user identification;
if the matching is successful, determining the user identification which is successfully matched as the user identification of the user;
if the matching fails, a user identifier is established for the user, and the established user identifier is bound with the acquired biological identification feature.
5. An eye tracking calibration method calibration device, comprising:
a background calibration starting unit for: in one interactive operation, if the interactive mode selected by the user is a non-line-of-sight positioning interactive mode, starting a background calibration flow;
the acquisition unit is used for: in the background calibration flow, acquiring eye feature information of the user;
a calibration unit for:
acquiring a position coordinate obtained by adopting the non-sight line positioning interaction mode as a calibration point coordinate;
according to the acquired eye feature information and the calibration point coordinates, calculating to obtain the current personal calibration coefficient of the user;
a calculation unit for:
before starting the background calibration flow, acquiring eye feature information of a user;
Calculating to obtain the gaze point coordinates of the user according to the acquired eye feature information and the target calibration coefficient; the target calibration coefficients include: a system default calibration factor or a personal calibration factor associated with a user identification of the user;
the display unit is used for displaying the interaction effect of the functional area to which the gaze point coordinates belong, wherein the interaction effect comprises expansion, color change and brightness enhancement;
and the monitoring unit is used for determining that the interaction mode adopted by the interaction operation is a non-sight line positioning mode if the non-sight line positioning interaction equipment is used and is positioned to other functional areas according to the non-sight line positioning interaction equipment.
6. The apparatus of claim 5, wherein,
the user identification of the user is a target user identification;
before the calculating using the target calibration coefficients to obtain the gaze point coordinates of the user, the calculating unit is further configured to:
if the target user identification is associated with the personal calibration coefficient, determining the personal calibration coefficient associated with the target user identification as the target calibration coefficient;
if the user identification of the user is not associated with the personal calibration coefficient, determining the default calibration coefficient of the system as the target calibration coefficient.
7. The apparatus of claim 6, wherein after calculating the current personal calibration coefficients, the calibration unit is further to:
if the target user identifier is associated with the personal calibration coefficient, updating the personal calibration coefficient associated with the target user identifier to the current personal calibration coefficient;
and if the target user identifier is not associated with the personal calibration coefficient, associating the current personal calibration coefficient with the target user identifier.
8. The apparatus of any one of claim 5 to 7,
the user identification is bound with the biological identification feature;
the apparatus further comprises:
a biometric identification unit for:
acquiring the biological identification characteristics of a user before the target calibration coefficient is used for calculating to obtain the gaze point coordinates of the user;
matching the acquired biometric characteristic with the biometric characteristic of the established user identification;
if the matching is successful, determining the user identification which is successfully matched as the user identification of the user;
if the matching fails, a user identifier is established for the user, and the established user identifier is bound with the acquired biological identification feature.
CN202010191024.5A 2020-03-18 2020-03-18 Eyeball tracking calibration method and device Active CN113495613B (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
CN202010191024.5A CN113495613B (en) 2020-03-18 2020-03-18 Eyeball tracking calibration method and device
PCT/CN2021/079596 WO2021185110A1 (en) 2020-03-18 2021-03-08 Method and device for eye tracking calibration
JP2022555765A JP2023517380A (en) 2020-03-18 2021-03-08 Eye tracking calibration method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010191024.5A CN113495613B (en) 2020-03-18 2020-03-18 Eyeball tracking calibration method and device

Publications (2)

Publication Number Publication Date
CN113495613A CN113495613A (en) 2021-10-12
CN113495613B true CN113495613B (en) 2023-11-21

Family

ID=77769930

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010191024.5A Active CN113495613B (en) 2020-03-18 2020-03-18 Eyeball tracking calibration method and device

Country Status (3)

Country Link
JP (1) JP2023517380A (en)
CN (1) CN113495613B (en)
WO (1) WO2021185110A1 (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114047822B (en) * 2021-11-24 2023-12-19 京东方科技集团股份有限公司 Near-to-eye display method and system
CN116311396B (en) * 2022-08-18 2023-12-12 荣耀终端有限公司 Method and device for fingerprint identification
CN116700500B (en) * 2023-08-07 2024-05-24 江西科技学院 Multi-scene VR interaction method, system and storage medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105045374A (en) * 2014-04-22 2015-11-11 联想(新加坡)私人有限公司 Automatic gaze calibration
CN106406509A (en) * 2016-05-16 2017-02-15 上海青研科技有限公司 Head-mounted eye control virtual reality device
CN109410285A (en) * 2018-11-06 2019-03-01 北京七鑫易维信息技术有限公司 A kind of calibration method, device, terminal device and storage medium
CN110427101A (en) * 2019-07-08 2019-11-08 北京七鑫易维信息技术有限公司 Calibration method, device, equipment and the storage medium of eyeball tracking

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3624023B2 (en) * 1995-07-14 2005-02-23 キヤノン株式会社 Line-of-sight detection apparatus and optical apparatus comprising line-of-sight detection apparatus
JP5664064B2 (en) * 2010-09-22 2015-02-04 富士通株式会社 Gaze detection device and correction coefficient calculation program
US9189095B2 (en) * 2013-06-06 2015-11-17 Microsoft Technology Licensing, Llc Calibrating eye tracking system by touch input
JP2015046111A (en) * 2013-08-29 2015-03-12 株式会社Jvcケンウッド Viewpoint detection device and viewpoint detection method
US9760772B2 (en) * 2014-03-20 2017-09-12 Lc Technologies, Inc. Eye image stimuli for eyegaze calibration procedures
JP2017134558A (en) * 2016-01-27 2017-08-03 ソニー株式会社 Information processor, information processing method, and computer-readable recording medium recorded with program
CN105843397A (en) * 2016-04-12 2016-08-10 公安部上海消防研究所 Virtual reality interactive system based on pupil tracking technology
CN109976535B (en) * 2019-05-05 2022-12-02 北京七鑫易维信息技术有限公司 Calibration method, device, equipment and storage medium

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105045374A (en) * 2014-04-22 2015-11-11 联想(新加坡)私人有限公司 Automatic gaze calibration
CN106406509A (en) * 2016-05-16 2017-02-15 上海青研科技有限公司 Head-mounted eye control virtual reality device
CN109410285A (en) * 2018-11-06 2019-03-01 北京七鑫易维信息技术有限公司 A kind of calibration method, device, terminal device and storage medium
CN110427101A (en) * 2019-07-08 2019-11-08 北京七鑫易维信息技术有限公司 Calibration method, device, equipment and the storage medium of eyeball tracking

Also Published As

Publication number Publication date
JP2023517380A (en) 2023-04-25
CN113495613A (en) 2021-10-12
WO2021185110A1 (en) 2021-09-23

Similar Documents

Publication Publication Date Title
CN110460837B (en) Electronic device with foveal display and gaze prediction
CN109086726B (en) Local image identification method and system based on AR intelligent glasses
US9791927B2 (en) Systems and methods of eye tracking calibration
EP3427185B1 (en) Blue light adjustment for biometric security
CN109154983B (en) Head mounted display system configured to exchange biometric information
US9953214B2 (en) Real time eye tracking for human computer interaction
JP7676400B2 (en) SYSTEM AND METHOD FOR OPERATING A HEAD MOUNTED DISPLAY SYSTEM BASED ON USER IDENTIFICATION - Patent application
CN113495613B (en) Eyeball tracking calibration method and device
CN109410285B (en) Calibration method, calibration device, terminal equipment and storage medium
CN109032351B (en) Fixation point function determination method, fixation point determination device and terminal equipment
CN112667069A (en) Method for automatically identifying at least one user of an eye tracking device and eye tracking device
US20180004287A1 (en) Method for providing user interface through head mounted display using eye recognition and bio-signal, apparatus using same, and computer readable recording medium
US20140085189A1 (en) Line-of-sight detection apparatus, line-of-sight detection method, and program therefor
CN109254662A (en) Mobile device operation method, apparatus, computer equipment and storage medium
Sun et al. Real-time gaze estimation with online calibration
CN120085754A (en) Eye tracking device and method thereof
JPWO2018220963A1 (en) Information processing apparatus, information processing method, and program
CN106462230A (en) Method and system for operating a display apparatus
De Buyser et al. Exploring the potential of combining smart glasses and consumer-grade EEG/EMG headsets for controlling IoT appliances in the smart home
CN110174937A (en) Watch the implementation method and device of information control operation attentively
CN113491502A (en) Eyeball tracking calibration inspection method, device, equipment and storage medium
CN109976528A (en) A kind of method and terminal device based on the dynamic adjustment watching area of head
CN115997159B (en) Method for determining the position of the center of rotation of an eye using a mobile device
KR102731936B1 (en) Method and device to determine trigger intent of user
CN110399930B (en) Data processing method and system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant