[go: up one dir, main page]

CN102043952A - Eye-gaze tracking method based on double light sources - Google Patents

Eye-gaze tracking method based on double light sources Download PDF

Info

Publication number
CN102043952A
CN102043952A CN 201010618752 CN201010618752A CN102043952A CN 102043952 A CN102043952 A CN 102043952A CN 201010618752 CN201010618752 CN 201010618752 CN 201010618752 A CN201010618752 A CN 201010618752A CN 102043952 A CN102043952 A CN 102043952A
Authority
CN
China
Prior art keywords
point
image
screen
gaze point
calibration
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN 201010618752
Other languages
Chinese (zh)
Other versions
CN102043952B (en
Inventor
孙建德
杨彩霞
刘琚
张�杰
杨晓晖
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shandong University
Original Assignee
Shandong University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shandong University filed Critical Shandong University
Priority to CN201010618752A priority Critical patent/CN102043952B/en
Publication of CN102043952A publication Critical patent/CN102043952A/en
Application granted granted Critical
Publication of CN102043952B publication Critical patent/CN102043952B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Eye Examination Apparatus (AREA)

Abstract

本发明公开了一种基于双光源的视线跟踪方法,具体步骤如下:(1)图像预处理:利用瞳孔和反射点灰度差的原理,对采集到的人脸图像进行预处理,提取出人眼图像中两个反射点和瞳孔区域,并计算出两个反射点及瞳孔中心在图像坐标系中的坐标;(2)注视点估计:根据图像中瞳孔中心与两反射点构成的三角形和屏幕上注视点及两个红外光源构成的三角形近似为一对相似三角形来确定屏幕上注视点的近似位置;(3)注视点校正,得到最终的注视点位置。本发明无需测量人距屏幕间的距离,计算复杂度低,对头部运动鲁棒性强,实验过程自然舒适,易于实现,估计误差在视线跟踪系统应用的误差容许范围之内。

The invention discloses a line-of-sight tracking method based on dual light sources. The specific steps are as follows: (1) Image preprocessing: using the principle of the gray scale difference between the pupil and the reflection point, the collected face image is pre-processed to extract the human face image. Two reflection points and the pupil area in the eye image, and calculate the coordinates of the two reflection points and the pupil center in the image coordinate system; (2) Gaze point estimation: According to the triangle formed by the pupil center and the two reflection points in the image and the screen The triangle formed by the upper gaze point and the two infrared light sources is approximated as a pair of similar triangles to determine the approximate position of the gaze point on the screen; (3) gaze point correction to obtain the final gaze point position. The invention does not need to measure the distance between people and the screen, has low computational complexity, is robust to head movement, is natural and comfortable in the experiment process, is easy to implement, and the estimation error is within the tolerance range of the eye-tracking system application error.

Description

A kind of sight tracing based on two light sources
Technical field
The present invention relates to a kind of sight tracing, belong to video, multimedia signal processing technique field based on two light sources.
Background technology
Along with the develop rapidly of computer science and technology and industry, domestic consumer's sharp increase of computing machine, intelligentized man-machine interaction also receives publicity day by day.As the important channel that the mankind obtain external information, sight line also can be brought into play its great potential in intelligent man-machine interaction, becomes the effective tool of intelligent interaction.For example, we can express our subjective thought and then realize that things ground is directly controlled to external world by sight line
T.Hutchinson etc. took the lead in having proposed in 1989 carrying out mutual thought with sight line and computing machine.Its tempting application prospect has attracted countless domestic and international research persons from each side such as image pre-service, mapping algorithm, corrections eye tracking to be studied.Nowadays, the eye tracking technology is increasingly mature, it uses the approval that also constantly obtains people,, development of games auxiliary, the appearance design 3D free viewpoint video in epoch in vogue of testing and assessing from the disabled person, none brings convenience for our life, has effectively promoted intelligent science and technology development especially.
The method of eye tracking has electroculogram (EOG), iris-sclera edge, corneal reflection, pupil-corneal reflection vector etc. multiple, and wherein, because its precision is higher, user's comfort is high and enjoy favor based on the method for pupil-corneal reflection vector.Utilizations such as D.H.Yoo four light sources in calendar year 2001 propose can comparatively accurately realize the estimation of sight line based on the gaze tracking system of the constant algorithm of double ratio.Yet, because be subjected to that hardware is integrated, the influence of execution speed and application prospect, people more are partial to adopt still less light source to realize the estimation of sight line.At present, want to realize that comparatively accurate sight line estimation needs two infrared light supplies at least under the situation freely at head movement.And existing two light source sight tracings calculate more complicated, even if allow head moving, correct moving scope also has very big restriction.Therefore find a kind of simple and practical eye tracking mapping method to have important theoretical research meaning and great actual application value.
Summary of the invention
At existing two light source sight tracing calculation of complex, to the restricted big problem of head range of movement, the invention provides a kind of computation complexity low, to little, the practical sight tracing of head limit movement based on two light sources.
According to the image-forming principle of watching characteristic and camera attentively of human eye, as shown in Figure 4, two light source L on the screen 1, L 2The reflection spot V that on eye cornea form corresponding with it 01, V 02Line and blinkpunkt Q on the screen and the P of pupil center 0Line meet at the corneal curvature center C of human eye; And, according to the refraction at spherical surface principle, the reflection spot V of two infrared light supplies on cornea 01, V 02And the P of pupil center 0With they picture V after the cornea refraction at spherical surface 1, V 2And therefore P in same footpath upwards, uses V 1, V 2And P replaces V respectively 01, V 02And P 0Can not influence the blinkpunkt results estimated.
Set up in the process in mapping, because people's eyeball is a spheroidite, so the reflection spot V that two infrared light supplies form in human eye 1, V 2Though and pupil center is inequality, of slight difference at the coordinate figure of the refraction point on the cornea on Z-direction.Calculate by analysis: these 3 coordinate figures on the Z axle can think approximate identical, the blinkpunkt result is estimated that the error that is caused is 0.9887cm to the maximum on X-axis, on Y-axis, be 0.7852cm to the maximum, and the trend of error presents compartmentalization and distributes, therefore can calibrate in conjunction with secondary and reduce because the approximate error that produces, the error of the feasible blinkpunkt that finally estimates is within the tolerance interval in actual applications.
Sight tracing based on two light sources of the present invention, concrete steps are as follows:
(1) image pre-service: the principle of utilizing pupil and reflection spot gray scale difference, the facial image that collects is carried out pre-service, extract in the eye image two reflection spots and pupil region, and calculate two reflection spots and the pupil center coordinate in image coordinate system, wherein image coordinate system is a true origin with the upper left corner of image, horizontal direction is an X-axis, and vertical direction is a Y-axis;
(2) blinkpunkt is estimated: the triangle of blinkpunkt and two infrared light supply formations is approximately the apparent position that a pair of similar triangles are determined blinkpunkt on the screen on triangle that constitutes according to pupil center in the image and two reflection spots and the screen.
(3) blinkpunkt is proofreaied and correct: the method for utilization secondary calibration is proofreaied and correct the intrinsic deviation of human eye's visual axis and optical axis and because of being similar to the evaluated error that produces, is obtained final blinkpunkt position.
The specific implementation step that described step (2) blinkpunkt is estimated is:
A. draw blinkpunkt and two leg-of-mutton approximate similarity sexual intercourse that infrared light supply constitutes on triangle that pupil center and two reflection spots constitute and the screen according to the image-forming principle of watching characteristic and camera attentively of human eye;
B. estimate the apparent position of blinkpunkt on the screen according to the triangle similarity.
The specific implementation step that described step (3) blinkpunkt is proofreaied and correct is:
The a-quadrant is divided: number and distribution according to scaling point are five zones with whole screen divider;
1 calibration of B: the scaling point by the screen centre position carries out some calibration;
C zone location: according to the position of some calibration back blinkpunkt and the zone at the definite blinkpunkt place of relation, the position between each scaling point;
D secondary calibration: according to the zone at blinkpunkt place, the blinkpunkt after once calibrating is carried out the secondary calibration, obtain final blinkpunkt position with the scaling point in the blinkpunkt region.
The triangle that blinkpunkt and two infrared light supplies constitute on triangle that the present invention constitutes according to the pupil center in the image and two reflection spots and the screen is approximately the apparent position that a pair of similar triangles are determined blinkpunkt on the screen, calibrate by secondary again and proofread and correct inherent variability and the approximate evaluated error that is caused of algorithm between human eye's visual axis and the optical axis, obtain comparatively accurate blinkpunkt estimated value.This method need not to measure the distance of head distance screen, and computation complexity is low, and to the head movement strong robustness, experimentation is comfortable naturally, is easy to realize, is convenient to integrated actual product.
Description of drawings
Fig. 1 is the system hardware figure that realizes the inventive method.
Fig. 2 is the process flow diagram of the inventive method.
Fig. 3 is the synoptic diagram of image preprocessing process in the inventive method.
Fig. 4 is video camera imaging and people's eye fixation principle schematic.
Fig. 5 is the approximate substitution synoptic diagram of coordinate position in the inventive method.
Fig. 6 is the leg-of-mutton synoptic diagram of two approximate similarities.
Fig. 7 is that scaling point distributes and screen area is divided synoptic diagram
Fig. 8 is a blinkpunkt estimated result synoptic diagram.
Fig. 9 is that the blinkpunkt estimated accuracy is analyzed synoptic diagram.
Embodiment
Realization the present invention is based on two light sources sight tracing hardware system as shown in Figure 1, comprise a gray scale video camera, two infrared light supplies and a personal computer.Personal computer adopts the 2.60GHZ Pentium Dual Core, computer screen size 34 * 27cm (wide * height), it is 694 * 1040 gray scale video camera that a resolution is adorned in computing machine below, it is motionless that camera position keeps in experimentation, and one 1 watt infrared light supply L is equipped with in the lower left corner of display and the lower right corner respectively 1And L 2The tester is sitting in apart from the position of screen 60--70cm, and head can move in the scope of 20 * 20 * 10cm (wide * length * degree of depth).
Fig. 2 has provided the process flow diagram of the sight tracing based on two light sources of the present invention, and according to this flow process, concrete implementation step is as follows:
1. determine reflection spot center and the pupil center coordinate in image coordinate system in the image by the image pre-service, its process as shown in Figure 3.According to the gray scale difference of pupil and corneal reflection point, extract pupil and corneal reflection point zone, and obtain their the coordinate position p of center in image coordinate system respectively i(p x, p y), v 1i(v 1ix, v 1iy), v 2i(v 2ix, v 2iy).Wherein image coordinate system is a true origin with the upper left corner of image, and horizontal direction is an X-axis, and vertical direction is a Y-axis.
2. blinkpunkt is estimated.The triangle that blinkpunkt and two infrared light supplies constitute on triangle that constitutes according to the pupil center in the image and two reflection spots and the screen is approximately a pair of similar triangles and determines blinkpunkt apparent position on the screen.Its specific implementation step is:
(1), draws blinkpunkt and two leg-of-mutton approximate similarity sexual intercourse that infrared light supply constitutes on triangle that pupil center and two reflection spots constitute and the screen according to the image-forming principle of watching characteristic and camera attentively of human eye shown in Figure 4.Its concrete steps are as follows:
(a) according to the image-forming principle of camera, with corneal reflection point in the facial image and pupil center the coordinate conversion in camera coordinates system in world coordinate system, its transformational relation as shown in the formula:
p i(p ix,p iy)→p(p x,p y,p z)=p(pixelpitch c(p ix-c certer)+O x,pixelpitch r(p iy-r center)+O y,-λ+O z)
v 1i(v 1ix,v 1iy)→v 1(v 1x,v 1y,v 1z)=v 1(pixelpitch c(v 1ix-c certer)+O x,pixelpitch r(v 1iy-r center)+O y,-λ+O z)
v 2i(v 2ix,v 2iy)→v 2(v 2x,v 2y,v 2z)=v 2(pixelpitch c(v 2ix-c center)+O x,pixelpitch r(v 2iy-r center)+O y,-λ+O z)
Wherein, space coordinates are initial point with the lower left corner of television screen, are the x axle with the axle of the horizontal direction that is parallel to television screen, are y with the axle of the vertical direction that is parallel to television screen, are the z axle with the axle perpendicular to the direction of television screen directed towards user.Pixe; Pitch c, pixelpitch r, c Center, r CenterAll are intrinsic parameters of camera, O x, O y, O zBe the coordinate figure of camera light node on a world coordinate system coordinate axis.λ is the constant relevant with camera focus, focal length and object distance.From above-mentioned conversion relational expression as can be seen, by a p (p x, p y, p z), v 1(v 1x, v 1y, v 1z) and v 2(v 2x, v 2y, v 2z) plane parallel that constitutes is in screen and Δ pv 1v 2Similar in appearance to Δ p iv 1iv 2i
(b) according to the image-forming principle of camera, the P (P of pupil center x, P y, P z) on the line of pO, the reflection spot V of infrared light supply on cornea 1(V 1x, V 1y, V 1z), V 2(V 2x, V 2y, V 2z) respectively at v 1O and v 2On the line of O.Strictly speaking, P z, V 2zAnd V 2zBe unequal, but they are more or less the same, so let us can think P z=V 2z=V 1z, promptly put P and V 2Use straight line OP and plane Z=V respectively 2zIntersection point P " and straight line OV 2With plane Z=V 2zIntersection point V 2" replace, as shown in Figure 6, and Δ P " V 1V 2" similar in appearance to triangle Δ pv 1v 2If human eye center of curvature C and P " line and the intersection point of screen be Q, then Q is approximate blinkpunkt estimated value, and Δ QL 1L 2Similar in appearance to Δ P " V 1V 2", so Δ QL 1L 2Similar in appearance to Δ pv 1v 2And similar in appearance to Δ p iv 1iv 2i
(2) be respectively L through measuring the coordinate that can obtain two infrared light supplies 1(0,0) and L 2(0,34), L 1xAnd L 2xBe positioned at same horizontal line.By Δ QL 1L 2Similar in appearance to Δ p iv 1iv 2i, as shown in Figure 5, utilize the character of similar triangle, can obtain blinkpunkt coordinate Q (Q approximate on the screen according to following equation x, Q y).
Q x L 2 x - L 1 x = v 1 ix - p ix v 1 ix - v 2 ix Q y L 2 x - L 1 x = v 1 iy - p iy v 1 ix - v 2 ix ⇒ Q x = v 1 ix - p ix v 1 ix - v 2 ix * ( L 2 x - L 1 x ) Q y = v 1 iy - p iy v 1 ix - v 2 ix * ( L 2 x - L 1 x )
L wherein 1xAnd L 2xBe respectively infrared light supply L 1And L 2Coordinate figure on X-axis.
3,,, the position of the blinkpunkt that is calculated in the step 2 watches the position attentively so need being only accurately human eye through the calibration correction because vision optical axis and the intrinsic deviation of the optical axis.Adopt the secondary calibration mode, also can reduce the error that produces owing to the approximate substitution in the algorithm when proofreading and correct the optical axis and optical axis inherent variability, concrete steps are as follows:
(1) area dividing.Number and distribution according to scaling point in the system are A with whole screen divider 0, A 1, A 2, A 3And A 4Five zones, as shown in Figure 7.
(2) some calibration.Scaling point by the screen centre position carries out some calibration.Its concrete grammar is as follows:
If the true coordinate value of scaling point a is (x_real_a, y_real_a), and utilize coordinate figure that algorithm estimates for (x_gaze_a, y_gaze_a), then scaling point a is at the poor x_error_a between actual value and the estimated value on X-axis and the Y-axis, and y_error_a can be represented by the formula:
x_error_a=x_gaze_a-x_real_a
y_error_a=y_gaze_a-y_real_a
Suppose the blinkpunkt Q that the algorithm with this pair light source estimates i(i=1,2,3, L) with all the other four scaling point b, c, d, the coordinate position of e is respectively (x_gaze_Q i, y_gaze_Q i) (i=1,2,3, L), (x_gaze_b, y_gaze_b), (x_gaze_c, y_gaze_c), (x_gaze_d, y_gaze_d) and (x_gaze_e, y_gaze_e), then through the coordinate (x_gazel_Q of center scaling point a after once calibrating i, y_gazel_Q i), (x_gazel_b, y_gazel_b), (x_gazel_c, y_gazel_c), (x_gazel_d, y_gazel_d), (x_gazel_e y_gazel_e) can be expressed as:
x_gazel_Q i=x_gaze_Q i+w ix*x_error_a,y_gazel_Q i=y_gaze_Q i+w iy*y_error_a
x_gazel_b=x_gaze_b+w bx*x_error_a,y_gazel_b=y_gaze_b+w by*y_error_a
x_gazel_c=x_gaze_c+w cx*x_error_a,y_gazel_c=y_gaze_c+w cy*y_error_a
x_gazel_d=x_gaze_d+w dx*x_error_a,y_gazel_d=y_gaze_d+w dy*y_error_a
x_gazel_e=x_gaze_e+w ex*x_error_a,y_gazel_e=y_gaze_e+w ey*y_error_a
Wherein, weight coefficient w Ix, w Bx, w Cx, w Dx, w Ex, w Iy, w By, w Cy, w Dy, w EyThe distance dependent on X-axis or Y-axis with corresponding point and centre coordinate point.
(3) zone location.According to the position of some calibration back blinkpunkt and the zone at the definite blinkpunkt place of relation, the position between each scaling point.
(4) secondary calibration.According to the zone at blinkpunkt place, with the scaling point in the blinkpunkt region blinkpunkt after once calibrating is carried out the secondary calibration, obtain final blinkpunkt position, its coordinate position can be expressed as:
X_gaze2_Q i=x_gazel_Q i, y_gaze2_Q i=y_gazel_Q i(at regional A 0)
X_gaze2_Q i=x_gazel_Q i+ x_error_b, y_gaze2_Q i=y_gazel_Q i+ y_error_b is (at regional A 1)
X_gaze2_Q i=x_gazel_Q i+ x_error_c, y_gaze2_Q i=y_gazel_Q i+ y_error_c is (at regional A 2)
X_gaze2_Q i=x_gazel_Q i+ x_error_d, y_gaze2_Q i=y_gazel_Q i+ y_error_d is (at regional A 3)
X_gaze2_Q i=x_gazel_Q i+ x_error_e, y_gaze2_Q i=y_gazel_Q i+ y_error_e is (in zone 4 4)
Should be based on two light source sight tracings of space similar triangles and secondary calibration, need not to measure the distance between head and the screen, computing method are simple, though exist approximate error can not surpass their maximal value 0.9887cm and 0.7852cm in algorithm on X-axis and Y-axis.And because the custom characteristic of people's eye fixation, the error of its blinkpunkt estimated result presents zonal distribution, as shown in Figure 8.Fig. 9 has provided blinkpunkt estimated accuracy analysis chart.The evaluated error result of the blinkpunkt that is used to test is as shown in the table:
Blinkpunkt X_error(cm) Yerror(cm)
1 0.5050 0.3143
2 0.5132 0.8293
3 0.5299 0.3748
4 0.6994 0.0992
5 0.9113 0.2203
6 0.5349 0.3853
7 0.2769 0.0475
8 1.0562 0.1781
9 0.7784 0.4744
10 0.3104 0.6536
11 0.2331 0.3049
12 1.1230 0.1768
13 1.0073 0.2070
14 0.7716 0.5990
15 0.9398 0.6581
16 0.9398 0.5310
Average 0.6956 0.3784
Its average error on X-axis and Y-axis is respectively 0.6956cm and 0.3784cm, so method of the present invention can be used for actual application fully.

Claims (3)

1.一种基于双光源的视线跟踪方法,其特征在于,包括以下步骤:1. A line-of-sight tracking method based on dual light sources, characterized in that, comprising the following steps: (1)图像预处理:利用瞳孔和反射点灰度差的原理,对采集到的人脸图像进行预处理,提取出人眼图像中两个反射点和瞳孔区域,并计算出两个反射点及瞳孔中心在图像坐标系中的坐标,其中图像坐标系以图像的左上角为坐标原点,水平方向为X轴,竖直方向为Y轴;(1) Image preprocessing: Using the principle of grayscale difference between the pupil and the reflection point, the collected face image is preprocessed, two reflection points and the pupil area in the human eye image are extracted, and two reflection points are calculated And the coordinates of the pupil center in the image coordinate system, wherein the image coordinate system takes the upper left corner of the image as the coordinate origin, the horizontal direction is the X axis, and the vertical direction is the Y axis; (2)注视点估计:根据图像中瞳孔中心与两反射点构成的三角形和屏幕上注视点及两个红外光源构成的三角形近似为一对相似三角形来确定屏幕上注视点的近似位置;(2) Gaze point estimation: determine the approximate position of the gaze point on the screen according to the triangle formed by the center of the pupil and the two reflection points in the image and the triangle formed by the gaze point and the two infrared light sources on the screen as a pair of similar triangles; (3)注视点校正:运用二次定标的方法来校正人眼视轴与光轴固有的偏差以及因近似而产生的估计误差,得到最终的注视点位置。(3) Fixation point correction: use the method of secondary calibration to correct the inherent deviation between the visual axis and the optical axis of the human eye and the estimation error caused by approximation, and obtain the final fixation point position. 2.根据权利要求1所述的基于双光源的视线跟踪方法,其特征在于,所述步骤(2)注视点估计的具体实现步骤为:2. the line-of-sight tracking method based on dual light sources according to claim 1, is characterized in that, the concrete realization step of described step (2) gaze point estimation is: A.根据人眼的注视特性及相机的成像原理得出瞳孔中心与两反射点构成的三角形和屏幕上注视点及两个红外光源构成的三角形的近似相似性关系;A. According to the gaze characteristics of the human eye and the imaging principle of the camera, the approximate similarity relationship between the triangle formed by the center of the pupil and two reflection points and the triangle formed by the fixation point on the screen and two infrared light sources is obtained; B.根据三角形相似性来估计屏幕上注视点的近似位置。B. Estimate the approximate position of the gaze point on the screen from the triangle similarity. 3.根据权利要求1所述的基于双光源的视线跟踪方法,其特征在于,所述步骤(3)注视点校正的具体实现步骤为:3. the line-of-sight tracking method based on dual light sources according to claim 1, is characterized in that, the specific implementation steps of described step (3) fixation point correction are: A区域划分:根据定标点的个数及分布将整个屏幕划分为五个区域;A area division: divide the entire screen into five areas according to the number and distribution of calibration points; B一点定标:通过屏幕中间位置的定标点进行一点定标;B One-point calibration: perform one-point calibration through the calibration point in the middle of the screen; C区域定位:根据一点定标后注视点的位置与各个定标点之间的位置关系确定注视点所在的区域;C area positioning: determine the area where the gaze point is located according to the position of the gaze point after one point calibration and the positional relationship between each calibration point; D二次定标:根据注视点所在的区域,用注视点所在区域内的定标点对一次定标后的注视点进行二次定标,得到最终的注视点位置。 D secondary calibration: According to the area where the gaze point is located, use the calibration points in the area where the gaze point is located to perform secondary calibration on the gaze point after primary calibration to obtain the final gaze point position. the
CN201010618752A 2010-12-31 2010-12-31 A Eye Tracking Method Based on Dual Light Sources Expired - Fee Related CN102043952B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201010618752A CN102043952B (en) 2010-12-31 2010-12-31 A Eye Tracking Method Based on Dual Light Sources

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201010618752A CN102043952B (en) 2010-12-31 2010-12-31 A Eye Tracking Method Based on Dual Light Sources

Publications (2)

Publication Number Publication Date
CN102043952A true CN102043952A (en) 2011-05-04
CN102043952B CN102043952B (en) 2012-09-19

Family

ID=43910080

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201010618752A Expired - Fee Related CN102043952B (en) 2010-12-31 2010-12-31 A Eye Tracking Method Based on Dual Light Sources

Country Status (1)

Country Link
CN (1) CN102043952B (en)

Cited By (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103677270A (en) * 2013-12-13 2014-03-26 电子科技大学 Human-computer interaction method based on eye movement tracking
CN103793719A (en) * 2014-01-26 2014-05-14 深圳大学 Monocular distance-measuring method and system based on human eye positioning
CN103885589A (en) * 2014-03-06 2014-06-25 华为技术有限公司 Eye tracking method and device
CN104244807A (en) * 2012-07-31 2014-12-24 独立行政法人科学技术振兴机构 Point of gaze detection device, point of gaze detection method, individual parameter computation device, individual parameter computation method, program, and computer-readable recording medium
CN104598019A (en) * 2013-10-28 2015-05-06 欧姆龙株式会社 Screen operation apparatus and screen operation method
CN104644120A (en) * 2013-11-15 2015-05-27 现代自动车株式会社 Gaze detecting apparatus and method
CN104915013A (en) * 2015-07-03 2015-09-16 孙建德 Eye tracking and calibrating method based on usage history
CN105678209A (en) * 2014-12-08 2016-06-15 现代自动车株式会社 Method for detecting face direction of a person
CN106547341A (en) * 2015-09-21 2017-03-29 现代自动车株式会社 The method of gaze tracker and its tracing fixation
CN107003521A (en) * 2014-09-22 2017-08-01 脸谱公司 The display visibility assembled based on eyes
CN107515474A (en) * 2017-09-22 2017-12-26 宁波维真显示科技股份有限公司 Autostereoscopic display method, apparatus and stereoscopic display device
CN108140244A (en) * 2015-12-01 2018-06-08 Jvc 建伍株式会社 Sight line detector and method for detecting sight line
CN108196676A (en) * 2018-01-02 2018-06-22 联想(北京)有限公司 Track and identify method and system
CN108427503A (en) * 2018-03-26 2018-08-21 京东方科技集团股份有限公司 Human eye method for tracing and human eye follow-up mechanism
CN108697389A (en) * 2015-11-18 2018-10-23 阿斯泰克有限公司 System and method for supporting neural state assessment and neural rehabilitation, especially cognition and/or laloplegia
CN109144267A (en) * 2018-09-03 2019-01-04 中国农业大学 Man-machine interaction method and device
CN110908511A (en) * 2019-11-08 2020-03-24 Oppo广东移动通信有限公司 Method and related device for triggering recalibration
CN112132755A (en) * 2019-06-25 2020-12-25 京东方科技集团股份有限公司 Method, apparatus, system and computer readable medium for correcting and demarcating pupil position
CN113589532A (en) * 2021-07-30 2021-11-02 歌尔光学科技有限公司 Display calibration method and device of head-mounted equipment, head-mounted equipment and storage medium
CN114706484A (en) * 2022-04-18 2022-07-05 Oppo广东移动通信有限公司 Sight line coordinate determination method and device, computer readable medium and electronic equipment
CN117648037A (en) * 2024-01-29 2024-03-05 北京未尔锐创科技有限公司 Target sight tracking method and system

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN201307266Y (en) * 2008-06-25 2009-09-09 韩旭 Binocular sightline tracking device

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN201307266Y (en) * 2008-06-25 2009-09-09 韩旭 Binocular sightline tracking device

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
《Intelligent Control and Automation (WCICA), 2010 8th World Congress on》 20100709 Xiaohui Yang et al 《A gaze tracking scheme for eye-based intelligent control》 50-55 1-3 , *
《中国工程科学》 20081231 黄莹,王志良,戚颖 《基于双光源的实时视线追踪系统》 86-90 1-3 , *

Cited By (35)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104244807A (en) * 2012-07-31 2014-12-24 独立行政法人科学技术振兴机构 Point of gaze detection device, point of gaze detection method, individual parameter computation device, individual parameter computation method, program, and computer-readable recording medium
US9262680B2 (en) 2012-07-31 2016-02-16 Japan Science And Technology Agency Point-of-gaze detection device, point-of-gaze detecting method, personal parameter calculating device, personal parameter calculating method, program, and computer-readable storage medium
CN104244807B (en) * 2012-07-31 2016-10-19 国立研究开发法人科学技术振兴机构 Gaze point detection device and gaze point detection method
CN104598019A (en) * 2013-10-28 2015-05-06 欧姆龙株式会社 Screen operation apparatus and screen operation method
CN104644120A (en) * 2013-11-15 2015-05-27 现代自动车株式会社 Gaze detecting apparatus and method
CN103677270A (en) * 2013-12-13 2014-03-26 电子科技大学 Human-computer interaction method based on eye movement tracking
CN103677270B (en) * 2013-12-13 2016-08-17 电子科技大学 A kind of man-machine interaction method based on eye-tracking
CN103793719A (en) * 2014-01-26 2014-05-14 深圳大学 Monocular distance-measuring method and system based on human eye positioning
CN103885589A (en) * 2014-03-06 2014-06-25 华为技术有限公司 Eye tracking method and device
CN103885589B (en) * 2014-03-06 2017-01-25 华为技术有限公司 Eye movement tracking method and device
CN107003521A (en) * 2014-09-22 2017-08-01 脸谱公司 The display visibility assembled based on eyes
CN107003521B (en) * 2014-09-22 2020-06-05 脸谱科技有限责任公司 Display visibility based on eye convergence
CN105678209A (en) * 2014-12-08 2016-06-15 现代自动车株式会社 Method for detecting face direction of a person
CN105678209B (en) * 2014-12-08 2020-06-30 现代自动车株式会社 Method for detecting face direction of person
CN104915013B (en) * 2015-07-03 2018-05-11 山东管理学院 A kind of eye tracking calibrating method based on usage history
CN104915013A (en) * 2015-07-03 2015-09-16 孙建德 Eye tracking and calibrating method based on usage history
CN106547341A (en) * 2015-09-21 2017-03-29 现代自动车株式会社 The method of gaze tracker and its tracing fixation
CN106547341B (en) * 2015-09-21 2023-12-08 现代自动车株式会社 Gaze tracker and method for tracking gaze thereof
CN108697389A (en) * 2015-11-18 2018-10-23 阿斯泰克有限公司 System and method for supporting neural state assessment and neural rehabilitation, especially cognition and/or laloplegia
CN108140244A (en) * 2015-12-01 2018-06-08 Jvc 建伍株式会社 Sight line detector and method for detecting sight line
CN108140244B (en) * 2015-12-01 2021-09-24 Jvc 建伍株式会社 Sight line detection device and sight line detection method
CN107515474A (en) * 2017-09-22 2017-12-26 宁波维真显示科技股份有限公司 Autostereoscopic display method, apparatus and stereoscopic display device
CN108196676B (en) * 2018-01-02 2021-04-13 联想(北京)有限公司 Tracking identification method and system
CN108196676A (en) * 2018-01-02 2018-06-22 联想(北京)有限公司 Track and identify method and system
CN108427503B (en) * 2018-03-26 2021-03-16 京东方科技集团股份有限公司 Human eye tracking method and human eye tracking device
CN108427503A (en) * 2018-03-26 2018-08-21 京东方科技集团股份有限公司 Human eye method for tracing and human eye follow-up mechanism
CN109144267A (en) * 2018-09-03 2019-01-04 中国农业大学 Man-machine interaction method and device
CN112132755A (en) * 2019-06-25 2020-12-25 京东方科技集团股份有限公司 Method, apparatus, system and computer readable medium for correcting and demarcating pupil position
CN112132755B (en) * 2019-06-25 2024-09-20 京东方科技集团股份有限公司 Method, device, system and computer-readable medium for correcting and calibrating pupil position
CN110908511B (en) * 2019-11-08 2022-03-15 Oppo广东移动通信有限公司 Method for triggering recalibration and related device
CN110908511A (en) * 2019-11-08 2020-03-24 Oppo广东移动通信有限公司 Method and related device for triggering recalibration
CN113589532A (en) * 2021-07-30 2021-11-02 歌尔光学科技有限公司 Display calibration method and device of head-mounted equipment, head-mounted equipment and storage medium
CN114706484A (en) * 2022-04-18 2022-07-05 Oppo广东移动通信有限公司 Sight line coordinate determination method and device, computer readable medium and electronic equipment
CN117648037A (en) * 2024-01-29 2024-03-05 北京未尔锐创科技有限公司 Target sight tracking method and system
CN117648037B (en) * 2024-01-29 2024-04-19 北京未尔锐创科技有限公司 Target sight tracking method and system

Also Published As

Publication number Publication date
CN102043952B (en) 2012-09-19

Similar Documents

Publication Publication Date Title
CN102043952A (en) Eye-gaze tracking method based on double light sources
US10958898B2 (en) Image creation device, method for image creation, image creation program, method for designing eyeglass lens and method for manufacturing eyeglass lens
CN101901485B (en) 3D free head moving type gaze tracking system
CN105708467B (en) Human body actual range measures and the method for customizing of spectacle frame
Lai et al. Hybrid method for 3-D gaze tracking using glint and contour features
Plopski et al. Corneal-imaging calibration for optical see-through head-mounted displays
US12154383B2 (en) Methods, devices and systems for determining eye parameters
CN103761519B (en) Non-contact sight-line tracking method based on self-adaptive calibration
Coutinho et al. Improving head movement tolerance of cross-ratio based eye trackers
JP7659148B2 (en) Eye tracking device and method
CN102125422A (en) Pupil center-corneal reflection (PCCR) based sight line evaluation method in sight line tracking system
US10620454B2 (en) System and method of obtaining fit and fabrication measurements for eyeglasses using simultaneous localization and mapping of camera images
CN102930252A (en) Sight tracking method based on neural network head movement compensation
CN102456137A (en) Sight line tracking preprocessing method based on near-infrared reflection point characteristic
Bakker et al. Accurate gaze direction measurements with free head movement for strabismus angle estimation
US20240061278A1 (en) Frame adjustment system
CN112329699A (en) Method for positioning human eye fixation point with pixel-level precision
Nagamatsu et al. Calibration-free gaze tracking using a binocular 3D eye model
Wibirama et al. 3D gaze tracking on stereoscopic display using optimized geometric method
Lu et al. Neural 3D gaze: 3D pupil localization and gaze tracking based on anatomical eye model and neural refraction correction
CN113613547A (en) Apparatus and method for evaluating performance of a vision device for a vision task
Pansing et al. Optimization of illumination schemes in a head-mounted display integrated with eye tracking capabilities
Fang et al. Automatic head and facial feature extraction based on geometry variations
Villanueva et al. Geometry issues of gaze estimation
Lai et al. 3-d gaze tracking using pupil contour features

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20120919

Termination date: 20211231