[go: up one dir, main page]

US9704037B2 - Method for detecting face direction of a person - Google Patents

Method for detecting face direction of a person Download PDF

Info

Publication number
US9704037B2
US9704037B2 US14/834,391 US201514834391A US9704037B2 US 9704037 B2 US9704037 B2 US 9704037B2 US 201514834391 A US201514834391 A US 201514834391A US 9704037 B2 US9704037 B2 US 9704037B2
Authority
US
United States
Prior art keywords
light
reflection points
person
glasses
detecting
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active, expires
Application number
US14/834,391
Other versions
US20160162735A1 (en
Inventor
Il Yong YOON
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hyundai Motor Co
Original Assignee
Hyundai Motor Co
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hyundai Motor Co filed Critical Hyundai Motor Co
Assigned to HYUNDAI MOTOR COMPANY reassignment HYUNDAI MOTOR COMPANY ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: YOON, IL YOUNG
Publication of US20160162735A1 publication Critical patent/US20160162735A1/en
Application granted granted Critical
Publication of US9704037B2 publication Critical patent/US9704037B2/en
Active legal-status Critical Current
Adjusted expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/161Detection; Localisation; Normalisation
    • G06K9/00604
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/215Motion-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/223Analysis of motion using block-matching
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/246Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/59Context or environment of the image inside of a vehicle, e.g. relating to seat occupancy, driver state or inner lighting conditions
    • G06V20/597Recognising the driver's state or behaviour, e.g. attention or drowsiness
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/168Feature extraction; Face representation
    • G06V40/171Local features and components; Facial parts ; Occluding parts, e.g. glasses; Geometrical relationships
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/18Eye characteristics, e.g. of the iris
    • G06V40/19Sensors therefor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30196Human being; Person
    • G06T2207/30201Face

Definitions

  • the present disclosure relates to a method for detecting face direction of a person, and more particularly, to a method for detecting face direction by estimating the person's (e.g., a vehicle driver's) line of sight through detecting a direction in which glasses worn by the person are directed.
  • the person's e.g., a vehicle driver's
  • DSM driver surveillance system
  • ADAS advanced driver assistance systems
  • a line of sight tracking technique has been developed as an advanced driver state surveillance, several issues exist that may cause the tracking to fail. For example, if the person whose line of sight is being tracked (e.g., a driver) is wearing glasses, the glass lenses may reflect the light and consequently, the reflected light can cover the cornea reflection point in the person's eye. This can make distinguishing the cornea reflection point from the reflected light difficult. As a result, the line of sight tracking, its marketability and safety may be degraded.
  • An aspect of the present disclosure provides a method for detecting ace direction of a person by estimating the person's line of sight through detecting a direction in which glasses worn by the person are directed.
  • a method for detecting face direction includes receiving a face image of a person. The method further includes determining whether the person is wearing glasses, based on the face image. The method further includes determining whether the number of reflection points of light in a glasses region of the face image is four or more at the time of detecting the glasses region. The method further includes aligning the reflection points of light in order of size, upon determining that the number of reflection points of light is four or more. The method further includes detecting two virtual images of the light, based on the aligning. The method further includes detecting a face direction vector based on the two virtual images of the light.
  • FIG. 1 is a flow chart showing a process for detecting face direction of a person according to the present disclosure.
  • FIG. 2 is a diagram showing a double reflection by myopic glasses.
  • FIG. 3 is a diagram showing a classified state of reflection points of glass lenses.
  • FIG. 4 is a diagram showing relationship between a center of a lens of glasses and a center of an eyeball rotation.
  • FIG. 5 is a diagram showing relationship between an optical axis due to a tilt of a frame of glasses and the center of the eyeball rotation.
  • FIG. 6 is a diagram showing an example of camera and light arrangement in the process for detecting a face direction according to the present disclosure.
  • FIG. 1 A process for detecting face direction according to the present disclosure is shown in FIG. 1 .
  • a face image of a person is received as input.
  • reflection points of light at the glasses region is determined.
  • the reflection points of a light are aligned.
  • a virtual image of the light is detected, and at step S 60 a face direction vector is detected.
  • step S 10 which is an operation inputting the face image of a person (e.g., a driver)
  • the face image is input through a system such as, for example, a driver state surveillance system.
  • step S 20 it is determined, using the face image received at step S 10 , whether or not the person is wearing glasses so as to enable the person's line of sight to be detected.
  • step S 30 Upon detecting a glasses region at step S 20 , at step S 30 it is determined whether four or more reflection points are present in the glasses region.
  • the reflection points in the glasses region include reflection points of light by an inner surface of the glass lenses and reflection points of light by an outer surface thereof.
  • the light can be infrared light included in the driver state surveillance system.
  • a lens 201 has two reflection surfaces, wherein an outer surface 203 has a convex lens shape for both myopia and hyperopia and an inner surface 205 has a concave lens shape for myopia and a convex lens shape for hyperopia.
  • a large light reflection point is caused by the outer surface and a small light reflection point is caused by the inner surface.
  • the glass lenses may not have a curvature larger than that of a cornea, the light reflection points by the glass lenses are always larger than those by the cornea.
  • a straight line i.e., optical axis
  • a curvature center of each surface of the lens passes through a movement center of an eyeball.
  • a face direction vector is calculated by the same method as a method for calculating a line of sight vector by a cornea glint.
  • step S 20 when the glasses region is not detected at step S 20 , the process may continue from step S 10 .
  • step S 31 it is determined whether the line of sight detection is failed.
  • step S 32 an additional light is applied so as to increase the reflection points of light.
  • step S 32 the process may continue from step S 10 so as to input the face image by the additional light.
  • the number of reflection points of the light in the glass lenses is four or less and the reflection points of the cornea and a pupil may not be found, it is possible to artificially increase the reflection points of the light by applying additional light.
  • light may be applied at different position so as to form four reflection points of the light.
  • the process may continue from step S 30 so as to once again determine whether the number of reflection points in the glasses region is four or more at the time of detecting the glasses region.
  • the reflection points in the glasses region include a reflection point of light by an inner surface of the glass lens and a reflection point of a light by an outer surface thereof, as previously discussed with regards to FIG. 2 .
  • step S 40 the reflection points of the light are aligned in order of size.
  • step S 50 after the reflection points of the light are aligned at step S 40 , two virtual images of the light among the four reflection points of the light are detected.
  • step S 60 the face direction vector is detected by the two virtual images of the light detected at step S 50 .
  • an optical axis (OA) of the glasses lens is designed so as to pass through the rotation center (CR) of the eyeball.
  • the glasses frame has a downward tilt of 6 to 10 degrees. However even in this case, the pass of the optical axis through the lens is effective.
  • the method for detecting face direction of a person provides face direction in which the glasses view becomes the face direction by reversely using the reflection points of the light by the glass lenses. Consequently, marketability and safety may be increased by detecting the direction in which the glass lenses view so as to improve probability of the person's line of sight estimation.
  • the face direction vector is detected so as to be used for increasing accuracy in estimating the failed line of sight and a distraction monitoring may be performed by just a change in a face direction. As a result, it is possible to cope with an emergency situation even in the case in which the line of sight detection is failed.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Health & Medical Sciences (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Health & Medical Sciences (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Human Computer Interaction (AREA)
  • Ophthalmology & Optometry (AREA)
  • Eye Examination Apparatus (AREA)
  • Signal Processing (AREA)
  • Traffic Control Systems (AREA)
  • Image Analysis (AREA)

Abstract

A method for detecting face direction of a person includes receiving a face image of the person. The method further includes determining whether the person is wearing glasses, based on the face image. The method also includes determining whether the number of reflection points of light in a glasses region of the face image is four or more at the time of detecting the glasses region. The method also includes aligning the reflection points of light in order of size, upon determining that the number of reflection points of light is four or more. The method also includes detecting two virtual images of the light, based on the aligning. The method also includes detecting a face direction vector based on the two virtual images of the light.

Description

CROSS-REFERENCE TO RELATED APPLICATION
This application is based on and claims the benefit of priority to Korean Patent Application No. 10-2014-0175095, filed on Dec. 8, 2014 in the Korean Intellectual Property Office, the disclosure of which is incorporated herein in its entirety by reference.
TECHNICAL FIELD
The present disclosure relates to a method for detecting face direction of a person, and more particularly, to a method for detecting face direction by estimating the person's (e.g., a vehicle driver's) line of sight through detecting a direction in which glasses worn by the person are directed.
BACKGROUND
In general, a driver surveillance system (DSM) or a line of sight tracking technique has been developed for use in advanced driver assistance systems (ADAS).
Meanwhile, although a line of sight tracking technique has been developed as an advanced driver state surveillance, several issues exist that may cause the tracking to fail. For example, if the person whose line of sight is being tracked (e.g., a driver) is wearing glasses, the glass lenses may reflect the light and consequently, the reflected light can cover the cornea reflection point in the person's eye. This can make distinguishing the cornea reflection point from the reflected light difficult. As a result, the line of sight tracking, its marketability and safety may be degraded.
SUMMARY
The present disclosure has been made to solve the above-mentioned problems occurring in the prior art while advantages achieved by the prior art are maintained intact.
An aspect of the present disclosure provides a method for detecting ace direction of a person by estimating the person's line of sight through detecting a direction in which glasses worn by the person are directed.
According to an exemplary embodiment of the present disclosure, a method for detecting face direction includes receiving a face image of a person. The method further includes determining whether the person is wearing glasses, based on the face image. The method further includes determining whether the number of reflection points of light in a glasses region of the face image is four or more at the time of detecting the glasses region. The method further includes aligning the reflection points of light in order of size, upon determining that the number of reflection points of light is four or more. The method further includes detecting two virtual images of the light, based on the aligning. The method further includes detecting a face direction vector based on the two virtual images of the light.
BRIEF DESCRIPTION OF THE DRAWINGS
The above and other objects, features and advantages of the present disclosure will be more apparent from the following detailed description taken in conjunction with the accompanying drawings.
FIG. 1 is a flow chart showing a process for detecting face direction of a person according to the present disclosure.
FIG. 2 is a diagram showing a double reflection by myopic glasses.
FIG. 3 is a diagram showing a classified state of reflection points of glass lenses.
FIG. 4 is a diagram showing relationship between a center of a lens of glasses and a center of an eyeball rotation.
FIG. 5 is a diagram showing relationship between an optical axis due to a tilt of a frame of glasses and the center of the eyeball rotation.
FIG. 6 is a diagram showing an example of camera and light arrangement in the process for detecting a face direction according to the present disclosure.
DETAILED DESCRIPTION
Exemplary embodiments of the present disclosure will be described in detail with reference to the accompanying drawings.
A process for detecting face direction according to the present disclosure is shown in FIG. 1. At step S10 a face image of a person is received as input. At step S20 it is determined whether the person is wearing glasses. At step S30 reflection points of light at the glasses region is determined. At step S40 the reflection points of a light are aligned. At step S50 a virtual image of the light is detected, and at step S60 a face direction vector is detected.
As shown in FIG. 1, in step S10, which is an operation inputting the face image of a person (e.g., a driver), the face image is input through a system such as, for example, a driver state surveillance system.
At step S20, it is determined, using the face image received at step S10, whether or not the person is wearing glasses so as to enable the person's line of sight to be detected.
Upon detecting a glasses region at step S20, at step S30 it is determined whether four or more reflection points are present in the glasses region.
As shown in FIG. 3, the reflection points in the glasses region include reflection points of light by an inner surface of the glass lenses and reflection points of light by an outer surface thereof. The light can be infrared light included in the driver state surveillance system.
As shown in FIG. 2, a lens 201 has two reflection surfaces, wherein an outer surface 203 has a convex lens shape for both myopia and hyperopia and an inner surface 205 has a concave lens shape for myopia and a convex lens shape for hyperopia.
In the case of a myopic lens, since both surfaces of the myopic lens are convex in terms of reflection, a reduced virtual image is focused. However, since the outer lens thereof has a larger curvature radius, magnification thereof is increased and a larger image is focused on thereon.
That is, among the light reflection points, a large light reflection point is caused by the outer surface and a small light reflection point is caused by the inner surface.
In addition, since the glass lenses may not have a curvature larger than that of a cornea, the light reflection points by the glass lenses are always larger than those by the cornea.
In the case in which four reflection points of two lights by the two surfaces of the lens form an image in one lens, a straight line connecting curvature centers of the two surfaces to each other represents the face direction.
Meanwhile, in terms of lens optics, a straight line (i.e., optical axis) connecting a center of the lens and a curvature center of each surface of the lens passes through a movement center of an eyeball.
In this case, a face direction vector is calculated by the same method as a method for calculating a line of sight vector by a cornea glint.
In addition, when the glasses region is not detected at step S20, the process may continue from step S10.
In the case in which at step S30 it is determined that the number of reflection points of the glasses region is less than four, at step S31 it is determined whether the line of sight detection is failed.
In this case, when the line of sight detection is failed, at step S32 an additional light is applied so as to increase the reflection points of light.
In addition, if the addition light is applied at step S32, the process may continue from step S10 so as to input the face image by the additional light.
As shown in FIG. 6, in the case in which the number of reflection points of the light in the glass lenses is four or less and the reflection points of the cornea and a pupil may not be found, it is possible to artificially increase the reflection points of the light by applying additional light. In addition, light may be applied at different position so as to form four reflection points of the light.
Referring back to FIG. 1, in the case in which at step S31, the line of sight detection is successful, the process may continue from step S30 so as to once again determine whether the number of reflection points in the glasses region is four or more at the time of detecting the glasses region.
In this case, at step S30, the reflection points in the glasses region include a reflection point of light by an inner surface of the glass lens and a reflection point of a light by an outer surface thereof, as previously discussed with regards to FIG. 2.
When it is confirmed at step S30 that the number of reflection points in the glasses region is four or more, at step S40, the reflection points of the light are aligned in order of size.
At step S50, after the reflection points of the light are aligned at step S40, two virtual images of the light among the four reflection points of the light are detected.
At step S60, the face direction vector is detected by the two virtual images of the light detected at step S50.
As shown in FIG. 4, an optical axis (OA) of the glasses lens is designed so as to pass through the rotation center (CR) of the eyeball. In addition, as shown in FIG. 5, the glasses frame has a downward tilt of 6 to 10 degrees. However even in this case, the pass of the optical axis through the lens is effective.
As described above, the method for detecting face direction of a person according to the present disclosure provides face direction in which the glasses view becomes the face direction by reversely using the reflection points of the light by the glass lenses. Consequently, marketability and safety may be increased by detecting the direction in which the glass lenses view so as to improve probability of the person's line of sight estimation.
In the case in which the number of reflection points of the light is four in the glass lens, the line of sight detection or the reflection point detection of the cornea have failed in most cases in the past. However, according to the present disclosure, the face direction vector is detected so as to be used for increasing accuracy in estimating the failed line of sight and a distraction monitoring may be performed by just a change in a face direction. As a result, it is possible to cope with an emergency situation even in the case in which the line of sight detection is failed.
As described above, although the present disclosure has been described with reference to exemplary embodiments and the accompanying drawings, it would be appreciated by those skilled in the art that the present disclosure is not limited thereto but various modifications and alterations might be made without departing from the scope defined in the following claims.

Claims (8)

What is claimed is:
1. A method for detecting face direction, the method comprising:
receiving a face image of a person;
determining whether the person is wearing glasses, based on the face image;
determining whether a number of reflection points of light in a glasses region of the face image is four or more at a time of detecting the glasses region;
aligning reflection points of light in order of size, upon determining that the number of reflection points of light is four or more;
detecting two virtual images of the light, based on the aligning; and
detecting a face direction vector based on the two virtual images of the light.
2. The method according to claim 1, wherein determining whether the person is wearing glasses comprises detecting a glasses region in the face image, and
when the glasses region is not detected, receiving a second face image of the person.
3. The method according to claim 1, further comprising:
upon determining that the number of reflection points of light in the glasses is less than four, detecting a line of sight of the person.
4. The method according to claim 3, further comprising:
determining a failure of the line of sight detection; and
applying additional light so as to increase the reflection points of the light.
5. The method according to claim 4, further comprising:
upon applying the additional light, receiving a third face image of the person.
6. The method according to claim 3, further comprising:
determining a success of the line of sight detection; and
determining whether the number of reflection points of light in the glasses region of the face image is four or more at the time of detecting the glasses region.
7. The method according to claim 1, wherein the reflection points are reflection points of light by an inner surface of a lens of the glasses and reflection points of light by an outer surface thereof.
8. The method according to claim 1, wherein the face image is received through a driver state surveillance system.
US14/834,391 2014-12-08 2015-08-24 Method for detecting face direction of a person Active 2035-09-02 US9704037B2 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR10-2014-0175095 2014-12-08
KR1020140175095A KR101619661B1 (en) 2014-12-08 2014-12-08 Detection method of face direction of driver

Publications (2)

Publication Number Publication Date
US20160162735A1 US20160162735A1 (en) 2016-06-09
US9704037B2 true US9704037B2 (en) 2017-07-11

Family

ID=56021268

Family Applications (1)

Application Number Title Priority Date Filing Date
US14/834,391 Active 2035-09-02 US9704037B2 (en) 2014-12-08 2015-08-24 Method for detecting face direction of a person

Country Status (3)

Country Link
US (1) US9704037B2 (en)
KR (1) KR101619661B1 (en)
CN (1) CN105678209B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20220252924A1 (en) * 2021-02-11 2022-08-11 Seeing Machines Limited Cabin monitoring with electrically switched polarization

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
RU2672502C1 (en) * 2014-12-10 2018-11-15 Телефонактиеболагет Лм Эрикссон (Пабл) Device and method for forming cornea image
US10621418B2 (en) * 2016-02-18 2020-04-14 Mitsubishi Electric Corporation Passenger detection device, passenger detection system, and passenger detection method
KR102371591B1 (en) * 2016-10-06 2022-03-07 현대자동차주식회사 Apparatus and method for determining condition of driver
CN112401887B (en) * 2020-11-10 2023-12-12 恒大新能源汽车投资控股集团有限公司 Driver attention monitoring method and device and electronic equipment
US20240005387A1 (en) * 2022-07-01 2024-01-04 Warby Parker Inc. Systems and methods for spectacle removal and virtual try-on

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0757072A (en) 1993-08-11 1995-03-03 Nissan Motor Co Ltd Driver's state detector
JP2005242428A (en) 2004-02-24 2005-09-08 Nissan Motor Co Ltd Driver face imaging device
JP2005296382A (en) 2004-04-13 2005-10-27 Honda Motor Co Ltd Visual line detector
JP2012019931A (en) 2010-07-14 2012-02-02 Honda Motor Co Ltd Eyeball imaging apparatus
KR20130054830A (en) 2011-11-17 2013-05-27 현대자동차주식회사 Method and system for alarming dangerous situation at road with driver state monitoring and traffic report
US8457367B1 (en) 2012-06-26 2013-06-04 Google Inc. Facial recognition
KR20130094939A (en) 2012-02-17 2013-08-27 주식회사 샤플라이 Prevention reflection and hotspot device by illumination for face recognition
US20130222642A1 (en) 2012-02-24 2013-08-29 Denso Corporation Imaging control device and program

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090154782A1 (en) * 2007-12-17 2009-06-18 Three Palm Software Dual-magnify-glass visualization for soft-copy mammography viewing
CN101344919B (en) * 2008-08-05 2012-08-22 华南理工大学 Sight tracing method and disabled assisting system using the same
CN102043952B (en) * 2010-12-31 2012-09-19 山东大学 A Eye Tracking Method Based on Dual Light Sources
US20120257035A1 (en) * 2011-04-08 2012-10-11 Sony Computer Entertainment Inc. Systems and methods for providing feedback by tracking user gaze and gestures
WO2012147027A1 (en) * 2011-04-28 2012-11-01 Koninklijke Philips Electronics N.V. Face location detection
US20140341441A1 (en) * 2013-05-20 2014-11-20 Motorola Mobility Llc Wearable device user authentication
CN103500061B (en) * 2013-09-26 2017-11-07 三星电子(中国)研发中心 Control the method and apparatus of display

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0757072A (en) 1993-08-11 1995-03-03 Nissan Motor Co Ltd Driver's state detector
JP2005242428A (en) 2004-02-24 2005-09-08 Nissan Motor Co Ltd Driver face imaging device
JP2005296382A (en) 2004-04-13 2005-10-27 Honda Motor Co Ltd Visual line detector
JP2012019931A (en) 2010-07-14 2012-02-02 Honda Motor Co Ltd Eyeball imaging apparatus
KR20130054830A (en) 2011-11-17 2013-05-27 현대자동차주식회사 Method and system for alarming dangerous situation at road with driver state monitoring and traffic report
KR20130094939A (en) 2012-02-17 2013-08-27 주식회사 샤플라이 Prevention reflection and hotspot device by illumination for face recognition
US20130222642A1 (en) 2012-02-24 2013-08-29 Denso Corporation Imaging control device and program
KR20130097671A (en) 2012-02-24 2013-09-03 가부시키가이샤 덴소 Imaging control device and program
US8988597B2 (en) * 2012-02-24 2015-03-24 Denso Corporation Imaging control device and program for controlling facial image taking apparatus which radiates infrared light toward the face
US8457367B1 (en) 2012-06-26 2013-06-04 Google Inc. Facial recognition
KR20140001164A (en) 2012-06-26 2014-01-06 구글 인코포레이티드 Facial recognition

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20220252924A1 (en) * 2021-02-11 2022-08-11 Seeing Machines Limited Cabin monitoring with electrically switched polarization

Also Published As

Publication number Publication date
CN105678209B (en) 2020-06-30
KR101619661B1 (en) 2016-05-10
US20160162735A1 (en) 2016-06-09
CN105678209A (en) 2016-06-15

Similar Documents

Publication Publication Date Title
US9704037B2 (en) Method for detecting face direction of a person
US10395510B2 (en) Reminding method and reminding device
US10278576B2 (en) Behind-eye monitoring using natural reflection of lenses
US9916690B2 (en) Correction of displayed images for users with vision abnormalities
US9612454B2 (en) Lens and method for correcting vision of a user
JP4811259B2 (en) Gaze direction estimation apparatus and gaze direction estimation method
US9274351B2 (en) Method for optimizing the postural prism of an ophthalmic lens
EP1868139B1 (en) Spectacles detection method
JP2009254525A (en) Pupil detecting method and apparatus
US10496166B2 (en) Eye tracking device and eye tracking method
WO2015014059A1 (en) Imaging apparatus and imaging method
KR101628493B1 (en) Apparatus and method for tracking gaze of glasses wearer
US20190082170A1 (en) Goggle type display device, eye gaze detection method, and eye gaze detection system
WO2013052132A3 (en) Image-based head position tracking method and system
JP2008210239A (en) Line-of-sight estimation device
CN106843474B (en) Mobile terminal display processing method and system
US20150309567A1 (en) Device and method for tracking gaze
WO2018145460A1 (en) Smart user-experience device and smart helmet
US10955915B2 (en) Gaze tracking via tracing of light paths
JP2009240551A (en) Sight line detector
US8128462B2 (en) Method of centering a non-edged ophthalmic lens in which the centering point of the lens is offset relative to its geometric center
US12248147B2 (en) Information processing device and information processing method
JP2010134489A (en) Visual line detection device and method, and program
KR20160061691A (en) Gaze Tracker and Method for Detecting Pupil thereof
JPWO2017179280A1 (en) Gaze measurement apparatus and gaze measurement method

Legal Events

Date Code Title Description
AS Assignment

Owner name: HYUNDAI MOTOR COMPANY, KOREA, REPUBLIC OF

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:YOON, IL YOUNG;REEL/FRAME:036406/0543

Effective date: 20150520

STCF Information on status: patent grant

Free format text: PATENTED CASE

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment: 4

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 8TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1552); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment: 8