EP4405783A1 - An image processing system and method thereof - Google Patents
An image processing system and method thereofInfo
- Publication number
- EP4405783A1 EP4405783A1 EP22769591.3A EP22769591A EP4405783A1 EP 4405783 A1 EP4405783 A1 EP 4405783A1 EP 22769591 A EP22769591 A EP 22769591A EP 4405783 A1 EP4405783 A1 EP 4405783A1
- Authority
- EP
- European Patent Office
- Prior art keywords
- image
- state
- illumination module
- captured
- operating
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/013—Eye tracking input arrangements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/18—Eye characteristics, e.g. of the iris
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
- B60K35/10—Input arrangements, i.e. from user to vehicle, associated with vehicle functions or specially adapted therefor
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/0093—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 with means for monitoring data relating to the user, e.g. head-tracking, eye-tracking
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/10—Image acquisition
- G06V10/12—Details of acquisition arrangements; Constructional details thereof
- G06V10/14—Optical characteristics of the device performing the acquisition or on the illumination arrangements
- G06V10/143—Sensing or illuminating at different wavelengths
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/77—Processing image or video features in feature spaces; using data integration or data reduction, e.g. principal component analysis [PCA] or independent component analysis [ICA] or self-organising maps [SOM]; Blind source separation
- G06V10/80—Fusion, i.e. combining data from various sources at the sensor level, preprocessing level, feature extraction level or classification level
- G06V10/803—Fusion, i.e. combining data from various sources at the sensor level, preprocessing level, feature extraction level or classification level of input or preprocessed data
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/59—Context or environment of the image inside of a vehicle, e.g. relating to seat occupancy, driver state or inner lighting conditions
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/59—Context or environment of the image inside of a vehicle, e.g. relating to seat occupancy, driver state or inner lighting conditions
- G06V20/597—Recognising the driver's state or behaviour, e.g. attention or drowsiness
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/161—Detection; Localisation; Normalisation
- G06V40/166—Detection; Localisation; Normalisation using acquisition arrangements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/18—Eye characteristics, e.g. of the iris
- G06V40/19—Sensors therefor
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K2360/00—Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
- B60K2360/149—Instrument input by detecting viewing direction not otherwise provided for
Definitions
- This disclosure relates to image processing, and more in particular, a method, system and device for processing images for eye tracking commonly used in motor vehicle applications such as driver monitoring.
- DMS Driver monitoring systems
- CMS cabin monitoring systems
- a purpose of this disclosure is to ameliorate the problem of obtaining full resolution images for eye tracking, by providing the subject-matter of the independent claims.
- a purpose of this disclosure is to ameliorate the problem of identifying different types of objects captured in a monitoring system, by providing the subject-matter of the independent claims.
- the objective of this disclosure is solved by a method of processing images for eye tracking, the method comprising: capturing at least one image, by way of an imaging module, when an illumination module is operating in an ON state; and consecutively capturing at least one image, by way of an imaging module, when the illumination module is operating in an OFF state; characterized by: receiving and processing, by way of an image processing unit, the at least one image captured when the illumination module is operating in the ON state; and the at least one image captured when the illumination module is operating in the OFF state; such that one or more resultant images suitable for eye tracking are obtained wherein the one or more resultant images comprises image information
- An advantage of the above described aspect of this disclosure yields a method of processing images suitable for eye tracking, of which image information obtained is selected from a range of wavelength, such that specific details of image information may be analyse for eye tracking.
- the image information of the one or more resultant images comprises at least range of wavelength of the group consisting of:
- NIR near-infrared
- the advantage of the above aspect of this disclosure is to select image information falling within a selected scope of range of wavelengths, such that the one or more resultant images processed will only contain image information between 380nm to 1100nm, and preferably between (1 ) 380nm to 800nm in a RGB range; (2) 700nm to 1100nm in a NIR range; and (3) 400nm to 700nm in a visible light range.
- Preferred is a method of processing images for eye tracking as described above or as described above as being preferred, further comprising: determining, by way of the image processing unit, a first pixel intensity value (P1 ) from the at least one image captured when the at least one illumination module is operating in the ON state; and determining, by way of the image processing unit, a second pixel intensity value (P2) from the at least one image captured consecutively when the at least one illumination module is operating to the OFF state.
- P1 first pixel intensity value
- P2 second pixel intensity value
- Preferred is a method of processing images for eye tracking as described above or as described above as being preferred, in which: calculating, a ratio between an average value of the first pixel intensity value (P1 ) from the at least one image captured when the at least one illumination module is operating in the ON state; and an average value of the second pixel intensity value (P2) from the at least one image captured consecutively when the at least one illumination module is operating to the OFF state; and controlling an analog dimming function of the imaging module in response to the ratio calculated.
- P1 average value of the first pixel intensity value
- P2 average value of the second pixel intensity value
- the advantage of the above aspect of this disclosure is to control an analog diming in response to a calculated ratio, where the ratio is an average value of a first pixel value of the first image captured and an average value of a second pixel value of the second image consecutively captured.
- An advantage of obtaining is calculated ratio is such that brightness of lighting module can be adjusted according to the calculated ratio as a form of feedback information between a previous frame and a subsequent frame to achieve precise analog dimming control.
- Preferred is a method of processing images for eye tracking as described above or as described above as being preferred, in which: obtaining the one or more resultant images comprising an image in a near-infrared (NIR) wavelength by subtracting, the first pixel intensity value (P1 ) from determined according to the at least one images captured when the illumination module is operating in the ON state from the second pixel intensity value (P2) from determined according to the at least one images captured when the illumination module is operating in the OFF state.
- NIR near-infrared
- the advantage of the above aspect of this disclosure is to obtain one or more resultant images containing image information in the NIR wavelength or within a wavelength range of 700nm to 1100nm, by applying an image subtracting method.
- An advantages of obtaining a full resolution resultant image containing image information in the NIR wavelength or 700nm to 1100nm is to perform further eye tracking analysis in the aforesaid wavelength range.
- the advantage of the above aspect of this disclosure is to process images of captured by the imaging module, by feature matching and image alignment to produce full resolution resultant images.
- this aspect is advantageous for sensing fast-moving objects captured.
- Preferred is a method of processing images for eye tracking as described above or as described above as being preferred, in which: the eye tracking is selected from the group consisting of: estimation of light of sight of user, for purposes of virtual reality (VR) applications; estimation of eye position of user, for purposes of augmented reality (AR) applications; and estimation of a state of a user, for purposes of driver monitoring applications.
- VR virtual reality
- AR augmented reality
- the advantage of the above aspect of this disclosure is to yield resultant images suitable for eye tracking, for purposes of virtual reality (VR) applications; augmented reality (AR) applications and driver monitoring applications.
- VR virtual reality
- AR augmented reality
- RGB red, blue, green
- IR infrared
- the advantage of the above aspect of this disclosure is to obtain one or more resultant images containing image information in the RGB and IR wavelength, by controlling a status of the illumination module, to process at least two images consecutively, in at least two different types of wavelength and more in particular within a range of 380nm to 800nm.
- An advantages of obtaining a full resolution resultant image containing image information in the RGB and IR wavelength or withing a range of 380nm to 1100nm is to perform further eye tracking analysis in the aforesaid wavelength range.
- Preferred is a method of processing images for eye tracking as described above or as described above as being preferred, further comprising: subtracting, by way of the image processing unit, a visible light wavelength from the one or more resultant images obtained.
- the advantage of the above aspect of this disclosure is to apply an image subtracting method to remove visible light wavelength such that the one or more resultant images contains resultant images in within a range of selected wavelength without visible light.
- An advantage of this aspect of this disclosure is to yield suppression of undesirable noise signals.
- an image processing system for processing images for eye tracking function, the system comprising: an illumination module operable to operate between an ON state and an OFF state; an imaging module operable to capture at least one image when the illumination module operates in the ON state; and to consecutively capture at least one image when the illumination module operates in the OFF state; and a processing unit operable to process information of the at least one image captured; characterized in that: the image processing unit is further operable to obtain one or more resultant images in a selected group of wavelength, such that the one or more resultant images are suitable for eye tracking.
- An advantage of the above described aspect of this disclosure yields an image processing system of processing images suitable for eye tracking, of which image information obtained is selected from a range of wavelength, such that specific details of image information may be analyse for eye tracking.
- the imaging module is selected from the group consisting of: an image sensor operable to capture images in a combination of RGB wavelength and IR wavelength; and a global shutter sensor.
- the advantage of the above aspect of this disclosure is to yield an imaging processing system using only a single image sensor configuration suitable for capturing full resolution images in multiple wavelengths.
- this yields an imaging processing system which requires minimal hardware.
- the image information of the one or more resultant images comprises at least range of wavelength of the group comprising:
- NIR near-infrared
- the illumination module is selected from the group consisting of:
- VSCEL vertical cavity surface emitting laser
- the advantage of the above aspect of this disclosure is to yield a computer program product to cause the image processing system to execute the steps of the method as disclosed herein for processing images captured.
- the objective of this disclosure is solved by a computer-readable medium having stored thereon the compute program product as described above or as described above as being preferred.
- the advantage of the above aspect of this disclosure is to yield a computer-readable medium for storing the computer program product, to cause the image processing system to execute the steps of the method as disclosed herein for processing images captured.
- FIG. 1 shows a system block diagram in accordance with a preferred embodiment.
- FIG. 2 shows exemplary image frame intervals of images captured in accordance with a preferred embodiment.
- FIG. 3 shows a flowchart 300 for processing images for eye tracking in a preferred embodiment.
- FIG. 4a-c shows an exemplary frame by frame image subtraction process in a preferred embodiment.
- like reference signs refer to like components in several perspective views and/or configurations.
- first”, “second”, “third” and the like used in the context of this disclosure may refer to modification of different elements in accordance to various exemplary embodiments, but not limited thereto.
- the expressions may be used to distinguish one element form another element, regardless of sequence of importance.
- a first image and “a second image” may indicate different images regardless of order or importance.
- a first image may be referred to as the second image and vice versa without departing from the scope of this disclosure.
- FIG. 1 shows a system block diagram 100 in accordance with a preferred embodiment.
- system 100 shows an image processing system for eye tracking functions.
- system 100 includes a control module 102 for executing image processing functions and an image module 104 for executing image processing functions.
- the imaging module 104 further includes an image sensor 106 with a lens or imaging optics 110 for receiving light rays and an illumination module 108 having illumination optics 112.
- the image module 104 may include circuitry for example a driver circuit and/or a digital - analog circuit (DAC) for driving the image module 104.
- the control module 102 and image module 104 may be in electrical communication.
- the control module 102 is a system-on-chip (SoC) operable to control the entire image processing system 100 and execute algorithms.
- the image sensor 106 may be a global shutter type image sensor.
- Example of imaging optics 110 may include lens and/or optical filters suitable for working with selected wavelengths operable by the image sensor 106.
- the image processing system 100 further includes illumination module 108. Suitable types of illumination module 108 includes light emitting diodes (LED) and vertical cavity surface emitting laser (VCSEL).
- the image processing system 100 may further include illumination optics 112. Examples of illumination optics 112 includes diffusor or reflector.
- the image sensor 106 of the imaging module is operable to capture a first image when the illumination module 108 is operating in an ON state, and consecutively capture at least second image when the illumination module 108 is operating in an OFF state.
- the image sensor 106 will receive sensor signals within a certain wavelength at a predetermined time interval. This process of capturing a first image when the illumination module 108 is operating in an ON state, and consecutively capture at least second image when the illumination module 108 is operating in an OFF state may be adjusted or designed according to exposure time and gain setting based on image processing requirements.
- the focus shall relate to eye tracking functions.
- FIG. 2 shows exemplary image frame intervals of images captured, or a switching logic mode 200 in accordance with an exemplary embodiment.
- the aforesaid switching logic mode can be achieved by switching signals generated by either a controller or using logic circuit chips.
- a first image captured contains a combination of selected wavelengths, or at least two selected wavelengths.
- a second image consecutively captured contains only RGB wavelengths.
- a main advantage of this switching logic mode 200 is to that within an image frame rate interval, two distinct resultant images may be processed by image processing system 100 in a single image frame interval 202, each resultant image comprising image information of a selected range of wavelength.
- the image sensor 106 is operable to detect a combination of at least two types of wavelengths.
- Suitable examples of the image sensor 106 operable to capture sensing signals in dual wavelength may be an image sensor operable to sense red, blue, green (RGB) and infrared (IR) wavelengths.
- the illumination module 108 is a near infrared (NIR) light source.
- NIR near infrared
- FIG. 3 which shows a flowchart 300 for processing images for eye tracking in a preferred embodiment.
- the control module 102 executes a command causing the illumination module 108 to operate in an ON mode, and the imaging module 104 to capture a first image.
- the illumination module 108 is operating in the ON mode, the image information captured in the first image comprises image information in dual wavelengths.
- the first image may comprise image information in a range of 400nm to 700nm, or a visible light range, and may further comprises image information in a range of 700nm to 1100nm in a NIR range.
- the control module 102 is operable to determine a pixel value (P1 ) of the first image, which may include both visible light optical power and NIR light optical power.
- the control module 102 consecutively executes a command causing the illumination module 108 to operate in an OFF mode, and the imaging module 104 to consecutively capture at least one second image.
- the image information captured in the at least one second image comprises image information in a single wavelength or in a range of 400nm to 700nm, or a visible light range.
- the control module 102 may determine a pixel value (P2) of the second image, which will include both visible light optical power and NIR light optical power.
- control module 102 may further process the at least one second image captured when the illumination module 108 is operating in an OFF mode, to obtain a high-resolution colour image containing image information in 380 nm to 800 nm in a red, blue, green (RGB) range only.
- suitable imaging processing step may be a demosaicing algorithm, such as colour filter array or colour filter mosaic
- control module 102 may further process the first image and the at least one second image captured, by subtracting the first pixel value (P1 ) determined from the second pixel value (P2) determined, such that a resultant image comprising image information in a range of 700nm to 1100nm in a near-infrared (NIR) range is yield.
- NIR near-infrared
- the resultant image produced is a full resolution NIR image.
- the advantage of processing full resolution NIR image for eye tracking is of importance in the field of machine vision applications, in particular where eye tracking is applicable.
- applicable eye tracking function for estimation of light of sight of user, for purposes of virtual reality (VR) applications; estimation of eye position of user, for purposes of augmented reality (AR) applications and estimation of a state of a user, for purposes of driver monitoring applications.
- the aforesaid configuration addresses the some of the problems in eye tracking image processing systems, i.e. , lack of high-quality images to accurately estimate position of eyes.
- NIR imaging pixels and NIR imaging pixels share the same exposure time during image capturing but with different quantum efficiency (QE) and irradiance at pixel surface.
- QE quantum efficiency
- the brightness difference between visible light optical power and NIR light optical power can lead to poor image quality.
- the brightness of NIR illumination module needs to be adjusted. This can be achieved by using a driver chip with analog dimming function.
- analog dimming is not controllable via direct inter-chip communication.
- a control module such as a controllable DAC chip or a pulse width modulation (PWM) based resistor-capacitor (RC) circuit, RC network or RC filter, for generating an analog signal may be necessary to control this analog dimming function.
- PWM pulse width modulation
- control module 102 of this disclosure executes calculating of a ratio between an average value of the first pixel intensity value (P1 ) from the at least one image captured when the at least one illumination module is operating in the ON state and an average value of the second pixel intensity value (P2) from the at least one image captured consecutively when the at least one illumination module is operating to the OFF state, and controlling an analog dimming function of the imaging module 104 in response to the ratio calculated.
- the calculated ratio is used as an additional feedback to control the analog diming settings to the subsequent image frame.
- an NIR illumination module may not be operable to supply sufficient brightness to meet an expected ratio to yield a full resolution resultant image.
- an optical light neutral density (ND) filter (not shown) may be combined with NIR long pass filter to reduce the ambient visible lighting brightness while maintaining the NIR light brightness as a mitigation solution.
- FIG. 4a-c the principles of generating full resolution NIR images at step 308 can be achieved using pixelwise processing.
- an image containing image information in RGB wavelength and IR wavelength is the minued while a corresponding image represented by FIG. 4b containing image information in RGB wavelength only is the subtrahend.
- the respective images may be processed by the image processing system 100 at steps 302 and step 308.
- the image processing system is operable to consecutively capture images in different wavelengths, thus processing at least two images of different distinct wavelengths within an image frame interval.
- the image processing method and system as disclosed herein yields full resolution images under different ambient lighting conditions, to achieve accuracy to eye tracking.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- General Health & Medical Sciences (AREA)
- Human Computer Interaction (AREA)
- Health & Medical Sciences (AREA)
- Mechanical Engineering (AREA)
- Transportation (AREA)
- Combustion & Propulsion (AREA)
- Chemical & Material Sciences (AREA)
- General Engineering & Computer Science (AREA)
- Ophthalmology & Optometry (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Artificial Intelligence (AREA)
- Databases & Information Systems (AREA)
- Evolutionary Computation (AREA)
- Medical Informatics (AREA)
- Software Systems (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Computing Systems (AREA)
- Optics & Photonics (AREA)
- Studio Devices (AREA)
- Eye Examination Apparatus (AREA)
Abstract
A method of processing images for eye tracking is disclosed. The method comprises capturing at least one image, by way of an imaging module, when an illumination module is operating in an ON state, and consecutively capturing at least one image, by way of an imaging module, when the illumination module is operating in an OFF state. The method includes receiving and processing, by way of an image processing unit, the at least one image captured when the illumination module is operating in the ON state; and the at least one image captured when the illumination module is operating in the OFF state, such that one or more resultant images suitable for eye tracking are obtained. The one or more resultant image(s) comprise(s) image information of at least one eye and of a selected range of wavelength. A system and a computer program product are also disclosed.
Description
AN IMAGE PROCESSING SYSTEM AND METHOD THEREOF
TECHNICAL FIELD
This disclosure relates to image processing, and more in particular, a method, system and device for processing images for eye tracking commonly used in motor vehicle applications such as driver monitoring.
BACKGROUND
Driver monitoring systems (DMS) has been used in the automotive industry for determining a status of operators for some years. The nature of DMS uses a driver’s facial characteristics identification, for example eye movement or head position to determine status of the operator.
Increasingly, other types of automotive monitoring systems such as cabin monitoring systems (CMS) is necessitated, of which CMS monitors an entire passenger cabin, to determine for example, number of passengers onboard, intruders while vehicle is parked and/or potential attacks such as robbery. Such applications require image analyses to identify different types of objects within the passenger cabin.
Due to the nature of lighting conditions within a passenger cabin of motor vehicles, it is always challenging to obtain clear, full resolution images for such analysis. Further, different types of objects within a passenger cabin have different texture and depth, which increases the complexity of identifying different types of objects within a passenger cabin.
There is therefore a need to provide a method and system for processing images suitable for eye tracking, that overcomes, or at least ameliorates, the problems described above. Furthermore, other desirable features and characteristics will become apparent from the subsequent detailed description and the appended
claims, taking in conjunction with the accompanying drawings and this background of the disclosure.
SUMMARY
A purpose of this disclosure is to ameliorate the problem of obtaining full resolution images for eye tracking, by providing the subject-matter of the independent claims.
Further, a purpose of this disclosure is to ameliorate the problem of identifying different types of objects captured in a monitoring system, by providing the subject-matter of the independent claims.
The objective of this disclosure is solved by a method of processing images for eye tracking, the method comprising: capturing at least one image, by way of an imaging module, when an illumination module is operating in an ON state; and consecutively capturing at least one image, by way of an imaging module, when the illumination module is operating in an OFF state; characterized by: receiving and processing, by way of an image processing unit, the at least one image captured when the illumination module is operating in the ON state; and the at least one image captured when the illumination module is operating in the OFF state; such that one or more resultant images suitable for eye tracking are obtained wherein the one or more resultant images comprises image information
- of at least one eye and
of a selected range of wavelength.
An advantage of the above described aspect of this disclosure yields a method of processing images suitable for eye tracking, of which image information obtained is selected from a range of wavelength, such that specific details of image information may be analyse for eye tracking.
Preferred is a method of processing images for eye tracking as described above or as described above as being preferred, in which: the image information of the one or more resultant images comprises at least range of wavelength of the group consisting of:
• 380nm to 800nm in a red, blue, green (RGB) range;
• 700nm to 110Onm in a near-infrared (NIR) range; and
• 400nm to700nm in a visible light range.
The advantage of the above aspect of this disclosure is to select image information falling within a selected scope of range of wavelengths, such that the one or more resultant images processed will only contain image information between 380nm to 1100nm, and preferably between (1 ) 380nm to 800nm in a RGB range; (2) 700nm to 1100nm in a NIR range; and (3) 400nm to 700nm in a visible light range.
Preferred is a method of processing images for eye tracking as described above or as described above as being preferred, further comprising: determining, by way of the image processing unit, a first pixel intensity value (P1 ) from the at least one image captured when the at least one illumination module is operating in the ON state; and determining, by way of the image processing unit, a second pixel intensity value (P2) from the at least one image captured consecutively when the at least one illumination module is operating to the OFF state.
The advantage of the above aspect of this disclosure is to capture images alternatively with controllable amplitude, such image information consecutively captured in different selected wavelengths.
Preferred is a method of processing images for eye tracking as described above or as described above as being preferred, in which: calculating, a ratio between an average value of the first pixel intensity value (P1 ) from the at least one image captured when the at least one illumination module is operating in the ON state; and an average value of the second pixel intensity value (P2) from the at least one image captured consecutively when the at least one illumination module is operating to the OFF state; and controlling an analog dimming function of the imaging module in response to the ratio calculated.
The advantage of the above aspect of this disclosure is to control an analog diming in response to a calculated ratio, where the ratio is an average value of a first pixel value of the first image captured and an average value of a second pixel value of the second image consecutively captured. An advantage of obtaining is calculated ratio is such that brightness of lighting module can be adjusted according to the calculated ratio as a form of feedback information between a previous frame and a subsequent frame to achieve precise analog dimming control.
Preferred is a method of processing images for eye tracking as described above or as described above as being preferred, in which: obtaining the one or more resultant images comprising an image in a near-infrared (NIR) wavelength by
subtracting, the first pixel intensity value (P1 ) from determined according to the at least one images captured when the illumination module is operating in the ON state from the second pixel intensity value (P2) from determined according to the at least one images captured when the illumination module is operating in the OFF state.
The advantage of the above aspect of this disclosure is to obtain one or more resultant images containing image information in the NIR wavelength or within a wavelength range of 700nm to 1100nm, by applying an image subtracting method. An advantages of obtaining a full resolution resultant image containing image information in the NIR wavelength or 700nm to 1100nm is to perform further eye tracking analysis in the aforesaid wavelength range.
Preferred is a method of processing images for eye tracking as described above or as described above as being preferred, in which: aligning a first image frame captured and a second image frame captured consecutively by: determining, by way of the image processing unit, at least one identical feature between the first image frame captured and the second image frame captured consecutively; and matching, the at least one identical feature determined between the first image frame captured and the second image frame captured consecutively, prior to obtaining the one or more resultant images.
The advantage of the above aspect of this disclosure is to process images of captured by the imaging module, by feature matching and image alignment to produce full resolution resultant images. In particular, this aspect is advantageous for sensing fast-moving objects captured.
Preferred is a method of processing images for eye tracking as described above or as described above as being preferred, in which: the eye tracking is selected from the group consisting of: estimation of light of sight of user, for purposes of virtual reality (VR) applications; estimation of eye position of user, for purposes of augmented reality (AR) applications; and estimation of a state of a user, for purposes of driver monitoring applications.
The advantage of the above aspect of this disclosure is to yield resultant images suitable for eye tracking, for purposes of virtual reality (VR) applications; augmented reality (AR) applications and driver monitoring applications.
Preferred is a method of processing images for eye tracking as described above or as described above as being preferred, in which: obtaining the one or more resultant images comprising an image in a red, blue, green (RGB) wavelength and an infrared (IR) wavelength by: capturing the at least one image, by way of the imaging module, when the illumination module is operating in the ON state. and consecutively obtaining the one or more resultant images comprising an image in a red, blue green (RGB) wavelength and a visible light wavelength by capturing the at least one image, by way of the imaging module, when the illumination module is operating in the OFF state.
The advantage of the above aspect of this disclosure is to obtain one or more resultant images containing image information in the RGB and IR wavelength, by controlling a status of the illumination module, to process at least two images consecutively, in at least two different types of wavelength and more in particular within a range of 380nm to 800nm. An advantages of obtaining a full resolution resultant image containing image information in the RGB and IR wavelength or
withing a range of 380nm to 1100nm is to perform further eye tracking analysis in the aforesaid wavelength range.
Preferred is a method of processing images for eye tracking as described above or as described above as being preferred, further comprising: subtracting, by way of the image processing unit, a visible light wavelength from the one or more resultant images obtained.
The advantage of the above aspect of this disclosure is to apply an image subtracting method to remove visible light wavelength such that the one or more resultant images contains resultant images in within a range of selected wavelength without visible light. An advantage of this aspect of this disclosure is to yield suppression of undesirable noise signals.
The objective of this disclosure is solved by an image processing system for processing images for eye tracking function, the system comprising: an illumination module operable to operate between an ON state and an OFF state; an imaging module operable to capture at least one image when the illumination module operates in the ON state; and to consecutively capture at least one image when the illumination module operates in the OFF state; and a processing unit operable to process information of the at least one image captured; characterized in that: the image processing unit is further operable to obtain one or more resultant images in a selected group of wavelength, such that the one or more resultant images are suitable for eye tracking.
An advantage of the above described aspect of this disclosure yields an image processing system of processing images suitable for eye tracking, of which image
information obtained is selected from a range of wavelength, such that specific details of image information may be analyse for eye tracking.
Preferred is an image processing system as described above or as described above as being preferred, in which: the imaging module is selected from the group consisting of: an image sensor operable to capture images in a combination of RGB wavelength and IR wavelength; and a global shutter sensor.
The advantage of the above aspect of this disclosure is to yield an imaging processing system using only a single image sensor configuration suitable for capturing full resolution images in multiple wavelengths. Advantageously, this yields an imaging processing system which requires minimal hardware.
Preferred is an image processing system as described above or as described above as being preferred, in which: the image information of the one or more resultant images comprises at least range of wavelength of the group comprising:
• 380 nm to 800 nm in a red, blue, green (RGB) range;
• 700nm to 110Onm in a near-infrared (NIR) range; and
• 400nm to 700nm in a visible light range.
The advantage of the above aspect of this disclosure yields multiple images, each image containing information in a different wavelength, which maybe processed by the image processing system disclosed herein.
Preferred is an image processing system as described above or as described above as being preferred, in which: the illumination module is selected from the group consisting of:
• light-emitting diode (LED); and
• vertical cavity surface emitting laser (VSCEL).
The advantage of the above aspect of this disclosure yields different types of illumination module suitable for use in the image processing system disclosed herein.
The objective of this disclosure is solved by a computer program product comprising instructions to cause the image processing system as described above or as described above as being preferred to execute the steps of the method as described above or as described above as being preferred.
The advantage of the above aspect of this disclosure is to yield a computer program product to cause the image processing system to execute the steps of the method as disclosed herein for processing images captured.
The objective of this disclosure is solved by a computer-readable medium having stored thereon the compute program product as described above or as described above as being preferred.
The advantage of the above aspect of this disclosure is to yield a computer-readable medium for storing the computer program product, to cause the image processing system to execute the steps of the method as disclosed herein for processing images captured.
BRIEF DESCRIPTION OF DRAWINGS
Other objects and aspects of this disclosure will become apparent from the following description of embodiments with reference to the accompanying drawings in which:
FIG. 1 shows a system block diagram in accordance with a preferred embodiment.
FIG. 2 shows exemplary image frame intervals of images captured in accordance with a preferred embodiment.
FIG. 3 shows a flowchart 300 for processing images for eye tracking in a preferred embodiment.
FIG. 4a-c shows an exemplary frame by frame image subtraction process in a preferred embodiment. In various embodiments described by reference to the above figures, like reference signs refer to like components in several perspective views and/or configurations.
DETAILED DESCRIPTION OF EMBODIMENTS
The following detailed description is merely exemplary in nature and is not intended to limit the disclosure or the application and uses of the disclosure. Furthermore, there is no intention to be bound by any theory presented in the preceding background of the disclosure or the following detailed description. It is the intent of this disclosure to present an image processing method and system which yields full resolution images for eye tracking purposes.
Hereinafter, the term “first”, “second”, “third” and the like used in the context of this disclosure may refer to modification of different elements in accordance to various exemplary embodiments, but not limited thereto. The expressions may be used to distinguish one element form another element, regardless of sequence of importance. By way of an example, “a first image” and “a second image” may indicate different images regardless of order or importance. On a similar note, a first image may be referred to as the second image and vice versa without departing from the scope of this disclosure.
FIG. 1 shows a system block diagram 100 in accordance with a preferred embodiment. In particular, system 100 shows an image processing system for eye tracking functions. In an embodiment, system 100 includes a control module 102 for executing image processing functions and an image module 104 for executing image processing functions. The imaging module 104 further includes an image sensor 106 with a lens or imaging optics 110 for receiving light rays and an illumination module 108 having illumination optics 112. The image module 104 may include circuitry for example a driver circuit and/or a digital - analog circuit (DAC) for driving the image module 104. The control module 102 and image module 104 may be in electrical communication.
As shown in FIG. 1 , the control module 102 is a system-on-chip (SoC) operable to control the entire image processing system 100 and execute algorithms. The image sensor 106 may be a global shutter type image sensor. Example of imaging optics 110 may include lens and/or optical filters suitable for working with selected
wavelengths operable by the image sensor 106. As shown in FIG. 1 , the image processing system 100 further includes illumination module 108. Suitable types of illumination module 108 includes light emitting diodes (LED) and vertical cavity surface emitting laser (VCSEL). The image processing system 100 may further include illumination optics 112. Examples of illumination optics 112 includes diffusor or reflector. In an embodiment, the image sensor 106 of the imaging module is operable to capture a first image when the illumination module 108 is operating in an ON state, and consecutively capture at least second image when the illumination module 108 is operating in an OFF state. By alternatively switching the illumination module 108 in an ON mode and an OFF mode, the image sensor 106 will receive sensor signals within a certain wavelength at a predetermined time interval. This process of capturing a first image when the illumination module 108 is operating in an ON state, and consecutively capture at least second image when the illumination module 108 is operating in an OFF state may be adjusted or designed according to exposure time and gain setting based on image processing requirements. In this disclosure, the focus shall relate to eye tracking functions.
FIG. 2 shows exemplary image frame intervals of images captured, or a switching logic mode 200 in accordance with an exemplary embodiment. The aforesaid switching logic mode can be achieved by switching signals generated by either a controller or using logic circuit chips.
As shown in FIG. 2, when the illumination module 108 of the image processing system 100 is operating in ON mode, a first image captured contains a combination of selected wavelengths, or at least two selected wavelengths. When the illumination module 108 is operating in the OFF mode, a second image consecutively captured contains only RGB wavelengths. A main advantage of this switching logic mode 200 is to that within an image frame rate interval, two distinct resultant images may be processed by image processing system 100 in a single image frame interval 202, each resultant image comprising image information of a selected range of wavelength.
In this exemplary embodiment, the image sensor 106 is operable to detect a combination of at least two types of wavelengths. Suitable examples of the image sensor 106 operable to capture sensing signals in dual wavelength may be an image sensor operable to sense red, blue, green (RGB) and infrared (IR) wavelengths. In this exemplary embodiment, the illumination module 108 is a near infrared (NIR) light source. An advantage of this embodiment is to produce high resolution images captured under dimly lit ambient conditions. An exemplary scenario will be capturing images for eye tracking function of an operator sitting within an interior of a motor vehicle operating at night.
FIG. 3 which shows a flowchart 300 for processing images for eye tracking in a preferred embodiment. At step 302, the control module 102 executes a command causing the illumination module 108 to operate in an ON mode, and the imaging module 104 to capture a first image. In this embodiment, the illumination module 108 is operating in the ON mode, the image information captured in the first image comprises image information in dual wavelengths.
The first image may comprise image information in a range of 400nm to 700nm, or a visible light range, and may further comprises image information in a range of 700nm to 1100nm in a NIR range. Accordingly, the control module 102 is operable to determine a pixel value (P1 ) of the first image, which may include both visible light optical power and NIR light optical power.
At the next step 304, the control module 102 consecutively executes a command causing the illumination module 108 to operate in an OFF mode, and the imaging module 104 to consecutively capture at least one second image. When the illumination module 108 is operating in the OFF mode, the image information captured in the at least one second image comprises image information in a single wavelength or in a range of 400nm to 700nm, or a visible light range. Accordingly, the control module 102 may determine a pixel value (P2) of the second image, which will include both visible light optical power and NIR light optical power.
Optionally at step 306, the control module 102 may further process the at least one second image captured when the illumination module 108 is operating in an OFF mode, to obtain a high-resolution colour image containing image information in 380 nm to 800 nm in a red, blue, green (RGB) range only. An example of suitable imaging processing step may be a demosaicing algorithm, such as colour filter array or colour filter mosaic
At step 308, the control module 102 may further process the first image and the at least one second image captured, by subtracting the first pixel value (P1 ) determined from the second pixel value (P2) determined, such that a resultant image comprising image information in a range of 700nm to 1100nm in a near-infrared (NIR) range is yield. Advantageously, the resultant image produced is a full resolution NIR image. The advantage of processing full resolution NIR image for eye tracking is of importance in the field of machine vision applications, in particular where eye tracking is applicable. By way of example, applicable eye tracking function for estimation of light of sight of user, for purposes of virtual reality (VR) applications; estimation of eye position of user, for purposes of augmented reality (AR) applications and estimation of a state of a user, for purposes of driver monitoring applications.
For clarity and brevity, the principles of image subtraction process in this disclosure are explained in detail below. As mentioned above, one of the advantages of using a single image sensor 106 capturing single RGB sensing signals and IR sensing signals with an NIR illumination module 108 configuration is to capture images for eye tracking function under dimly lit conditions.
The aforesaid configuration addresses the some of the problems in eye tracking image processing systems, i.e. , lack of high-quality images to accurately estimate position of eyes.
Under lit conditions, for example during daytime, visible light imaging pixels and NIR imaging pixels share the same exposure time during image capturing but with different quantum efficiency (QE) and irradiance at pixel surface. The brightness
difference between visible light optical power and NIR light optical power can lead to poor image quality. To counter the effects leading to poor image quality, the brightness of NIR illumination module needs to be adjusted. This can be achieved by using a driver chip with analog dimming function. However, analog dimming is not controllable via direct inter-chip communication. A control module, such as a controllable DAC chip or a pulse width modulation (PWM) based resistor-capacitor (RC) circuit, RC network or RC filter, for generating an analog signal may be necessary to control this analog dimming function.
In contrast, the control module 102 of this disclosure executes calculating of a ratio between an average value of the first pixel intensity value (P1 ) from the at least one image captured when the at least one illumination module is operating in the ON state and an average value of the second pixel intensity value (P2) from the at least one image captured consecutively when the at least one illumination module is operating to the OFF state, and controlling an analog dimming function of the imaging module 104 in response to the ratio calculated. The calculated ratio is used as an additional feedback to control the analog diming settings to the subsequent image frame.
Under extreme ambient lighting condition, an NIR illumination module may not be operable to supply sufficient brightness to meet an expected ratio to yield a full resolution resultant image. Under such circumstances, an optical light neutral density (ND) filter (not shown) may be combined with NIR long pass filter to reduce the ambient visible lighting brightness while maintaining the NIR light brightness as a mitigation solution.
Turning now to FIG. 4a-c, the principles of generating full resolution NIR images at step 308 can be achieved using pixelwise processing. As shown in FIG. 4a, an image containing image information in RGB wavelength and IR wavelength is the minued while a corresponding image represented by FIG. 4b containing image information in RGB wavelength only is the subtrahend. The respective images may be processed by the image processing system 100 at steps 302 and step 308.
Applying the image subtraction algorithm, the pixelwise post-processing formulas can be defined as follows: ● P3(IR)(1,1) = P1(B + IR)(1,1) − P2(B)(1,1) ● P3(IR)(1,2) = P1(G + IR)(1,2) − P2(G)(1,2) ● P3(IR)(1,2) = P1(R + IR)(1,2) − P2(G)(1,3) ● … ● P3(IR)(2,2) = P1(IR)(2,2) − P2(IR)(2,2) wherein P = pixel intensity value, for e.g., P1 = first pixel intensity value R = red color value B = blue color value G = green color value IR = infrared value Thus, it can be seen, an image processing method and system having the advantage of yielding full resolution in selected wavelengths has been provided. More advantageously, the image processing system is operable to consecutively capture images in different wavelengths, thus processing at least two images of different distinct wavelengths within an image frame interval. By determining the pixel intensity value of each image pixel, the image processing method and system as disclosed herein yields full resolution images under different ambient lighting conditions, to achieve accuracy to eye tracking. While exemplary embodiments have been presented in the foregoing detailed description of the disclosure, it should be appreciated that a vast number of variation exist. It should further be appreciated that the exemplary embodiments are only examples, and are not intended to limit the scope, applicability, operation, or configuration of the disclosure in any way. Rather, the foregoing detailed description will provide those skilled in the art with a convenient road map for implementing an exemplary embodiment of the disclosure, it being understood that various changes may be made in the function and arrangement of elements and method of operation
described in the exemplary embodiment without departing from the scope of the disclosure as set forth in the appended claims. Reference Signs
Claims
1 . A method of processing images for eye tracking, the method comprising: capturing at least one image, by way of an imaging module, when an illumination module is operating in an ON state; and consecutively capturing one or more resultant images suitable for eye tracking, by way of an imaging module, when the illumination module is operating in an OFF state; characterized by: receiving and processing, by way of an image processing unit, the at least one image captured when the illumination module is operating in the ON state; and the at least one image captured when the illumination module is operating in the OFF state; such that one or more resultant images suitable for eye tracking are obtained wherein the one or more resultant images comprises image information
- of at least one eye captured in at least two selected range of wavelength, and the method further comprises: subtracting, a first pixel value from a first image and a second pixel value from a second image for yielding at least one full resolution resultant image comprising image information in at least one selected range of wavelengths.
2. The method of claim 1 , wherein the image information of the one or more resultant images comprises at least range of wavelength of the group consisting of:
• 380 nm to 800 nm in a red, blue, green (RGB) range;
• 700nm to 1100nm in a near-infrared (NIR) range; and
• 400nm to 700nm in a visible light range.
3. The method of claims 1 - 2, further comprising: determining, by way of the image processing unit, a first pixel intensity value (P1 ) from the at least one image captured when the at least one illumination module is operating in the ON state; and determining, by way of the image processing unit, a second pixel intensity value (P2) from the at least one image captured consecutively when the at least one illumination module is operating to the OFF state.
4. The method of claim 3, further comprising: calculating, a ratio between an average value of the first pixel intensity value (P1 ) from the at least one image captured when the at least one illumination module is operating in the ON state; and an average value of the second pixel intensity value (P2) from the at least one image captured consecutively when the at least one illumination module is operating to the OFF state; and controlling an analog dimming function of the imaging module in response to the ratio calculated.
5. The method of claims 1 - 4, further comprising: obtaining the one or more resultant images comprising an image in a near-infrared (NIR) wavelength by subtracting, the first pixel intensity value (P1 ) from determined according to the at least one images captured when the illumination module is operating in the ON state from the second pixel intensity value (P2) from determined according to the at least one images captured when the illumination module is operating in the OFF state.
6. The method according to any one of the preceding claims, further comprising aligning a first image frame captured and a second image frame captured consecutively by: determining, by way of the image processing unit, at least one identical feature between the first image frame captured and the second image frame captured consecutively; and matching, the at least one identical feature determined between the first image frame captured and the second image frame captured consecutively, prior to obtaining the one or more resultant images.
7. The method according to any one of the preceding claims, wherein the eye tracking is selected from the group consisting of: estimation of light of sight of user, for purposes of virtual reality (VR) applications; estimation of eye position of user, for purposes of augmented reality (AR) applications; and estimation of a state of a user, for purposes of driver monitoring applications.
8. The method according to any one of the preceding claims, further comprising: obtaining the one or more resultant images comprising an image in a red, blue, green (RGB) wavelength and an infrared (IR) wavelength by: capturing the at least one image, by way of the imaging module, when the illumination module is operating in the ON state. and consecutively obtaining the one or more resultant images comprising an image in a red, blue green (RGB) wavelength and a visible light wavelength by capturing the at least one image, by way of the imaging module, when the illumination module is operating in the OFF state.
9. The method according to any one of the preceding claims, further comprising: subtracting, by way of the image processing unit, a visible light wavelength from the one or more resultant images obtained.
10. An image processing system for processing images for eye tracking function, the system comprising: an illumination module operable to operate between an ON state and an OFF state; an imaging module operable to capture at least one image when the illumination module operates in the ON state; and to consecutively capture at least one image when the illumination module operates in the OFF state; and a processing unit operable to process information of the at least one image captured; characterized in that: the image processing unit is further operable to receive and process the at least one image captured when the illumination module is operating in the ON state; and the at least one image captured when the illumination module is operating in the OFF state; such that one or more resultant images suitable for eye tracking are obtained wherein the one or more resultant images comprises image information
- of at least one eye captured in
- at least two selected range of wavelength.
The system of claim 10, wherein the imaging module is selected from the group consisting of: an image sensor operable to capture images in a combination of RGB wavelength and IR wavelength; and a global shutter sensor. The system of claims 10-11 , wherein the image information of the one or more resultant images comprises at least range of wavelength of the group comprising:
• 380 nm to 800 nm in a red, blue, green (RGB) range;
• 700nm to 1100nm in a near-infrared (NIR) range; and
• 400nm to 700nm in a visible light range. The system of claim 10-12, wherein the illumination module is selected from the group consisting of:
• light-emitting diode (LED); and
• vertical cavity surface emitting laser (VSCEL). A computer program product comprising instructions to cause the image processing system in claim 10 - 13 to execute the steps of the method of claims 1 to 10. A computer-readable medium having stored thereon the computer program of claim 14.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
GB2113545.4A GB2611289A (en) | 2021-09-23 | 2021-09-23 | An image processing system and method thereof |
PCT/EP2022/073654 WO2023046406A1 (en) | 2021-09-23 | 2022-08-25 | An image processing system and method thereof |
Publications (1)
Publication Number | Publication Date |
---|---|
EP4405783A1 true EP4405783A1 (en) | 2024-07-31 |
Family
ID=78399700
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP22769591.3A Pending EP4405783A1 (en) | 2021-09-23 | 2022-08-25 | An image processing system and method thereof |
Country Status (3)
Country | Link |
---|---|
EP (1) | EP4405783A1 (en) |
GB (1) | GB2611289A (en) |
WO (1) | WO2023046406A1 (en) |
Family Cites Families (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7522344B1 (en) * | 2005-12-14 | 2009-04-21 | University Of Central Florida Research Foundation, Inc. | Projection-based head-mounted display with eye-tracking capabilities |
EP3187100A4 (en) * | 2014-08-29 | 2018-05-09 | Alps Electric Co., Ltd. | Line-of-sight detection device |
WO2017134918A1 (en) * | 2016-02-01 | 2017-08-10 | アルプス電気株式会社 | Line-of-sight detection device |
US10594974B2 (en) * | 2016-04-07 | 2020-03-17 | Tobii Ab | Image sensor for vision based on human computer interaction |
US20220377223A1 (en) * | 2019-11-07 | 2022-11-24 | Seeing Machines Limited | High performance bright pupil eye tracking |
-
2021
- 2021-09-23 GB GB2113545.4A patent/GB2611289A/en active Pending
-
2022
- 2022-08-25 EP EP22769591.3A patent/EP4405783A1/en active Pending
- 2022-08-25 WO PCT/EP2022/073654 patent/WO2023046406A1/en active Application Filing
Also Published As
Publication number | Publication date |
---|---|
GB202113545D0 (en) | 2021-11-10 |
WO2023046406A1 (en) | 2023-03-30 |
GB2611289A (en) | 2023-04-05 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9883080B2 (en) | Vehicle vision system with color correction | |
US10063786B2 (en) | Vehicle vision system with enhanced low light capabilities | |
US20200244929A1 (en) | Vehicular driving assistance system | |
US10356337B2 (en) | Vehicle vision system with gray level transition sensitive pixels | |
CA2792562C (en) | Automatic image equalization for surround-view video camera systems | |
DE102015202846A1 (en) | Vehicle vision system with display | |
US20210127051A1 (en) | Camera fusion and illumination for an in-cabin monitoring system of a vehicle | |
CN114584710B (en) | Camera control device, camera device, and control method of camera control device | |
US20140368654A1 (en) | Chromatic aberration compensation for vehicle cameras | |
EP4405783A1 (en) | An image processing system and method thereof | |
JP7057818B2 (en) | Low light imaging system | |
EP3605497A1 (en) | Illumination image capture device | |
US12122298B2 (en) | Camera mirror system with IR LED night vision system | |
JP6607128B2 (en) | Virtual image display device, virtual image display method, and control program | |
JP6655504B2 (en) | Image processing apparatus, image processing system, moving object, and image processing method | |
US20240042937A1 (en) | Vehicle Camera, Camera System, Video Processing Method, Software, and Vehicle Incorporating the Same | |
US20240344886A1 (en) | A monitoring system and method for identifying objects | |
CN117333859A (en) | License plate recognition method, storage medium and camera system | |
WO2019225165A1 (en) | Light distribution control device, light projection system, and light distribution control method | |
CN119137626A (en) | Vehicle occupant monitoring system including an image acquisition device having a rolling shutter image sensor | |
CN116263987A (en) | Method for capturing images in a vehicle interior and vehicle interior camera system | |
WO2019003361A1 (en) | Imaging system and imaging device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: UNKNOWN |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE |
|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE |
|
17P | Request for examination filed |
Effective date: 20240423 |
|
AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
DAV | Request for validation of the european patent (deleted) | ||
DAX | Request for extension of the european patent (deleted) |