[go: up one dir, main page]

CN113678138A - Biometric imaging apparatus and biometric imaging method for capturing image data of a body part of a person capable of improving image data quality - Google Patents

Biometric imaging apparatus and biometric imaging method for capturing image data of a body part of a person capable of improving image data quality Download PDF

Info

Publication number
CN113678138A
CN113678138A CN202080026328.4A CN202080026328A CN113678138A CN 113678138 A CN113678138 A CN 113678138A CN 202080026328 A CN202080026328 A CN 202080026328A CN 113678138 A CN113678138 A CN 113678138A
Authority
CN
China
Prior art keywords
image data
body part
biometric imaging
capturing
imaging device
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202080026328.4A
Other languages
Chinese (zh)
Inventor
J·贝格维斯特
A·赫伯
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Kunkang Innovation Laboratory Co ltd
Original Assignee
Kunkang Innovation Laboratory Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Kunkang Innovation Laboratory Co ltd filed Critical Kunkang Innovation Laboratory Co ltd
Publication of CN113678138A publication Critical patent/CN113678138A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/60Static or dynamic means for assisting the user to position a body part for biometric acquisition
    • G06V40/67Static or dynamic means for assisting the user to position a body part for biometric acquisition by interactive indications to the user
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/10Image acquisition
    • G06V10/12Details of acquisition arrangements; Constructional details thereof
    • G06V10/14Optical characteristics of the device performing the acquisition or on the illumination arrangements
    • G06V10/143Sensing or illuminating at different wavelengths
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/107Static hand or arm
    • G06V40/11Hand-related biometrics; Hand pose recognition
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/14Vascular patterns

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Image Input (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)

Abstract

A biometric imaging device (10) for capturing image data of a body part (4) of a person, the biometric imaging device (10) comprising at least one of a visible light sensor (101) for capturing image data of the body part in the visible spectrum and a near infrared light sensor (102) for capturing image data of the body part in the near infrared spectrum. A biometric imaging device (10) includes a time-of-flight camera (103), the time-of-flight camera (103) being configured to capture three-dimensional image data of a body part (4) of a person. A biometric imaging device (10) is configured to perform an imaging procedure comprising the steps of: capturing three-dimensional image data of a current body part pose; determining a difference between the desired body part pose and the current body part pose based on the three-dimensional image data; providing user guidance to the person based on the determined difference, enabling the person to adjust the body posture in a direction of the desired posture; and capturing at least one of image data in the visible spectrum and image data in the infrared spectrum.

Description

Biometric imaging apparatus and biometric imaging method for capturing image data of a body part of a person capable of improving image data quality
Technical Field
The present disclosure relates to a biometric imaging device and a biometric imaging method for capturing image data of a body part of a person. In particular, the present disclosure relates to a biometric imaging apparatus and a biometric imaging method for capturing image data of a body part of a person, which can improve the quality of the image data.
Background
Biometric authentication devices, including biometric imaging devices for capturing image data of a body part of a person, are widely used for person authentication, for example in the context of access control to resources (e.g., buildings, rooms, computers, smartphones, electronic banking accounts, voting systems, school or university test papers, frontiers, corporate registries, etc.).
In some embodiments, the biometric imaging device is configured to capture image data of a body part of a person (e.g., a hand of the person) such that individual and typical biometrics may be determined from the captured image data. The captured image data of the hand or body part of the person may relate to image data captured using a near infrared light sensor (e.g., 700nm to 900nm), image data captured using a visible light sensor (e.g., 400nm to 600nm), or a combination thereof. The biometric characteristic determined from the image data may relate to a vein pattern, a palm print, a lifeline, etc. of the hand. Image data captured in near infrared light enables determination of features relating to vein patterns of the hand. Image data captured under visible light enables the determination of features related to the palm print and lifeline of the hand.
The person authentication is based on pre-stored biometrics which are registered under the control of an eligible and trustworthy authority. For example, the facility verifies the identity of a person based on an identification card (e.g., passport). Various image data of a person's hand is captured and a biometric characteristic of the person's hand or body part is determined from the captured image data. The determined biometric is stored in a database as a pre-stored biometric. In some embodiments, the pre-stored biometric characteristic may partially or completely comprise the captured image data. The determined biometric characteristic of the hand or body part, the captured image data of the hand, or a combination thereof may relate to a vein pattern, a palm print, a lifeline, and the like.
Later, if person authentication is required, the biometric authentication device captures image data of a person's hand or body part. Biometric characteristics of a person's hand or body part are determined and compared to pre-stored biometric characteristics and/or pre-stored image data. If a match is found within the pre-stored biometric, authentication of the person is passed, otherwise authentication is denied.
To achieve repeatable results and sufficient authentication accuracy, biometric imaging devices must capture high quality image data of a person's body part or hand in the visible and near infrared spectrum. In particular, illumination with visible light and near-infrared light is required to have high homogeneity and uniform intensity. The capture of image data of a body part of a person requires that high quality image data be provided under various environmental conditions. For example, in an improved installation, previously installed illumination is not an optimal design for installation of a biometric imaging device. To achieve the desired ease of use, biometric imaging devices are typically mounted so that a person can move a body part into a comfortable position. For example, a person may wish to move a flat hand to a posture that is horizontal relative to the ground, or to a posture that has a maximum inclination of 45 ° relative to the ground. However, backlighting may severely impact the quality of image data captured using biometric imaging devices that have been installed to provide a desired level of ease of use. Other applications of biometric imaging devices are related to laptops or smart phones (use in daytime, sunny, rainy conditions, nighttime). However, the captured image data is also required to have high quality under such widely varying environmental conditions. In case it is desired to determine the biometric features only related to a sub-area of the palm of the hand, it becomes particularly difficult to capture high quality image data if it is desired to determine the biometric features of the whole hand and fingers, due to the absence of backlight coverage. Important quality features of the captured image data are related to so-called "regions of interest", which must be clearly identified. This leads to the requirement that the image data must have a high contrast at the edge of the hand in order to unambiguously and reproducibly determine the contour of the hand.
US2005286744a1 discloses capturing images of the palm of a hand. A frontal guide for supporting the wrist is provided. The frontal guidance can naturally guide the palm to the image capturing area of the sensor unit. The palm can be positioned correctly.
US2006023919a1 discloses providing guidance to properly perform image capture of biometric information. The image capturing apparatus performs a plurality of image capturing operations (including distance measurement) at short intervals. And displaying a guide screen according to the analysis result. The guidance includes information such as "please put the hand on the authentication device again", "your hand is too far away", "open palm", "please put the hand parallel to the device", and the like. Guidance is disclosed in connection with mechanical guidance.
Disclosure of Invention
It is an object of the present invention to provide a biometric imaging apparatus and a biometric imaging method which do not have at least some of the disadvantages of the prior art. In particular, it is an object of the present invention to provide a biometric imaging apparatus and a biometric imaging method capable of improving the quality of image data. In particular, it is an object of the present invention to provide a biometric imaging apparatus and a biometric imaging method which are capable of improving image data quality under difficult backlight conditions and without any mechanical assistance means (e.g. robot guidance).
At least one object of the invention is achieved by a biometric imaging apparatus and a biometric imaging method as defined in the appended independent claims. Further embodiments of the invention are set forth in the dependent claims.
At least one object of the invention is achieved by a biometric imaging device for capturing image data of a body part of a person, comprising at least one of a visible light sensor for capturing image data of the body part in the visible spectrum and a near infrared light sensor for capturing image data of the body part in the near infrared spectrum. The biometric imaging device includes a time-of-flight camera configured to capture three-dimensional image data of a body part of a person. The biometric imaging device is configured to perform an imaging procedure comprising the steps of: capturing three-dimensional image data of a current body part pose; determining a difference between the desired body part pose and the current body part pose based on the three-dimensional image data; providing user guidance to the person based on the determined difference, enabling the person to adjust the body part posture in the direction of the desired posture; and capturing at least one of image data in the visible spectrum and image data in the infrared spectrum. The desired body part posture may take into account difficult backlighting conditions. The desired body part pose may be determined dynamically, for example from captured three-dimensional image data. The desired body part posture may take into account certain factors, such as the position of the light source, the brightness of the light source, etc. Without further assistance such as mechanical guidance, the user is able to adjust the body part posture in accordance with the desired body part posture and the quality of the captured image data is improved.
The capture of image data in the visible spectrum and/or the capture of image data in the near infrared spectrum may be optimized based on three-dimensional image data captured by the time-of-flight camera. For example, the optimization may be related to the focal length and/or depth of field of the visible light sensor. For example, the optimization may be related to the focal length and/or depth of field of the near infrared light sensor. For example, if image data is captured on the palm side or back side of the hand, optimization may allow wrinkles or skin folds to be captured with detailed resolution, such as supplementing the life line on the palm side of the hand with more detail. The optimization based on the three-dimensional image data may involve determining a desired body part pose having a predetermined distance to the visible light sensor and/or the near infrared light sensor, thereby taking into account the focal distance and/or the depth of field. Thus, no expensive lenses, such as sufficiently fast electro-adaptive liquid lenses, are required. Furthermore, there is no need for real-time computationally complex frequency analysis of image data in the visible or near infrared spectrum. Furthermore, the three-dimensional image data captured by the time-of-flight camera enables accurate measurement of the absolute dimensions of a body part (e.g., a hand).
In an embodiment, the current and desired body part postures relate to one or more of relative distance, relative orientation and gesture. In some embodiments, the relative distance and/or relative orientation is defined with respect to the visible light sensor and/or with respect to the near infrared light sensor.
In an embodiment, the user guidance relates to adjusting one or more of a relative distance, a relative orientation, and a gesture of the current body part posture. The relative distance may be related to a distance between the biometric imaging device and the body part. The relative orientation may be related to a relative orientation between the biometric imaging device and the body part pose. The relative distance and/or orientation may depend on environmental properties of the biometric imaging device, such as the backlight, the reflective surface, and the like. The gesture may be associated with a movement of a body part, such as an extension of the fingers of a hand.
In an embodiment, the user guidance comprises one or more of visual guidance displayed on the display and acoustic guidance played back on the speaker. Acoustic guidance may not be preferred as human language may not be known. The acoustic guidance may include general signals such as warning signals, information signals, and the like. The visual guidance may include a representation of the body part, which may include emphasis, such as a colored part.
In an embodiment, the user guidance comprises displaying on the display a representation of the desired body part posture and a representation of the current body part posture. Enabling the person to more accurately adjust the body part posture.
In an embodiment, the user guidance includes displaying on the display a representation of a level indicating a difference between a current body part posture and a desired body part posture. Enabling the person to more accurately adjust the body part posture.
In an embodiment, the biometric imaging device is further configured to repeat one or more steps of the imaging procedure more than once. Enabling the person to gradually adjust the body parts to improve accuracy.
In an embodiment, the biometric imaging device is further configured to maintain a delay of less than 100 milliseconds (preferably less than 10 milliseconds) between determining the difference and capturing at least one of the image data in the visible spectrum and the image data in the infrared spectrum if the determined difference is within a predetermined range. Image data relating to the biometric feature is captured as long as the posture of the body part is in the desired posture.
In an embodiment, the biometric imaging device is further configured to determine a region of interest of the body part based on at least one of the three-dimensional image data, the image data in the visible spectrum and the image data in the near infrared spectrum, and adjust at least one of the desired body part pose according to the region of interest and capture at least one of the image data in the visible spectrum and the image data in the infrared spectrum. For example, in the case of a female palm, the near infrared spectrum may not include vein patterns, and in this case, the desired pose may change to the female back of the hand. For example, a rectangular region of interest of a portion of the palm may not include sufficient biometric features, in which case the desired body part posture and/or captured image data may change to enable capture of image data of a larger area of the palm (e.g., also including all fingers).
At least one object of the present invention is also achieved by a biometric imaging method for capturing image data of a body part of a person, wherein at least one of a visible light sensor for capturing image data of the body part in the visible spectrum and a near infrared light sensor for capturing image data of the body part in the near infrared spectrum is provided. The method comprises the following steps: providing a time-of-flight camera configured to capture three-dimensional image data of a body part of a person; performing an imaging procedure comprising the steps of: capturing three-dimensional image data of a current body part pose; determining a difference between the desired body part pose and the current body part pose based on the three-dimensional image data; providing user guidance to the person based on the determined difference, enabling the person to adjust the body part posture in the direction of the desired posture; at least one of image data in the visible spectrum and image data in the infrared spectrum is captured.
In an embodiment, the user guidance relates to adjusting one or more of a relative distance, a relative orientation, and a gesture of the current body part posture.
In an embodiment, the user guidance comprises displaying on the display a representation of the desired body part posture and a representation of the current body part posture.
In an embodiment, the biometric imaging method further comprises: one or more steps in the imaging procedure are repeated more than once.
In an embodiment, the biometric imaging method further comprises maintaining a delay of less than 100 milliseconds (preferably less than 10 milliseconds) between determining the difference and capturing at least one of the image data in the visible spectrum and the image data in the infrared spectrum if the determined difference is within a predetermined range.
In an embodiment, the biometric imaging method further comprises determining a region of interest of the body part based on at least one of the three-dimensional image data, the image data in the visible spectrum and the image data in the near infrared spectrum, and adjusting at least one of the desired body part pose according to the region of interest and capturing at least one of the image data in the visible spectrum and the image data in the infrared spectrum.
Drawings
The invention will be described in more detail below with reference to an embodiment shown in the drawings. The figures show:
FIG. 1 schematically illustrates a left hand palm of a first person;
FIG. 2 schematically illustrates a right hand palm of a second person;
figure 3 schematically shows the venous network on the back of the right hand 3 of a third person;
FIG. 4 schematically illustrates a human hand and a biometric authentication device;
FIG. 5 schematically illustrates a time-of-flight camera;
FIG. 6 schematically illustrates a biometric imaging device installed in a building;
7a, 7b, 7c, 7d schematically show user guidance in relation to an imaging procedure performed by a biometric imaging device; and
fig. 8 schematically illustrates a method of capturing image data of a user's hand.
Detailed Description
Fig. 1 schematically shows the palm of a left hand 1 of a first person. The left hand 1 has a thumb t, index finger i, middle finger m, ring finger r and little finger l. Fig. 2 schematically shows the palm of the right hand 2 of the second person. The right hand 2 has a thumb t, index finger i, middle finger m, ring finger r and little finger l.
Fig. 1 and 2 schematically show images of the palms of the left and right hands 1, 2 captured with a visible light sensor (e.g., 400nm to 600 nm). The hands 1, 2 have a palm print P or lifeline recognizable under visible light. Additionally or alternatively, the vein pattern of the hands 1, 2 may be determined from image data captured in near infrared light (e.g., 700nm to 900 nm). Fig. 1 and 2 do not show the vein pattern.
As shown in fig. 1 and 2, the palm prints P or life lines of the two human hands 1, 2 comprise individual biometric features, such as specific lengths, positions, curvatures, etc. By comparison with pre-stored biometric features from the body parts of the enrolled person, a specific person authentication can be performed, in particular in combination with biometric features determined from the respective vein patterns. Further, the person authentication may also be based on biometric features of the back of the hand determined from image data captured in visible light, near infrared light, or a combination thereof. However, it is not known at present whether the biometrics characteristic of the back of the hand determined from the image data captured with the visible light sensor is sufficient to achieve the person authentication. In the case of relying on the back of the hand, it is presently considered that image data captured with a near infrared light sensor is necessary to sufficiently achieve authentication of a person.
Fig. 3 schematically shows the venous network of the back of the third right human hand 3. The right hand 3 has a thumb t, index finger i, middle finger m, ring finger r and little finger l. As shown in fig. 3, the back of the hand 3 includes veins including a dorsal vein network 31 (dorsal vein network of the hand) and dorsal metacarpal veins 32 (dorsal metacarpal ganglion). The vein pattern may be determined from image data captured using a near-infrared light sensor, and the respective biometric features may be determined from image data captured in near-infrared light.
Fig. 4 schematically illustrates a biometric imaging device 80, which may be part of or provide a biometric authentication device. The biometric imaging device 80 comprises a biometric sensor 10 and a processing unit 20. The biometric imaging device 80 may be connected to the user display 40, for example, for providing user guidance. As shown in fig. 4, the processing unit 20 may be attached to the biometric sensor 10. The processing unit 20 may be remotely located within a computing infrastructure, such as a mainframe, server, cloud, etc. Processing unit 20 may include one or more processors and may store computer instructions that may be executed by the one or more processors to enable the functions described in this disclosure. The user display 40 may be fixedly mounted near the biometric sensor 10. The user display 40 may be associated with a user device (e.g., a notebook, a smart phone, a smart watch, etc.), wherein the processing unit 20 may communicate with the user display 40 via a wireless connection (e.g., bluetooth). The biometric imaging device 80 may be included in a user device (e.g., a notebook, a smart phone, a smart watch, etc.). As shown in FIG. 4, a current pose 401 of the user's hand and a desired pose 402 of the user's hand may be displayed on the display 40.
The biometric sensor 10 is capable of capturing image data of the person's hand 4. The biometric sensor 10 includes a visible light sensor 101 for capturing image data in the visible spectrum, a near infrared light sensor 102 for capturing image data in the near infrared spectrum, and a time-of-flight camera 103 for capturing image data having three dimensions. One or more of the visible light sensor 101, the near infrared light sensor 102, and the time-of-flight camera 103 may be included in a single sensor. Further, the biometric sensor 10 comprises a light source 104. Fig. 4 shows eight light sources 104 arranged in a circle around the sensors 101, 102 and the time-of-flight camera 103. The light sources 104 may include different numbers of light sources and/or may be arranged differently. The light source 104 may include a customized lens to achieve a uniform light distribution. The light source 104 may include one or more light sources that provide illumination in the visible spectrum and allow image data to be captured in the visible spectrum with the visible light sensor 101. Light source 104 may include one or more light sources that provide near infrared light illumination and allow image data to be captured in the near infrared light with near infrared light sensor 102. Calibration may be provided, in particular with respect to the geometrical position of the visible light sensor 101, the near infrared light sensor 102 and the time of flight camera 103, for example a translational displacement between the visible light sensor 101, the near infrared light sensor 102 and the time of flight camera 103. Furthermore, a calibration of the scaling factor with respect to the image data captured by the time-of-flight camera 103 may be provided, for example the absolute size of an object in the captured image data. Calibration may be provided within the biometric sensor 10 or a combination thereof by post-processing in a dedicated computer, such as the processing unit 20. Calibration may provide that objects in the image data captured by the visible light sensor 101, near infrared light sensor 102, and time-of-flight camera 103 are aligned with each other.
The visible light sensor 101 may comprise a visible light sensitive chip providing 2D image data (2D: two-dimensional) from a visible light intensity distribution generated by a 3D scene (3D: three-dimensional). The near-infrared light sensor 102 may include a near-infrared photosensitive chip that provides 2D image data (2D: two-dimensional) from a near-infrared light intensity distribution generated by a 3D scene (3D: three-dimensional). Visible light sensor 101 and near infrared light sensor 102 may include lenses, buffers, controllers, processing electronics, and the like. The visible light sensor 101 and the near infrared light sensor 102 may relate to commercially available sensors, such as e2v semiconductor SAS EV76C570 CMOS image sensors equipped with blocking filters for visible light sensor 101 with wavelengths smaller than 500nm and for near infrared light sensor 102 with wavelengths larger than 700nm, or for example OmniVision OV4686 RGB-lr sensors, wherein the visible light sensor 101 and the near infrared light sensor 102 are combined in one chip and the sensor comprises an RGB-lr filter. The light source 104 may comprise a visible and/or near infrared light generator, such as an LED (light emitting diode). The light source 104 may relate to a commercially available light source, for example the high power LED SMB1N series from Roithner Laser Technik GmbH, Vienna (Roithner Laser technologies GmbH, Vienna).
Fig. 5 schematically shows a time-of-flight camera 103. The time-of-flight camera 103 includes a sequence controller 1031, a modulation controller 1032, a pixel matrix 1033, an A/D converter 1034 (A/D: analog to digital), an LED or VCSEL 1035 (LED: light emitting diode; VCSEL: vertical cavity surface emitting laser), and a lens 1036. The sequence controller controls the modulation controller 1032 and the a/D converter 1034. The modulation controller 1032 controls the LEDs or VCSELs 1035 and the pixel matrix 1033. Pixel matrix 1033 provides signals to a/D converter 1034. Sequence controller 1031 interacts with host controller 1037, for example, via an I2C bus (I2C: I-Square-C serial data bus). An LED or VCSEL 1035 illuminates the 3D scene 1038. After time of flight, the lens 1036 receives light reflected by the 3D scene 1038. A/D converter 1034 provides raw 3D image data (3D: three-dimensional) to host controller 1037, for example, via MIPI CSI-2 or PIF (MIPI: Mobile Industrial processor interface; CSI: Camera Serial interface; PIF: parallel interface). The host controller performs depth map calculations and provides a magnitude image 103a of the 3D scene 1038 and a depth image 103D of the 3D scene. As shown in fig. 5, for example, since a wall behind a person is arranged at a specific distance from the time-of-flight camera 103, the background of the magnitude image 103a includes the light shadow of the wall behind the person, while the background of the depth image 103d has a single value (e.g., black). Time-of-flight camera 103 may relate to an English-flying-RabdosiaTMREAL3 of IncTMAnd may include the specifications: direct measurement of depth and amplitude in each pixel; the highest accuracy; lean computing load; actively modulating infrared light and patented background lighting Suppression (SBI) circuitry in each pixel; full operation in any light condition (dark and bright sunlight); monocular system architectures without a mechanical baseline; minimal size and high design flexibility; no close range operation limitation exists; no special requirement on mechanical stability; no mechanical alignment and no angle correction; there is no risk of recalibration or de-calibration due to dropping, vibration or thermal flexing; simple and very fast calibration at a time throughout a lifetime; cost-effective manufacturing.
Fig. 6 schematically shows a biometric imaging device 80 installed in a building 6 having an entrance 61. Access to the building 6 is controlled by a biometric authentication device comprising a biometric imaging device 80. As shown in fig. 6, the biometric imaging device 80 is mounted adjacent to the entrance 61. Biometric imaging device 80 is configured to capture image data from a body part (e.g., hand 4) of a person requesting access to building 6. The backlight, e.g., affected by the light source 62 mounted to provide comfortable lighting conditions, may severely degrade the quality of the image data captured by the biometric imaging device 80.
To capture image data with improved quality, the biometric imaging device 80 is configured to perform an imaging procedure comprising the steps of: capturing three-dimensional image data of the current pose of the hand 4; determining a difference in a desired pose of the hand 4 and a current pose of the hand 4 based on the three-dimensional image data; providing user guidance 401, 402 to the person based on the determined difference to enable the person to adjust the posture of the hand 4 in the direction of the desired posture; at least one of image data in the visible spectrum and image data in the infrared spectrum of the hand 4 is captured.
Fig. 7a, 7b, 7c, 7d schematically show user guidance in relation to an imaging procedure performed by the biometric imaging device 80. The user directions are displayed on the display 40. The display 40 may be fixedly mounted in a position proximate to the biometric imaging device 80. The display 40 may be associated with a user device (e.g., a tablet, smartphone, etc.).
As shown in fig. 7a, a representation of a desired pose 402 of the hand 4 is displayed. The display of the representation of the desired gesture 402 is based on the three-dimensional image data captured with the time-of-flight camera 103 and may be calibrated according to the hand 4. Calibration according to the hand 4 is important to be able to properly guide hands of different sizes, such as a man's hand, a woman's hand or a child's hand. As shown in fig. 7a, a representation of the current pose 402 of the hand 4 is displayed. The display of the representation of the current pose 401 is based on the three-dimensional image data captured with the time-of-flight camera 103.
In the example according to fig. 7a, the current pose of the hand 4 is too far away from the biometric imaging device 80, which is indicated by displaying the representation of the current pose 401 in a smaller size than the representation of the desired pose 402. Thus, the person can adjust the posture of the hand 4 in the direction of the desired posture, i.e., move the hand close to the biometric imaging device 80.
In the example according to fig. 7b, the current pose of the hand 4 is still too far away from the biometric imaging device 80, which is indicated by displaying the representation of the current pose 401 in a smaller size than the representation of the desired pose 402. However, with the example according to fig. 7a, the distance decreases and more information may be displayed, such as a level 403(spirit level), to provide guidance for further adjusting the posture of the hand 4. A level 403 displayed on the back of the representation of the hand of the current pose 401 provides user guidance regarding the difference between the current inclination of the hand and the desired inclination of the hand. Thus, the person is able to adjust the pose of the hand 4 in the direction of the desired pose, i.e. move the hand even closer to the biometric imaging device 80 and adjust the inclination of the hand.
In the example according to fig. 7c, the current pose of the hand 4 is approximately at the desired distance from the biometric imaging device 80, which is indicated by displaying the representation of the current pose 401 in approximately the same size as the representation of the desired pose 402. However, the posture of the hand 4 does not yet have the correct inclination, which is indicated by the level 403 being displayed together with the representation of the current posture 401 of the hand. Thus, the person is able to adjust the posture of the hand 4 in the direction of the desired posture, i.e. further adjust the inclination of the hand.
In the example according to fig. 7d, the difference between the current posture of the hand 4 and the desired posture of the hand 4 is at a sufficiently small level. The biometric imaging device 80 captures at least one of image data in the visible spectrum and image data in the infrared spectrum. Since the posture of the hand 4 coincides with the desired posture, which may take into account especially backlight conditions, best focus conditions, etc., the captured image data has an improved quality.
Fig. 8 schematically shows a method of capturing image data of a person's hand 4. The method includes an imaging procedure as follows. In step S1, three-dimensional image data of the current posture of the hand 4 is captured using the time-of-flight camera 103. In step S2, a difference between the desired posture of the hand 4 and the current posture of the hand 4 is determined based on the three-dimensional image data. In step S3, user guidance 401, 402 is provided to the person based on the determined difference, enabling the person to adjust the posture of the hand 4 in the direction of the desired posture. In step S4, at least one of image data of the hand 4 in the visible spectrum and image data in the infrared spectrum is captured. For example, steps S1 to S3 are continuously repeated until step S2 is followed by step S4 after it is determined in step S2 that the difference is smaller than the predetermined threshold.
REFERENCE SIGNS LIST
1. 2, 3 hands of first, second and third persons
t, i, m, r, l thumb, index finger, middle finger, ring finger, little finger
P palm print or life line
31. 32 dorsal venous rete of hand, dorsal metacarpal vein
Hand of 4 persons
10 biometric sensor
101 visible light sensor
102 near infrared light sensor
103 time-of-flight camera
104 light source
20 processing unit
80 biometric imaging device
40 display
401. 402 a current pose of a user's hand, a desired pose of a user's hand
403 level gauge

Claims (15)

1. A biometric imaging device (10) for capturing image data of a body part (4) of a person, the biometric imaging device (10) comprising at least one of a visible light sensor (101) for capturing image data of the body part in the visible spectrum and a near infrared light sensor (102) for capturing image data of the body part in the near infrared spectrum, wherein:
a biometric imaging device (10) comprising a time-of-flight camera (103), the time-of-flight camera (103) being configured to capture three-dimensional image data of a body part (4) of a person; and is
A biometric imaging device (10) is configured to perform an imaging procedure comprising the steps of:
capturing three-dimensional image data of a current body part pose;
determining a difference between the desired body part pose and the current body part pose based on the three-dimensional image data;
providing user guidance (401, 402) to the person based on the determined difference to enable the person to adjust the body part posture in the direction of the desired posture; and
at least one of image data in the visible spectrum and image data in the infrared spectrum is captured.
2. Biometric imaging device (10) according to the preceding claim, wherein the current and desired body part postures are related to one or more of relative distance, relative orientation and gesture.
3. The biometric imaging device (10) according to any one of the preceding claims, wherein the user guidance (401, 402) relates to adjusting one or more of a relative distance, a relative orientation, and a gesture of the current body part pose.
4. Biometric imaging device (10) according to any of the preceding claims, wherein the user guidance (401, 402) comprises one or more of visual guidance displayed on the display (40) and acoustic guidance played back on a speaker.
5. Biometric imaging device (10) according to any one of the preceding claims, wherein the user guidance (401, 402) comprises a representation of a desired body part posture (401) and a representation of a current body part posture (402) displayed on the display (40).
6. Biometric imaging device (10) according to any of the preceding claims, wherein the user guidance (402) comprises a representation of a level (403) displayed on the display (40), the level (403) indicating a difference between a current body part posture and a desired body part posture.
7. The biometric imaging device (10) according to any one of the preceding claims, further configured to repeat one or more steps of an imaging procedure more than once.
8. The biometric imaging device (10) according to any one of the preceding claims, further configured to maintain a delay of less than 100 milliseconds, preferably less than 10 milliseconds, between determining the difference and capturing at least one of image data in the visible spectrum and image data in the infrared spectrum if the determined difference is within a predetermined range.
9. The biometric imaging device (10) according to any one of the preceding claims, further configured to determine a region of interest of the body part based on at least one of the three-dimensional image data, the image data in the visible spectrum and the image data in the near infrared spectrum, and to adjust at least one of the desired body part pose according to the region of interest and to capture at least one of the image data in the visible spectrum and the image data in the infrared spectrum.
10. A biometric imaging method for capturing image data of a body part (4) of a person, wherein at least one of a visible light sensor (101) for capturing image data of the body part in the visible spectrum and a near infrared light sensor (102) for capturing image data of the body part in the near infrared spectrum is provided, the method comprising:
providing a time-of-flight camera (103), the time-of-flight camera (103) being configured for capturing three-dimensional image data of a body part (4) of a person; and is
Performing an imaging procedure comprising the steps of:
capturing three-dimensional image data of a current body part pose;
determining a difference between the desired body part pose and the current body part pose based on the three-dimensional image data;
providing user guidance to the person based on the determined difference, enabling the person to adjust the body part posture in a direction of a desired posture; and
at least one of image data in the visible spectrum and image data in the infrared spectrum is captured.
11. A method of biometric imaging as in claim 10, wherein the user guidance (401, 402) relates to adjusting one or more of a relative distance, a relative orientation and a gesture of the current body part pose.
12. A biometric imaging method as in claim 10 or 11, wherein the user guidance (401, 402) comprises displaying on the display (40) a representation of the desired body part posture (401) and a representation of the current body part posture (402).
13. The biometric imaging method recited in any one of claims 10 to 12 further comprising repeating one or more steps of an imaging procedure more than once.
14. The biometric imaging method according to any one of claims 10 to 13, further comprising maintaining a delay of less than 100 milliseconds, preferably less than 10 milliseconds, between determining the difference and capturing at least one of image data in the visible spectrum and image data in the infrared spectrum if the determined difference is within a predetermined range.
15. The biometric imaging method as in any one of claims 10 to 14, further comprising determining a region of interest of the body part based on at least one of the three dimensional image data, the image data in the visible spectrum, and the image data in the near infrared spectrum, and adjusting at least one of the desired body part pose according to the region of interest and capturing at least one of the image data in the visible spectrum and the image data in the infrared spectrum.
CN202080026328.4A 2019-04-10 2020-04-06 Biometric imaging apparatus and biometric imaging method for capturing image data of a body part of a person capable of improving image data quality Pending CN113678138A (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
CH00486/19A CH716053A1 (en) 2019-04-10 2019-04-10 Biometric formation device and biometric formation method for acquiring image data of a body part of a person with user guidance.
CH00486/19 2019-04-10
PCT/EP2020/059718 WO2020207947A1 (en) 2019-04-10 2020-04-06 Biometrics imaging device and biometrics imaging method for capturing image data of a body part of a person which enable improved image data quality

Publications (1)

Publication Number Publication Date
CN113678138A true CN113678138A (en) 2021-11-19

Family

ID=67909234

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202080026328.4A Pending CN113678138A (en) 2019-04-10 2020-04-06 Biometric imaging apparatus and biometric imaging method for capturing image data of a body part of a person capable of improving image data quality

Country Status (6)

Country Link
US (1) US11900731B2 (en)
EP (1) EP3953858A1 (en)
CN (1) CN113678138A (en)
AU (1) AU2020271607A1 (en)
CH (1) CH716053A1 (en)
WO (1) WO2020207947A1 (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CH716052A1 (en) 2019-04-10 2020-10-15 Smart Secure Id Ag Biometric authentication device and biometric authentication method for authenticating a person with reduced computing complexity.
EP4002166B1 (en) 2020-11-11 2024-06-12 Qamcom Innovation Labs AB Method and system for biometric authentication for large numbers of enrolled persons

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103929566A (en) * 2013-01-15 2014-07-16 富士通株式会社 Biometric information image capture device and biometric authentication device
US20140286528A1 (en) * 2013-03-19 2014-09-25 Fujitsu Limited Biometric information input apparatus and biometric information input method
US20140369558A1 (en) * 2012-01-17 2014-12-18 David Holz Systems and methods for machine control
CN105144238A (en) * 2013-02-25 2015-12-09 联邦科学和工业研究组织 3d imaging method and system
CN107003738A (en) * 2014-12-03 2017-08-01 微软技术许可有限责任公司 Fixation object application launcher

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4546169B2 (en) 2004-06-28 2010-09-15 富士通株式会社 An imaging device for palm authentication
JP4515850B2 (en) 2004-07-30 2010-08-04 富士通株式会社 Biometric device guidance screen control method, biometric device and program thereof
US20060018523A1 (en) 2004-07-23 2006-01-26 Sanyo Electric Co., Ltd. Enrollment apparatus and enrollment method, and authentication apparatus and authentication method
JP4786483B2 (en) 2006-09-14 2011-10-05 富士通株式会社 Biometric guidance control method for biometric authentication device and biometric authentication device
JP5810581B2 (en) 2011-03-29 2015-11-11 富士通株式会社 Biological information processing apparatus, biological information processing method, and biological information processing program
JP6089610B2 (en) * 2012-11-13 2017-03-08 富士通株式会社 Biometric authentication apparatus, biometric authentication method, and biometric authentication computer program
US9848113B2 (en) * 2014-02-21 2017-12-19 Samsung Electronics Co., Ltd. Multi-band biometric camera system having iris color recognition
EP3576016A4 (en) * 2018-04-12 2020-03-18 Guangdong Oppo Mobile Telecommunications Corp., Ltd. Face recognition method and apparatus, and mobile terminal and storage medium
CH716052A1 (en) 2019-04-10 2020-10-15 Smart Secure Id Ag Biometric authentication device and biometric authentication method for authenticating a person with reduced computing complexity.

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140369558A1 (en) * 2012-01-17 2014-12-18 David Holz Systems and methods for machine control
CN103929566A (en) * 2013-01-15 2014-07-16 富士通株式会社 Biometric information image capture device and biometric authentication device
CN105144238A (en) * 2013-02-25 2015-12-09 联邦科学和工业研究组织 3d imaging method and system
US20140286528A1 (en) * 2013-03-19 2014-09-25 Fujitsu Limited Biometric information input apparatus and biometric information input method
CN107003738A (en) * 2014-12-03 2017-08-01 微软技术许可有限责任公司 Fixation object application launcher

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
JAN SVOBODA, ET.AL: "Contactless_biometric_hand_geometry_recognition_using_a_low-cost_3D_camera", IEEE, 31 December 2015 (2015-12-31), pages 1 - 4 *
JAVIER MOLINA, ET.AL: "Real-time user independent hand gesture recognition from time-of-flight camera video using static and dynamic models", SPRINGER, 6 August 2011 (2011-08-06), pages 1 - 8 *
S. SAMOIL, ET.AL: "Multispectral_Hand_Biometrics", IEEE, 31 December 2014 (2014-12-31) *

Also Published As

Publication number Publication date
CH716053A1 (en) 2020-10-15
US20220207922A1 (en) 2022-06-30
US11900731B2 (en) 2024-02-13
WO2020207947A1 (en) 2020-10-15
EP3953858A1 (en) 2022-02-16
AU2020271607A1 (en) 2021-10-28

Similar Documents

Publication Publication Date Title
CN107949863B (en) Authentication device and authentication method using biometric information
WO2020108225A1 (en) Fingerprint acquisition method and related apparatus
US11715231B2 (en) Head pose estimation from local eye region
US20140037135A1 (en) Context-driven adjustment of camera parameters
US9880634B2 (en) Gesture input apparatus, gesture input method, and program for wearable terminal
CN113260951A (en) Fade-in user interface display based on finger distance or hand proximity
WO2014084249A1 (en) Facial recognition device, recognition method and program therefor, and information device
CN105303155B (en) Iris identification equipment and its operating method
WO2016099905A1 (en) Range camera
EP3575997A1 (en) Vein biometric authentication device and method
US11335090B2 (en) Electronic device and method for providing function by using corneal image in electronic device
CN113678138A (en) Biometric imaging apparatus and biometric imaging method for capturing image data of a body part of a person capable of improving image data quality
US12131570B2 (en) Biometrics authentication device and biometrics authentication method for authenticating a person with reduced computational complexity
Tresanchez et al. Optical Mouse Sensor for Eye Blink Detection and Pupil Tracking: Application in a Low‐Cost Eye‐Controlled Pointing Device
EP3779660A1 (en) Apparatus and method for displaying graphic elements according to object
CN109684907B (en) Identification device and electronic device
CN106951077B (en) Prompting method and first electronic device
JP2023544107A (en) Optical stylus for optical positioning devices
WO2018185992A1 (en) Biometric authentication device and method
CN113405674B (en) Body temperature measuring method and camera equipment
EP4332798A1 (en) Biometric liveness authentication method, and electronic device
KR102713797B1 (en) Photographic devices and authentication devices
CN111881719B (en) Non-contact type biological recognition guiding device, method and biological feature recognition system
US20240153136A1 (en) Eye tracking
JP5299632B2 (en) Personal authentication device, program and recording medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
REG Reference to a national code

Ref country code: HK

Ref legal event code: DE

Ref document number: 40055582

Country of ref document: HK

SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination