US7760916B2 - Registration apparatus, collation apparatus, image correction method and program - Google Patents
Registration apparatus, collation apparatus, image correction method and program Download PDFInfo
- Publication number
- US7760916B2 US7760916B2 US11/551,393 US55139306A US7760916B2 US 7760916 B2 US7760916 B2 US 7760916B2 US 55139306 A US55139306 A US 55139306A US 7760916 B2 US7760916 B2 US 7760916B2
- Authority
- US
- United States
- Prior art keywords
- image
- finger
- identification
- inclination
- pixels
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Expired - Fee Related, expires
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/12—Fingerprints or palmprints
- G06V40/13—Sensors therefor
- G06V40/1312—Sensors therefor direct reading, e.g. contactless acquisition
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/20—Image preprocessing
- G06V10/24—Aligning, centring, orientation detection or correction of the image
- G06V10/243—Aligning, centring, orientation detection or correction of the image by compensating for image skew or non-uniform image deformations
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/14—Vascular patterns
Definitions
- the present invention contains subject matter related to Japanese Patent Application JP2005-316896 filed in the Japanese Patent Office on Oct. 31, 2005, the entire contents of which being incorporated herein by reference.
- the present invention relates to a registration apparatus, a collation apparatus, an image correction method and a program that can suitably be used in applications for biometric authentications.
- Blood vessels have been objects of biometrics authentication among others.
- authentication apparatus include an image pickup camera and a finger is typically rigidly secured in position in parallel with the imaging plane of the image pickup camera in order for the camera to shoot blood vessels in the finger.
- the authentication apparatus then registers the image of the blood vessels obtained as a result of the shooting operation in a memory or the like or compares the image with the image of blood vessels registered in a memory or the like to identify the person to be identified.
- one of the typical instances where the condition of a finger differs between at the time of registration and at the time of authentication is the angle between the imaging plane of the image pickup camera and the finger placed on the camera, or the inclination of the finger relative to the imaging plane.
- the finger in the image picked up for the blood vessels shows a profile that is long at a part located close to the image pickup camera and short at a part located remote from the image pickup camera. In short, the picked up image of the finger is distorted.
- the distortion of the finger in an image thereof differs between the time of registration and the time of authentication, there can arise a case where the finger is that of a registered (authorized) person but determined to be that of an unauthorized (unregistered) person and a case where the finger is that of an unauthorized (unregistered) person but determined to be that of a registered (authorized) person. In short, the accuracy of authentication will be poor.
- the authentication apparatus erroneously correct the image, judging the distortion of projection to be attributable to an inclination of the finger relative to the camera that picked up the image. Then, the problem of a poor accuracy of authentication remains undissolved.
- a registration apparatus including: a corresponding pixel detection means for detecting two or more than two pixels in an object of identification shown in a first image obtained by shooting the object of identification at a site of a living body from a first position and corresponding two or more than two pixels, whichever appropriate, in the object of identification shown in a second image obtained by shooting the object of identification from a second position different from the first position and located substantially on a plane including the first position; an inclination detection means for detecting the extent of inclination of the site of the living body relative to a reference plane on the basis of the difference of each corresponding pair of pixels; a correction means for correcting the distortion of the first image or the second image according to the extent of inclination; and a registration means for registering the object of identification shown in the first image or the second image corrected for the distortion in a recording medium.
- a registration apparatus as defined above, it is possible to know the real angle of inclination of the finger of which an image is picked up. In other words, there does not arise a situation where the image is erroneously corrected by judging the distortion of projection to be attributable not to the profile of the finger itself but to the inclination of the finger relative to the camera that picked up the image. Therefore, with a registration apparatus according to the aspect of the present invention, it is possible to register an object of identification, which may be a finger, after selectively removing the distortion of projection attributable to the change in the condition of placement of the finger without relying on the profile of the finger. Then, it is possible to improve the accuracy of authentication at the time of collation.
- a collation apparatus including: a corresponding pixel detection means for detecting two or more than two pixels in an object of identification shown in a first image obtained by shooting the object of identification at a site of a living body from a first position and corresponding two or more than two pixels, whichever appropriate, in the object of identification shown in a second image obtained by shooting the object of identification from a second position different from the first position and located substantially on a plane including the first position; an inclination detection means for detecting the extent of inclination of the site of the living body relative to a reference plane on the basis of the difference of each corresponding pair of pixels; a correction means for correcting the distortion of the first image or the second image according to the extent of inclination; and a collation means for collating, using the object of identification shown in the first image or the second image corrected for the distortion as object of collation.
- a collation apparatus as defined above, it is possible to know the real angle of inclination of the finger of which an image is picked up. In other words, there does not arise a situation where the image is erroneously corrected by judging the distortion of projection to be attributable not to the profile of the finger itself but to the inclination of the finger relative to the camera that picked up the image. Therefore, with a collation apparatus according to the aspect of the present invention, it is possible to collate, using the object of identification, which may be a finger, after selectively removing the distortion of projection attributable to the change in the condition of placement of the finger without relying on the profile of the finger. Then, it is possible to improve the accuracy of authentication.
- an image correction method including: a first step of acquiring a first image obtained by shooting an object of identification at a site of a living body from a first position and a second image obtained by shooting the object of identification from a second position different from the first position and located substantially on a plane including the first position; a second step of detecting two or more than two pixels in the object of identification shown in the first image and corresponding two or more than two pixels, whichever appropriate, in the object of identification shown in the second image; a third step of detecting the extent of inclination of the site of the living body relative to a reference plane on the basis of the difference of each corresponding pair of pixels; and a fourth step of correcting the distortion of the first image or the second image according to the extent of inclination.
- an image correction method it is possible to know the real angle of inclination of the finger of which an image is picked up. In other words, there does not arise a situation where the image is erroneously corrected by judging the distortion of projection to be attributable not to the profile of the finger itself but to the inclination of the finger relative to the camera that picked up the image. Therefore, with an image correction method according to the aspect of the present invention, it is possible to have an operation of biometrics authentication of an object of identification, which may be a finger, performed after selectively removing the distortion of projection attributable to the change in the condition of placement of the finger without relying on the profile of the finger. Then, it is possible to improve the accuracy of authentication.
- a program according to the aspect of the present invention as defined above, it is possible to know the real angle of inclination of the finger of which an image is picked up. In other words, there does not arise a situation where the image is erroneously corrected by judging the distortion of projection to be attributable not to the profile of the finger itself but to the inclination of the finger relative to the camera that picked up the image. Therefore, with a program according to the aspect of the present invention, it is possible to have an operation of biometric authentication of an object of identification, which may be a finger, performed after selectively removing the distortion of projection attributable to the change in the condition of placement of the finger without relying on the profile of the finger. Then, it is possible to improve the accuracy of authentication.
- the present invention it is possible to realize a registration apparatus, a collation apparatus, an image correction method and a program with an improved accuracy of authentication because it is possible to have an operation of biometrics authentication of an object of identification, which may be a finger, performed after selectively removing the distortion of projection attributable to the change in the condition of placement of the finger without relying on the profile of the finger by detecting two or more than two pixels in an object of identification shown in a first image obtained by shooting the object of identification at a site of a living body from a first position and two or more than two corresponding pixels, whichever appropriate, in the object of identification shown in a second image obtained by shooting the object of identification from a second position different from the first position and located substantially on a plane including the first position, detecting the extent of inclination of the site of the living body relative to a reference plane on the basis of the difference of each corresponding pair of pixels and correcting the distortion of the first image or the second image according to the extent of inclination. Then, it is possible to improve
- FIG. 1 is a schematic perspective view of an authentication apparatus, showing an appearance thereof;
- FIG. 2 is a schematic cross sectional view of the authentication apparatus taken along line A-A′ in FIG. 1 ;
- FIG. 3 is a schematic block diagram of the authentication apparatus of FIG. 1 , showing the circuit configuration thereof;
- FIGS. 4A and 4B are schematic illustrations of detection of corresponding pixels
- FIG. 5 is a schematic illustration of the difference of corresponding pixels
- FIG. 6 is a schematic illustration of a stereoscopic viewing technique
- FIG. 7 is a schematic illustration of the relationship (1) between a moving finger, focal point and parallax
- FIG. 8 is a schematic illustration of an inclination of a finger relative to a reference plane
- FIG. 9 is a schematic illustration the relationship (2) between a moving finger, focal point and parallax
- FIGS. 10A and 10B are schematic illustrations of detection of inclination and rotation of a finger
- FIG. 11 is a flowchart of a distortion correction process sequence
- FIGS. 12A and 12B are schematic illustrations of detection of corresponding pixels by another embodiment.
- FIG. 1 is a schematic perspective view of an authentication apparatus realized by applying an embodiment of the present invention, showing an appearance thereof
- FIG. 2 is a schematic cross sectional view of the authentication apparatus taken along line A-A′ in FIG. 1 .
- 1 generally denotes the authentication apparatus realized by applying this embodiment.
- An image pickup camera 4 and a near infrared (IR) source (not shown) for irradiating infrared rays that are specifically absorbed by hemoglobin of living bodies into the image pickup space of the image pickup camera 4 are provided at respective predetermined positions on the top surface 2 A of a substantially box-shaped cabinet 2 .
- IR near infrared
- near infrared rays emitted onto the finger FG are absorbed by hemoglobin of the blood vessel tissues found in the finger FG and reflected and scattered by the tissues other than the blood vessel tissues of the finger FG.
- near infrared rays projected onto the profile and some of the blood vessels of the finger FG enter the image pickup camera 4 by way of the finger FG.
- the image pickup camera 4 is adapted to lead the finger-profile/blood-vessels-projected rays that enters it to an image pickup element 4 b by way of an optical system 4 a and output a video signal representing images of the profile and some of the blood vessels of the finger FG formed in the image pickup element 4 b (to be referred to as finger-profile/blood-vessels video signal hereinafter) to a signal processing circuit mounted in the authentication apparatus 1 .
- the authentication apparatus 1 can pick up images of some of the blood vessels found in the finger FG.
- a thin member that the finger tip touches (to be referred to as finger tip touching section hereinafter) 3 is arranged in a condition of becoming perpendicular relative to the top surface 2 A and a linear member operating as index for moving the finger FG on a same plane in the image pickup space of the image pickup camera 4 (to be referred to as finger movement index section hereinafter) 3 a is arranged on the finger tip touching surface of the finger tip touching section 3 .
- the finger FG is adapted to move in a predetermined direction (as indicated by a dotted chain line in FIG. 2 ) along the finger movement index section 3 a.
- the authentication apparatus 1 it is possible to successively pick up images of some of the blood vessels found in the finger FG that moves on a same plane.
- the authentication apparatus 1 can faithfully pick up images of blood vessels at an end of (a finger of) a living body where veins and arteries coexist in a mixed state.
- FIG. 3 is a schematic block diagram of the authentication apparatus of FIG. 1 , showing the circuit configuration thereof.
- the authentication apparatus 1 includes an operation section 11 , an IR source drive section 12 , a camera drive section 13 , a flash memory 14 and an interface 15 for exchanging data with the outside of the apparatus (to be referred to as external interface hereinafter), which are connected to a control section 10 by way of transmission lines.
- the control section 10 is a computer that includes a Central Processing Unit (CPU) for controlling the entire authentication apparatus 1 , a Read Only Memory (ROM) for storing various programs and presetting information, a Random Access Memory (RAM) that is a work memory of the CPU.
- the control section 10 is adapted to receive an execution command COM 1 for an operation in a mode of registering blood vessels of a person (to be referred to as blood vessel registration mode hereinafter) or an execution command COM 2 for an operation in a mode of judging identity of a registered person (to be referred to as authentication mode hereinafter) from the operation section 11 according to a user operation.
- CPU Central Processing Unit
- ROM Read Only Memory
- RAM Random Access Memory
- the control section 10 decides the mode of operation according to an execution command COM 1 or an execution command COM 2 and executes a registration process or an authentication process, using a program that corresponds to the decision and appropriately controlling the IR source drive section 12 , the camera drive section 13 , the flash memory 14 and the external interface 15 .
- control section 10 When the control section 10 decides to operate in a blood vessel registration mode, it shifts the operation mode to a blood vessel registration mode and acquires video data including those of some of the blood vessels of the finger FG that is moving in the image pickup space, while the finger tip is held in contact with the finger movement index section 3 a ( FIGS. 1 and 2 ).
- control section 10 drives the near IR source (not shown) by way of the IR source drive section 12 to turn on the near IR source. Then, as a result, near IR rays are emitted onto the finger FG that is moving in the image pickup space and finger-profile/blood-vessels-projected rays obtained by way of the finger FG enter the image pickup element 4 b ( FIG. 2 ) by way of the optical system 4 a ( FIG. 2 ) of the image pickup camera 4 . Then, the finger-profile/blood-vessels-projected rays that enter the image pickup element 4 b are subjected to photoelectric conversion.
- control section 10 adjusts the lens position of the focusing lens of the optical system 4 a through the camera drive section 13 so as to focus the finger located in the image pickup space or some of the blood vessels in the finger according to the contrast of the finger-profile/blood-vessels video signal output from the image pickup element 4 b as a result of the photoelectric conversion.
- control section 10 is adapted to detect the inclination of the finger on the basis of the finger-profile/blood-vessels video data DA 1 through DAm and executes a process for correcting the distortion of projection (to be referred to as distortion correction process hereinafter) on, for example, the finger-profile/blood-vessels video data DA 1 out of the finger-profile/blood-vessels video data DA 1 through DAm depending on the inclination of the finger.
- distortion correction process a process for correcting the distortion of projection
- control section 10 registers the finger-profile/blood-vessels video data that have been subjected to the distortion correction process as data DIS for identifying the living body whose finger FG is arranged and moving in the image pickup space (to be referred to as identification information hereinafter) by storing them in the flash memory 14 .
- control section 10 operates in a blood vessel registration mode.
- a distortion correction process is executed as in the case of the blood vessel registration mode.
- control section 10 is adapted to collate the finger-profile/blood-vessels video data that have been subjected to a distortion correction process and the identification information DIS registered in the flash memory 14 and judges if the user, or the object of the image pickup operation of the image pickup camera 4 , is the registered person (authorized person) or not. Then, the control section 10 transfers the outcome of judgment as judgment data JD to the outside by way of the external interface 15 .
- control section 10 operates in an authentication mode.
- the distortion correction process that the control section 10 executes will be specifically described below in detail. Since the distortion correction process that is executed in a blood vessel registration mode and the distortion correction process that is executed in an authentication mode are similar to each other, the distortion correction process in a blood vessel registration mode will be specifically described here from the viewpoint of convenience of description.
- the control section 10 selects the finger-profile/blood-vessels image IM 1 ( FIG. 4A ) of the finger-profile/blood-vessels video data DA 1 it acquired first and the finger-profile/blood-vessels image IM 2 ( FIG. 4B ) of the finger-profile/blood-vessels video data DAm it acquired last out of the finger-profile/blood-vessels video data DA 1 through DAm it acquired from the image pickup camera 4 as object to be processed.
- control section 10 detects two or more than two characteristic pixels from the pixels showing the finger profile and the blood vessels in the finger-profile/blood-vessels image IM 1 ( FIG. 4A ) and two or more than two, whichever appropriate, characteristic corresponding pixels showing the finger profile and the blood vessel in the finger-profile/blood-vessels image IM 2 ( FIG. 4B ).
- control section 10 executes a process for emphasizing the profile on the finger-profile/blood-vessels video data DA 1 and the finger-profile/blood-vessels video data DAm and collates the finger-profile/blood-vessels images IM 1 and IM 2 obtained as a result of the processes. Then, the control section 10 detects, for example, corresponding pixels PX 1 and PX 1 ′, PX 2 and PX 2 ′, PX 3 and PX 3 ′, PX 4 and PX 4 ′ and PX 5 and PX 5 ′ of the first joint and the second joint of the finger tip out of the group of pixels showing the finger profile.
- the control section 10 detects the actual inclination of the finger on the basis of the positional differences (to be referred to as corresponding pixel differences hereinafter) M 1 , M 2 , M 3 , M 4 , M 5 between the corresponding pixels PX 1 and PX 1 ′, PX 2 and PX 2 ′, PX 3 and PX 3 ′, PX 4 and PX 4 ′ and PX 5 and PX 5 ′.
- a stereoscopic viewing technique is used for detecting the corresponding pixel differences.
- the stereoscopic viewing technique when images of an object of shooting OJ is picked up from two positions that allow the optical axes of the lens of the image pickup camera to be held in parallel with each other as shown in FIG.
- the authentication apparatus 1 does not have two cameras mounted therein, a relationship of similarity as shown in FIG. 7 holds true because the finger FG that is the object of shooting OJ is moving ( FIG. 2 , etc.). Note that the parts in FIG. 7 that correspond to those in FIG. 6 are denoted respectively by the same reference symbols.
- the distance between the two cameras in FIG. 6 corresponds to the distance DS 1 from the position of placement P 1 of the finger FG that corresponds to the finger-profile/blood-vessels image IM 1 ( FIG. 4A ) to the position of placement P 2 of the finger FG that corresponds to the finger-profile/blood-vessels image IM 2 ( FIG. 4B ).
- the distance DS 2 between the position of the object of shooting OJM 1 and the virtual position of the object of shooting OJM 1 ′ corresponds to the corresponding pixel differences M ( FIG. 5 : M 1 , M 2 , M 3 , M 4 , M 5 ), or the parallax.
- the relationship among the positions of placement P 1 ′ and P 2 ′ at respective given clock times, the focal point f 1 and the parallax Z′ (as indicated by the thick solid lines in FIG. 9 ) and the relationship among the positions of placement P 1 ′′ and P 2 ′′ at respective given clock times, the focal point f 1 and the parallax Z′′ (as indicated by the thin solid lines in FIG. 9 ) are similar relative to each other as shown in FIG. 9 .
- the control section 10 determines the ratios of the corresponding pixel differences M 1 through M 5 (M 1 /M 2 , M 1 /M 3 , M 1 /M 4 , M 1 /M 5 , M 2 /M 3 , M 2 /M 4 , M 2 /M 5 , M 3 /M 4 , M 3 /M 5 and M 4 /M 5 ) and, as shown in FIGS. 10A and 10B , detects the extent of inclination AN 1 ( FIG. 10A ) of the finger FG relative to the reference plane located vis-à-vis the pad of the finger FG (to be referred to as finger inclination angle hereinafter) and the extent of rotation AN 2 ( FIG. 10B ) of the finger FG (to be referred to finger rotary angle hereinafter).
- the reference plane is the imaging plane of the image pickup camera 4 or the top surface 2 A of the cabinet 2 .
- control section 10 corrects the distortion of the finger-profile/blood-vessels image IM 1 (or the finger-profile/blood-vessels image IM 2 ) according to the outcome of the detection to correct the image in order to make it an image of the finger FG picked up when the surface of the finger pad is parallel to the reference plane.
- control section 10 can execute a distortion correction process on the basis of the finger-profile/blood-vessels image IM 1 ( FIG. 4A ) and the finger-profile/blood-vessels image IM 2 ( FIG. 4B ) acquired from the image pickup camera 4 .
- the above-described distortion correction process of the control section 10 is executed by following the distortion correction process sequence RT illustrated in FIG. 11 .
- the control section 10 starts following the distortion correction process sequence at Step SP 0 and, in the next step, or Step SP 1 , selects finger-profile/blood-vessels image data, e.g., the finger-profile/blood-vessels image data DA 1 and DAm ( FIG. 4 ), out of the finger-profile/blood-vessels image data DA 1 through DAm obtained as a result of the operation of shooting the finger FG that is moving in the image pickup space, while the finger tip is held in contact with the movement index section 3 a ( FIGS. 1 and 2 ).
- finger-profile/blood-vessels image data e.g., the finger-profile/blood-vessels image data DA 1 and DAm ( FIG. 4 )
- Step SP 2 the control section 10 detects the corresponding pixels PX 1 and PX 1 ′, PX 2 and PX 2 ′, PX 3 and PX 3 ′, PX 4 and PX 4 ′ and PX 5 and PX 5 ′ ( FIG. 5 ) from the finger-profile/blood-vessels images IM 1 , IM 2 of the selected finger-profile/blood-vessels video data DA 1 and DAm.
- Step SP 3 the control section 10 determines the differences M 1 through M 5 ( FIG. 5 ) of the corresponding pixels that correspond to the parallax from the corresponding pixels PX and PX′ and, in the next step, of Step SP 4 , determines the ratios of the corresponding pixel differences M 1 through M 5 (M 1 /M 2 , M 1 /M 3 , M 1 /M 4 , M 1 /M 5 , M 2 /M 3 , M 2 /M 4 , M 2 /M 5 , M 3 /M 4 , M 3 /M 5 and M 4 /M 5 ) before it proceeds to Step SP 5 .
- Step SP 5 the control section 10 judges if any two of more than two of the values of the ratios of the corresponding pixel differences M 1 through M 5 as determined in Step SP 4 are contradictory or not. If any two of more than two of the values are contradictory, the control section 10 returns to Step SP 1 and selects two finger-profile/blood-vessels video data that are different from the finger-profile/blood-vessels video data DA 1 and DAm and repeats the above-described process.
- Step SP 6 the control section 10 proceeds to Step SP 6 , where it detects the finger inclination angle AN 1 ( FIG. 10A ) and the finger rotary angle AN 2 ( FIG. 10B ) and then to Step S 7 , where it corrects the distortion of the finger-profile/blood-vessels image IM 1 (or the finger-profile/blood-vessels image IM 2 ) so as to turn the image into an image of the finger FG picked up when the surface of the finger pad is parallel to the reference plane according to the outcome of the detection. Then, the control section 10 proceeds to Step SP 8 to end the distortion correction process sequence RT.
- control section 10 can execute the distortion correction process by following the distortion correction process sequence RT.
- the authentication apparatus 1 detects two or more than two corresponding pixels PX 1 and PX 1 ′, PX 2 and PX 2 ′, PX 3 and PX 3 ′, PX 4 and PX 4 ′ and PX 5 and PX 5 ′ out of the pixels showing the profile of the finger from the finger-profile/blood-vessels image IM 1 ( FIG. 4A ) and the finger-profile/blood-vessels image IM 2 ( FIG. 4B ) obtained as a result of shooting the finger FG placed respectively at the first position and the second position located on a same plane and some of the blood vessels found in the finger FG.
- the authentication apparatus 1 determines the corresponding pixel differences M 1 through M 5 ( FIG. 5 ) from the corresponding pixels PX and PX′ and detects the finger inclination angle AN 1 ( FIG. 10A ) relative to the reference plane on the basis of the ratios of the corresponding pixel differences M 1 through M 5 . Then, the authentication apparatus 1 corrects the distortion of the finger-profile/blood-vessels image IM 1 according to the finger inclination angle AN 1 .
- the authentication apparatus 1 can also detects the finger inclination angle AN 1 where the actual inclination of the finger FG is reflected as the inclination is detected according to the ratios of parallax. Therefore, with the authentication apparatus 1 , there does not arise a situation where the image is erroneously corrected by judging the distortion of projection to be attributable not to the profile of the finger itself but to the inclination of the finger relative to the camera that picked up the image. Thus, with the authentication apparatus 1 , it is possible to have an operation of authentication of a finger performed after selectively removing the distortion of projection attributable to the condition of placement of the finger FG without relying on the profile of the finger FG. Then, it is possible to improve the accuracy of authentication.
- the authentication apparatus 1 of this embodiment is adapted to detect not only the finger inclination angle AN 1 relative to a reference plane but also the finger rotary angle AN 2 on the basis of the ratios of the corresponding pixel differences M 1 through M 5 and correct the finger-profile/blood-vessels image IM 1 according to the finger inclination angle AN 1 and the finger rotary angle AN 2 .
- the authentication apparatus 1 can by far accurately eliminate the distortion of projection attributable to the change in the condition of placement of the finger FG.
- the accuracy of authentication is by far improved.
- the finger FG that moves substantially on a same plane is shot by means of a single image pickup camera 4 and finger-profile/blood-vessels image IM 1 ( FIG. 4A ) and finger-profile/blood-vessels image IM 2 ( FIG. 4B ) that correspond respectively to the first position and the second position that is different from the first position are selectively used in the above description.
- the present invention is by no means limited thereto.
- two image pickup cameras having a same configuration may be arranged respectively at the first position and the second position with the optical axes thereof running in parallel with each other or intersecting each other and a pair of images acquired as a result of shooting the finger FG by means of the image pickup cameras may be used.
- Such an arrangement provides advantages similar to those described above by referring to the above-described embodiment.
- a single image pickup camera may be slidably placed at a first position and a second position so as to make the optical axes run in parallel with each other or intersect each other at the first and second positions and a pair of images acquired as a result of shooting the finger FG by means of the image pickup camera may be used.
- Such an arrangement also provides advantages similar to those described above by referring to the above-described embodiment.
- a finger is used as a site of a living body to be shot by such an image pickup unit in the above-described embodiment
- the present invention is by no means limited to a finger.
- a palm, a finger of a foot, an arm, an eye or some other site of a living body may alternatively be used for the purpose of the present invention.
- blood vessels are used as object of identification in the above-described embodiment
- the present invention is by no means limited thereto and some other object of identification such as fingerprints, the profile of a part of a living body or nerves may alternatively be used for the purpose of the present invention.
- a marker substance that produces a specific effect on nerves may be injected into the body of a person to be identified and the marker substance may be shot by an image pickup camera to make the nerves operate as object of identification like a finger in the case of the above-described embodiment.
- While the corresponding pixels PX 1 and PX 1 ′, PX 2 and PX 2 ′, PX 3 and PX 3 ′, PX 4 and PX 4 ′ and PX 5 and PX 5 ′ are detected by emphasizing the profile of the finger in the finger-profile/blood-vessels image IM 1 and the finger-profile/blood-vessels image IM 2 and subsequently collating them to detect two or more than two corresponding pixels of the object of identification shown in the first image and the second image in the above-described embodiment, the present invention is by no means limited thereto and some other detection technique may alternatively be applied for the purpose of the present invention.
- an image processing technique called minutia may be used to extract junctions of blood vessels in the finger-profile/blood-vessels images IM 1 and IM 2 and collate them in order to detect corresponding pixels PX and PX′.
- this technique it is possible to computationally determine the length of perpendicular PD from a position on the finger to be identified to a reference plane. Then, the distortion of the finger-profile/blood-vessels image IM 1 can be corrected more directly to further improve the accuracy of authentication if compared with the technique of computationally determining the length of perpendicular PD from the finger profile to a reference plane.
- each area AR is handled as a plane to detect the inclination of the finger so that it is possible to detect the inclination of the finger more accurately if the finger-profile/blood-vessels images IM 1 and IM 2 have only few characteristic pixels or contain noises that correspond to shadows due to bad shooting conditions. Then, the accuracy of authentication is by far improved.
- the present invention is by no means limited thereto and the angle of inclination of a finger relative to a reference plane may alternatively be detected by determining the distances from the positions on the actual finger FG that correspond respectively to the corresponding pixels PX and PX′ ( FIG. 5 : PX 1 and PX 1 ′, PX 2 and PX 2 ′, PX 3 and PX 3 ′, PX 4 and PX
- the distance from a position on the actual finger FG to a reference plane can be computationally determined by means of the above-described formula (1). More specifically, it is possible to computationally determine the distance DSI ( FIG. 7 ) between the position of placement P 1 of the finger FG corresponding to the finger-profile/blood-vessels image IM 1 ( FIG. 4A ) and the position of placement P 2 of the finger FG corresponding to the finger-profile/blood-vessels image IM 2 ( FIG. 4B ) typically on the basis of the number of images generated between the finger-profile/blood-vessels image IM 1 and the finger-profile/blood-vessels image IM 2 and the rate of generation of finger-profile/blood-vessels video signals by the image pickup camera 4 .
- the distance DS 2 ( FIG. 7 ) from the focal point F 1 to the imaging plane in the ROM in advance as preset value for the distance from the position of the focusing lens to the image pickup element of the image pickup camera 4 .
- the distance DS 2 may vary according to focusing in reality but the variance in the distance DS 2 is allowable as error. More specifically, the average of the distance from the focusing lens to the imaging plane when the focusing lens is closest to the object of shooting, the distance from the focusing lens to the imaging plane when the focusing lens is remotest from the object of shooting and the distance from the focusing lens to the imaging plane when the focusing lens is at an intermediary position is held as preset value. Then, the plane of the focusing lens at the averaged position operates as reference plane.
- control section 10 substitutes “H”, “f” and “Z” in the formula (1) respectively with the computationally determined distance between the position of placement P 1 and the position of placement P 2 , the distance DS 2 between the focal point F 1 and the imaging plane that is held as predefined value and the corresponding pixel differences M that correspond to the parallax to determine the actual position of the finger FG that corresponds to the corresponding pixels PX and PX′ and the distance DS 4 ( FIG. 7 ) between the finger FG and the reference plane.
- the finger diameter may be extracted from the picked up image in a registration mode in a condition where the finger is placed immediately above the image pickup camera 4 and registered in the flash memory 14 as reference value for the size of the finger with the identification information DIS. Then, in an authentication mode, the finger inclination angle AN 1 ( FIG. 10A ) may be detected on the basis of the registered finger diameter and the finger diameter at the corresponding site in the finger-profile/blood-vessels images IM 1 and IM 2 .
- a registration mode in a condition where the finger is placed immediately above the image pickup camera 4 and registered in the flash memory 14 as reference value for the size of the finger with the identification information DIS.
- the finger inclination angle AN 1 FIG. 10A
- Such an arrangement provides advantages similar to those of the above-described embodiment.
- the finger inclination angle AN 1 ( FIG. 10A ) and the finger rotary angle AN 2 ( FIG. 10B ) are detected for the purpose of detecting the extent of inclination of a site of a living body relative to a reference plane in the above-described embodiment
- the present invention is by no means limited thereto and, alternatively, only the finger inclination angle AN 1 ( FIG. 10A ) may be detected.
- the distortion would be corrected so as to make it an image picked up in a condition where the reference plane and the finger pad are in parallel with each other according to only the finger inclination angle AN 1 .
- the present invention is by no means limited thereto and any of various different alternative arrangements may be applied. Additionally, the finger tip touching section 3 and the finger movement index section 3 a may be replaced by auditory or visual unit for notifying the condition of the finger FG placed in the image pickup space.
- the finger inclination angle AN 1 ( FIG. 10A ) and the finger rotary angle AN 2 ( FIG. 10B ) are detected each time blood vessel video data DA 1 through DAm (or finger-profile/blood-vessels video data DB 1 through DBn) are acquired as a result of successively shooting the finger by means of the image pickup camera 4 by executing a distortion correction process using the obtained current image and the image immediately preceding the current image.
- the volume of the audio output is shifted as a function of the difference between the finger inclination angle AN 1 and the finger rotary angle AN 2 that are detected and the finger inclination angle and the finger rotary angle that are selected as reference each time the finger inclination angle AN 1 and the finger rotary angle AN 2 are detected.
- the difference between the finger inclination angle AN 1 and the finger rotary angle AN 2 that are detected and the finger inclination angle and the finger rotary angle that are selected as reference may be displayed on a real time basis.
- the present invention is by not means limited thereto and may be applied in various different modes.
- the present invention may be applied to an apparatus having a single feature.
- the present invention can be utilized in the field for biometrics authentication.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
- Collating Specific Patterns (AREA)
- Studio Devices (AREA)
- Image Analysis (AREA)
Abstract
Description
PD=H·f/Z (1)
Claims (10)
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2005316896A JP4465619B2 (en) | 2005-10-31 | 2005-10-31 | Registration device, verification device, image correction method, and program |
JPP2005-316896 | 2005-10-31 |
Publications (2)
Publication Number | Publication Date |
---|---|
US20070122014A1 US20070122014A1 (en) | 2007-05-31 |
US7760916B2 true US7760916B2 (en) | 2010-07-20 |
Family
ID=38121510
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US11/551,393 Expired - Fee Related US7760916B2 (en) | 2005-10-31 | 2006-10-20 | Registration apparatus, collation apparatus, image correction method and program |
Country Status (2)
Country | Link |
---|---|
US (1) | US7760916B2 (en) |
JP (1) | JP4465619B2 (en) |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20080130969A1 (en) * | 2006-11-02 | 2008-06-05 | Tomoyuki Asano | Imaging Apparatus |
US20090010337A1 (en) * | 2006-10-31 | 2009-01-08 | Sony Computer Entertainment Inc. | Picture decoding using same-picture reference for pixel reconstruction |
US20090010338A1 (en) * | 2006-10-31 | 2009-01-08 | Sony Computer Entertainment Inc. | Picture encoding using same-picture reference for pixel reconstruction |
US8213518B1 (en) | 2006-10-31 | 2012-07-03 | Sony Computer Entertainment Inc. | Multi-threaded streaming data decoding |
US20190042721A1 (en) * | 2008-08-04 | 2019-02-07 | Sony Corporation | Biometrics authentication system |
US10419760B2 (en) | 2014-09-29 | 2019-09-17 | Sony Interactive Entertainment Inc. | Picture quality oriented rate control for low-latency streaming applications |
Families Citing this family (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20100328304A1 (en) * | 2006-12-14 | 2010-12-30 | Imagnosis Inc. | Display direction correcting device, correcting method, and correction program for medical 3d image |
US9210404B2 (en) | 2012-12-14 | 2015-12-08 | Microsoft Technology Licensing, Llc | Calibration and registration of camera arrays using a single circular grid optical target |
US9210417B2 (en) | 2013-07-17 | 2015-12-08 | Microsoft Technology Licensing, Llc | Real-time registration of a stereo depth camera array |
US10339662B2 (en) | 2016-05-23 | 2019-07-02 | Microsoft Technology Licensing, Llc | Registering cameras with virtual fiducials |
US10326979B2 (en) | 2016-05-23 | 2019-06-18 | Microsoft Technology Licensing, Llc | Imaging system comprising real-time image registration |
JP2018049391A (en) * | 2016-09-20 | 2018-03-29 | 富士通株式会社 | Biological image processing apparatus, biological image processing method, and biological image processing program |
JP7387596B2 (en) * | 2017-07-20 | 2023-11-28 | ラーバ アイディー プロプライアタリー リミティド | safety tag |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPS62210590A (en) | 1986-03-12 | 1987-09-16 | Mitsubishi Electric Corp | Fingerprint collating device |
JP2000022869A (en) | 1998-07-02 | 2000-01-21 | Hitachi Ltd | Non-contact image reader |
US20020028004A1 (en) * | 2000-09-06 | 2002-03-07 | Naoto Miura | Personal identification device and method |
JP2002288670A (en) | 2001-03-22 | 2002-10-04 | Honda Motor Co Ltd | Personal authentication device using face image |
JP2003263639A (en) | 2002-03-08 | 2003-09-19 | Koji Fukami | Face image recognition apparatus and method |
US6813010B2 (en) * | 2000-09-20 | 2004-11-02 | Hitachi, Ltd | Personal identification system |
JP2005043286A (en) | 2003-07-24 | 2005-02-17 | Topcon Corp | Electron beam measurement and observation apparatus and electron beam measurement and observation method |
US20050129325A1 (en) * | 2003-11-27 | 2005-06-16 | Sony Corporation | Image processing apparatus and method |
-
2005
- 2005-10-31 JP JP2005316896A patent/JP4465619B2/en not_active Expired - Fee Related
-
2006
- 2006-10-20 US US11/551,393 patent/US7760916B2/en not_active Expired - Fee Related
Patent Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPS62210590A (en) | 1986-03-12 | 1987-09-16 | Mitsubishi Electric Corp | Fingerprint collating device |
JP2000022869A (en) | 1998-07-02 | 2000-01-21 | Hitachi Ltd | Non-contact image reader |
US20020028004A1 (en) * | 2000-09-06 | 2002-03-07 | Naoto Miura | Personal identification device and method |
JP2002083298A (en) | 2000-09-06 | 2002-03-22 | Hitachi Ltd | Personal authentication device and method |
US6813010B2 (en) * | 2000-09-20 | 2004-11-02 | Hitachi, Ltd | Personal identification system |
JP2002288670A (en) | 2001-03-22 | 2002-10-04 | Honda Motor Co Ltd | Personal authentication device using face image |
JP2003263639A (en) | 2002-03-08 | 2003-09-19 | Koji Fukami | Face image recognition apparatus and method |
JP2005043286A (en) | 2003-07-24 | 2005-02-17 | Topcon Corp | Electron beam measurement and observation apparatus and electron beam measurement and observation method |
US20050129325A1 (en) * | 2003-11-27 | 2005-06-16 | Sony Corporation | Image processing apparatus and method |
Non-Patent Citations (2)
Title |
---|
English translation of Japanese Patent Office, Office Action issued for Patent Application JP2007-316896, Aug. 20, 2009, pp. 1-4. * |
Japanese Patent Office, Office Action issued in Patent Application JP2007-316896, on Aug. 20, 2009. |
Cited By (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20090010337A1 (en) * | 2006-10-31 | 2009-01-08 | Sony Computer Entertainment Inc. | Picture decoding using same-picture reference for pixel reconstruction |
US20090010338A1 (en) * | 2006-10-31 | 2009-01-08 | Sony Computer Entertainment Inc. | Picture encoding using same-picture reference for pixel reconstruction |
US8213518B1 (en) | 2006-10-31 | 2012-07-03 | Sony Computer Entertainment Inc. | Multi-threaded streaming data decoding |
US8218641B2 (en) * | 2006-10-31 | 2012-07-10 | Sony Computer Entertainment Inc. | Picture encoding using same-picture reference for pixel reconstruction |
US8218640B2 (en) * | 2006-10-31 | 2012-07-10 | Sony Computer Entertainment Inc. | Picture decoding using same-picture reference for pixel reconstruction |
US20080130969A1 (en) * | 2006-11-02 | 2008-06-05 | Tomoyuki Asano | Imaging Apparatus |
US8229183B2 (en) * | 2006-11-02 | 2012-07-24 | Sony Corporation | Imaging apparatus |
US20190042721A1 (en) * | 2008-08-04 | 2019-02-07 | Sony Corporation | Biometrics authentication system |
US10956547B2 (en) * | 2008-08-04 | 2021-03-23 | Sony Corporation | Biometrics authentication system |
US10419760B2 (en) | 2014-09-29 | 2019-09-17 | Sony Interactive Entertainment Inc. | Picture quality oriented rate control for low-latency streaming applications |
US11006112B2 (en) | 2014-09-29 | 2021-05-11 | Sony Interactive Entertainment Inc. | Picture quality oriented rate control for low-latency streaming applications |
US11509896B2 (en) | 2014-09-29 | 2022-11-22 | Sony Interactive Entertainment Inc. | Picture quality oriented rate control for low-latency streaming applications |
Also Published As
Publication number | Publication date |
---|---|
JP4465619B2 (en) | 2010-05-19 |
JP2007122608A (en) | 2007-05-17 |
US20070122014A1 (en) | 2007-05-31 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US7760916B2 (en) | Registration apparatus, collation apparatus, image correction method and program | |
KR100919338B1 (en) | The system for managing entrance/leaving room | |
US9754148B2 (en) | Image correction apparatus and image correction method | |
US10909363B2 (en) | Image acquisition system for off-axis eye images | |
US7983453B2 (en) | Verification apparatus, verification method and program | |
EP2068270B1 (en) | Authentication apparatus and authentication method | |
JP2008211514A (en) | Image discriminating device and method, and program | |
JP6846330B2 (en) | Biometric device and biometric system | |
KR102151474B1 (en) | Non contract fingerprint recognition method using smart terminal | |
KR102316587B1 (en) | Method for biometric recognition from irises | |
JP2004178606A (en) | Personal authentication device and method | |
JP2011100317A (en) | Personal authentication method and personal authentication system using vein pattern during bending and stretching motion of finger | |
JP5182341B2 (en) | Personal authentication apparatus and method | |
JP2004102993A (en) | Personal authentication device and method | |
KR100937800B1 (en) | Personal authentication system and device | |
JP2008206536A (en) | Personal authentication system using retina image | |
JP2014167799A (en) | Personal authentication device, and blood vessel image photographing apparatus | |
JP2011018344A (en) | Personal authentication system and device | |
JP2004171577A (en) | Personal authentication device and method | |
JP6082766B2 (en) | Blood vessel imaging device | |
JP4603610B2 (en) | Entrance / exit management system | |
KR20200053792A (en) | Payment method and system using bio credit card | |
JP2009087363A (en) | Personal authentication apparatus and method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: SONY CORPORATION, JAPAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:SATO, HIDEO;REEL/FRAME:018417/0835 Effective date: 20061010 |
|
FEPP | Fee payment procedure |
Free format text: PAYOR NUMBER ASSIGNED (ORIGINAL EVENT CODE: ASPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
FPAY | Fee payment |
Year of fee payment: 4 |
|
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 8TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1552) Year of fee payment: 8 |
|
FEPP | Fee payment procedure |
Free format text: MAINTENANCE FEE REMINDER MAILED (ORIGINAL EVENT CODE: REM.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
LAPS | Lapse for failure to pay maintenance fees |
Free format text: PATENT EXPIRED FOR FAILURE TO PAY MAINTENANCE FEES (ORIGINAL EVENT CODE: EXP.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
STCH | Information on status: patent discontinuation |
Free format text: PATENT EXPIRED DUE TO NONPAYMENT OF MAINTENANCE FEES UNDER 37 CFR 1.362 |
|
FP | Lapsed due to failure to pay maintenance fee |
Effective date: 20220720 |