CN109726611B - Biological feature recognition method and device, readable storage medium and electronic equipment - Google Patents
Biological feature recognition method and device, readable storage medium and electronic equipment Download PDFInfo
- Publication number
- CN109726611B CN109726611B CN201711023405.7A CN201711023405A CN109726611B CN 109726611 B CN109726611 B CN 109726611B CN 201711023405 A CN201711023405 A CN 201711023405A CN 109726611 B CN109726611 B CN 109726611B
- Authority
- CN
- China
- Prior art keywords
- depth information
- module
- biometric
- sub
- detection light
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Landscapes
- User Interface Of Digital Computer (AREA)
- Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
- Telephone Function (AREA)
Abstract
The disclosure relates to a biological feature identification method and device, a readable storage medium and an electronic device, wherein the method comprises the steps of receiving a user trigger instruction aiming at a camera module; in response to the user trigger instruction, determining depth information of biological features of a photographed object; and constructing a three-dimensional image of the biological feature according to the depth information. According to the method and the device, the corresponding three-dimensional image can be constructed based on the acquired depth information of the biological features, so that the electronic equipment can perform identity recognition according to the constructed three-dimensional image, and the safety performance of the electronic equipment is improved.
Description
Technical Field
The present disclosure relates to the field of terminal technologies, and in particular, to a biometric feature recognition method and apparatus, a readable storage medium, and an electronic device.
Background
Currently, an electronic device may acquire an image of a photographed object on the electronic device through a camera module, so that the photographed object is stored in the electronic device in the form of two-dimensional image information, and a user may view the image through the electronic device or may view the image through another terminal device connected to the electronic device.
Disclosure of Invention
The present disclosure provides a biometric feature recognition method and apparatus, a readable storage medium, and an electronic device to solve the deficiencies in the related art.
According to a first aspect of embodiments of the present disclosure, there is provided a biometric identification method, including:
receiving a user trigger instruction aiming at the camera module;
in response to the user trigger instruction, determining depth information of biological features of a photographed object;
and constructing a three-dimensional image of the biological feature according to the depth information.
Optionally, the determining depth information of the biological feature of the photographed object includes:
emitting detection light to the biometric feature;
receiving reflected light of the detection light reflected by the biological characteristics of the shot object;
and acquiring the depth information according to the detection light and the reflection light.
Optionally, the obtaining the depth information according to the detection light and the reflection light includes:
and acquiring the depth information according to the phase difference and the time difference between the detection light and the reflection light.
Optionally, the obtaining the depth information according to the phase difference and the time difference between the detection light and the reflection light includes:
acquiring sub-depth information for multiple times according to the phase difference and the time difference between the detection light and the reflection light;
and carrying out mean value filtering calculation or median value filtering calculation on the obtained sub-depth information to obtain the depth information.
Optionally, the receiving a user trigger instruction for the camera module includes:
receiving a first trigger instruction aiming at a physical key on the electronic equipment; or,
and receiving a second trigger instruction aiming at the virtual key on the function page shown by the electronic equipment.
Optionally, after constructing the three-dimensional image of the biometric feature according to the depth information, the method further includes:
and carrying out biological feature recognition based on the three-dimensional image and the predefined image, and finishing preset function operation based on a recognition result.
According to a second aspect of the embodiments of the present disclosure, there is provided a biometric apparatus including:
the receiving module is configured to receive a user trigger instruction aiming at the camera module;
a determination module configured to determine depth information of a biometric feature of a photographed object in response to a user trigger instruction received by the reception module;
an imaging module configured to construct a three-dimensional image of the biometric feature from the depth information determined by the determination module.
Optionally, the determining module includes:
an emission sub-module configured to emit detection light to the biometric feature;
the receiving sub-module is configured to receive reflected light rays of the detection light rays emitted by the emitting sub-module after the detection light rays are reflected by the biological characteristics of the shot object;
and the acquisition sub-module is configured to acquire the depth information according to the detection light rays emitted by the emission sub-module and the reflection light rays received by the receiving sub-module.
Optionally, the obtaining sub-module includes:
an acquisition unit configured to acquire the depth information according to a phase difference and a time difference between the detection light and the reflection light.
Optionally, the obtaining unit includes:
a first acquisition subunit configured to acquire sub-depth information a plurality of times according to a phase difference and a time difference between the detection light and the reflection light;
and the second acquisition subunit is configured to perform a mean filtering algorithm on the sub-depth information acquired by the first acquisition subunit to acquire the depth information.
Optionally, the receiving module includes:
the first receiving submodule is configured to receive a first triggering instruction aiming at a physical key on the electronic equipment; or,
and the second receiving submodule is configured to receive a second trigger instruction aiming at the virtual key on the function page shown by the electronic equipment.
Optionally, the method further includes:
and the identification module is configured to perform biological feature identification on the basis of the three-dimensional image and the predefined image and complete preset function operation on the basis of an identification result.
According to a third aspect of embodiments of the present disclosure, there is provided a computer readable storage medium having stored thereon computer instructions which, when executed by a processor, implement the steps of the method as described in any one of the above embodiments.
According to a fourth aspect of the embodiments of the present disclosure, there is provided an electronic apparatus including:
a processor;
a memory for storing processor-executable instructions;
wherein the processor is configured to implement the steps of the method according to any of the embodiments described above.
The technical scheme provided by the embodiment of the disclosure can have the following beneficial effects:
according to the embodiment, the corresponding three-dimensional image can be constructed based on the acquired depth information of the biological features, so that the identity recognition of the electronic equipment can be facilitated according to the constructed three-dimensional image, and the safety performance of the electronic equipment can be improved.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present disclosure and together with the description, serve to explain the principles of the disclosure.
FIG. 1 is a flow diagram illustrating a biometric identification method according to an exemplary embodiment.
FIG. 2 is a flow diagram illustrating another biometric identification method according to an example embodiment.
Fig. 3 is one of application scenarios illustrating another biometric method according to an exemplary embodiment.
Fig. 4 is a second application scenario diagram illustrating another biometric identification method according to an exemplary embodiment.
Fig. 5 is a schematic diagram illustrating a manner of obtaining depth information according to an example embodiment.
Fig. 6-11 are block diagrams illustrating a biometric identification device according to an exemplary embodiment.
Fig. 12 is a block diagram illustrating a biometric identification device for use in accordance with an exemplary embodiment.
Detailed Description
Reference will now be made in detail to the exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, like numbers in different drawings represent the same or similar elements unless otherwise indicated. The embodiments described in the following exemplary embodiments do not represent all embodiments consistent with the present application. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the present application, as detailed in the appended claims.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. As used in this application and the appended claims, the singular forms "a", "an", and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It should also be understood that the term "and/or" as used herein refers to and encompasses any and all possible combinations of one or more of the associated listed items.
It is to be understood that although the terms first, second, third, etc. may be used herein to describe various information, such information should not be limited to these terms. These terms are only used to distinguish one type of information from another. For example, first information may also be referred to as second information, and similarly, second information may also be referred to as first information, without departing from the scope of the present application. The word "if" as used herein may be interpreted as "at … …" or "when … …" or "in response to a determination", depending on the context.
Fig. 1 is a flowchart illustrating a biometric method according to an exemplary embodiment, where the biometric method is applied to a terminal, as shown in fig. 1, and may include the following steps:
in step 101, a user trigger instruction for the camera module is received.
In this embodiment, the trigger instruction may be a first trigger instruction for a virtual button on a functional interface, where the functional interface may include a payment interface, an unlocking interface, or a shooting interface; alternatively, the trigger instruction may be a second trigger instruction for a preset physical key on the electronic device, for example, "home key", "home key + power key", and the like, which is not limited in this disclosure.
In step 102, in response to the user trigger instruction, depth information of a biometric feature of a photographed object is determined.
In the present embodiment, in response to the trigger instruction described in the above-described embodiments, the depth information of the biometric feature of the photographic subject is determined. For example, a boundary region between the biometric region and another region different from the biometric region may be first determined from the grayscale information of the image of the subject on the electronic device, and a portion surrounded by the boundary region may be determined as the biometric region.
Further, the corresponding biological characteristic position can be determined according to the biological characteristic area, so that the detection light is emitted towards the biological characteristic on the shot object, the reflection light of the detection light reflected by the biological characteristic of the shot object is received, and the depth information of the biological characteristic on the shot object can be acquired according to the detection light and the reflection light.
Specifically, the time difference between the detection light and the reflection light is determined according to the phase difference between the detection light and the reflection light, the sub-depth information of the biological feature under the detection is obtained by calculation according to the time difference, and the depth information is obtained by performing mean value filtering calculation or median value filtering calculation on the sub-depth information obtained for multiple times.
In step 103, a three-dimensional image of the biometric feature is constructed from the depth information.
In this embodiment, the acquired depth information may be combined with the plane information of the biometric feature acquired by the camera module to construct a three-dimensional image corresponding to the biometric feature.
The three-dimensional image based on the biological features obtained in the above embodiments may perform biological feature recognition with a preset image in the electronic device, and complete a preset function operation based on the recognition result, for example, the electronic device may be unlocked when the three-dimensional image is successfully matched with the preset image in the electronic device.
According to the embodiment, the corresponding three-dimensional image can be constructed based on the acquired depth information of the biological features, so that the identity recognition of the electronic equipment can be facilitated according to the constructed three-dimensional image, and the safety performance of the electronic equipment can be improved.
In order to describe the technical solution of the present disclosure in detail, a specific implementation flow of the biometric identification method will be described below by taking a mobile phone as an example. As shown in fig. 2, the biometric identification method may include the steps of:
in step 201, a preset physical button is clicked.
In this embodiment, as shown in fig. 3, when the mobile phone 100 is in the screen locking state and the "home" key is triggered, the mobile phone 100 may start the camera module and construct a three-dimensional image according to the depth information; of course, in other embodiments, it may be determined whether to start the three-dimensional image construction function according to the user's requirement. The "home" key is merely used as an exemplary illustration, and may further include preset virtual keys such as "pay", "confirm", "login", and the like.
In step 202, gradation information of an image of a subject formed on the mobile phone 100 is acquired.
In step 203, a biometric region and other regions distinguished from the biometric region are determined based on the grayscale information.
In this embodiment, due to the difference in color and brightness at each position on the object, the gray scale values of each region of the image formed by the object on the mobile phone 100 are different, so that the boundary region of the image can be obtained and divided based on the discontinuity of gray scale presented by the edge of the image, and the biometric region and other regions of the object can be obtained.
For example, as shown in fig. 4, the object to be photographed may include a face portion 10 and a background portion 20, because the physical distance between the face portion 10 and the mobile phone 100 is short, and the physical distance between the background portion 20 and the mobile phone 100 is relatively long. Therefore, when the amount of change of the depth information in the neighboring area exceeds a preset threshold, the boundary area 30 between the background portion 20 and the face portion 10 is determined, and it is determined that the first portion surrounded by the boundary area 30 (i.e., the portion surrounded by the boundary of the face portion 10; wherein the portion may include the portion surrounded by the boundary together with the edge of the entire image) is the face portion, and the other portions of the image (i.e., the mountain portion 20 and the water surface portion 30) different from the first portion are the background portion.
In step 204, detection light is emitted to the biometric feature on the subject corresponding to the biometric feature region.
In step 205, reflected light is received.
In this embodiment, the detection light may be infrared light, so that the detection light can be reflected to obtain reflected light after encountering the photographed object, and the mobile phone 100 can determine the physical distance between the current camera module and the biometric feature of the photographed object according to the parameters of the detection light and the reflected light.
As shown in fig. 5, when the object 40 is photographed, light signals may be sequentially emitted to each point of the object 40 according to a preset sequence to measure depth information of the corresponding point, for example, the depth information may be sequentially measured from left to right and from top to bottom in the figure. Meanwhile, multiple measurements can be performed on the object 40 to obtain multiple sets of depth information about the object 40, and then weighted average calculation is performed on the depth information to obtain final depth information; the weight can be flexibly set according to the actual situation, which is not limited by the present disclosure. By measuring the object 40 for a plurality of times according to the preset sequence, the accuracy of measuring depth information and the three-dimensional effect of the image can be improved, and thus the accuracy of subsequently identifying the main body and the background part is further improved.
In step 206, a time difference is calculated from the phase difference between the detected light and the reflected light.
In step 207, corresponding sub-depth information is obtained according to the time difference calculation.
In this embodiment, the sub-depth information may be a physical distance between the mobile phone 100 and the biometric feature of the photographed object measured at the time. In particular, the physical distance may be calculated from a physical parameter between the emitted detection light and the received reflected light.
For example, the calculation may be obtained according to the following functional relationship:
d=△τ/2△τ;
wherein, d: a physical distance; c: the speed of light; pi: a circumferential ratio; f: detecting the emission frequency of the light;phase difference; deltaτ: the time difference.
In step 208, whether the number of sub-depth information reaches a preset threshold.
In this embodiment, when the number of sub-depth information reaches a preset threshold, step 208 is performed; and returning to the step 204 when the number of the sub-depth information does not reach the preset threshold.
In step 209, the depth information is calculated using a mean filtering algorithm.
In this embodiment, the mean filtering calculation is performed based on the sub-depth information obtained through multiple detections, which is beneficial to improving the accuracy of the obtained depth information.
For example, if the 3 times sub-depth information acquired for one of the measurement points on the object 40 is a, b, and c, and a > b > c, respectively, then when the median filtering calculation is adopted, the depth information of the measurement point at this time is b; when the mean filtering calculation is adopted, the depth information of the measurement point at this time is (a + b + c)/3. Of course, in some other embodiments, the depth information of the corresponding point may be obtained by performing a median filtering algorithm on any preset region, then performing a mean filtering calculation on the preset region, and using the depth information obtained after the mean filtering calculation as the depth information of each point on the preset region.
In step 2010, a three-dimensional image of the biometric feature is constructed from the depth information.
In step 2011, the three-dimensional image is matched with a preset image to determine whether the matching is successful.
In this embodiment, when the three-dimensional image is successfully matched with the preset image, step 2012 is executed; and when the three-dimensional image fails to be matched with the preset image, returning to the step 204 to obtain the depth information of the corresponding biological characteristics again and reconstructing the three-dimensional image.
Of course, in some other embodiments, the process may also return to step 2010 to construct the three-dimensional image again, so as to save processing resources.
In step 2012, handset 100 is unlocked.
In this embodiment, after the mobile phone is unlocked based on the recognition result, the user can perform the next operation step on the mobile phone. Of course, only unlocking the mobile phone is taken as an example for description here, and the method can also be applied to various application scenarios such as payment and login.
Corresponding to the foregoing embodiments of the biometric method, the present disclosure also provides embodiments of a biometric apparatus.
Fig. 6 is a block diagram illustrating a biometric identification device according to an example embodiment. Referring to fig. 6, the apparatus 200 includes a detection module 601, a determination module 602, and an imaging module 603.
The receiving module 601 is configured to receive a user trigger instruction for the camera module;
a determining module 602 configured to determine depth information of a biometric feature of a photographed object in response to a user trigger instruction received by the receiving module 601;
an imaging module 603 configured to construct a three-dimensional image of the biometric feature from the depth information determined by the determination module 602.
As shown in fig. 7, fig. 7 is a block diagram of another biometric apparatus according to an exemplary embodiment, which is based on the foregoing embodiment shown in fig. 6, and the determining module 602 may include: a transmit sub-module 6021, a receive sub-module 6022, and an acquisition sub-module 6023, wherein:
an emission sub-module 6021 configured to emit detection light to the biometric feature;
a receiving sub-module 6022 configured to receive reflected light rays of the detection light rays emitted by the emitting sub-module 6022 after being reflected by the biological characteristics of the photographed object;
an obtaining sub-module 6023 configured to obtain the depth information from the detection light rays emitted by the emitting sub-module 6021 and the reflection light rays received by the receiving sub-module 6022.
As shown in fig. 8, fig. 8 is a block diagram of another biometric apparatus according to an exemplary embodiment, which is based on the foregoing embodiment shown in fig. 7, and the acquisition sub-module 6023 may include an acquisition unit 6023A, in which:
an obtaining unit 6023A configured to obtain the depth information according to a phase difference and a time difference between the detected light ray and the reflected light ray.
The acquisition unit 6023A may further include a first acquisition subunit and a second acquisition subunit, wherein the first acquisition subunit may be configured to acquire the sub-depth information a plurality of times according to a phase difference and a time difference between the detected light and the reflected light; the second obtaining subunit may be configured to perform a mean filtering algorithm on the sub-depth information obtained by the first obtaining subunit to obtain the depth information.
As shown in fig. 9, fig. 9 is a block diagram of another biometric apparatus according to an exemplary embodiment, which is based on any one of the foregoing embodiments shown in fig. 6-8, and the receiving module 601 may include a first receiving submodule 6011, wherein:
a first receiving submodule 6011 configured to receive a first trigger instruction for a physical key on an electronic device; or,
alternatively, as shown in fig. 10, the receiving module 601 includes a second receiving submodule 6012,
the second receiving submodule 6012 is configured to receive a second trigger instruction for a virtual key on a function page shown by the electronic device.
As shown in fig. 11, fig. 11 is a block diagram of another biometric apparatus according to an exemplary embodiment, which may further include, on the basis of the embodiment shown in any one of the foregoing fig. 6 to 9:
and the identification module 604 is configured to perform biological feature identification based on the three-dimensional image and the predefined image, and complete preset function operation based on the identification result.
With regard to the apparatus in the above-described embodiment, the specific manner in which each module performs the operation has been described in detail in the embodiment related to the method, and will not be elaborated here.
For the device embodiments, since they substantially correspond to the method embodiments, reference may be made to the partial description of the method embodiments for relevant points. The above-described embodiments of the apparatus are merely illustrative, and the units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the modules can be selected according to actual needs to achieve the purpose of the disclosed solution. One of ordinary skill in the art can understand and implement it without inventive effort.
Correspondingly, the present disclosure also provides a biometric device, including: a processor; a memory for storing processor-executable instructions; wherein the processor is configured to: receiving a user trigger instruction aiming at the camera module; in response to the user trigger instruction, determining depth information of biological features of a photographed object; and constructing a three-dimensional image of the biological feature according to the depth information.
Accordingly, the present disclosure also provides a terminal comprising a memory, and one or more programs, wherein the one or more programs are stored in the memory and configured for execution by the one or more processors to include instructions for: receiving a user trigger instruction aiming at the camera module; in response to the user trigger instruction, determining depth information of biological features of a photographed object; and constructing a three-dimensional image of the biological feature according to the depth information.
Fig. 12 is a block diagram illustrating an apparatus for biometric identification according to an example embodiment. For example, the apparatus 1200 may be a mobile phone, a computer, a digital broadcast terminal, a messaging device, a game console, a tablet device, a medical device, an exercise device, a personal digital assistant, and the like.
Referring to fig. 12, the apparatus 1200 may include one or more of the following components: processing component 1202, memory 1204, power component 1206, multimedia component 1208, audio component 1210, input/output (I/O) interface 1212, sensor component 1214, and communications component 1216.
The processing component 1202 generally controls overall operation of the apparatus 1200, such as operations associated with display, telephone calls, data communications, camera operations, and recording operations. The processing components 1202 may include one or more processors 1220 to execute instructions to perform all or a portion of the steps of the methods described above. Further, the processing component 1202 can include one or more modules that facilitate interaction between the processing component 1202 and other components. For example, the processing component 1202 can include a multimedia module to facilitate interaction between the multimedia component 1208 and the processing component 1202.
The memory 1204 is configured to store various types of data to support operation at the apparatus 1200. Examples of such data include instructions for any application or method operating on the device 1200, contact data, phonebook data, messages, pictures, videos, and so forth. The memory 1204 may be implemented by any type or combination of volatile or non-volatile memory devices such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disks.
A power supply component 1206 provides power to the various components of the device 1200. Power components 1206 may include a power management system, one or more power sources, and other components associated with generating, managing, and distributing power for apparatus 1200.
The multimedia components 1208 include a screen that provides an output interface between the device 1200 and a user. In some embodiments, the screen may include a Liquid Crystal Display (LCD) and a Touch Panel (TP). If the screen includes a touch panel, the screen may be implemented as a touch screen to receive an input signal from a user. The touch panel includes one or more touch sensors to sense touch, slide, and gestures on the touch panel. The touch sensor may not only sense the boundary of a touch or slide action, but also detect the duration and pressure associated with the touch or slide operation. In some embodiments, the multimedia component 1208 includes a front facing camera and/or a rear facing camera. The front camera and/or the rear camera may receive external multimedia data when the apparatus 1200 is in an operation mode, such as a photographing mode or a video mode. Each front camera and rear camera may be a fixed optical lens system or have a focal length and optical zoom capability.
The I/O interface 1212 provides an interface between the processing component 1202 and peripheral interface modules, which may be keyboards, click wheels, buttons, etc. These buttons may include, but are not limited to: a home button, a volume button, a start button, and a lock button.
The sensor assembly 1214 includes one or more sensors for providing various aspects of state assessment for the apparatus 1200. For example, the sensor assembly 1214 may detect an open/closed state of the apparatus 1200, the relative positioning of the components, such as a display and keypad of the apparatus 1200, the sensor assembly 1214 may also detect a change in the position of the apparatus 1200 or a component of the apparatus 1200, the presence or absence of user contact with the apparatus 1200, orientation or acceleration/deceleration of the apparatus 1200, and a change in the temperature of the apparatus 1200. The sensor assembly 1214 may include a proximity sensor configured to detect the presence of a nearby object in the absence of any physical contact. The sensor assembly 1214 may also include a light sensor, such as a CMOS or CCD image sensor, for use in imaging applications. In some embodiments, the sensor assembly 1214 may also include an acceleration sensor, a gyroscope sensor, a magnetic sensor, a pressure sensor, or a temperature sensor.
The communications component 1216 is configured to facilitate communications between the apparatus 1200 and other devices in a wired or wireless manner. The apparatus 1200 may access a wireless network based on a communication standard, such as WiFi, 2G or 3G, or a combination thereof. In an exemplary embodiment, the communication component 1216 receives the broadcast signal or broadcast related information from an external broadcast management system via a broadcast channel. In an exemplary embodiment, the communications component 1216 further includes a Near Field Communication (NFC) module to facilitate short-range communications. For example, the NFC module may be implemented based on Radio Frequency Identification (RFID) technology, infrared data association (IrDA) technology, Ultra Wideband (UWB) technology, Bluetooth (BT) technology, and other technologies.
In an exemplary embodiment, the apparatus 1200 may be implemented by one or more Application Specific Integrated Circuits (ASICs), Digital Signal Processors (DSPs), Digital Signal Processing Devices (DSPDs), Programmable Logic Devices (PLDs), Field Programmable Gate Arrays (FPGAs), controllers, micro-controllers, microprocessors or other electronic components for performing the above-described methods.
In an exemplary embodiment, a non-transitory computer readable storage medium comprising instructions, such as memory 1204 comprising instructions, executable by processor 1220 of apparatus 1200 to perform the above-described method is also provided. For example, the non-transitory computer readable storage medium may be a ROM, a Random Access Memory (RAM), a CD-ROM, a magnetic tape, a floppy disk, an optical data storage device, and the like.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This application is intended to cover any variations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It will be understood that the present disclosure is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the present disclosure is limited only by the appended claims.
Claims (14)
1. A biometric identification method, comprising:
receiving a user trigger instruction aiming at the camera module;
in response to the user trigger instruction, determining a biological feature region in an image according to the image formed by the shot object on the electronic equipment, determining a corresponding biological feature position in the real world according to the biological feature region, and determining depth information of the biological feature of the shot object aiming at the biological feature position;
and constructing a three-dimensional image of the biological feature according to the depth information.
2. The biometric identification method according to claim 1, wherein the determining depth information of the biometric characteristic of the photographed object includes:
emitting detection light to the biometric feature;
receiving reflected light of the detection light reflected by the biological characteristics of the shot object;
and acquiring the depth information according to the detection light and the reflection light.
3. The biometric method according to claim 2, wherein the obtaining the depth information based on the detection light and the reflection light comprises:
and acquiring the depth information according to the phase difference and the time difference between the detection light and the reflection light.
4. The biometric method according to claim 3, wherein the obtaining the depth information based on the phase difference and the time difference between the detection light and the reflection light comprises:
acquiring sub-depth information for multiple times according to the phase difference and the time difference between the detection light and the reflection light;
and carrying out mean value filtering calculation or median value filtering calculation on the obtained sub-depth information to obtain the depth information.
5. The biometric identification method according to claim 1, wherein the receiving of the user trigger instruction for the camera module comprises:
receiving a first trigger instruction aiming at a physical key on the electronic equipment; or,
and receiving a second trigger instruction aiming at the virtual key on the function page shown by the electronic equipment.
6. The biometric identification method according to claim 1, further comprising, after constructing the three-dimensional image of the biometric feature from the depth information:
and carrying out biological feature recognition based on the three-dimensional image and the predefined image, and finishing preset function operation based on a recognition result.
7. A biometric identification device, comprising:
the receiving module is configured to receive a user trigger instruction aiming at the camera module;
the determining module is configured to determine a biological feature area in an image of a shot object formed on an electronic device in response to a user trigger instruction received by the receiving module, determine a corresponding biological feature position in the real world according to the biological feature area, and determine depth information of a biological feature of the shot object according to the biological feature position;
an imaging module configured to construct a three-dimensional image of the biometric feature from the depth information determined by the determination module.
8. The biometric recognition apparatus of claim 7, wherein the determining means comprises:
an emission sub-module configured to emit detection light to the biometric feature;
the receiving sub-module is configured to receive reflected light rays of the detection light rays emitted by the emitting sub-module after the detection light rays are reflected by the biological characteristics of the shot object;
and the acquisition sub-module is configured to acquire the depth information according to the detection light rays emitted by the emission sub-module and the reflection light rays received by the receiving sub-module.
9. The biometric recognition apparatus of claim 8, wherein the acquisition sub-module comprises:
an acquisition unit configured to acquire the depth information according to a phase difference and a time difference between the detection light and the reflection light.
10. The biometric apparatus according to claim 9, wherein the acquisition unit includes:
a first acquisition subunit configured to acquire sub-depth information a plurality of times according to a phase difference and a time difference between the detection light and the reflection light;
and the second acquisition subunit is configured to perform a mean filtering algorithm on the sub-depth information acquired by the first acquisition subunit to acquire the depth information.
11. The biometric identification device of claim 7, wherein the receiving module comprises:
the first receiving submodule is configured to receive a first triggering instruction aiming at a physical key on the electronic equipment; or,
and the second receiving submodule is configured to receive a second trigger instruction aiming at the virtual key on the function page shown by the electronic equipment.
12. The biometric recognition device of claim 7, further comprising:
and the identification module is configured to perform biological feature identification on the basis of the three-dimensional image and the predefined image and complete preset function operation on the basis of an identification result.
13. A computer-readable storage medium having stored thereon computer instructions, which, when executed by a processor, carry out the steps of the method according to any one of claims 1-6.
14. An electronic device, characterized in that the electronic device comprises:
a processor;
a memory for storing processor-executable instructions;
wherein the processor is configured to implement the steps of the method according to any one of claims 1-6.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201711023405.7A CN109726611B (en) | 2017-10-27 | 2017-10-27 | Biological feature recognition method and device, readable storage medium and electronic equipment |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201711023405.7A CN109726611B (en) | 2017-10-27 | 2017-10-27 | Biological feature recognition method and device, readable storage medium and electronic equipment |
Publications (2)
Publication Number | Publication Date |
---|---|
CN109726611A CN109726611A (en) | 2019-05-07 |
CN109726611B true CN109726611B (en) | 2021-07-23 |
Family
ID=66291686
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201711023405.7A Active CN109726611B (en) | 2017-10-27 | 2017-10-27 | Biological feature recognition method and device, readable storage medium and electronic equipment |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN109726611B (en) |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110944112A (en) * | 2019-11-22 | 2020-03-31 | 维沃移动通信有限公司 | Image processing method and electronic equipment |
CN114584697B (en) * | 2020-11-16 | 2024-06-25 | 中国航发商用航空发动机有限责任公司 | Residue detection device and method |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN1575524A (en) * | 2001-08-23 | 2005-02-02 | 华盛顿州大学 | Image acquisition with depth enhancement |
CN101866056A (en) * | 2010-05-28 | 2010-10-20 | 中国科学院合肥物质科学研究院 | Three-dimensional imaging method and system based on LED array common lens TOF depth measurement |
CN102073050A (en) * | 2010-12-17 | 2011-05-25 | 清华大学 | Depth-camera based three-dimensional scene depth measurement device |
CN104008366A (en) * | 2014-04-17 | 2014-08-27 | 深圳市唯特视科技有限公司 | 3D intelligent recognition method and system for biology |
CN104516560A (en) * | 2013-09-27 | 2015-04-15 | 联想(北京)有限公司 | Identification method, identification device and electronic equipment |
CN106485118A (en) * | 2016-09-19 | 2017-03-08 | 信利光电股份有限公司 | Electronic equipment and its identifying system, decryption method |
-
2017
- 2017-10-27 CN CN201711023405.7A patent/CN109726611B/en active Active
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN1575524A (en) * | 2001-08-23 | 2005-02-02 | 华盛顿州大学 | Image acquisition with depth enhancement |
CN101866056A (en) * | 2010-05-28 | 2010-10-20 | 中国科学院合肥物质科学研究院 | Three-dimensional imaging method and system based on LED array common lens TOF depth measurement |
CN102073050A (en) * | 2010-12-17 | 2011-05-25 | 清华大学 | Depth-camera based three-dimensional scene depth measurement device |
CN104516560A (en) * | 2013-09-27 | 2015-04-15 | 联想(北京)有限公司 | Identification method, identification device and electronic equipment |
CN104008366A (en) * | 2014-04-17 | 2014-08-27 | 深圳市唯特视科技有限公司 | 3D intelligent recognition method and system for biology |
CN106485118A (en) * | 2016-09-19 | 2017-03-08 | 信利光电股份有限公司 | Electronic equipment and its identifying system, decryption method |
Also Published As
Publication number | Publication date |
---|---|
CN109726611A (en) | 2019-05-07 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9953506B2 (en) | Alarming method and device | |
US10452890B2 (en) | Fingerprint template input method, device and medium | |
CN105631803B (en) | The method and apparatus of filter processing | |
CN108010060B (en) | Target detection method and device | |
CN107944367B (en) | Face key point detection method and device | |
CN106557759B (en) | Signpost information acquisition method and device | |
US10242678B2 (en) | Friend addition using voiceprint analysis method, device and medium | |
US10379602B2 (en) | Method and device for switching environment picture | |
US20170339287A1 (en) | Image transmission method and apparatus | |
US20170371506A1 (en) | Method, device, and computer-readable medium for message generation | |
CN109726614A (en) | 3D stereoscopic imaging method and device, readable storage medium storing program for executing, electronic equipment | |
CN105426485A (en) | Image combination method and device, intelligent terminal and server | |
CN110059547B (en) | Target detection method and device | |
CN107341000B (en) | Method and device for displaying fingerprint input image and terminal | |
CN106886019B (en) | Distance measurement method and device | |
CN107657608B (en) | Image quality determination method and device and electronic equipment | |
EP3435283A1 (en) | Method and device for optical fingerprint recognition, and computer-readable storage medium | |
CN109726611B (en) | Biological feature recognition method and device, readable storage medium and electronic equipment | |
CN109284591B (en) | Face unlocking method and device | |
WO2023273050A1 (en) | Living body detection method and apparatus, electronic device, and storage medium | |
CN106469446B (en) | Depth image segmentation method and segmentation device | |
CN108154090B (en) | Face recognition method and device | |
CN107730452B (en) | Image splicing method and device | |
CN104954683B (en) | Determine the method and device of photographic device | |
CN109543564A (en) | Based reminding method and device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |