[go: up one dir, main page]

CN119418373A - A method for synchronously collecting contact finger images and finger state true values - Google Patents

A method for synchronously collecting contact finger images and finger state true values Download PDF

Info

Publication number
CN119418373A
CN119418373A CN202411335583.3A CN202411335583A CN119418373A CN 119418373 A CN119418373 A CN 119418373A CN 202411335583 A CN202411335583 A CN 202411335583A CN 119418373 A CN119418373 A CN 119418373A
Authority
CN
China
Prior art keywords
finger
contact
state
images
image sequence
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202411335583.3A
Other languages
Chinese (zh)
Other versions
CN119418373B (en
Inventor
冯建江
裴浩翔
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tsinghua University
Original Assignee
Tsinghua University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tsinghua University filed Critical Tsinghua University
Priority to CN202411335583.3A priority Critical patent/CN119418373B/en
Publication of CN119418373A publication Critical patent/CN119418373A/en
Application granted granted Critical
Publication of CN119418373B publication Critical patent/CN119418373B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/12Fingerprints or palmprints
    • G06V40/13Sensors therefor
    • G06V40/1306Sensors therefor non-optical, e.g. ultrasonic or capacitive sensing
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/82Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/12Fingerprints or palmprints
    • G06V40/13Sensors therefor
    • G06V40/1312Sensors therefor direct reading, e.g. contactless acquisition
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/12Fingerprints or palmprints
    • G06V40/13Sensors therefor
    • G06V40/1318Sensors therefor using electro-optical elements or layers, e.g. electroluminescent sensing

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Evolutionary Computation (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Medical Informatics (AREA)
  • Software Systems (AREA)
  • General Health & Medical Sciences (AREA)
  • Databases & Information Systems (AREA)
  • Computing Systems (AREA)
  • Artificial Intelligence (AREA)
  • Health & Medical Sciences (AREA)
  • Image Input (AREA)

Abstract

本申请提出了一种接触式手指图像和手指状态真值的同步采集方法,涉及生物特征识别和人机交互技术领域,该方法包括:通过具有指纹或触控传感器的第一设备采集接触式手指图像,并基于镜面反射原理或额外的具有拍摄功能的第二设备同步采集手指的非接触式图像序列;使用多种视觉算法,根据非接触式所采集的手指图像序列,计算得到手指状态的真值数据;根据手指状态的真值数据,对接触式手指状态测量技术进行优化。本申请在使用手机等消费电子设备的指纹或触控传感器采集接触式手指图像的同时,通过低成本、便捷地采集手指和相邻手部区域的图像序列,进而获得手指状态的真值数据,可以实现对于与接触式手指图像相关联的手指状态数据的便捷式获取。

The present application proposes a method for synchronously collecting contact finger images and finger state true values, which relates to the field of biometric identification and human-computer interaction technology. The method includes: collecting contact finger images through a first device with a fingerprint or touch sensor, and synchronously collecting a non-contact image sequence of the finger based on the mirror reflection principle or an additional second device with a shooting function; using a variety of visual algorithms to calculate the true value data of the finger state according to the finger image sequence collected non-contactly; optimizing the contact finger state measurement technology according to the true value data of the finger state. The present application uses the fingerprint or touch sensor of a consumer electronic device such as a mobile phone to collect contact finger images, and collects the image sequence of the finger and the adjacent hand area at a low cost and conveniently, thereby obtaining the true value data of the finger state, which can realize the convenient acquisition of the finger state data associated with the contact finger image.

Description

Synchronous acquisition method for contact finger image and finger state truth value
Technical Field
The application relates to the technical field of biological feature recognition and human-computer interaction, in particular to a synchronous acquisition method of contact finger images and finger state truth values.
Background
At present, fingerprint identification technology is used as a mature biological feature identification technology, and has been widely applied in the field of identity identification. However, the information contained in the fingerprint image includes not only individual identity information but also three-dimensional posture of the finger, force of the finger, and the like. The information extracted from the fingerprint image can be applied to the field of man-machine interaction to develop various interaction modes, and can also realize richer operation in identity authentication application. And the touch interaction technology has similar characteristics. Besides the widely extracted and used contact and positioning information, the touch image also contains various information such as three-dimensional gestures of fingers, forces of fingers and the like, and can be used for developing more practical applications.
Many finger state measurement techniques based on touch sensors such as fingerprints or touch control employ a deep network-based approach, and thus rely on collecting a large number of finger samples and state truth values. However, the current technology for collecting the true value of the finger state, such as the optical tracking technology, is very complex, and requires expensive equipment and complicated operation procedures. This technique is difficult to use to collect the current user's finger state truth data and thus difficult to use for personalized fine-tuning of pre-trained deep networks. In practical application, how to use a mobile phone with a fingerprint or a touch sensor to conveniently and simultaneously collect true value data of a contact finger image and finger state, so that abundant data are provided for the design and optimization of an algorithm used in application, and the method becomes a problem worthy of research.
Disclosure of Invention
The present application aims to solve at least one of the technical problems in the related art to some extent.
Therefore, the application aims to provide a synchronous acquisition method, a device, an electronic device and a readable medium for contact finger images and true values of finger states, which can synchronously acquire a non-contact image sequence of a finger while acquiring the contact finger images by using a mobile phone sensor, and acquire true value data of the finger states according to the non-contact image sequence, so as to be used for optimizing a finger state measurement technology based on the contact images.
In order to achieve the above objective, an embodiment of a first aspect of the present application provides a method for synchronously collecting a contact finger image and a true value of a finger state, including:
Collecting contact finger images through a first device with a fingerprint or a touch sensor, and synchronously collecting a non-contact image sequence of the finger based on a specular reflection principle or an additional second device with a shooting function;
using a plurality of visual algorithms, and calculating to obtain truth value data of the finger state according to the non-contact acquired finger image sequence;
And optimizing the contact type finger state measurement technology according to the true value data of the finger state.
Optionally, the capturing a contact finger image by the first device with a fingerprint or a touch sensor and synchronously capturing a non-contact image sequence of the finger based on a specular reflection principle includes:
placing the first device front side up on a desktop or otherwise securing a position;
fixing a reflector on an upper area of the front surface of the first device, wherein the mirror surface of the reflector is opposite to the screen of the first device;
When a user places a finger on a fingerprint or a touch sensor of the first device to collect contact finger images, a front camera of the first device is used to synchronously collect a non-contact image sequence of the finger through reflection of a mirror surface.
Optionally, the capturing a contact finger image by a device with a fingerprint or a touch sensor and synchronously capturing a non-contact image sequence of the finger based on an additional second device with a shooting function includes:
placing the first device front side up on a desktop or otherwise securing a position;
fixing the second device to an upper region of the front face of the first device;
And when the user places the finger on the fingerprint or the touch sensor of the first device and collects the contact finger image, synchronously collecting a non-contact image sequence of the finger by using the second device.
Optionally, the method further comprises:
when a non-contact image sequence of the finger is acquired, a supplementary light source is used for auxiliary illumination according to actual conditions.
Optionally, the truth data of the finger state includes a finger position, a three-dimensional gesture, a force, and other data associated with information contained in the contact finger image, where the three-dimensional gesture of the finger refers to a three-dimensional gesture of the first finger segment of the finger.
Optionally, the calculating, using a plurality of visual algorithms, the truth value data of the finger state according to the non-contact collected finger image sequence includes:
Constructing a visual recognition model by using a deep learning method, and recognizing the finger position of the finger;
and according to the spatial position relation and the parameters of the camera, resolving to obtain the three-dimensional gesture of the finger.
And estimating the force of the finger according to the morphological change condition of the finger nail cover by using a deep learning method.
Optionally, the contact finger state measurement technique includes:
the three-dimensional gesture, force and finger position of the finger are measured from the contact finger image by a deep learning method.
In order to achieve the above object, an embodiment of a second aspect of the present application provides a synchronous acquisition device for a contact finger image and a true value of a finger state, including:
the acquisition module is used for acquiring contact finger images through a first device with a fingerprint or a touch sensor and synchronously acquiring a non-contact image sequence of the finger based on a specular reflection principle or an additional second device with a shooting function;
The calculation module is used for calculating true value data of the finger state according to the non-contact acquired finger image sequence by using various visual algorithms;
And the optimizing module is used for optimizing the contact type finger state measuring technology according to the true value data of the finger state.
To achieve the above object, an embodiment of a third aspect of the present application provides an electronic device, including a processor, and a memory communicatively connected to the processor;
The memory stores computer-executable instructions;
the processor executes computer-executable instructions stored by the memory to implement the method of any one of the first aspects above.
To achieve the above object, an embodiment of a fourth aspect of the present application proposes a computer-readable storage medium having stored therein computer-executable instructions for implementing the method according to any of the above first aspects when being executed by a processor.
The technical scheme provided by the embodiment of the application at least has the following beneficial effects:
Through proposing two types of acquisition schemes, namely through the equipment with shooting function based on the principle of specular reflection or additionally, in the process that a user acquires contact finger images by using a contact sensor (comprising a fingerprint sensor and a touch sensor) of consumer electronic equipment such as a mobile phone, a tablet computer, a smart watch and the like, synchronously and non-contact acquisition of an image sequence of a finger (which can comprise an adjacent hand area) can be realized, various truth value data of the finger state can be further obtained from the non-contact image through a visual algorithm, and the contact finger state measurement technology is optimized by utilizing the truth value data.
Additional aspects and advantages of the application will be set forth in part in the description which follows and, in part, will be obvious from the description, or may be learned by practice of the application.
Drawings
The foregoing and/or additional aspects and advantages of the application will become apparent and readily appreciated from the following description of the embodiments, taken in conjunction with the accompanying drawings, in which:
FIG. 1 is a flow chart of a method for synchronous acquisition of contact finger images and finger state truth values according to an embodiment of the present application;
FIG. 2 is a schematic diagram illustrating the collection of finger data using specular reflection, according to an embodiment of the present application;
FIG. 3 is a schematic diagram illustrating the use of an additional capture device to capture finger data according to an embodiment of the present application;
FIG. 4 is a flow chart illustrating the collection of finger data using specular reflection, according to an embodiment of the application;
FIG. 5 is a flow chart illustrating the collection of finger data using an additional capture device in accordance with an embodiment of the present application;
fig. 6 is a block diagram of a synchronous acquisition device of a touch finger image and a true value of finger state according to an embodiment of the application.
Detailed Description
Embodiments of the present application are described in detail below, examples of which are illustrated in the accompanying drawings, wherein like or similar reference numerals refer to like or similar elements or elements having like or similar functions throughout. The embodiments described below by referring to the drawings are illustrative and intended to explain the present application and should not be construed as limiting the application.
The following describes a method and a device for synchronously acquiring a contact finger image and a true finger state value according to an embodiment of the present application with reference to the accompanying drawings.
Fig. 1 is a flowchart of a method for synchronously collecting a contact finger image and a true value of a finger state according to an embodiment of the present application, as shown in fig. 1, the method includes the following steps:
Step 101, acquiring a contact finger image by a first device with a fingerprint or a touch sensor, and synchronously acquiring a non-contact image sequence of the finger based on a specular reflection principle or an additional second device with a shooting function.
The prior art has the technical problems that the current mainstream technology for collecting the true value of the finger state, such as the optical tracking technology, is very complex, and requires expensive equipment and complicated operation procedures. This technique is difficult to use to collect the current user's finger state truth data and thus difficult to use for personalized fine-tuning of pre-trained deep networks.
Therefore, in order to solve the above problems, the present application proposes two types of acquisition schemes, that is, acquiring a non-contact image sequence of a finger based on a specular reflection principle or an additional device with a photographing function, and then calculating true value data of the finger state by using various visual algorithms.
In the embodiment of the application, the first device may be a consumer electronic device such as a mobile phone, a tablet computer, a smart watch and the like including a fingerprint sensor or a touch sensor, and the second device may be a mobile phone, a camera, a smart glasses, other wearable devices and the like having a shooting function, which is not limited in the application.
In a possible embodiment, a schematic diagram of acquiring finger data using specular reflection is shown in fig. 2, and a schematic diagram of acquiring finger data using an additional photographing device is shown in fig. 3. In this embodiment, the first device is a mobile phone with a fingerprint sensor or a touch sensor, and the second device is a mobile phone with a photographing function or other photographing devices.
FIG. 4 is a flow chart illustrating the collection of finger data using specular reflection, including:
in step 201, the first device is placed on a desktop face up or otherwise fixed in position.
If the first device is a mobile phone with a fingerprint sensor or a touch sensor, as shown in fig. 2, the mobile phone is placed on a desktop with its front side facing upwards, or the position is fixed in other ways.
It will be appreciated that there are a number of ways to fix the position of the handset, and the application is not limited in this regard.
And 202, fixing a reflector on an upper area of the front surface of the first device, wherein the mirror surface of the reflector is opposite to the screen of the first device.
As shown in fig. 2, the mirror is fixed to the upper area of the front of the mobile phone, with the mirror facing the screen of the mobile phone.
It will be appreciated that there are a number of ways to fix the position of the mirror, and the application is not limited in this regard.
In one possible embodiment, a special bracket may be used to secure the mirror to the upper region of the front of the handset.
Step 203, when a user places a finger on a fingerprint or a touch sensor of a first device to collect a contact finger image, a front camera of the first device is used to synchronously collect a non-contact image sequence of the finger through reflection of a mirror surface.
As shown in fig. 2, when a user places a finger on a fingerprint or a touch sensor of a mobile phone to collect images of the finger in contact, a front camera of the same mobile phone can be used to synchronously collect a non-contact image sequence of the finger through reflection of a mirror surface.
That is, the front camera of the same mobile phone is used to synchronously acquire a non-contact image sequence of the finger while the mobile phone sensor is used to acquire the contact finger image.
Fig. 5 is a flowchart illustrating the acquisition of finger data using an additional photographing apparatus according to an embodiment of the present application, including:
Step 301, placing the first device face up on a desktop or otherwise performing a position fix.
Similar to the above procedure, the handset is placed face up on a desktop or otherwise fixed in position.
It will be appreciated that there are a number of ways to fix the position of the handset, and the application is not limited in this regard.
Step 302, fixing the second device on the upper area of the front surface of the first device.
It will be appreciated that there are various ways of fixing the position of a device having a photographing function, and the present application is not limited thereto.
In one possible embodiment, another mobile phone or other photographing device with photographing function is fixed on the upper area of the front surface of the mobile phone by using a special bracket.
Step 303, when a user places a finger on a fingerprint or a touch sensor of a first device to collect images of the finger in contact, a non-contact image sequence of the finger is synchronously collected by using a second device.
As shown in fig. 3, when a user places a finger on a fingerprint or a touch sensor of a mobile phone to collect images of the finger in contact, a front camera of another mobile phone or other photographing devices can be used to synchronously collect a non-contact image sequence of the finger.
That is, while the contact finger image is acquired using the cell phone sensor, a non-contact image sequence of the finger is synchronously acquired using a front camera of another cell phone or other photographing device.
In addition, for the methods shown in fig. 4 and 5, in order to achieve better image capturing quality, a supplementary light source may be used to perform auxiliary illumination when acquiring a non-contact image sequence of a finger according to actual situations.
Step 102, calculating to obtain truth value data of the finger state according to the non-contact acquired finger image sequence by using a plurality of visual algorithms.
In the embodiment of the application, the truth value data of the finger state includes the finger position (such as the index finger of the right hand), the three-dimensional gesture, the force (normal force and shearing force) and other data related to the information contained in the contact finger image, wherein the three-dimensional gesture of the finger refers to the three-dimensional gesture of the first finger segment of the finger.
It can be appreciated that using reasonable visual algorithms, the truth data (or data very close to the truth) of the finger state can be derived from the image sequence of the finger. In the application, the adopted visual algorithm is not particularly limited, namely any visual algorithm capable of realizing calculation and acquisition of the true value data of the finger state can be adopted.
The following list of several algorithm schemes that can be referred to, but not as a limitation of the application, is as follows:
(1) And constructing a visual recognition model by using a deep learning method, and recognizing the finger position of the finger. Or when the fingerprint is acquired, the distinction is realized by designating the acquired finger position.
As one possible implementation, the position and identity of the finger (e.g., right index finger) may be automatically identified by training a deep learning model. This may be achieved by Convolutional Neural Networks (CNNs) or other deep learning architecture.
Or when the fingerprint is acquired, the distinction is realized by designating the acquired finger position. For example, the user may designate in advance which finger is placed on the sensor, thereby achieving differentiation.
(2) And according to the spatial position relation and the parameters of the camera, resolving to obtain the three-dimensional gesture of the finger.
As one possible implementation, the pose of the finger in three-dimensional space is calculated by multi-view geometric reconstruction techniques using spatial geometry and camera parameters (e.g., focal length, pixel size, etc.). This may be achieved by structured light, stereoscopic or multi-view geometry methods.
(3) And estimating the force of the finger according to the morphological change condition of the finger nail cover by using a deep learning method.
As one possible implementation, the morphology change of the finger nail cover is analyzed through a deep learning model, and the forces (including normal force and shear force) to which the finger is subjected are estimated. Such a method may be based on convolutional neural networks or other deep learning models, through training to identify changes in force.
In the embodiment of the application, more accurate data synchronization can be further realized by using methods such as interpolation and the like for the collected contact finger image and finger state truth value data. And (3) designing reasonable data storage and presentation formats according to actual requirements, and storing the contact finger image and the finger state truth value data, so that the follow-up use of the data is convenient.
In addition, in order to improve the accuracy and convenience of acquisition, an intuitive user interface and interaction flow can be designed to guide a user to accurately perform operations such as finger placement. The acquisition system can be designed to calculate the true value of the finger state in real time, so as to provide real-time feedback, such as acquisition state prompt, image acquisition quality prompt and the like, and help the user to know the state of the current operation.
And 103, optimizing the contact type finger state measurement technology according to the true value data of the finger state.
Finally, the truth value data of the finger state obtained through the steps are used for optimizing the contact finger state measurement technology.
It should be noted that the touch finger state measurement technique referred to herein includes, but is not limited to, measuring three-dimensional gestures, forces, finger positions, and the like of a finger from a fingerprint or a touch image by a deep learning method.
In order to realize the embodiment, the application also provides a synchronous acquisition device for the contact finger image and the true finger state.
Fig. 6 is a block diagram of a synchronous acquisition device 10 for contact finger images and finger state truth values, according to an embodiment of the present application, comprising:
The acquisition module 100 is used for acquiring contact finger images through a first device with a fingerprint or a touch sensor and synchronously acquiring a non-contact image sequence of the finger based on a specular reflection principle or an additional second device with a shooting function;
The calculating module 200 is configured to calculate, according to the non-contact collected finger image sequence, true value data of the finger state by using multiple visual algorithms;
The optimizing module 300 is configured to optimize the contact finger state measurement technique according to the truth value data of the finger state.
The specific manner in which the various modules perform the operations in the apparatus of the above embodiments have been described in detail in connection with the embodiments of the method, and will not be described in detail herein.
In order to realize the embodiment, the application also provides electronic equipment, which comprises a processor and a memory in communication connection with the processor, wherein the memory stores computer-executable instructions, and the processor executes the computer-executable instructions stored in the memory so as to realize the method for executing the embodiment.
In order to implement the above-described embodiments, the present application also proposes a computer-readable storage medium having stored therein computer-executable instructions that, when executed by a processor, are adapted to implement the methods provided by the foregoing embodiments.
In the foregoing description of embodiments, reference has been made to the terms "one embodiment," "some embodiments," "example," "a particular example," or "some examples," etc., meaning that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the application. In this specification, schematic representations of the above terms are not necessarily directed to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples. Furthermore, the different embodiments or examples described in this specification and the features of the different embodiments or examples may be combined and combined by those skilled in the art without contradiction.
Furthermore, the terms "first," "second," and the like, are used for descriptive purposes only and are not to be construed as indicating or implying a relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defining "a first" or "a second" may explicitly or implicitly include at least one such feature. In the description of the present application, the meaning of "plurality" means at least two, for example, two, three, etc., unless specifically defined otherwise.
Any process or method descriptions in flow charts or otherwise described herein may be understood as representing modules, segments, or portions of code which include one or more executable instructions for implementing specific logical functions or steps of the process, and additional implementations are included within the scope of the preferred embodiment of the present application in which functions may be executed out of order from that shown or discussed, including substantially concurrently or in reverse order from that shown or discussed, depending on the functionality involved, as would be understood by those reasonably skilled in the art of the embodiments of the present application.
Logic and/or steps represented in the flowcharts or otherwise described herein, e.g., a ordered listing of executable instructions for implementing logical functions, can be embodied in any computer-readable medium for use by or in connection with an instruction execution system, apparatus, or device, such as a computer-based system, processor-containing system, or other system that can fetch the instructions from the instruction execution system, apparatus, or device and execute the instructions. For the purposes of this description, a "computer-readable medium" can be any means that can contain, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device. More specific examples (a non-exhaustive list) of the computer-readable medium would include an electrical connection (an electronic device) having one or more wires, a portable computer diskette (a magnetic device), a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber device, and a portable compact disc read-only memory (CDROM). In addition, the computer readable medium may even be paper or other suitable medium on which the program is printed, as the program may be electronically captured, via, for instance, optical scanning of the paper or other medium, then compiled, interpreted or otherwise processed in a suitable manner, if necessary, and then stored in a computer memory.
It is to be understood that portions of the present application may be implemented in hardware, software, firmware, or a combination thereof. In the above-described embodiments, the various steps or methods may be implemented in software or firmware stored in a memory and executed by a suitable instruction execution system. If implemented in hardware as in another embodiment, may be implemented using any one or combination of techniques known in the art, discrete logic circuits with logic gates for implementing logic functions on data signals, application specific integrated circuits with appropriate combinational logic gates, programmable Gate Arrays (PGAs), field Programmable Gate Arrays (FPGAs), etc.
Those of ordinary skill in the art will appreciate that all or a portion of the steps carried out in the method of the above-described embodiments may be implemented by a program to instruct related hardware, where the program may be stored in a computer readable storage medium, and where the program, when executed, includes one or a combination of the steps of the method embodiments.
In addition, each functional unit in the embodiments of the present application may be integrated in one processing module, or each unit may exist alone physically, or two or more units may be integrated in one module. The integrated modules may be implemented in hardware or in software functional modules. The integrated modules may also be stored in a computer readable storage medium if implemented in the form of software functional modules and sold or used as a stand-alone product.
The above-mentioned storage medium may be a read-only memory, a magnetic disk or an optical disk, or the like. While embodiments of the present application have been shown and described above, it will be understood that the above embodiments are illustrative and not to be construed as limiting the application, and that variations, modifications, alternatives and variations may be made to the above embodiments by one of ordinary skill in the art within the scope of the application.
It should be appreciated that various forms of the flows shown above may be used to reorder, add, or delete steps. For example, the steps described in the present application may be performed in parallel, sequentially, or in a different order, so long as the desired results of the technical solution of the present application are achieved, and the present application is not limited herein.
The above embodiments do not limit the scope of the present application. It will be apparent to those skilled in the art that various modifications, combinations, sub-combinations and alternatives are possible, depending on design requirements and other factors. Any modifications, equivalent substitutions and improvements made within the spirit and principles of the present application should be included in the scope of the present application.

Claims (10)

1. A synchronous acquisition method of a contact finger image and a finger state true value is characterized by comprising the following steps:
Collecting contact finger images through a first device with a fingerprint or a touch sensor, and synchronously collecting a non-contact image sequence of the finger based on a specular reflection principle or an additional second device with a shooting function;
using a plurality of visual algorithms, and calculating to obtain truth value data of the finger state according to the non-contact acquired finger image sequence;
And optimizing the contact type finger state measurement technology according to the true value data of the finger state.
2. The method of claim 1, wherein the capturing contact finger images by the first device with a fingerprint or touch sensor and synchronizing capturing a sequence of non-contact images of the finger based on specular reflection principles comprises:
placing the first device front side up on a desktop or otherwise securing a position;
fixing a reflector on an upper area of the front surface of the first device, wherein the mirror surface of the reflector is opposite to the screen of the first device;
When a user places a finger on a fingerprint or a touch sensor of the first device to collect contact finger images, a front camera of the first device is used to synchronously collect a non-contact image sequence of the finger through reflection of a mirror surface.
3. The method of claim 1, wherein capturing contact finger images by a device having a fingerprint or touch sensor and synchronously capturing a non-contact image sequence of a finger based on an additional second device having a photographing function, comprises:
placing the first device front side up on a desktop or otherwise securing a position;
fixing the second device to an upper region of the front face of the first device;
And when the user places the finger on the fingerprint or the touch sensor of the first device and collects the contact finger image, synchronously collecting a non-contact image sequence of the finger by using the second device.
4. A method according to claim 2 or 3, further comprising:
when a non-contact image sequence of the finger is acquired, a supplementary light source is used for auxiliary illumination according to actual conditions.
5. The method of claim 1, wherein the truth data for the finger state includes a finger position, a three-dimensional pose, a force, and other data associated with information contained in the contact finger image, wherein the three-dimensional pose of the finger refers to a three-dimensional pose of a first finger segment of the finger.
6. The method of claim 5, wherein calculating true value data of the finger state from the non-contact acquired sequence of finger images using a plurality of vision algorithms comprises:
Constructing a visual recognition model by using a deep learning method, and recognizing the finger position of the finger;
and according to the spatial position relation and the parameters of the camera, resolving to obtain the three-dimensional gesture of the finger.
And estimating the force of the finger according to the morphological change condition of the finger nail cover by using a deep learning method.
7. The method of claim 1, wherein the contact finger state measurement technique comprises:
the three-dimensional gesture, force and finger position of the finger are measured from the contact finger image by a deep learning method.
8. The utility model provides a synchronous collection system of contact finger image and finger state truth value which characterized in that includes:
the acquisition module is used for acquiring contact finger images through a first device with a fingerprint or a touch sensor and synchronously acquiring a non-contact image sequence of the finger based on a specular reflection principle or an additional second device with a shooting function;
The calculation module is used for calculating true value data of the finger state according to the non-contact acquired finger image sequence by using various visual algorithms;
And the optimizing module is used for optimizing the contact type finger state measuring technology according to the true value data of the finger state.
9. An electronic device comprising a processor and a memory communicatively coupled to the processor;
The memory stores computer-executable instructions;
the processor executes computer-executable instructions stored in the memory to implement the method of any one of claims 1-7.
10. A computer readable storage medium having stored therein computer executable instructions which when executed by a processor are adapted to carry out the method of any one of claims 1-7.
CN202411335583.3A 2024-09-24 2024-09-24 Synchronous acquisition method for contact finger image and finger state truth value Active CN119418373B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202411335583.3A CN119418373B (en) 2024-09-24 2024-09-24 Synchronous acquisition method for contact finger image and finger state truth value

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202411335583.3A CN119418373B (en) 2024-09-24 2024-09-24 Synchronous acquisition method for contact finger image and finger state truth value

Publications (2)

Publication Number Publication Date
CN119418373A true CN119418373A (en) 2025-02-11
CN119418373B CN119418373B (en) 2025-11-11

Family

ID=94462524

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202411335583.3A Active CN119418373B (en) 2024-09-24 2024-09-24 Synchronous acquisition method for contact finger image and finger state truth value

Country Status (1)

Country Link
CN (1) CN119418373B (en)

Citations (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180253583A1 (en) * 2017-03-06 2018-09-06 International Business Machines Corporation Recognizing fingerprints and fingerprint combinations as inputs
CN109154959A (en) * 2017-05-17 2019-01-04 深圳市汇顶科技股份有限公司 Optical fingerprint sensor with non-contact imaging capability
CN110728201A (en) * 2019-09-20 2020-01-24 南京元初科技有限公司 Image processing method and device for fingerprint identification
CN112016525A (en) * 2020-09-30 2020-12-01 墨奇科技(北京)有限公司 Non-contact fingerprint acquisition method and device
CN212809298U (en) * 2020-06-04 2021-03-26 苏州博瑞尔特信息科技有限公司 Attendance device with body temperature measurement function
CN113486825A (en) * 2021-07-12 2021-10-08 上海锐瞻智能科技有限公司 Non-contact fingerprint acquisition device, method, system and medium thereof
CN113569631A (en) * 2021-06-16 2021-10-29 清华大学 Monocular non-contact fingerprint perspective distortion correction method and device
CN113569638A (en) * 2021-06-24 2021-10-29 清华大学 Method and device for estimating three-dimensional gesture of finger by planar fingerprint
CN114202778A (en) * 2021-11-04 2022-03-18 清华大学 Method and system for estimating three-dimensional gesture of finger by planar fingerprint
CN114463789A (en) * 2022-01-19 2022-05-10 北京至简墨奇科技有限公司 Non-contact fingerprint image enhancement method, apparatus, storage medium and program product
CN114677714A (en) * 2022-03-30 2022-06-28 西安交通大学 Matching method and related device of contact fingerprint and non-contact fingerprint
CN115909425A (en) * 2022-12-09 2023-04-04 厦门熵基科技有限公司 A model training method, fingerprint image processing method, device and electronic equipment
CN116229523A (en) * 2022-12-28 2023-06-06 泉州装备制造研究所 Method, device and equipment for fingerprint pose estimation
WO2024025502A1 (en) * 2022-07-28 2024-02-01 Sagiroglu Seref Mobile automated fingerprint identification system and method
CN118366190A (en) * 2024-03-06 2024-07-19 熵基科技股份有限公司 Biological characteristic image acquisition method, device, equipment and storage medium
CN118544365A (en) * 2024-07-26 2024-08-27 珠海市可为精密机械有限公司 Flexible grabbing method for mechanical arm for automatic workpiece conversion
CN118644878A (en) * 2024-06-06 2024-09-13 北京邮电大学 A contactless fingerprint recognition method based on three-dimensional features and graph neural network

Patent Citations (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180253583A1 (en) * 2017-03-06 2018-09-06 International Business Machines Corporation Recognizing fingerprints and fingerprint combinations as inputs
CN109154959A (en) * 2017-05-17 2019-01-04 深圳市汇顶科技股份有限公司 Optical fingerprint sensor with non-contact imaging capability
CN110728201A (en) * 2019-09-20 2020-01-24 南京元初科技有限公司 Image processing method and device for fingerprint identification
CN212809298U (en) * 2020-06-04 2021-03-26 苏州博瑞尔特信息科技有限公司 Attendance device with body temperature measurement function
CN112016525A (en) * 2020-09-30 2020-12-01 墨奇科技(北京)有限公司 Non-contact fingerprint acquisition method and device
CN113569631A (en) * 2021-06-16 2021-10-29 清华大学 Monocular non-contact fingerprint perspective distortion correction method and device
CN113569638A (en) * 2021-06-24 2021-10-29 清华大学 Method and device for estimating three-dimensional gesture of finger by planar fingerprint
CN113486825A (en) * 2021-07-12 2021-10-08 上海锐瞻智能科技有限公司 Non-contact fingerprint acquisition device, method, system and medium thereof
CN114202778A (en) * 2021-11-04 2022-03-18 清华大学 Method and system for estimating three-dimensional gesture of finger by planar fingerprint
CN114463789A (en) * 2022-01-19 2022-05-10 北京至简墨奇科技有限公司 Non-contact fingerprint image enhancement method, apparatus, storage medium and program product
CN114677714A (en) * 2022-03-30 2022-06-28 西安交通大学 Matching method and related device of contact fingerprint and non-contact fingerprint
WO2024025502A1 (en) * 2022-07-28 2024-02-01 Sagiroglu Seref Mobile automated fingerprint identification system and method
CN115909425A (en) * 2022-12-09 2023-04-04 厦门熵基科技有限公司 A model training method, fingerprint image processing method, device and electronic equipment
CN116229523A (en) * 2022-12-28 2023-06-06 泉州装备制造研究所 Method, device and equipment for fingerprint pose estimation
CN118366190A (en) * 2024-03-06 2024-07-19 熵基科技股份有限公司 Biological characteristic image acquisition method, device, equipment and storage medium
CN118644878A (en) * 2024-06-06 2024-09-13 北京邮电大学 A contactless fingerprint recognition method based on three-dimensional features and graph neural network
CN118544365A (en) * 2024-07-26 2024-08-27 珠海市可为精密机械有限公司 Flexible grabbing method for mechanical arm for automatic workpiece conversion

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
杨金锋等: "手指多模态特征图像姿态同步采集系统设计", 中国民航大学学报, no. 01, 15 February 2020 (2020-02-15) *
王科俊等: "非接触指纹图像识别算法研究", 电子学报, no. 11, 15 November 2017 (2017-11-15) *

Also Published As

Publication number Publication date
CN119418373B (en) 2025-11-11

Similar Documents

Publication Publication Date Title
KR102292028B1 (en) Gesture recognition method, device, electronic device, and storage medium
CN108496142B (en) A kind of gesture recognition method and related device
CN106485190B (en) Fingerprint registration method and device
KR101612605B1 (en) Method for extracting face feature and apparatus for perforimg the method
KR101929077B1 (en) Image identificaiton method and image identification device
KR20110063679A (en) Vein pattern recognition based biometric authentication system and method
CN110213491B (en) A focusing method, device and storage medium
CN107610177B (en) The method and apparatus of characteristic point is determined in a kind of synchronous superposition
CN101477631A (en) Method, equipment for extracting target from image and human-machine interaction system
CN111008576B (en) Pedestrian detection and its model training, updating method, device and readable storage medium
CN105302294B (en) A kind of interactive virtual reality apparatus for demonstrating
CN116661604A (en) Man-machine interaction recognition system based on Media Pipe frame acquisition gesture
CN106778670A (en) Gesture identifying device and recognition methods
CN105205482A (en) Quick facial feature recognition and posture estimation method
CN104376323B (en) A kind of method and device for determining target range
CN119418373B (en) Synchronous acquisition method for contact finger image and finger state truth value
CN103177245B (en) gesture recognition method and device
CN111460858B (en) Method and device for determining finger tip point in image, storage medium and electronic equipment
CN110187806B (en) Fingerprint template input method and related device
CN114202778B (en) Method and system for estimating three-dimensional gesture of finger by planar fingerprint
CN110162950A (en) Electronic device and control method thereof
CN113365382A (en) Light control method and device, electronic equipment and storage medium
CN116386103B (en) Eye sight drop point estimation method, system and electronic equipment
CN113435229A (en) Face iris multi-mode recognition method and device, readable storage medium and equipment
CN111281355A (en) A method and device for determining pulse collection position

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant