CN117235698A - Face verification method, device, equipment, medium and product - Google Patents
Face verification method, device, equipment, medium and product Download PDFInfo
- Publication number
- CN117235698A CN117235698A CN202311234898.4A CN202311234898A CN117235698A CN 117235698 A CN117235698 A CN 117235698A CN 202311234898 A CN202311234898 A CN 202311234898A CN 117235698 A CN117235698 A CN 117235698A
- Authority
- CN
- China
- Prior art keywords
- face
- data
- gyroscope
- determining
- comparison value
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Landscapes
- Collating Specific Patterns (AREA)
Abstract
The application provides a face verification method, a face verification device, face verification equipment, a face verification medium and a face verification product, and relates to the technical field of information security, wherein the face verification method comprises the following steps: after the primary face verification result is that the verification is passed, acquiring image acquisition data and gyroscope acquisition data corresponding to the user terminal; the acquisition requirements of the image acquisition data and the gyroscope acquisition data are the same, and the data arrangement sequence is the same; the image verification data comprises a plurality of face images which are arranged; the gyroscope acquisition data comprises a plurality of arranged gyroscope data; determining a first deflection angle between adjacent face images in the image acquisition data; determining a second deflection angle between adjacent gyroscope data in the gyroscope acquisition data; determining a reference comparison value and a comparison value to be tested according to the first deflection angle and the second deflection angle; and determining a final face verification result according to the reference comparison value and the comparison value to be tested. The face verification method improves the accuracy of face verification.
Description
Technical Field
The present application relates to the field of information security technologies, and in particular, to a face verification method, device, apparatus, medium, and product.
Background
With the development of AI (English is called Artificial Intelligence, chinese is artificial intelligence) technology, AI face changing becomes a means with great influence in information security. The AI face-changing method obtains face videos or pictures of the victim through an illegal way, and replaces the faces through an AI model, so that the true or false is difficult to judge.
Especially in the financial field, due to the popularity of face recognition verification, AI face change may cause errors in face recognition, resulting in security problems such as monetary loss and account loss of users.
At present, the verification and identification of the AI face mainly prevents a user from acquiring a face video or picture from the aspect of privacy protection, and the mode is difficult to determine whether the AI face is changed in the face verification process.
Therefore, a scheme capable of identifying the AI face change in the face verification process and improving the face verification accuracy is needed at present.
Disclosure of Invention
The application provides a face verification method, a device, equipment, a medium and a product, which are used for solving the problem that a scheme capable of identifying an AI (automatic identification) face change in the face verification process and improving the face verification accuracy is required at present.
The first aspect of the present application provides a face verification method, applied to a server, where the server is in communication connection with a user terminal, and the user terminal is provided with a gyroscope and a camera, and the method includes:
After the primary face verification result is that the verification is passed, acquiring image acquisition data and gyroscope acquisition data corresponding to the user terminal; the acquisition requirements of the image acquisition data and the gyroscope acquisition data are the same, and the data arrangement sequence is the same; the image verification data comprises a plurality of face images which are arranged; the gyroscope acquisition data comprises a plurality of arrayed gyroscope data;
determining a first deflection angle between adjacent face images in the image acquisition data;
determining a second deflection angle between adjacent gyroscope data in the gyroscope acquisition data;
determining a reference comparison value and a comparison value to be tested according to the first deflection angle and the second deflection angle;
and determining a final face verification result according to the reference comparison value and the comparison value to be tested.
Further, the method as described above, the acquisition requirements include acquisition time and acquisition frequency;
the acquiring the image acquisition data and the gyroscope acquisition data corresponding to the user terminal comprises the following steps:
receiving video authentication data and gyroscope acquisition data sent by a user terminal; the gyroscope acquisition data are acquired and generated by the gyroscope in the user terminal according to the acquisition time and the acquisition frequency; the video authentication data includes video generation time;
And carrying out image acquisition on the video authentication data according to the video generation time based on the acquisition time and the acquisition frequency to generate image acquisition data.
Further, as described above, the determining the first deflection angle between adjacent face images in the image acquisition data includes:
arranging face images in the image acquisition data according to the acquisition time sequence to generate arranged face images;
a first deflection angle between two face images adjacently arranged in each face image after arrangement is determined.
Further, the method as described above, wherein determining the first deflection angle between two face images adjacently arranged in each face image after arrangement includes:
determining face positions and face interpupillary distances in the face images by adopting a preset face recognition algorithm;
determining a face area to be calculated according to the face position and the face interpupillary distance; the x coordinate of the left boundary of the face area is the difference value between the x coordinate of the left eye of the face and 0.5 pupil distance, the x coordinate of the right boundary of the face area is the sum between the x coordinate of the right eye of the face and 0.5 pupil distance, the y coordinate of the upper boundary of the face area is the sum between the y coordinate corresponding to the horizontal position of the eyes of the face and 1 pupil distance, and the y coordinate of the lower boundary of the face area is the difference value between the y coordinate corresponding to the horizontal position of the eyes of the face and 1 pupil distance;
And carrying out angle deflection calculation on face areas corresponding to the two face images which are adjacently arranged by adopting a preset image characteristic algorithm so as to generate a corresponding first deflection angle.
Further, the method as described above, the determining the second deflection angle between adjacent gyroscope data in the gyroscope acquisition data includes:
arranging all the gyroscope data in the gyroscope acquisition data according to the acquisition time sequence to generate arranged gyroscope data;
for two adjacent arranged gyroscope data in each arranged gyroscope data, determining a difference value between the arranged gyroscope data arranged at the rear and the arranged gyroscope data at the front as the second deflection angle.
Further, the method as described above, wherein determining the reference alignment value and the alignment value to be tested according to the first deflection angle and the second deflection angle includes:
according to the arrangement sequence of the first deflection angles and the second deflection angles, performing one comparison on all the first deflection angles and the second deflection angles which are positioned in front of a preset quantity threshold value to generate a plurality of intermediate comparison results; the one-to-one comparison is to compare the first deflection angle and the second deflection angle which are in the same sequence;
Determining an average value of the plurality of intermediate comparison results as a reference comparison value;
and carrying out one-to-one comparison on all the first deflection angles and the second deflection angles which are positioned behind the preset quantity threshold value to generate a plurality of comparison values to be tested.
Further, the method as described above, wherein the determining the final face verification result according to the reference comparison value and the comparison value to be tested includes:
determining the number corresponding to the normal comparison value according to the comparison value to be tested and the reference comparison value; the normal comparison value is a comparison value to be checked in a preset normal value range; the preset normal numerical range is based on the reference comparison value;
if the ratio between the number corresponding to the normal comparison value and the number corresponding to the comparison value to be detected is greater than or equal to a preset ratio threshold value, determining that the final face verification result is verification passing;
if the ratio between the number corresponding to the normal comparison value and the number corresponding to the comparison value to be detected is smaller than a preset ratio threshold, determining that the final face verification result is that verification is failed.
Further, the method as described above, wherein the determining the number corresponding to the normal comparison value according to the comparison value to be tested and the reference comparison value includes:
If the to-be-inspected comparison value is in the normal value range, determining that the to-be-inspected comparison value is a normal comparison result; the minimum value of the normal numerical range is the difference value between the reference comparison value and the preset first range threshold value, and the maximum value of the normal numerical range is the sum of the reference comparison value and the preset second range threshold value;
the total number of normal comparison results is calculated and taken as the comparison number.
The second aspect of the present application provides a face verification device, located in a server, where the server is communicatively connected to a user terminal, and the user terminal is provided with a gyroscope and a camera, and the device includes:
the acquisition module is used for acquiring image acquisition data and gyroscope acquisition data corresponding to the user terminal after the primary face verification result is that the primary face verification result passes the verification; the acquisition requirements of the image acquisition data and the gyroscope acquisition data are the same, and the data arrangement sequence is the same; the image verification data comprises a plurality of face images which are arranged; the gyroscope acquisition data comprises a plurality of arrayed gyroscope data;
the first determining module is used for determining a first deflection angle between adjacent face images in the image acquisition data;
The second determining module is used for determining a second deflection angle between adjacent gyroscope data in the gyroscope acquisition data;
the third determining module is used for determining a reference comparison value and a comparison value to be tested according to the first deflection angle and the second deflection angle;
and the verification module is used for determining a final face verification result according to the reference comparison value and the comparison value to be tested.
Further, the apparatus as described above, the acquisition requirements include acquisition time and acquisition frequency;
the acquisition module is specifically used for when acquiring image acquisition data and gyroscope acquisition data corresponding to the user terminal:
receiving video authentication data and gyroscope acquisition data sent by a user terminal; the gyroscope acquisition data are acquired and generated by the gyroscope in the user terminal according to the acquisition time and the acquisition frequency; the video authentication data includes video generation time; and carrying out image acquisition on the video authentication data according to the video generation time based on the acquisition time and the acquisition frequency to generate image acquisition data.
Further, in the apparatus as described above, the first determining module is specifically configured to:
arranging face images in the image acquisition data according to the acquisition time sequence to generate arranged face images; a first deflection angle between two face images adjacently arranged in each face image after arrangement is determined.
Further, in the apparatus as described above, the first determining module is specifically configured to, when determining the first deflection angle between two face images adjacently arranged in each face image after arrangement:
determining face positions and face interpupillary distances in the face images by adopting a preset face recognition algorithm; determining a face area to be calculated according to the face position and the face interpupillary distance; the x coordinate of the left boundary of the face area is the difference value between the x coordinate of the left eye of the face and 0.5 pupil distance, the x coordinate of the right boundary of the face area is the sum between the x coordinate of the right eye of the face and 0.5 pupil distance, the y coordinate of the upper boundary of the face area is the sum between the y coordinate corresponding to the horizontal position of the eyes of the face and 1 pupil distance, and the y coordinate of the lower boundary of the face area is the difference value between the y coordinate corresponding to the horizontal position of the eyes of the face and 1 pupil distance; and carrying out angle deflection calculation on face areas corresponding to the two face images which are adjacently arranged by adopting a preset image characteristic algorithm so as to generate a corresponding first deflection angle.
Further, in the apparatus as described above, the second determining module is specifically configured to:
arranging all the gyroscope data in the gyroscope acquisition data according to the acquisition time sequence to generate arranged gyroscope data; for two adjacent arranged gyroscope data in each arranged gyroscope data, determining a difference value between the arranged gyroscope data arranged at the rear and the arranged gyroscope data at the front as the second deflection angle.
Further, in the apparatus as described above, the third determining module is specifically configured to:
according to the arrangement sequence of the first deflection angles and the second deflection angles, performing one comparison on all the first deflection angles and the second deflection angles which are positioned in front of a preset quantity threshold value to generate a plurality of intermediate comparison results; the one-to-one comparison is to compare the first deflection angle and the second deflection angle which are in the same sequence; determining an average value of the plurality of intermediate comparison results as a reference comparison value; and carrying out one-to-one comparison on all the first deflection angles and the second deflection angles which are positioned behind the preset quantity threshold value to generate a plurality of comparison values to be tested.
Further, in the apparatus as described above, the verification module is specifically configured to:
determining the number corresponding to the normal comparison value according to the comparison value to be tested and the reference comparison value; the normal comparison value is a comparison value to be checked in a preset normal value range; the preset normal numerical range is based on the reference comparison value; if the ratio between the number corresponding to the normal comparison value and the number corresponding to the comparison value to be detected is greater than or equal to a preset ratio threshold value, determining that the final face verification result is verification passing; if the ratio between the number corresponding to the normal comparison value and the number corresponding to the comparison value to be detected is smaller than a preset ratio threshold, determining that the final face verification result is that verification is failed.
Further, in the apparatus as described above, the verification module is specifically configured to, when determining the number corresponding to the normal comparison value according to the comparison value to be tested and the reference comparison value:
if the to-be-inspected comparison value is in the normal value range, determining that the to-be-inspected comparison value is a normal comparison result; the minimum value of the normal numerical range is the difference value between the reference comparison value and the preset first range threshold value, and the maximum value of the normal numerical range is the sum of the reference comparison value and the preset second range threshold value; the total number of normal comparison results is calculated and taken as the comparison number.
A third aspect of the present application provides an electronic apparatus, comprising: a memory and a processor;
the memory stores computer-executable instructions;
the processor executes computer-executable instructions stored by the memory to implement the face verification method of any one of the first aspects.
A fourth aspect of the present application provides a computer-readable storage medium having stored therein computer-executable instructions which, when executed by a processor, are adapted to carry out the face verification method of any one of the first aspects.
A fifth aspect of the application provides a computer program product comprising a computer program which, when executed by a processor, implements the face verification method of any one of the first aspects.
The application provides a face verification method, a device, equipment, a medium and a product, wherein the method comprises the following steps: after the primary face verification result is that the verification is passed, acquiring image acquisition data and gyroscope acquisition data corresponding to the user terminal; the acquisition requirements of the image acquisition data and the gyroscope acquisition data are the same, and the data arrangement sequence is the same; the image verification data comprises a plurality of face images which are arranged; the gyroscope acquisition data comprises a plurality of arrayed gyroscope data; determining a first deflection angle between adjacent face images in the image acquisition data; determining a second deflection angle between adjacent gyroscope data in the gyroscope acquisition data; determining a reference comparison value and a comparison value to be tested according to the first deflection angle and the second deflection angle; and determining a final face verification result according to the reference comparison value and the comparison value to be tested. According to the face verification method, when the face verification is performed again, image acquisition data and gyroscope acquisition data corresponding to a user terminal are obtained, and a first deflection angle between adjacent face images in the image acquisition data and a second deflection angle between adjacent gyroscope data in the gyroscope acquisition data are determined. The deflection angle of the AI face change when the face deflects is different from the face deflection angle of the real person, so that the accuracy of face verification can be improved based on the first deflection angle and the second deflection angle. Meanwhile, the corresponding reference comparison value and the comparison value to be tested can be determined through the first deflection angle and the second deflection angle, and the final face verification result is determined in a mode that the reference comparison value is verified on the comparison value to be tested, so that the accuracy of face verification is further improved.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the application and together with the description, serve to explain the principles of the application.
FIG. 1 is a scene diagram of a face verification method in which embodiments of the present application may be implemented;
fig. 2 is a schematic flow chart of a face verification method provided by the present application;
FIG. 3 is a second flow chart of the face verification method according to the present application;
fig. 4 is a schematic structural diagram of a face verification device provided by the application;
fig. 5 is a schematic structural diagram of an electronic device provided by the present application.
Specific embodiments of the present application have been shown by way of the above drawings and will be described in more detail below. The drawings and the written description are not intended to limit the scope of the inventive concepts in any way, but rather to illustrate the inventive concepts to those skilled in the art by reference to the specific embodiments.
Detailed Description
Reference will now be made in detail to exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, the same numbers in different drawings refer to the same or similar elements, unless otherwise indicated. The implementations described in the following exemplary examples do not represent all implementations consistent with the application. Rather, they are merely examples of apparatus and methods consistent with aspects of the application as detailed in the accompanying claims.
In the technical scheme of the embodiment of the application, the related processes of collecting, storing, using, processing, transmitting, providing, disclosing and the like of the personal information of the user accord with the regulations of related laws and regulations, and the public order is not violated.
It should be noted that, the user information (including but not limited to user equipment information, user personal information, etc.) and the data (including but not limited to data for analysis, stored data, presented data, etc.) related to the present application are information and data authorized by the user or fully authorized by each party, and the collection, use and processing of the related data need to comply with the related laws and regulations and standards of the related country and region, and provide corresponding operation entries for the user to select authorization or rejection.
It should be noted that the face verification method, device, equipment, medium and product disclosed by the disclosure can be used in the technical field of information security. And can be used in any field except the technical field of information security. The face verification method, device, equipment, medium and product application field are not limited.
The technical scheme of the application is described in detail below by specific examples. The following embodiments may be combined with each other, and the same or similar concepts or processes may not be described in detail in some embodiments. Embodiments of the present application will be described below with reference to the accompanying drawings.
For a clear understanding of the technical solutions of the present application, the prior art solutions will be described in detail first. With the development of AI technology, AI face-changing is a means with a great influence in information security. The main current ways of coping with AI face change mainly include the following ways:
1. strengthening passwords and other authentication modes: the method uses a strong password and combines other identity verification modes, such as fingerprint identification, two-step verification and the like, so as to improve the security of the account.
2. Privacy settings and rights control: privacy settings of users in various applications and online services are checked and managed. Ensuring that only the necessary personal information is shared and limiting the application's access rights to the user's camera and album.
3. Face mask techniques are used: some privacy preserving tools and applications provide face masking or face confusion functionality. These tools may add interference patterns or patterns to the face image such that AI face changes do not properly identify the facial features of the user.
In the above manner, the authentication and identification of the AI face mainly prevents the user from acquiring the face video or picture from the aspect of privacy protection, and it is difficult to determine whether the AI face change is performed in the face authentication process.
Therefore, a scheme capable of identifying the AI face change in the face verification process and improving the face verification accuracy is needed at present.
Therefore, the inventor finds in research that in order to solve the problem that in the prior art, an AI face change can be identified in the face verification process and the face verification accuracy is improved, the deflection angles between the AI face change and the real face are not necessarily the same based on the principle that the deflection angles between the AI face change and the real face are not necessarily the same when the real face moves, rotates and the like, so that whether the AI face change problem exists or not is determined based on a first deflection angle between adjacent face images in image acquisition data and a second deflection angle between adjacent gyroscope data in gyroscope acquisition data, and the face verification accuracy is improved. Wherein. The deflection angle of the gyroscope acquired data is the same as that of a real human face.
Specifically, after the primary face verification result is that the verification is passed, acquiring image acquisition data and gyroscope acquisition data corresponding to the user terminal. The acquisition requirements of the image acquisition data and the gyroscope acquisition data are the same, and the data arrangement sequence is the same. The image verification data includes a plurality of face images arranged. The gyroscope acquisition data includes a plurality of gyroscope data arranged. A first angle of deflection between adjacent face images in the image acquisition data is determined. And determining a second deflection angle between adjacent gyroscope data in the gyroscope acquisition data. And determining a reference comparison value and a comparison value to be tested according to the first deflection angle and the second deflection angle. And determining a final face verification result according to the reference comparison value and the comparison value to be tested.
According to the face verification method, when face verification is carried out, image acquisition data and gyroscope acquisition data corresponding to a user terminal are obtained, and a first deflection angle between adjacent face images in the image acquisition data and a second deflection angle between adjacent gyroscope data in the gyroscope acquisition data are determined. The deflection angle of the AI face change when the face deflects is different from the face deflection angle of the real person, so that the accuracy of face verification can be improved based on the first deflection angle and the second deflection angle. Meanwhile, the corresponding reference comparison value and the comparison value to be tested can be determined through the first deflection angle and the second deflection angle, and the face verification result is determined in a mode that the reference comparison value is verified on the comparison value to be tested, so that the accuracy of face verification is further improved.
The inventor proposes the technical scheme of the application based on the creative discovery.
The application scenario of the face verification method provided by the embodiment of the application is described below. As shown in fig. 1, 1 is a server, 2 is a user terminal, and 3 is a user. The network architecture of the application scene corresponding to the face verification method provided by the embodiment of the application comprises the following steps: server 1, user terminal 2 and user 3. The user terminal 2 can be a mobile phone, a tablet computer and other devices, and is provided with a camera and a gyroscope.
For example, when face verification is required, the server 1 performs primary face verification first, and when the primary face verification result is that the verification passes, the following procedure is performed:
(1) the user 3 collects verification related data through the user terminal 2 in a face verification stage, and when the user 3 collects the verification related data, the user terminal 2 is operated to shift, for example, when the user 3 collects image verification data corresponding to the user 3 in shifting through a camera, the user terminal collects data through a gyroscope collected by the gyroscope. At the same time, the user terminal 2 transmits authentication related data to the server 1. In this embodiment, the verification related data may be obtained during the preliminary face verification, and may be directly used for the subsequent face verification.
(2) The server 1 processes the verification related data to obtain image acquisition data and gyroscope acquisition data corresponding to the user terminal. The image acquisition data and the gyroscope acquisition data are identical in acquisition requirement, and the data arrangement sequence is identical. The image verification data includes a plurality of face images arranged. The gyroscope acquisition data includes a plurality of gyroscope data arranged.
(3) The server 1 determines a first angle of deflection between adjacent face images in the image acquisition data.
(4) The server 1 determines a second deflection angle between adjacent gyroscope data in the gyroscope acquisition data.
(5) The server 1 determines a reference comparison value and a comparison value to be checked from the first deflection angle and the second deflection angle.
(6) The server 1 determines a final face verification result according to the reference comparison value and the comparison value to be checked. If the final face verification result is determined to be passed, the user 3 may perform subsequent processing through the user terminal, and if the final face verification result is determined to be not passed, face verification may be performed again or other processing may be performed.
Embodiments of the present application will now be described with reference to the accompanying drawings.
Fig. 2 is a schematic flow chart of a face verification method provided by the present application, as shown in fig. 2, in this embodiment, an execution subject of the present application is a face verification device, and the face verification device may be integrated in an electronic device, such as a server. The server is in communication connection with a user terminal, and the user terminal is provided with a gyroscope and a camera, so that the face verification method provided by the embodiment comprises the following steps:
step S101, after the primary face verification result is that the verification is passed, acquiring image acquisition data and gyroscope acquisition data corresponding to the user terminal. The acquisition requirements of the image acquisition data and the gyroscope acquisition data are the same, and the data arrangement sequence is the same. The image verification data includes a plurality of face images arranged. The gyroscope acquisition data includes a plurality of gyroscope data arranged.
In this embodiment, when financial services such as transfer and transaction are performed, a security verification mechanism is triggered, and the user terminal sends a face verification request to the server. And when receiving a face verification request sent by the user terminal, the server starts face preliminary verification.
If the preliminary face verification result is that the face is passed, the face is started to be further verified, at the moment, a user can realize data acquisition through shifting the user terminal, the user terminal acquires a face image through a camera, and a gyroscope acquires gyroscope data corresponding to the gyroscope. The user can also perform the offset acquisition action during the preliminary face verification.
Since the user terminal moves rather than is stationary during face verification, there is an offset angle between the face images. In this embodiment, there is also an offset angle between the gyroscope data collected by the mobile user terminal.
If the user does not deviate from the user terminal, a situation that the face image has a certain deviation angle and the gyroscope data do not deviate is generated, and the final face verification result is judged to be that verification is not passed.
The image acquisition data may be AI face-changing image data or real face data, and further judgment and verification are required by a subsequent flow.
Step S102, determining a first deflection angle between adjacent face images in the image acquisition data.
In this embodiment, since the face images have a certain arrangement sequence during acquisition and generation, the first deflection angle between the face images arranged adjacently can be directly determined. The specific determination mode can be that a face recognition algorithm is adopted for calculation.
Step S103, determining a second deflection angle between adjacent gyroscope data in the gyroscope acquisition data.
In this embodiment, since the gyroscope data has a certain arrangement order when being acquired and generated, the second deflection angle between the gyroscope data of adjacent arrangement can be directly determined. The specific determination mode can be a mode of subtracting gyroscope data.
Step S104, determining a reference comparison value and a comparison value to be tested according to the first deflection angle and the second deflection angle.
In this embodiment, the reference comparison value may be generated by performing corresponding processing on the partial comparison queue value based on the comparison value between the first deflection angle and the second deflection angle, and the remaining comparison value is used as the comparison value to be tested. Therefore, whether the relative deflection angle between the face image and the gyroscope data is suddenly changed or not is further determined in the face authentication process, and if the relative deflection angle is suddenly changed, the face image is expressed as an AI face change. The accuracy of face verification can be further increased by the reference comparison value and the comparison value to be checked.
Step S105, determining a final face verification result according to the reference comparison value and the comparison value to be tested.
And determining the deviation between the comparison value to be tested and the reference comparison value, and if the deviation is larger, determining that the final face verification result is not passed.
The face verification method provided by the embodiment of the application comprises the following steps: and after the primary face verification result is that the verification is passed, acquiring image acquisition data and gyroscope acquisition data corresponding to the user terminal. The acquisition requirements of the image acquisition data and the gyroscope acquisition data are the same, and the data arrangement sequence is the same. The image verification data includes a plurality of face images arranged. The gyroscope acquisition data includes a plurality of gyroscope data arranged. A first angle of deflection between adjacent face images in the image acquisition data is determined. And determining a second deflection angle between adjacent gyroscope data in the gyroscope acquisition data. And determining a reference comparison value and a comparison value to be tested according to the first deflection angle and the second deflection angle. And determining a final face verification result according to the reference comparison value and the comparison value to be tested.
According to the face verification method, when face verification is carried out, image acquisition data and gyroscope acquisition data corresponding to a user terminal are obtained, and a first deflection angle between adjacent face images in the image acquisition data and a second deflection angle between adjacent gyroscope data in the gyroscope acquisition data are determined. The deflection angle of the AI face change when the face deflects is different from the face deflection angle of the real person, so that the accuracy of face verification can be improved based on the first deflection angle and the second deflection angle. Meanwhile, the corresponding reference comparison value and the comparison value to be tested can be determined through the first deflection angle and the second deflection angle, and the final face verification result is determined in a mode that the reference comparison value is verified on the comparison value to be tested, so that the accuracy of face verification is further improved.
Fig. 3 is a second flow chart of the face verification method provided by the present application, as shown in fig. 3, and the face verification method provided by the present embodiment is further refined based on the face verification method provided by the previous embodiment of the present application. The face verification method provided by the embodiment includes the following steps.
Step S201, receiving video authentication data and gyroscope acquisition data sent by a user terminal. And the gyroscope acquisition data are generated by acquiring the gyroscope in the user terminal according to the acquisition time and the acquisition frequency. The video authentication data includes a video generation time.
In this embodiment, the video authentication data is face video data collected by the user terminal, and further processing is required. In this embodiment, the server performs further processing, and in other application scenarios, the processing may be performed by the user terminal and then sent to the server. The gyroscope belongs to non-private data and can be acquired without user authorization.
The acquisition time can be in a time period from the start of face authentication to the end of face data acquisition, and the acquisition frequency can be set according to practical application, for example, the acquisition can be carried out according to the frequency of once acquisition of 40 milliseconds. Typically the entire face verification process lasts at least 3 seconds and the acquisition time may be 3 seconds long.
The video generation time is generally the same as the start face authentication time, and may have a certain delay.
Step S202, image acquisition is carried out on video authentication data according to video generation time based on acquisition time and acquisition frequency, and image acquisition data are generated.
In this embodiment, if the video generation time is the same as the face authentication start time, image acquisition is performed on the video authentication data by the acquisition frequency from the video generation time to generate image acquisition data. Face images in the image acquisition data may be arranged according to the acquisition time.
Step S203, arranging the face images in the image acquisition data according to the acquisition time sequence, and generating the arranged face images.
Step S204, determining a first deflection angle between two face images adjacently arranged in each face image after arrangement.
Alternatively, in this embodiment, S204 may specifically be:
and determining the face position and the face interpupillary distance in each face image by adopting a preset face recognition algorithm.
And determining a face region to be calculated according to the face position and the face interpupillary distance. The x coordinate of the left boundary of the face region is the difference between the x coordinate of the left eye of the face and 0.5 pupil distance, the x coordinate of the right boundary of the face region is the sum between the x coordinate of the right eye of the face and 0.5 pupil distance, the y coordinate of the upper boundary of the face region is the sum between the y coordinate corresponding to the horizontal position of the eyes of the face and 1 pupil distance, and the y coordinate of the lower boundary of the face region is the difference between the y coordinate corresponding to the horizontal position of the eyes of the face and 1 pupil distance.
And carrying out angle deflection calculation on face areas corresponding to the two face images which are adjacently arranged by adopting a preset image characteristic algorithm so as to generate a corresponding first deflection angle.
In this embodiment, the face position is a region position formed by a face boundary. The pupil distance is the number of pixels between two eyes, and the face area can be further filled with pure black so as to eliminate the influence of the face on the calculation result.
The face position and the face interpupillary distance can be determined by adopting the face recognition algorithm commonly used at present. The preset image feature algorithm may also adopt a SURF (English is fully called: speed-Up Robust Features, chinese is: acceleration robust feature) algorithm, and the image feature is utilized to calculate the image selection angle by using the constant space transformation.
Step S205, arranging the data of each gyroscope in the gyroscope acquisition data according to the acquisition time sequence, and generating the arranged data of each gyroscope.
Step S206, determining, as a second deflection angle, a difference between the gyroscope data arranged later and the gyroscope data arranged earlier, among the two gyroscope data arranged adjacently among the gyroscope data arranged later.
In this embodiment, the gyroscope data includes xyz three spatial axis direction data, and the second deflection angle is obtained by subtracting xyz data of two gyroscope data arranged adjacently, that is, the difference between the gyroscope data arranged later and the gyroscope data arranged earlier.
Step S207, determining a reference comparison value and a comparison value to be tested according to the first deflection angle and the second deflection angle.
Alternatively, in this embodiment, S207 may specifically be:
and performing one comparison on all the first deflection angles and the second deflection angles which are positioned in front of the preset quantity threshold according to the arrangement sequence of the first deflection angles and the second deflection angles, and generating a plurality of intermediate comparison results. One comparison is to compare the first deflection angle and the second deflection angle of the same order.
An average of the plurality of intermediate alignment results is determined as a reference alignment value.
And comparing all the first deflection angles and the second deflection angles which are positioned behind the preset quantity threshold value to generate a plurality of comparison values to be tested.
The preset number threshold may be set according to practical applications, for example, may be set to 10, and then the first 10 face images and the first 10 gyroscope data may be compared one by one.
For example, assuming that there are 20 face images and 20 gyroscope data, the first 10 face images are respectively an image a, an image b, and an image c … image j, and the first 10 gyroscope data are respectively data a, data b, and data c … data j. And comparing the deflection angle corresponding to the image a with the deflection angle corresponding to the data a to generate a corresponding intermediate comparison result, and comparing the deflection angle corresponding to the image b with the deflection angle corresponding to the data b to generate a corresponding intermediate comparison result.
The 10 intermediate comparison results were averaged to obtain a reference comparison value.
And taking an comparison value generated by one comparison between the face images of which the sequences are 11-20 and the gyroscope data as the comparison value to be checked.
And step S208, determining a final face verification result according to the reference comparison value and the comparison value to be tested.
In this embodiment, if the errors between all the to-be-inspected log values and the reference log values are within the preset range, the final face verification result can be considered to be passed, and if the errors between part of the to-be-inspected log values and the reference log values are within the preset range, further judgment can be performed according to the number within the error range.
Alternatively, in this embodiment, S208 may specifically be:
and determining the number corresponding to the normal comparison value according to the comparison value to be tested and the reference comparison value. The normal comparison value is the comparison value to be checked in a preset normal value range. The preset normal value range is based on the reference comparison value.
If the ratio of the number corresponding to the normal comparison value to the number corresponding to the comparison value to be tested is greater than or equal to a preset ratio threshold, determining that the final face verification result is verification passing.
If the ratio between the number corresponding to the normal comparison value and the number corresponding to the comparison value to be detected is smaller than a preset ratio threshold, determining that the final face verification result is that verification is failed.
The preset normal numerical range can be within-10% of the reference comparison numerical range, and can also be correspondingly set according to practical application. The preset ratio threshold may be set to 90%, 95%, or the like.
Optionally, in this embodiment, the process of determining the number corresponding to the normal comparison value according to the comparison value to be tested and the reference comparison value may specifically be:
and if the to-be-checked comparison value is in the normal value range, determining that the to-be-checked comparison value is a normal comparison result. The minimum value of the normal numerical range is the difference between the reference comparison value and the preset first range threshold, and the maximum value of the normal numerical range is the sum of the reference comparison value and the preset second range threshold.
The total number of normal alignment results was calculated and the total number was taken as the alignment.
Assuming that the preset first range threshold is 0.1 and the preset second range threshold is also 0.1, the preset normal value range is (reference alignment value-0.1, reference alignment value +0.1). If the comparison value to be tested is in (reference comparison value-0.1, reference comparison value +0.1), the comparison value to be tested is determined to be a normal comparison result.
In this embodiment, since the AI face-changing session generally uses hijacking the camera of the user terminal, the video in the face-authentication stage is replaced by the pseudo-face video after AI face-changing to pass the face authentication.
The method mainly comprises the steps of starting a camera during face authentication, acquiring data of a user terminal gyroscope, wherein the gyroscope data precision is very high, the common precision can reach 0.01 degree, and the minimum variation can be recorded. And comparing whether the tiny deflection of the real-time gyroscope is consistent with the face deflection angle of the face authentication. If the deflection coincidence degree is above 90%, the data of the camera and the gyroscope sensor are considered to be synchronous, and the real person is considered to operate the non-AI face-changing.
Fig. 4 is a schematic structural diagram of a face verification apparatus provided by the present application, as shown in fig. 4, in this embodiment, the face verification apparatus 300 may be disposed in an electronic device, such as a server, where the server is connected to a user terminal in a communication manner, and the user terminal is provided with a gyroscope and a camera, and the face verification apparatus 300 includes:
the acquiring module 301 is configured to acquire image acquisition data and gyroscope acquisition data corresponding to the user terminal after the preliminary face verification result is that the verification is passed. The acquisition requirements of the image acquisition data and the gyroscope acquisition data are the same, and the data arrangement sequence is the same. The image verification data includes a plurality of face images arranged. The gyroscope acquisition data includes a plurality of gyroscope data arranged.
A first determining module 302 is configured to determine a first deflection angle between adjacent face images in the image acquisition data.
A second determining module 303 is configured to determine a second deflection angle between adjacent gyroscope data in the gyroscope acquired data.
The third determining module 304 is configured to determine a reference alignment value and an alignment value to be tested according to the first deflection angle and the second deflection angle.
And the verification module 305 is configured to determine a final face verification result according to the reference comparison value and the comparison value to be tested.
The face verification device provided in this embodiment may execute the technical scheme of the method embodiment shown in fig. 2, and its implementation principle and technical effects are similar to those of the method embodiment shown in fig. 2, and are not described in detail herein.
The face verification device provided by the present application is further refined on the basis of the face verification device provided in the previous embodiment, and the face verification device 300 includes:
optionally, in this embodiment, the acquisition requirement includes an acquisition time and an acquisition frequency.
The acquiring module 301 is specifically configured to, when acquiring image acquisition data and gyroscope acquisition data corresponding to a user terminal:
and receiving video authentication data and gyroscope acquisition data sent by the user terminal. And the gyroscope acquisition data are generated by acquiring the gyroscope in the user terminal according to the acquisition time and the acquisition frequency. The video authentication data includes a video generation time. And carrying out image acquisition on the video authentication data according to the video generation time based on the acquisition time and the acquisition frequency to generate image acquisition data.
Optionally, in this embodiment, the first determining module 302 is specifically configured to:
and arranging the face images in the image acquisition data according to the acquisition time sequence to generate arranged face images. A first deflection angle between two face images adjacently arranged in each face image after arrangement is determined.
Optionally, in this embodiment, when determining the first deflection angle between two face images adjacently arranged in each face image after arrangement, the first determining module 302 is specifically configured to:
and determining the face position and the face interpupillary distance in each face image by adopting a preset face recognition algorithm. And determining a face region to be calculated according to the face position and the face interpupillary distance. The x coordinate of the left boundary of the face region is the difference between the x coordinate of the left eye of the face and 0.5 pupil distance, the x coordinate of the right boundary of the face region is the sum between the x coordinate of the right eye of the face and 0.5 pupil distance, the y coordinate of the upper boundary of the face region is the sum between the y coordinate corresponding to the horizontal position of the eyes of the face and 1 pupil distance, and the y coordinate of the lower boundary of the face region is the difference between the y coordinate corresponding to the horizontal position of the eyes of the face and 1 pupil distance. And carrying out angle deflection calculation on face areas corresponding to the two face images which are adjacently arranged by adopting a preset image characteristic algorithm so as to generate a corresponding first deflection angle.
Optionally, in this embodiment, the second determining module 303 is specifically configured to:
arranging the data of each gyroscope in the gyroscope acquisition data according to the acquisition time sequence, and generating the arranged data of each gyroscope. For two gyroscope data adjacently arranged in each gyroscope data after arrangement, determining a difference value between the gyroscope data arranged in the rear and the gyroscope data arranged in the front as a second deflection angle.
Optionally, in this embodiment, the third determining module 304 is specifically configured to:
and performing one comparison on all the first deflection angles and the second deflection angles which are positioned in front of the preset quantity threshold according to the arrangement sequence of the first deflection angles and the second deflection angles, and generating a plurality of intermediate comparison results. One comparison is to compare the first deflection angle and the second deflection angle of the same order. An average of the plurality of intermediate alignment results is determined as a reference alignment value. And comparing all the first deflection angles and the second deflection angles which are positioned behind the preset quantity threshold value to generate a plurality of comparison values to be tested.
Optionally, in this embodiment, the verification module 305 is specifically configured to:
and determining the number corresponding to the normal comparison value according to the comparison value to be tested and the reference comparison value. The normal comparison value is the comparison value to be checked in a preset normal value range. The preset normal value range is based on the reference comparison value. If the ratio of the number corresponding to the normal comparison value to the number corresponding to the comparison value to be tested is greater than or equal to a preset ratio threshold, determining that the final face verification result is verification passing. If the ratio between the number corresponding to the normal comparison value and the number corresponding to the comparison value to be detected is smaller than a preset ratio threshold, determining that the final face verification result is that verification is failed.
Optionally, in this embodiment, when determining the number corresponding to the normal comparison value according to the comparison value to be tested and the reference comparison value, the verification module 305 is specifically configured to:
and if the to-be-checked comparison value is in the normal value range, determining that the to-be-checked comparison value is a normal comparison result. The minimum value of the normal numerical range is the difference between the reference comparison value and the preset first range threshold, and the maximum value of the normal numerical range is the sum of the reference comparison value and the preset second range threshold. The total number of normal alignment results was calculated and the total number was taken as the alignment.
The face verification device provided in this embodiment may execute the technical scheme of the method embodiment shown in fig. 2 to 3, and its implementation principle and technical effects are similar to those of the method embodiment shown in fig. 2 to 3, and are not described in detail herein.
According to embodiments of the present application, the present application also provides an electronic device, a computer-readable storage medium, and a computer program product.
As shown in fig. 5, fig. 5 is a schematic structural diagram of an electronic device provided by the present application. Electronic devices are intended for various forms of digital computers, such as laptops, desktops, personal digital assistants, blade servers, mainframes, and other appropriate computers. The components shown herein, their connections and relationships, and their functions, are meant to be exemplary only, and are not meant to limit implementations of the applications described and/or claimed herein.
As shown in fig. 5, the electronic device includes: a processor 401 and a memory 402. The various components are interconnected using different buses and may be mounted on a common motherboard or in other manners as desired. The processor may process instructions executing within the electronic device.
Memory 402 is a non-transitory computer readable storage medium provided by the present application. The memory stores instructions executable by the at least one processor to cause the at least one processor to perform the face verification method provided by the application. The non-transitory computer readable storage medium of the present application stores computer instructions for causing a computer to execute the face verification method provided by the present application.
The memory 402 is used as a non-transitory computer readable storage medium, and may be used to store a non-transitory software program, a non-transitory computer executable program, and modules, such as program instructions/modules corresponding to the face verification method in the embodiment of the present application (e.g., the acquisition module 301, the first determination module 302, the second determination module 303, the third determination module 304, and the verification module 305 shown in fig. 4). The processor 401 executes various functional applications of the electronic device and data processing, i.e., implements the face authentication method in the above-described method embodiments, by running non-transitory software programs, instructions, and modules stored in the memory 402.
Meanwhile, the present embodiment also provides a computer product, which when executed by a processor of an electronic device, enables the electronic device to perform the face verification method of the above embodiment.
Other implementations of the examples of the application will be apparent to those skilled in the art from consideration of the specification and practice of the application disclosed herein. This application is intended to cover any variations, uses, or adaptations of embodiments of the application following, in general, the principles of the embodiments of the application and including such departures from the present disclosure as come within known or customary practice within the art to which the embodiments of the application pertains.
It is to be understood that the embodiments of the application are not limited to the precise arrangements and instrumentalities shown in the drawings, which have been described above, and that various modifications and changes may be made without departing from the scope thereof. The scope of embodiments of the application is limited only by the appended claims.
Claims (12)
1. The face verification method is characterized by being applied to a server, wherein the server is in communication connection with a user terminal, the user terminal is provided with a gyroscope and a camera, and the method comprises the following steps:
after the primary face verification result is that the verification is passed, acquiring image acquisition data and gyroscope acquisition data corresponding to a user terminal according to the face verification request; the acquisition requirements of the image acquisition data and the gyroscope acquisition data are the same, and the data arrangement sequence is the same; the image verification data comprises a plurality of face images which are arranged; the gyroscope acquisition data comprises a plurality of arrayed gyroscope data;
Determining a first deflection angle between adjacent face images in the image acquisition data;
determining a second deflection angle between adjacent gyroscope data in the gyroscope acquisition data;
determining a reference comparison value and a comparison value to be tested according to the first deflection angle and the second deflection angle;
and determining a final face verification result according to the reference comparison value and the comparison value to be tested.
2. The method of claim 1, wherein the acquisition requirements include acquisition time and acquisition frequency;
the acquiring the image acquisition data and the gyroscope acquisition data corresponding to the user terminal comprises the following steps:
receiving video authentication data and gyroscope acquisition data sent by a user terminal; the gyroscope acquisition data are acquired and generated by the gyroscope in the user terminal according to the acquisition time and the acquisition frequency; the video authentication data includes video generation time;
and carrying out image acquisition on the video authentication data according to the video generation time based on the acquisition time and the acquisition frequency to generate image acquisition data.
3. The method of claim 2, wherein determining a first angle of deflection between adjacent face images in the image acquisition data comprises:
Arranging face images in the image acquisition data according to the acquisition time sequence to generate arranged face images;
a first deflection angle between two face images adjacently arranged in each face image after arrangement is determined.
4. A method according to claim 3, wherein determining a first angle of deflection between two face images adjacently arranged in each face image after arrangement comprises:
determining face positions and face interpupillary distances in the face images by adopting a preset face recognition algorithm;
determining a face area to be calculated according to the face position and the face interpupillary distance; the x coordinate of the left boundary of the face area is the difference value between the x coordinate of the left eye of the face and 0.5 pupil distance, the x coordinate of the right boundary of the face area is the sum between the x coordinate of the right eye of the face and 0.5 pupil distance, the y coordinate of the upper boundary of the face area is the sum between the y coordinate corresponding to the horizontal position of the eyes of the face and 1 pupil distance, and the y coordinate of the lower boundary of the face area is the difference value between the y coordinate corresponding to the horizontal position of the eyes of the face and 1 pupil distance;
and carrying out angle deflection calculation on face areas corresponding to the two face images which are adjacently arranged by adopting a preset image characteristic algorithm so as to generate a corresponding first deflection angle.
5. The method of claim 4, wherein determining a second angle of deflection between adjacent ones of the gyroscope data comprises:
arranging all the gyroscope data in the gyroscope acquisition data according to the acquisition time sequence to generate arranged gyroscope data;
for two adjacent arranged gyroscope data in each arranged gyroscope data, determining a difference value between the arranged gyroscope data arranged at the rear and the arranged gyroscope data at the front as the second deflection angle.
6. The method of claim 1, wherein said determining a reference alignment value and an alignment value to be inspected from said first deflection angle and said second deflection angle comprises:
according to the arrangement sequence of the first deflection angles and the second deflection angles, performing one comparison on all the first deflection angles and the second deflection angles which are positioned in front of a preset quantity threshold value to generate a plurality of intermediate comparison results; the one-to-one comparison is to compare the first deflection angle and the second deflection angle which are in the same sequence;
determining an average value of the plurality of intermediate comparison results as a reference comparison value;
And carrying out one-to-one comparison on all the first deflection angles and the second deflection angles which are positioned behind the preset quantity threshold value to generate a plurality of comparison values to be tested.
7. The method of claim 6, wherein determining the final face verification result based on the reference alignment value and the alignment value to be inspected comprises:
determining the number corresponding to the normal comparison value according to the comparison value to be tested and the reference comparison value; the normal comparison value is a comparison value to be checked in a preset normal value range; the preset normal numerical range is based on the reference comparison value;
if the ratio between the number corresponding to the normal comparison value and the number corresponding to the comparison value to be detected is greater than or equal to a preset ratio threshold value, determining that the final face verification result is verification passing;
if the ratio between the number corresponding to the normal comparison value and the number corresponding to the comparison value to be detected is smaller than a preset ratio threshold, determining that the final face verification result is that verification is failed.
8. The method of claim 7, wherein determining the number of normal alignment values based on the alignment value to be tested and the reference alignment value comprises:
If the to-be-inspected comparison value is in the normal value range, determining that the to-be-inspected comparison value is a normal comparison result; the minimum value of the normal numerical range is the difference value between the reference comparison value and the preset first range threshold value, and the maximum value of the normal numerical range is the sum of the reference comparison value and the preset second range threshold value;
the total number of normal comparison results is calculated and taken as the comparison number.
9. A face verification apparatus, located at a server, the server being in communication connection with a user terminal, the user terminal being provided with a gyroscope and a camera, the apparatus comprising:
the acquisition module is used for acquiring image acquisition data and gyroscope acquisition data corresponding to the user terminal after the primary face verification result is that the primary face verification result passes the verification; the acquisition requirements of the image acquisition data and the gyroscope acquisition data are the same, and the data arrangement sequence is the same; the image verification data comprises a plurality of face images which are arranged; the gyroscope acquisition data comprises a plurality of arrayed gyroscope data;
the first determining module is used for determining a first deflection angle between adjacent face images in the image acquisition data;
The second determining module is used for determining a second deflection angle between adjacent gyroscope data in the gyroscope acquisition data;
the third determining module is used for determining a reference comparison value and a comparison value to be tested according to the first deflection angle and the second deflection angle;
and the verification module is used for determining a final face verification result according to the reference comparison value and the comparison value to be tested.
10. An electronic device, comprising: a memory and a processor;
the memory stores computer-executable instructions;
the processor executes computer-executable instructions stored in the memory to implement the face verification method of any one of claims 1 to 8.
11. A computer readable storage medium having stored therein computer executable instructions which when executed by a processor are for implementing a face verification method as claimed in any one of claims 1 to 8.
12. A computer program product comprising a computer program which, when executed by a processor, implements a face verification method as claimed in any one of claims 1 to 8.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202311234898.4A CN117235698A (en) | 2023-09-22 | 2023-09-22 | Face verification method, device, equipment, medium and product |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202311234898.4A CN117235698A (en) | 2023-09-22 | 2023-09-22 | Face verification method, device, equipment, medium and product |
Publications (1)
Publication Number | Publication Date |
---|---|
CN117235698A true CN117235698A (en) | 2023-12-15 |
Family
ID=89085766
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202311234898.4A Pending CN117235698A (en) | 2023-09-22 | 2023-09-22 | Face verification method, device, equipment, medium and product |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN117235698A (en) |
-
2023
- 2023-09-22 CN CN202311234898.4A patent/CN117235698A/en active Pending
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Boulkenafet et al. | OULU-NPU: A mobile face presentation attack database with real-world variations | |
US10430679B2 (en) | Methods and systems for detecting head motion during an authentication transaction | |
CN102985933A (en) | Distinguishing live faces from flat surfaces | |
CN110705507A (en) | Identity recognition method and device | |
US12256011B2 (en) | Methods, systems, and media for secure authentication of users based on a biometric identifier and knowledge-based secondary information | |
CN114387548B (en) | Video and living body detection method, system, device, storage medium and program product | |
CN109034029A (en) | Detect face identification method, readable storage medium storing program for executing and the electronic equipment of living body | |
EP2701096A2 (en) | Image processing device and image processing method | |
JP2018169943A (en) | Face authentication processing device, face authentication processing method and face authentication processing system | |
KR102748556B1 (en) | Method and apparatus for testing liveness | |
WO2024169261A1 (en) | Image processing method and apparatus, and electronic device, computer-readable storage medium and computer program product | |
CN117235698A (en) | Face verification method, device, equipment, medium and product | |
CN109063442B (en) | Service implementation method and device and camera implementation method and device | |
KR20170051392A (en) | Mobile device for checking liviness and operating method thereof | |
CN113033243A (en) | Face recognition method, device and equipment | |
CN108921036B (en) | Random number generation method and generation system based on face image recognition | |
CN110909704A (en) | Living body detection method and device, electronic equipment and storage medium | |
CN114202790B (en) | Living face detection method, device, electronic device and storage medium | |
CN113705428B (en) | Living body detection method and device, electronic equipment and computer readable storage medium | |
CN109242489B (en) | Authentication mode selection method and device | |
Liu et al. | Passface: Enabling practical anti-spoofing facial recognition with camera fingerprinting | |
CN110781473A (en) | Method for recognizing and preprocessing face picture | |
CN116305281B (en) | Human face recognition system and human face recognition method based on sensory cognition | |
CN116318700A (en) | Face authentication method and device | |
CN117011920A (en) | Face recognition method and device, electronic equipment and storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |