[go: up one dir, main page]

CN110895688A - System and method for identifying user identity - Google Patents

System and method for identifying user identity Download PDF

Info

Publication number
CN110895688A
CN110895688A CN201911358583.4A CN201911358583A CN110895688A CN 110895688 A CN110895688 A CN 110895688A CN 201911358583 A CN201911358583 A CN 201911358583A CN 110895688 A CN110895688 A CN 110895688A
Authority
CN
China
Prior art keywords
server
sensitive information
information
user
state value
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201911358583.4A
Other languages
Chinese (zh)
Inventor
张迪
张振龙
宋松凯
魏建国
郑辰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Relx Technology Co Ltd
Original Assignee
Shenzhen Relx Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Relx Technology Co Ltd filed Critical Shenzhen Relx Technology Co Ltd
Priority to CN201911358583.4A priority Critical patent/CN110895688A/en
Publication of CN110895688A publication Critical patent/CN110895688A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/178Human faces, e.g. facial parts, sketches or expressions estimating age from face image; using age information for improving recognition

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Financial Or Insurance-Related Operations Such As Payment And Settlement (AREA)

Abstract

The application relates to a system for identifying a user identity. The electronic device is configured to receive non-sensitive information associated with a user and includes an image capture apparatus. The first server is configured to determine a strong check state value associated with the non-sensitive information. The second server is configured to communicate with the first server. When the first server determines that the strong verification state value is a first state value, the first server generates a mark, the image acquisition equipment of the first electronic device is started to acquire a first living body image of the user based on the mark and first picture information from a first database, and the second server determines whether the first living body image corresponds to the first picture information.

Description

System and method for identifying user identity
Technical Field
The present application relates generally to a system and a method of identifying a user identity, and more particularly to a system and method of identifying a user identity and a user age.
Background
In modern society, automation facilities need systems and methods for identifying users to obtain information related to users, such as sex, age, body type or other sensitive information, so that the automation facilities can determine whether to provide services or stop services to users based on the obtained information. For example, the system for identifying the identity of the user may capture an image of a part of the biometric features of the user's body, process the captured image to determine whether the user meets a specific condition, and transmit an instruction to the automation facility to provide a service if the user meets the specific condition.
However, the conventional system for identifying the identity of the user may generate an error which cannot be ignored when processing the image of one part of the human body biological characteristics of the user. For example, a conventional system for identifying a user's identity may acquire an image of facial features of the user, and process the acquired image of facial features to determine the age of the user. However, the facial features of the user are not perfectly correlated with the actual age of the user, which may lead to a situation where the identification system may make a decision error, or even to an ill-conditioned user illegally using a service or purchasing a product. Such a drawback would make the application of the identification system much more limited.
Accordingly, the present disclosure provides a system for identifying a user identity and a method for identifying a user identity that solve the above problems.
Disclosure of Invention
A method for identifying a user identity is provided, which comprises the following steps: receiving, by an electronic device, non-sensitive information associated with a user; determining a strong check state value associated with the non-sensitive information; when the strong check state value is a first state value: generating a mark by the first server; reading first picture information of a first database; starting an application program of the electronic device to acquire a first living body image of the user based on the mark and the first picture information; and judging whether the first living body image corresponds to the first picture information.
A system for identifying a user identity is provided, which includes a first electronic device, a first server and a second server. The electronic device is configured to receive non-sensitive information associated with a user and includes an image capture apparatus. The first server is configured to determine a strong check state value associated with the non-sensitive information. The second server is configured to communicate with the first server. When the first server determines that the strong verification state value is a first state value, the first server generates a mark, the image acquisition equipment of the first electronic device is started to acquire a first living body image of the user based on the mark and first picture information from a first database, and the second server determines whether the first living body image corresponds to the first picture information.
Drawings
Aspects of the present application are readily understood from the following detailed description when read in conjunction with the accompanying drawings. It should be noted that the various features may not be drawn to scale and that the dimensions of the various features may be arbitrarily increased or reduced for clarity of discussion.
Fig. 1A illustrates a schematic diagram of an identity recognition system according to some embodiments of the present application.
FIG. 1B illustrates a schematic diagram of an identity recognition system, according to some embodiments of the present application.
Fig. 2A-2G illustrate schematic diagrams of a user interface of an electronic device according to some embodiments of the present application.
Fig. 3 illustrates a flow diagram of an identity recognition method according to some embodiments of the present application.
Fig. 4A illustrates a flow diagram of an identity recognition method according to some embodiments of the present application.
FIG. 4B illustrates a schematic diagram of device 40C in FIG. 4A.
Fig. 5 illustrates a flow diagram of an identity recognition method according to some embodiments of the present application.
Fig. 6 illustrates a flow diagram of an identity recognition method according to some embodiments of the present application.
Common reference numerals are used throughout the drawings and the detailed description to refer to the same or like components. The features of the present application will become apparent from the following detailed description when taken in conjunction with the accompanying drawings.
Detailed Description
The following disclosure provides many different embodiments, or examples, for implementing different features of the provided subject matter. Specific examples of components and arrangements are described below. Of course, these are merely examples and are not intended to be limiting. In addition, the present application may repeat reference numerals and/or letters in the various examples. This repetition is for the purpose of simplicity and clarity and does not in itself dictate a relationship between the various embodiments and/or configurations discussed.
Embodiments of the present application are discussed in detail below. It should be appreciated, however, that the present application provides many applicable concepts that can be embodied in a wide variety of specific contexts. The particular embodiments discussed are merely illustrative and do not limit the scope of the application.
The existing system for identifying the identity of the user may cause the situation of wrong determination, or even cause the illegal use of the service or the purchase of the product by the user with improper conditions. Such a drawback would make the application of the identification system much more limited.
The present disclosure provides a system and method for identifying a user identity, the identity identification system may include a first electronic device, a first server and a second server, wherein the first electronic device may include a display for displaying a user interface. And acquiring sensitive information or picture information which is related to the user and is pre-registered from the first database or the second database based on the information input by the user on the electronic device. The first server determines a status value of the user and accesses the second server based on the status value of the user by means of the indication or information in the specific format. The second server can compare the sensitive information or the living body image information input in real time with the sensitive information or the picture information which is registered in advance. The system and the method for identifying the identity of the user more rigorously are provided by judging the state value of the user and further comparing the real-time information with the pre-login information, and the situation of wrong identity identification is effectively avoided.
Fig. 1A illustrates a schematic diagram of a system 1A for recognizing a user identity according to some embodiments of the present application. As shown in fig. 1A, the system for identifying a user identity 1A includes an electronic device 10, a server 11, a server 12, a database 13 and a database 14.
The electronic device 10 may be connected to the server 11 via a communication network. In some embodiments, the electronic device 10 and the server 11 may be connected via wired communication. In certain embodiments, the electronic device 10 and the server 11 may be connectable via wireless communication technology. The electronic device 10 and the server 11 may be connected to each other via various communication technologies including, but not limited to, for example, ethernet, fibre channel over ethernet (FCoE), peripheral component interconnect express (PCIe), Advanced Host Controller Interface (AHCI), bluetooth, WiFi, and cellular data services such as GSM, CDMA, GPRS, WCDMA, EDGE, CDMA2000, or LTE, or a combination thereof.
The electronic device 10 may have a user interface to provide user input information and the electronic device 10 may have a user interface to display information. The electronic device 10 may provide the user with an authentication procedure. In some embodiments, the authentication procedure may be an identification procedure of the user. The electronic device 10 may have an image acquisition apparatus. The electronic device 10 can execute the application program and perform image acquisition through the image acquisition equipment. The electronic device 10 may execute a live image acquisition application and perform live image acquisition via an image acquisition device.
In some embodiments, the live image may be a human face, a fingerprint, a palm print, or a portion of an iris of an eye, a retina of an eye, or the like having a human biometric characteristic. In some embodiments, the live view acquisition application of the electronic device 10 may include a Software Development Kit (SDK). The software development group has a living body detection function. The in-vivo detection function may include the steps of: (1) calling an image capturing device; (2) starting face recognition and establishing a face recognition frame; (3) after the face is detected, judging the position; (4) judging whether the position is proper and whether the living body is a living body, wherein the judgment can comprise the steps of judging whether the living body blinks, opens the mouth, shakes or nods and the like; (5) after the living body is judged, photographing by using image acquisition equipment; (6) the acquired live image data is transmitted to the server 11. The steps described above are exemplary only and do not represent that the steps described above must be performed in a certain order. In some embodiments, the first electronic device may be a portable device, such as a tablet, cell phone, watch, or other handheld device, and in some embodiments, the electronic device 10 may be a stationary device, such as a computer.
The server 11 may be connected to the database 13 via a communication network. The server 11 and the server 12 may be connected via a communication network. The server 11 may include a cache. A cache may store information. The cache of the server 11 may store information entered by a user at the user interface of the electronic device 10. The server 11 may receive image data acquired by the electronic device 10. The cache of the server 11 may store the acquired image data.
The server 12 may include an interface. The server 11 and the server 12 may be communicatively connected via an interface. The server 12 may include a plurality of interfaces. In certain embodiments, server 12 may include an interface 121 and an interface 122. In some embodiments, the server 12 may include more interfaces. In certain embodiments, the server 12 may include fewer interfaces.
The interface 121 may be a get key interface. Interface 122 may be an authentication interface. The Interface 121 or the Interface 122 may be an Application Programming Interface (API). In some embodiments, when the server 11 receives information generated from the electronic device 10, the server 11 transmits the information to the interface 121 of the server 12, and the server 12 generates a key (token) when the information conforms to the specification of the interface. After the server 12 generates the key, the electronic device 10 may invoke an application to perform live image acquisition on the user via the image acquisition device. The database 13 may provide information back to the server 11 after the server 12 generates the key. The electronic device 10 can transmit the acquired living body image data to the interface 122 of the server 12 via the server 11. In some embodiments, the server 11 may transmit the pre-registered picture information P1 returned from the database 13 to the second interface 122 of the server 12.
The live view data is compared with the pre-registered picture information P1 in the server 12, and it is confirmed whether or not the live view data is identical to or corresponds to the pre-registered picture information P1. In some embodiments, the live video data is compared with the pre-registered picture information P1 at the interface 122 of the server 12. in some embodiments, the server 12 may request the database 14 to provide the picture information P2. The live image data and the picture information are compared in the server 12 to confirm whether the live image data and the picture information are the same or corresponding to each other. In some embodiments, the live image data is compared with the image information at the interface 122 of the server 12. In some embodiments, the data of the whole area or the partial area of the captured live view image is compared with the whole area or the partial area of the picture information P1 (or P2), and when the error value between the two is smaller than the threshold T1 or the similarity between the two is greater than the threshold T2, it is determined that the live view image is the same as or corresponds to the picture information P1 (or P2).
In certain embodiments, the database 14 already stores sensitive data for users. The database 14 has stored non-sensitive data for the user. The database 14 has stored picture information P2 of the user. In some embodiments, the key is obtained before a request for data is made to the database 14. In some embodiments, access to the interface 122 with sensitive information is required to make data requests to the database 14. In certain embodiments, the database 14 may be an authoritative data source. In certain embodiments, database 13 is different from database 14. The database 14 may be connected only to the server 12 via a communication network. In some embodiments, the database 14 may not be connected to the server 11.
In some embodiments, the server 11, the server 12, and the database 13 may establish a cloud system. In some embodiments, the database 14 is not included in the cloud system. In some embodiments, the server 11 may be a service layer (service layer) in the cloud. In some embodiments, the server 12 may be a service layer in the cloud. In some embodiments, the database 13 may be a data source layer. In some embodiments, the database 14 may be a data source layer.
The database 13 may store sensitive information I1 that the user has previously logged in, and in some embodiments, the first sensitive information may include name, identification number, age, date of birth, and the like. The database 13 may store non-sensitive information that the user has previously logged in. In some embodiments, the non-sensitive information may include a cell phone number, a personal email account number, and the like. The database 13 may store status values of users. In some embodiments, the user's status value may be related to the user's competency. The status value of the user may be related to whether the identity has been authenticated. The status value of the user may be related to whether the identity has been strongly verified. The database 13 may store first picture information that a user has previously logged in.
Before executing the identity authentication system of the present disclosure, the user may perform a strong check in advance. When the strong check is completed, the database 13 stores the strong check state as the state value S1 (e.g., the state value S1 is "1"), and when the strong check is not completed, the database 13 stores the strong check state as the state value S2 (e.g., the state value S2 is "0"). In some embodiments, the strong check may include the steps of: (1) the database 13 stores picture information P1 pre-registered by the user, and the database 13 stores sensitive information I1 pre-registered by the user; (2) the sensitive information I1 pre-logged in by the user is transmitted to the interface 122 of the server 12 via the server 11; (2) the database 14 provides sensitive information I2 and picture information; (3) the sensitive information I1 is compared with the second sensitive information I2 at the interface 122C 1; (5) comparing the picture information P1 with the picture information P2 at the interface 122 to C2, wherein in some embodiments, the comparison C2 comprises comparing data of all or part of the captured live view image with all or part of the picture information P1 (or P2), and when an error value between the data and the picture information P1 (or P2) is smaller than a threshold T1 or a similarity between the data and the picture information P1 (or P2) is greater than a threshold T2, determining that the live view image and the picture information P1 (or P2) are the same or corresponding; (6) when the comparison C1 is the same and the comparison C2 is the same or corresponding, the server 12 writes the strong check state value as the state value S1 and returns the strong check state value to the database 13; (7) on the contrary, when the comparison C1 is not the same or the comparison C2 is not the same or corresponding, the server 12 writes the strong check state value as the state value S2 and returns the state value to the database 13 through the server 11; (7) the database 13 stores strongly verified state values or state values. In some embodiments, the cache of the server 11 may store the strongly checked state value S1 or the state value S2. The steps described above are exemplary only and do not represent that the steps described above must be performed in a certain order.
The server 12 may request the database 14 to provide the sensitive information I3. The transmission of the sensitive information I3 back to the server 11 determines whether the sensitive information is greater than or equal to the threshold T3. When the sensitive information I3 is greater than or equal to the threshold T3, the server 11 writes the qualification status value ES1 (e.g., the qualification status value ES1 is "1") of the user, and transmits the qualification status value to the database 13 for storage. In some embodiments, when the sensitive information I3 is less than the threshold T3, the server 11 writes a user qualification status value ES2 (e.g., the qualification status value ES2 is "0") to the database 13 for storage, and the cache of the server 11 stores the user qualification status value. In some embodiments, the sensitive information I3 may be an age value of the user. In some embodiments, the threshold T3 may be a constant value.
Before executing the identity authentication system disclosed by the invention, the user can perform sensitive information authentication in advance. When the sensitive information authentication is completed, the database 13 stores the sensitive information authentication status as the status value S3 (e.g., the status value S3 is "1"), and when the sensitive information authentication is not completed, the database 13 stores the sensitive information authentication status as the status value S4 (e.g., the status value S4 is "0"). The pre-registered sensitive information I1 may be transmitted from the database 13 to the interface 121 of the server 12 via the server 11, the server 12 generates a key when the sensitive information complies with the specification of the interface 121, and the server 12 may request the database 14 to transmit back the second sensitive information I2 corresponding to the user's sensitive information I1 after the key is generated. The sensitive information I1 and the sensitive information I2 are determined at the interface 121 of the server 12 whether they are the same.
When the same is true, i.e. the sensitive information authentication is completed, the server 12 writes the sensitive information authentication status as status value S3 (e.g. status value S3 is "1") and transmits to the server 11. In some embodiments, the cache of the server 11 may store the state value S3. In some embodiments, the status value S1 is transmitted to the database 13 via the server 11 for storage. When the authentication is not completed, the server 12 writes the authentication status of the sensitive information as the status value S4 (e.g., the status value S4 is "0") and transmits the status value to the server 11. In some embodiments, the cache of the server 11 may store a state value of the sensitive information authentication S4. In some embodiments, the status value S4 of the sensitive information certificate is transmitted to the database 13 via the server 11 for storage.
In some embodiments, the user who does not perform sensitive information authentication has the relevant sensitive information authentication preset as the status value S4. In some embodiments, only users authenticated to sensitive information have their associated strong check state preset to the state value S2. In some embodiments, only users authenticated to sensitive information have their associated user qualifications pre-set to the qualification status value ES 2.
FIG. 1B illustrates a schematic diagram of a system for recognizing user identity 1B, according to some embodiments of the present application. As shown in fig. 1B, the user identification system 1B is similar to the identification system 1a, except that the identification system 1B further includes an electronic device 10', and the electronic device 10' is connected to the server 11 via a communication network. In some embodiments, the electronic device 10' and the server 11 may be connected via wired communication. In certain embodiments, the electronic device 10' and the server 11 may be connectable via wireless communication technology.
In some embodiments, a person opposite the user (e.g., a store attendant) may operate the electronic device 10'. The person opposite the user may operate the electronic device 10' to stop the identity authentication procedure.
Fig. 2A-2G illustrate schematic diagrams of a user interface 2A-2G of an electronic device 10 according to some embodiments of the present application.
As shown in fig. 2A, the electronic device 10 may include a housing 20, an image capturing apparatus 21, and a display 22. The image capturing device 21 is disposed at an edge of the housing. The display 22 is disposed on a surface of the housing. In some embodiments, the electronic device 10 may include a control module disposed inside the electronic device 10 to control the image acquisition equipment and the display. The electronic device 10 may include a storage module disposed inside the electronic device 10 and communicatively connected with a control module. The storage module can store information input by a user. The storage module can store living body image data. The storage module may store information from the server 11.
The control module may control the display 22 to display the user interfaces 2A-2G (shown in fig. 2A-2G). The control module may be communicatively coupled to the server 11. As shown in fig. 2A, the display 22 displays a user interface 2A, the user interface 2A including an indication 2A1 that prompts the user to enter non-sensitive information. The user interface 2a includes an input box 2a2 that provides user input. The user interface 2a includes a provide confirmation icon 2a 3. After the user enters non-sensitive information at the input box 2a2, the touch confirmatory icon 2a3 causes the display 22 to display the user interface 2b, 2d, or 2 e.
As shown in FIG. 2B, the display 22 displays a user interface 2B, the user interface 2B including an indication 2B1 prompting the user to enter a passcode. The verification code may be received from a mobile device of the user. The user interface 2b includes an input box 2b2 that provides user input. The user interface 2b includes a provision confirmation icon 2b 3. After the user enters the verification code at input box 2b2, the touch confirmatory icon 2b3 causes the display 22 to display the user interface 2b or 2 c.
As shown in fig. 2C, the display 22 displays a user interface 2C, and the user interface 2C includes an image display frame 2C1 for synchronously displaying the images captured by the image capturing device. When the biometric feature of the user is completely displayed within the range defined by the image display frame 2c1, the display 22 displays the user interface 2 e. In some embodiments, the biometric feature may be a human face. In some embodiments, when the display 22 displays the user interface 2c, the image capture device performs a liveness detection function.
As shown in fig. 2D, the display 22 displays a user interface 2D, the user interface 2D including an indication 2D1 that alerts the user of the sensitive information to be checked input from the user. The user interface 2d includes input boxes 2d2 and 2d3 that provide user input. The user interface 2d includes a provision confirmation icon 2d 4. After the user enters the sensitive information to be checked in the input boxes 2d2 and 2d3, the tactile confirmation icon 2d4 causes the display 22 to display the user interface 2c or 2 e. In some embodiments, the user enters an identification number and name in entry boxes 2d2 and 2d3, respectively. In some embodiments, the user interface 2d may include additional input boxes to provide for the user to enter additional sensitive or non-sensitive information. In some embodiments, the user interface 2d may integrate the input boxes 2d2 and 2d3 into a single input box.
As shown in FIG. 2E, display 22 displays user interface 2E, and user interface 2E displays icon 2E1 indicating "in progress". In some embodiments, when the display 22 displays the user interface 2e, the live image data is compared with the picture information P1 of the database 13 at the interface 122 of the server 12. In some embodiments, when the display 22 displays the user interface 2e, the live image data is compared with the picture information P2 of the database 14 at the interface 122 of the server 12. When the comparison is complete, the display 22 may display a user interface 2f or 2 g.
As shown in fig. 2F, the display 22 displays a user interface 2F, and the user interface 2F displays a diagram 2F1 showing "authentication completion".
As shown in fig. 2G, the display 22 displays a user interface 2G, and the user interface 2G displays a diagram 2G1 showing "authentication failure".
Fig. 3 illustrates a flow diagram of an identity recognition method according to some embodiments of the present application. The flow chart of fig. 3 represents operations that are continuously performed in the identification system 1A as described in fig. 1A.
In operation 301, a user is provided ready for authentication and the display 22 of the electronic device 10 displays the user interface 2 a. In some embodiments, the electronic device 10 may be in a standby state, and the user may wake up into the user interface 2a by touching the display 22 of the electronic device 10.
At operation 302, the user enters non-sensitive information, which may include the user's phone number, email account number, etc. in some embodiments, according to prompt 2a1 of user interface 2 a.
In operation 303, the electronic device 10 transmits the non-sensitive information to the database 13 via the server 11, and searches the database 13 for a status value of the user related to the non-sensitive information. In some embodiments, the user's state value may be a strongly checked state value.
In operation 304, when the strongly verified state value of the user is the state value S1 (e.g., the state value S1 is "1"), the method proceeds to operation 305. When the server 11 confirms that the strongly verified state value of the user is the state value S2 (e.g., the state value S2 is "0"), the method proceeds to operation 309.
In operation 305, the database 13 transmits the verification code to the electronic device 10, the electronic device 10 controls the display 22 to change from the user interface 2a to the user interface 2b, and the database 13 transmits the verification code to the personal device of the user via the communication network. The user's personal device may be a portable device such as a mobile phone, tablet, smart watch, and the like. The user enters the authentication code received by the personal device according to prompt 2b1 of user interface 2 b. When the first electronic device 10 determines that the verification code input by the user is the same as the verification code transmitted by the first server 11, the first server 11 generates an indication. In some embodiments, the indication may be a Universally Unique Identifier (UUID) or a Globally Unique Identifier (GUID). First picture information indicating a user pre-registered with the first database 13 is accessed via the server 11 to the interface 121 of the server 12, and the server 12 generates the key TK1 when the indication corresponds to the specification of the interface 121.
In operation 306, the display 22 of the electronic device 10 enters the user interface 2c from the user interface 2b after generating the key TK 1. After generating the TK1, the electronic device 10 executes an image capturing application and performs live image capturing for the user through the image capturing device 21. In some embodiments, the server 12 may transmit the key TK1 to the server 11, and the server 11 transmits an instruction to the electronic device 10 to invoke the image capturing application based on the key TK 1. The electronic device 10 transmits the acquired live image data IM1 to the server 11. In some embodiments, the cache of the server 11 may store the live image data IM 1. The electronic device 10 transmits the acquired live image data IM1 to the interface 122 of the server 12 via the server 11.
In operation 307, the server 11 requests the database 13 to transmit the picture information P1 of the user registered in advance to the interface 122 of the server 12, compares the live view data IM1 with the picture information P1 at the interface 122, and confirms whether the live view data IM1 is identical to or corresponds to the picture information P1. In some embodiments, the live view data IM1 and the picture information P1 may be human faces. The server 12 retrieves the data of the whole area or the partial area from the live image data IM1, compares the data with the whole area or the partial area of the picture information P1, and determines that the live image is the same as or corresponding to the picture information when the error value between the two is smaller than the threshold T2 or the similarity between the two is greater than the threshold T3, and the method proceeds to operation 308. When both error values are greater than the threshold T2 or both similarities are less than the threshold T3, the method proceeds to operation 311.
In some comparative embodiments, the identity authentication procedure only compares whether the verification code input by the user in the electronic device is the same as the verification code generated by the server, which may cause the third person to steal the mobile device of the user to complete the identity authentication. The identity authentication procedure disclosed by the invention at least needs to be subjected to verification code judgment and comparison between the living body image data IM1 and the pre-registered picture information P1, so that the identity identification is more rigorous and accurate. The condition of stealing sensitive information of users is effectively prevented.
In operation 308, when the processing unit determines that the living body image data IM1 is identical to or corresponds to the picture information P1, the processing unit transmits the confirmation completion information to the electronic device 10, and the electronic device 10 causes the display 22 to display the user interface 2f according to the confirmation completion information.
In some embodiments, after operation 308, the electronic device 10 may provide the user with the function of payment.
In operation 309, a determination is made of the status value of the sensitive information certificate stored in the database 13 in relation to the non-sensitive information. When it is determined that the state value of the sensitive information authentication is the state value S3, the method proceeds to operation 310. When it is determined that the state value of the sensitive information authentication is the state value S4, the method proceeds to operation 311. In some embodiments, the cache of the server 11 may store the state value of the sensitive information certificate, and the server 11 may determine the state value S3 or the state value S4 according to the state value of the sensitive information certificate stored in the cache.
In operation 310, the database 13 transmits the sensitive information I1 to the interface 121 of the server 12 via the server 11. The server 12 generates the key TK2 when the sensitive information I1 conforms to the specification of the interface 121. In certain embodiments, the sensitive information I1 may include a plurality of sensitive data. In some embodiments the sensitive information I1 may be a combination of an identification number and a name.
When operation 306 is performed via operation 310, the display 22 of the electronic device 10 transitions from user interface 2b to user interface 2c upon generation of the key TK 2. The electronic device 10 executes an application program and performs living body image acquisition on the user through the image acquisition apparatus 21. In some embodiments, the server 12 may transmit the key TK2 to the server 11, and the server 11 transmits an instruction to the electronic device 10 to invoke the image capturing application based on the key TK 2. The electronic device 10 transmits the acquired living body image data to the server 11. In some embodiments, the cache of the server 11 may store live image data. The electronic device 10 transmits the acquired live image data IM2 to the interface 122 of the server 12 via the server 11.
When operation 307 is performed through operations 306 and 310, the server 12 requests the database 14 to transmit the picture information P2 related to the user since the database 13 does not store the picture information P1 of the user. In some embodiments, the server 11 may access the server 12 to the interface 122 by using the live image data IM2 and the sensitive information I1, and the server 12 requests the database 14 to provide the sensitive information I2 and the picture information P2 related to the user. The live view data IM2 is compared with the pre-stored picture information P2 at the interface 122 to determine whether the live view data IM is identical to or corresponding to the picture information. In certain embodiments, the sensitive information I1 and the sensitive information I2 may be compared at the interface 122. When it is determined that the live view data IM2 is the same as or corresponds to the picture information P2 of the user, the method proceeds to operation 308. Otherwise, the method proceeds to operation 311. In some embodiments, when the live image data IM2 is determined to be the same as or corresponding to the picture information of the user P2, the server 12 writes the strong check status as the status value S1 (e.g., status value S1 "1"). The server 12 may transmit the strong checked state value S1 to the server 11, and the cache of the server 11 may store the strong checked state value S1. In some embodiments, the server 12 may transmit the status value S1 of the completion of the strong check to the database 13 via the server 11 for storage. In some embodiments, the default strong check state value in database 13 may be overwritten as state value S1.
In operation 307, the server 12 may request the database 14 to provide the sensitive information I3. The sensitive information I3 is transmitted back to the server 11 to determine whether the sensitive information I3 is greater than the threshold T3. When the sensitive information I3 is greater than the threshold T3, the server 11 writes the user qualification status ES1 (e.g., the qualification status ES1 is "1"), and transmits the user qualification status to the database 13 for storage. In some embodiments, when the sensitive information I3 is less than the threshold T3, the server 11 writes a user qualification status value ES2 (e.g., the qualification status value ES2 is "0") to the database 13 for storage, and the cache of the server 11 stores the user qualification status value. In some embodiments, the sensitive information I3 may be an age value of the user. When the user qualification is written to the qualification status value ES1, the method proceeds to operation 308. When the user qualification is written as the qualification status value ES2, the method proceeds to operation 311. In some embodiments, the server 12 may also determine whether the sensitive information I3 is greater than a threshold T3.
In certain embodiments, when the strong check state is the state value S1 and the user qualification is the qualification state value ES1, the method proceeds to operation 308. In certain embodiments, when the strong check state is the state value S1 and the user is the eligibility state value ES2, the strong check state is the state value S2, and the user is eligible for the eligibility state value ES1 or the strong check state is the state value S2 and the user is eligible for the eligibility state value ES2, the method proceeds to operation 311. The self-id verification method needs to obtain at least two different specific state values in operation 307, and then proceeds to operation 308 to complete the identity authentication procedure. The confirmation completed information is transmitted to the electronic device 10, and the electronic device 10 displays the user interface 2f on the display 22 according to the confirmation completed information. The identity authentication program can be judged to be completed only by acquiring at least two different types of state values, so that the rigor of identity identification is effectively improved, the convenience is effectively provided, and a pipeline is provided to enable a user to timely complete the identity verification program when the user does not complete strong verification in advance due to forgetting or other factors. In some comparative embodiments, the identity verification procedure only includes comparison of sensitive information, which may result in theft of sensitive information of the user. The identity verification method disclosed by the invention needs to be subjected to at least strong verification and user qualification judgment, so that the identity identification is more rigorous and accurate, and the condition of stealing sensitive information of the user is effectively prevented.
In operation 311, the user interface of the electronic device 10 transitions the display 22 from the user interface 2a to the user interface 2d, and the user may enter the sensitive information to be checked VI1 at the input boxes 2d2 and 2d3 based on the prompt 2d1 of the user interface 2 d.
In operation 312, the display 22 of the electronic device 10 is transitioned from the user interface 2d to the user interface 2 e. The electronic device 10 transmits the to-be-verified sensitive information VI1 to the interface 121 of the server 12 via the server 11, and the server 12 may request the database 14 to transmit the sensitive information I2 to the first interface 121. The sensitive information to be checked VI1 is compared with the sensitive information I2 at the interface 121 to determine whether the information is the same. When the same, the server 12 writes the sensitive information authentication as the status value S1. When the same, the server 12 generates the key TK 3. Where key TK3 may be identical to key TK 2. When the comparison between the sensitive information VI1 to be checked and the sensitive information I2 is different, the method returns to operation 311, and the display 22 of the electronic device 10 is changed from the user interface 2e to the user interface 2d, so that the user can input new sensitive information to be checked again.
In some embodiments, when the database 14 cannot search the associated sensitive information I2 according to the sensitive information to be checked VI1, the method returns to operation 311.
In operation 313, the display 22 of the electronic device 10 is transitioned from the user interface 2b to the user interface 2c upon generation of the key TK 3. The electronic device 10 executes an application program to perform living body image acquisition on the user through the image acquisition apparatus 21. In some embodiments, the server 12 may transmit the key TK3 to the server 11, and the server 11 transmits an instruction to the electronic device 10 to invoke the image capturing application based on the key TK 3. The electronic device 10 transmits the acquired live image data IM3 to the server 11. In some embodiments, the cache of the server 11 may store the live image data IM 3. The electronic device 10 transmits the acquired live image data IM3 to the interface 122 of the server 12 via the server 11.
In operation 314, since the database 13 does not store the picture information P1 of the user, the server 12 will request the database 14 to transmit the picture information P2 related to the user. In some embodiments, the server 11 may access the live image data IM3 and the sensitive information to be verified VI1 to the interface 122, and the server 12 requests the database 14 to provide the sensitive information I2 and the picture information P2 related to the user. The live-image data IM3 is compared with the picture information P2 at the interface 122 to determine whether the live-image data IM3 is the same as or corresponds to the picture information P2. In certain embodiments, the sensitive information I1 and the sensitive information I2 may be compared at the interface 122. When it is determined that the live view data IM3 is the same as or corresponds to the picture information P2 of the user, the method proceeds to operation 315. Otherwise, the method proceeds to operation 316. In some embodiments, when the live image data IM3 is determined to be identical to or corresponding to the user' S picture information P2, the second server 12 writes a strong check status as status value S1 (e.g., status value S1 is "1"). The server 12 may transmit the strong checked state value S1 to the server 11, and the cache of the server 11 may store the strong checked state value S1. In some embodiments, the server 12 may transmit the status value S1 of the completion of the strong check to the database 13 via the server 11 for storage. In some embodiments, the default strong check state value in database 13 may be overwritten as state value S1.
In operation 314, the server 12 may request the database 14 to provide the sensitive information I3. The sensitive information I3 is transmitted back to the server 11 to determine whether the sensitive information I3 is greater than the threshold T3. When the sensitive information I3 is greater than the threshold T3, the server 11 writes the user qualification status ES1 (e.g., the qualification status ES1 is "1"), and transmits the user qualification status to the database 13 for storage. In some embodiments, when the sensitive information I3 is less than the threshold T3, the server 11 writes the user qualification status value ES2 (e.g., the qualification status value ES2 is "0") to the database 13 for storage. The cache of the server 11 may store state values of the user qualifications. In some embodiments, the sensitive information I3 may be an age value of the user. In some embodiments, the threshold T3 may be a constant value. When the user qualification is written to the qualification status value ES1, the method proceeds to operation 315. When the user qualification is written to the qualification status value ES2, the method proceeds to operation 316. In some embodiments, the server 12 may also determine whether the sensitive information I3 is greater than a threshold T3.
In certain embodiments, when the strong check state is the state value S1 and the user qualification is the qualification state value ES1, the method proceeds to operation 315. In certain embodiments, when the strong check state is the state value S1 and the user qualification is the qualification state value ES2, the strong check state is the state value S1, and the user qualification is the qualification state value ES1 or the strong check state is the state value S2 and the user qualification is the qualification state value ES2, the method proceeds to operation 316. The self-identity verification method needs to obtain at least two different types of status values in operation 314, and then proceeds to operation 315, and further proceeds to operation 308 to complete the identity authentication procedure. The confirmation completed information is transmitted to the electronic device 10, and the electronic device 10 displays the user interface 2f on the display 22 according to the confirmation completed information. The identity authentication program can be judged to be completed only by acquiring at least two different types of state values, so that the rigor of identity identification is effectively improved, the convenience is effectively provided, and a pipeline is provided to enable a user to timely complete the identity verification program when the user does not complete strong verification in advance due to forgetting or other factors. In some comparative embodiments, the identity verification procedure only includes comparison of sensitive information, which may result in theft of sensitive information of the user. The identity verification method disclosed by the invention needs to be subjected to at least strong verification and user qualification judgment, so that the identity identification is more rigorous and accurate. The condition of stealing sensitive information of users is effectively prevented.
Furthermore, when the user is a new customer and the server 11 does not store the sensitive information of the user, the identity verification method of the present disclosure provides a pipeline for the user to perform the authentication, strong verification and qualification determination of the sensitive information to be verified in real time, thereby simplifying the identity identification process and attracting the new customer to use the service or purchase the product.
In operation 315, the server 11 transmits the living body image data IM3 of the user to the database 13 for storage. The server 11 transmits the strongly checked state value S1 and the qualification state value ES1 of the user qualification to the database 13 for storage.
When operation 308 is performed through operation 315, the server 11 transmits the confirmed information to the electronic device 10, and the electronic device 10 causes the display 22 to display the user interface 2f according to the confirmed information.
In operation 316, the server 12 transmits a termination signal to the first electronic device 10 via the server 11, and the electronic device 10 displays the user interface 2g on the display 22 according to the termination signal. In some embodiments, the user interface for the termination mode includes a text prompt to disable the provision of the service or product.
Fig. 4A illustrates a flow diagram of an identity recognition method according to some embodiments of the present application. The identity recognition method 4 includes operations 401 to 422 performed by the device 40A, the server 40B, the device 40C, the server 40D, the database 40E, and the server 40F in the identity recognition system. Fig. 4B illustrates a schematic diagram of the terminal 40C of fig. 4A.
The device 40A and the server 40B are connected via a communication network. The device 40A and the server 40B may be connected to each other via various communication technologies including, but not limited to, for example, ethernet, fibre channel over ethernet (FCoE), peripheral component interconnect express (PCIe), Advanced Host Controller Interface (AHCI), bluetooth, WiFi, and cellular data services such as GSM, CDMA, GPRS, WCDMA, EDGE, CDMA2000, or LTE, or a combination thereof. In some embodiments, the apparatus 40A may be a portable device, such as a tablet, cell phone, watch, or other handheld device, and in some embodiments, the apparatus 40A may be a stationary device, such as a computer.
The server 40B and the device 40C are connected via a communication network. Server 40B may receive instructions or information from device 40C. Server 40B and device 40C may be connected to each other via various communication techniques as between device 40A and server 40B. In some embodiments, the server may include an Application Programming Interface (API). In some embodiments, the server 40B may be an Internet socket.
The device 40C and the server 40D are connected via a communication network. The device 40C and the server 40D may be connected to each other via various communication technologies including, but not limited to, for example, ethernet, fibre channel over ethernet (FCoE), peripheral component interconnect express (PCIe), Advanced Host Controller Interface (AHCI), bluetooth, WiFi, and cellular data services such as GSM, CDMA, GPRS, WCDMA, EDGE, CDMA2000, or LTE, or a combination thereof. In certain embodiments, the device 40C may be an electronic device. In some embodiments, the apparatus 40C may be a portable device, such as a tablet, cell phone, watch, or other handheld apparatus, and in some embodiments, the apparatus 40C may be a stationary device, such as a computer.
FIG. 4B illustrates a schematic diagram of device 40C in FIG. 4A.
Referring to FIG. 4B, the apparatus 40C includes a display 40C1, an image capturing device 40C2, a control module 40C3, and a storage module 40C 4. Fig. 4B is merely exemplary in nature and does not represent that the components described above must be configured in accordance with fig. 4B.
The display 40C1 may be disposed on a surface of the device 40C. The display 40C1 of the device 40C may display different user interface modes. In certain embodiments, the display 40C1 of the device 40C may display the user interfaces 2A-2G as in fig. 2A-2G. Display 40C1 of device 40C may provide user input information and display 40C1 of device 40C may display information. Device 40C may provide a user with an authentication procedure. In some embodiments, the authentication procedure may be an identification procedure of the user.
A control module 40C3 is disposed within the device 40C and is configured to control the display 40C1 and the image acquisition device 40C 2. Image capture device 40C2 is located on a surface of device 40C and adjacent to display 40C 1. Control module 40C3 of apparatus 40C may execute applications and perform image acquisition via image acquisition device 40C 2. The control module 40C3 of the apparatus 40C may execute a live image acquisition application and perform live image acquisition via the image acquisition device 40C 2. In some embodiments, the live image may be a human face, a fingerprint, a palm print, or a portion of an iris of an eye, a retina of an eye, or the like having a human biometric characteristic. In certain embodiments, the live view acquisition application of the device 40C may comprise a Software Development Kit (SDK). The software development group has a living body detection function. The in-vivo detection function may include the steps of: (1) calling an image capturing device; (2) starting face recognition and establishing a face recognition frame; (3) after the face is detected, judging the position; (4) judging whether the position is proper and whether the living body is a living body, wherein the judgment can comprise the steps of judging whether the living body blinks, opens the mouth, shakes or nods and the like; (5) after the living body is judged, photographing by using image acquisition equipment; (6) the acquired living body image data is transmitted to the server 40D. The steps described above are exemplary only and do not represent that the steps described above must be performed in a certain order.
Storage module 40C4 is disposed within device 40C and is communicatively connected with control module 40C 3. The storage module 40C4 may store information entered by a user. The storage module 40C4 of the device 40C may store the in vivo image data. The storage module 40C4 may store information from the server 40D.
Referring to fig. 4A, a server 40D may be connected with a database 40E via a communication network. Server 40D may include a cache. A cache may store information. The cache of the server 40 may store information entered by the user at the user interface of the device 40C. Server 40D may receive image data acquired by device 40C. The cache of the server 40D may store the acquired image data.
Database 40E may store sensitive information about the user, such as identification number, name, date of birth, etc. Database 40E may store non-sensitive information about the user, such as a phone number, an email box, an account number of the communication software, an encrypted account number of the communication software, and the like. Database 40E may establish logic or rules that associate particular sensitive information with particular non-sensitive information.
The server 40F and the server 40D may be connected via a communication network. The server 40F may generate secret information based on the information from the server 40D. In some embodiments, the secret information may be a key. Based on the information from the server 40D, the server 40F can determine whether or not the information of the server 40D corresponds to the pre-stored information of the external database.
In operation 401, the display 40C1 of the device 40C displays the user interface 2 a. The user enters information M1 according to prompt 2a1 of user interface 2a, and in some embodiments, information M1 may include the user's phone number, email account number, two-dimensional barcode, and so forth.
In operation 402, the server 40D receives the information M1 from the device 40C. In some embodiments, the cache of server 40D may store information M1.
In operation 403, the database 40E is searched by the information M1 for pre-stored information PM1 corresponding to the information M1 and user information UM1 associated with the pre-stored information PM 1. In certain embodiments, the information M1 may be the same as the pre-stored information PM 1. In some embodiments, user information UM1 may be sensitive information of a user.
In operation 404, when the database 40E has the pre-stored information PM1 corresponding to the information M1 and the user information UM1, the database 40E transfers the user information UM1 to the server 40D, and the server 40D accesses the server 40F based on the user information UM 1.
In operation 405, the server 40F generates secret information SM1 based on the user information UM 1.
In operation 406, the secret information SM1 is transmitted from the server 40F to the control module 40C3 of the device 40C via the server 40D. In some embodiments, the cache of server 40D may store secret information SM 1.
In operation 407, the control module 40C3 of the device 40C receives the secret information SM1 of the server 40F to control the display 40C1 of the device 40C to display the user interface mode 2C. The control module 40C1 of the device 40C transmits the status value S5 to the server 40B.
In operation 408, the server 40B transmits an instruction IS1 representing the state value S5 to the device 40A.
In operation 409, the display of the device 40A displays the user interface mode 2e in response to the status value S5.
In operation 410, the control module 40C3 of the device 40C controls the picture taking device 40C2 to take the live picture IM1 of the user based on the secret information SM 1.
In operation 411, the server 40D transmits the living body image data IM4 and the user information UM1 to the server 40F. In some embodiments, the server 40D transmits the live image data IM4 and the secret information SM1 to the server 40F.
In operation 412, the server 40F determines whether the live view data IM4 corresponds to the pre-stored picture information P3 of the external database, wherein the pre-stored picture information P3 is associated with the user. The server 40F determines whether the secret information SM1 corresponds to the pre-stored information PM2 of the external database, wherein the pre-stored information PM2 is associated with the user. The server 40F determines whether the live view data IM4 and the secret information SM1 correspond to the pre-stored picture information P3 and the pre-stored information PM2 of the external database at the same time. When there is a simultaneous correspondence, the method proceeds to operation 417. When either does not correspond, the method proceeds to operation 413.
In operation 413, server 40D receives from server 40F a status value S6 representing a status determined to be non-corresponding in operation 412 and transmits to device 40C.
In operation 414, the display 40C1 of the device 40C displays the user interface mode 2g in response to the state value S6.
In operation 415, the server 40B transmits an instruction IS2 representing the state value S6 to the device 40A in response to the state value S6.
In operation 416, the display of device 40A displays user interface mode 2g in response to instructing IS 2.
In operation 417, the server 40D receives the status value S7 representing the status determined to correspond in operation 417 from the server 40F and transmits the status value to the device 40C.
In operation 418, the display 40C1 of the device 40C displays the user interface mode 2h in response to the state value S7.
In operation 419, the server 40B transmits an instruction IS3 representing the state value S7 to the device 40A in response to the state value S7.
In operation 420, the display of device 40A displays user interface mode 2h in response to instructing IS 3.
In operation 421, the server 40D transmits the living body image data IM4 and the user information UM1 to the database 40E.
In operation 422, the database 40E stores the living body image data IM4 and the user information UM 1.
Fig. 5 illustrates a flow diagram of an identity recognition method 5 according to some embodiments of the present application. The identity recognition method 5 is similar to the identity recognition method 4, except that the identity recognition method 5 includes operations 501 to 510 instead of operations 404 to 407 and 410 to 412 of the identity recognition method 4.
In operation 501, when the database 40E does not have the pre-stored information PM1 and the user information UM1 corresponding to the information M1, the server 40D transmits the instruction IS4 to the device 40C representing that the database 40E does not have the pre-stored information PM1 and the user information UM1 corresponding thereto.
In operation 502, the display 40C1 IS controlled to jump to the user interface mode 2d in response to instructing the control module 40C3 of the IS4 device 40C.
In operation 503, the user inputs user information UM2 of the user at input boxes 2d2 and 2d3 according to the prompt of the user interface mode 2 d. In some embodiments, user information UM2 may include sensitive information. In some embodiments, the storage module 40C4 may store the user information UM 2. The control module 40C1 of the device 40C transmits the status value S5 to the server 40B.
In operation 504, the server 40D accesses the server 40F based on the user information UM 2.
In operation 505, the server 40F generates secret information SM2 based on the user information UM 2.
In operation 506, the secret information SM2 is transmitted from the server 40F to the control module 40C3 of the device 40C via the server 40D. In some embodiments, the cache of server 40D may store secret information SM 2.
In operation 507, the control module 40C3 of the device 40C receives the secret information SM2 of the server 40F to control the display 40C1 of the device 40C to display the user interface mode 2C.
In operation 508, the control module 40C3 of the device 40C controls the picture taking device 40C2 to take the live picture IM5 of the user based on the secret information SM 2.
In operation 509, the server 40D transmits the living body image data IM5 and the user information UM2 to the server 40F. In some embodiments, the server 40D transmits the live image data IM5 and the secret information SM2 to the server 40F.
In operation 510, the server 40F determines whether the live view data IM5 corresponds to the pre-stored picture information P3 of the external database. The server 40F determines whether the secret information SM2 corresponds to the user information UM1 of the external database. The server 40F determines whether or not the live view IM2 and the secret information SM2 correspond to both the pre-stored picture information P3 and the pre-stored information PM2 of the external database. When there is a simultaneous correspondence, the method proceeds to operation 417. When either does not correspond, the method proceeds to operation 413.
Fig. 6 illustrates a flow diagram of an identity recognition method according to some embodiments of the present application. The flow chart of fig. 6 represents operations that are performed in succession in the identification system as described in fig. 4A.
In operation 601, the user inputs information M1 on the display 40C1 of the device 40C.
In operation 602, the server 40D determines whether there is pre-stored information PM1 corresponding to the information M1 and user information UM1 associated with the pre-stored information PM1 in the database 40E. When present, the method proceeds to operation 603. When not present, the method proceeds to operation 607.
In operation 603, the server 40D requests the server 40F to generate the secret information SM1 based on the user information UM 1.
In operation 604, the control module 40C3 of the device 40C controls the photographic acquisition device 40C2 to acquire the living body photographic data IM4 of the user based on the secret information SM 1.
In operation 605, the server 40F determines whether the live view data IM4 and the user information UM1 correspond to the pre-picture information P1 and the pre-storage information PM2 of the external database. When so, the method proceeds to operation 606. When there is no correspondence, the method proceeds to operation 607.
In operation 606, the display 40C1 of the device 40C displays the user interface mode 2 h.
In operation 607, the user enters user information UM2 on the display of device 40C.
In operation 608, the server 40D requests the generation of the secret information SM2 from the server 40F based on the user information UM 2.
In operation 609, the control module 40C3 of the device 40C controls the photographic acquisition device 40C2 to acquire the living body photographic data IM5 of the user based on the secret information SM 2.
In operation 605, the server 40F determines whether the live view IM2 and the user information UM2 correspond to the pre-stored picture information P3 and the pre-stored information PM2 of the external database, respectively. When so, the method proceeds to operation 606. When there is no correspondence, the method proceeds to operation 607.
In some comparative embodiments, the identity authentication procedure only compares whether the verification code input by the user in the electronic device is the same as the verification code generated by the server, which may cause the third person to steal the mobile device of the user to complete the identity authentication. The identity authentication procedure disclosed by the invention at least needs to be judged by the verification code and compared with the living body image data and the pre-registered picture information, so that the identity identification is more rigorous and accurate. The condition of stealing sensitive information of users is effectively prevented.
Furthermore, when the user is a new customer, under the condition that the database 40E does not store the pre-stored information of the user, the identity verification method of the present disclosure provides a pipeline for the user to perform the authentication of the sensitive information to be verified in real time, thereby simplifying the identity identification process and attracting the new customer to use the service or purchase the product.
As used herein, the terms "approximately," "substantially," "essentially," and "about" are used to describe and account for minor variations. When used in conjunction with an event or circumstance, the terms can refer to an instance in which the event or circumstance occurs precisely as well as an instance in which the event or circumstance occurs in close proximity. As used herein with respect to a given value or range, the term "about" generally means within ± 10%, ± 5%, ± 1%, or ± 0.5% of the given value or range. Ranges may be expressed herein as from one end point to another end point or between two end points. Unless otherwise specified, all ranges disclosed herein are inclusive of the endpoints. The term "substantially coplanar" may refer to two surfaces located within a few micrometers (μm) along the same plane, e.g., within 10 μm, within 5 μm, within 1 μm, or within 0.5 μm located along the same plane. When referring to "substantially" the same numerical value or property, the term can refer to values that are within ± 10%, ± 5%, ± 1%, or ± 0.5% of the mean of the stated values.
As used herein, the terms "approximately," "substantially," "essentially," and "about" are used to describe and explain minor variations. When used in conjunction with an event or circumstance, the terms can refer to an instance in which the event or circumstance occurs precisely as well as an instance in which the event or circumstance occurs in close proximity. For example, when used in conjunction with numerical values, the terms can refer to a range of variation that is less than or equal to ± 10% of the stated numerical value, e.g., less than or equal to ± 5%, less than or equal to ± 4%, less than or equal to ± 3%, less than or equal to ± 2%, less than or equal to ± 1%, less than or equal to ± 0.5%, less than or equal to ± 0.1%, or less than or equal to ± 0.05%. For example, two numerical values are considered to be "substantially" or "about" the same if the difference between the two numerical values is less than or equal to ± 10% (e.g., less than or equal to ± 5%, less than or equal to ± 4%, less than or equal to ± 3%, less than or equal to ± 2%, less than or equal to ± 1%, less than or equal to ± 0.5%, less than or equal to ± 0.1%, or less than or equal to ± 0.05%) of the mean of the values. For example, "substantially" parallel may refer to a range of angular variation of less than or equal to ± 10 ° from 0 °, e.g., less than or equal to ± 5 °, less than or equal to ± 4 °, less than or equal to ± 3 °, less than or equal to ± 2 °, less than or equal to ± 1 °, less than or equal to ± 0.5 °, less than or equal to ± 0.1 °, or less than or equal to ± 0.05 °. For example, "substantially" perpendicular may refer to a range of angular variation of less than or equal to ± 10 ° from 90 °, e.g., less than or equal to ± 5 °, less than or equal to ± 4 °, less than or equal to ± 3 °, less than or equal to ± 2 °, less than or equal to ± 1 °, less than or equal to ± 0.5 °, less than or equal to ± 0.1 °, or less than or equal to ± 0.05 °.
For example, two surfaces may be considered coplanar or substantially coplanar if the displacement between the two surfaces is equal to or less than 5 μm, equal to or less than 2 μm, equal to or less than 1 μm, or equal to or less than 0.5 μm. A surface may be considered planar or substantially planar if the displacement of the surface relative to the plane between any two points on the surface is equal to or less than 5 μm, equal to or less than 2 μm, equal to or less than 1 μm, or equal to or less than 0.5 μm.
As used herein, the singular terms "a" and "the" may include plural referents unless the context clearly dictates otherwise. In the description of some embodiments, a component provided "on" or "over" another component may encompass the case where the preceding component is directly on (e.g., in physical contact with) the succeeding component, as well as the case where one or more intervening components are located between the preceding and succeeding components.
As used herein, spatially relative terms, such as "below," "lower," "above," "upper," "lower," "left," "right," and the like, may be used herein for ease of description to describe one component or feature's relationship to another component or feature as illustrated in the figures. Spatially relative terms are intended to encompass different orientations of the device in use or operation in addition to the orientation depicted in the figures. The device may be otherwise oriented (rotated 90 degrees or at other orientations) and the spatially relative descriptors used herein interpreted accordingly. It will be understood that when an element is referred to as being "connected" or "coupled" to another element, it can be directly connected or coupled to the other element or intervening elements may be present.
The foregoing summarizes features of several embodiments and detailed aspects of the present disclosure. The embodiments described in this disclosure may be readily used as a basis for designing or modifying other processes and structures for carrying out the same or similar purposes and/or obtaining the same or similar advantages of the embodiments introduced herein. Such equivalent constructions do not depart from the spirit and scope of the present disclosure and various changes, substitutions, and alterations can be made therein without departing from the spirit and scope of the present disclosure.

Claims (27)

1. A method of recognizing a user identity, comprising:
receiving, by an electronic device, non-sensitive information associated with a user;
determining a strong check state value associated with the non-sensitive information;
when the strong check state value is a first state value:
generating a mark by the first server;
reading first picture information of a first database;
starting an application program of the electronic device to acquire a first living body image of the user based on the mark and the first picture information; and
and judging whether the first living body image corresponds to the first picture information or not.
2. The method of claim 1, further comprising:
when the strongly checked state value is a second state value:
determining a sensitive information authentication state value related to the non-sensitive information;
when the sensitive information authentication state value is a third state value:
reading first sensitive information related to the non-sensitive information in a first database;
starting the application program of the electronic device based on the first sensitive information to acquire a second living body image of the user; and
and judging whether the second living body image corresponds to the second picture information of the second database.
3. The method of claim 2, wherein the first picture information is different from the second picture information.
4. The method of claim 2, wherein the first database is different from the second database.
5. The method of claim 1, further comprising accessing a first interface using the indication and the first picture information to generate a first key, and invoking the application of the electronic device to obtain the first live view based on the first key.
6. The method of claim 2, further comprising accessing a first interface using the sensitive information to generate a second key, and invoking the application of the electronic device to obtain the second live image of the user based on the second key.
7. The method of claim 2, further comprising:
when the sensitive information authentication state value is a fourth state value:
receiving sensitive information to be checked by the electronic device;
reading second sensitive information of the second database;
judging whether the sensitive information to be checked is the same as the second sensitive information or not;
when the sensitive information to be checked is judged to be the same as the second sensitive information:
starting the application program of the electronic device based on the sensitive information to be checked so as to capture a third living body image of the user; and
and judging whether the third living body image corresponds to the second picture information or not.
8. The method of claim 2, further comprising:
when the strongly checked state value is the second state value:
determining a qualification status value for the user, comprising the steps of:
(1) reading third sensitive information of the second database;
(2) determining, by the first server, whether the third sensitive information is greater than a first threshold.
9. The method of claim 2, further comprising:
when the second live view image corresponds to the second picture information:
overwriting the strong check state value with the first state value.
10. The method of claim 9, further comprising:
reading third sensitive information of the second database;
when the third sensitive information is greater than a first threshold:
the user interface of the electronic device is in an authentication complete mode.
11. The method of claim 1, further comprising determining whether the first live view corresponds to the first picture information using a second interface, which comprises capturing data of all or a portion of the first live view and comparing the captured data with the data of all or a portion of the first picture information, and determining whether an error value therebetween is smaller than a second threshold.
12. The method of claim 1, further comprising:
when the first live image and the first picture information do not correspond to each other:
the first server transmits an instruction to place a user interface of the electronic device in a mode to receive sensitive information to be checked.
13. The method of claim 2, wherein the first sensitive information includes at least two or more sensitive information.
14. A system for recognizing a user identity, comprising:
a first electronic device configured to receive non-sensitive information related to a user and comprising an image acquisition device;
a first server configured to determine a strong check state value associated with the non-sensitive information; and
a second server configured to communicatively couple with the first server,
when the first server determines that the strong verification state value is the first state value, the first server generates a mark, the image acquisition device is started to acquire a first living body image of the user based on the mark and first picture information from a first database, and the second server determines whether the first living body image corresponds to the first picture information.
15. The system of claim 14, further comprising:
when the first server judges that the strong check state value is a second state value, the first server judges a sensitive information authentication state value related to the non-sensitive information; and
when the sensitive information authentication state value is a third state value:
and starting the image acquisition device to acquire a second living image of the user based on the sensitive information related to the non-sensitive information in the first database, and judging whether the second living image corresponds to second picture information in a second database or not by the second server.
16. The system of claim 15, further comprising a second electronic device configured to stop an identity authentication procedure.
17. The system of claim 15, wherein the first picture information is different from the second picture information.
18. The system of claim 15, wherein the first database is different from the second database.
19. The system of claim 14, wherein the second server comprises a first interface, the indication and the first picture information access the first interface via the first server to generate a first key, and the image capturing device of the electronic apparatus is activated by the first key to capture the first live image.
20. The system of claim 15, wherein the second server includes a first interface, the sensitive information accessing the first interface via the first server to generate a second key, the image of the electronic device being enabled based on the second key to acquire the second live image by an acquisition device.
21. The system of claim 15, wherein:
when the first server judges that the sensitive information authentication state value is a fourth state value:
receiving sensitive information to be checked by the electronic device;
the second server judges whether the sensitive information to be checked is the same as second sensitive information of the second database or not;
when the sensitive information to be checked is judged to be the same as the second sensitive information:
starting the image acquisition equipment of the electronic device to acquire a third living body image of the user based on the sensitive information to be checked; and
the second server determines whether the third live view corresponds to the second picture information.
22. The system of claim 15, wherein the first server determines the qualification status value of the user when the strong check status value is the second status value, comprising the steps of:
(3) reading third sensitive information of the second database;
(4) it is determined whether the third sensitive information is greater than a first threshold.
23. The system of claim 15, wherein the strongly-verified state value is overwritten as the first state value when the second live image corresponds to the first picture information.
24. The system of claim 23, wherein the first server reads third sensitive information of the second database via the second server, a user interface of the electronic device being in an authentication complete mode when the third sensitive information is greater than a first threshold.
25. The system of claim 14, wherein the second server comprises a second interface, wherein the second interface determines whether the first live image corresponds to the first picture information, and wherein the second interface compares data of all or a part of the captured area of the first live image with data of all or a part of the captured area of the first picture information, and determines whether an error value between the two is smaller than a second threshold.
26. The system of claim 14, wherein the first server transmits instructions to place a user interface of the electronic device in a mode to receive sensitive information to be checked when the first live image does not correspond to the first picture information.
27. The system of claim 15, wherein the first sensitive information includes at least two or more sensitive information.
CN201911358583.4A 2019-12-25 2019-12-25 System and method for identifying user identity Pending CN110895688A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911358583.4A CN110895688A (en) 2019-12-25 2019-12-25 System and method for identifying user identity

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911358583.4A CN110895688A (en) 2019-12-25 2019-12-25 System and method for identifying user identity

Publications (1)

Publication Number Publication Date
CN110895688A true CN110895688A (en) 2020-03-20

Family

ID=69787808

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911358583.4A Pending CN110895688A (en) 2019-12-25 2019-12-25 System and method for identifying user identity

Country Status (1)

Country Link
CN (1) CN110895688A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110895601A (en) * 2019-12-25 2020-03-20 深圳雾芯科技有限公司 User-identifying device and user-identifying system

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106101136A (en) * 2016-07-22 2016-11-09 飞天诚信科技股份有限公司 The authentication method of a kind of biological characteristic contrast and system
CN108121902A (en) * 2017-12-21 2018-06-05 上海亦源智能科技有限公司 Recognition of face identity Self-certified method and system
US20190109834A1 (en) * 2017-10-10 2019-04-11 Truepic Inc. Methods for authenticating photographic image data
CN110245481A (en) * 2019-05-08 2019-09-17 深圳法大大网络科技有限公司 A kind of method, apparatus and terminal device of real-name authentication
CN110392041A (en) * 2019-06-17 2019-10-29 平安银行股份有限公司 Electronic authorization method, apparatus, storage equipment and storage medium

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106101136A (en) * 2016-07-22 2016-11-09 飞天诚信科技股份有限公司 The authentication method of a kind of biological characteristic contrast and system
US20190109834A1 (en) * 2017-10-10 2019-04-11 Truepic Inc. Methods for authenticating photographic image data
CN108121902A (en) * 2017-12-21 2018-06-05 上海亦源智能科技有限公司 Recognition of face identity Self-certified method and system
CN110245481A (en) * 2019-05-08 2019-09-17 深圳法大大网络科技有限公司 A kind of method, apparatus and terminal device of real-name authentication
CN110392041A (en) * 2019-06-17 2019-10-29 平安银行股份有限公司 Electronic authorization method, apparatus, storage equipment and storage medium

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110895601A (en) * 2019-12-25 2020-03-20 深圳雾芯科技有限公司 User-identifying device and user-identifying system

Similar Documents

Publication Publication Date Title
US12244719B1 (en) Computer-implemented authentication platform
US10440019B2 (en) Method, computer program, and system for identifying multiple users based on their behavior
US9531710B2 (en) Behavioral authentication system using a biometric fingerprint sensor and user behavior for authentication
US10346605B2 (en) Visual data processing of response images for authentication
US11605145B2 (en) Electronic device and authentication method thereof
US20200004939A1 (en) Biometric authentication
US20150242605A1 (en) Continuous authentication with a mobile device
CN113826135B (en) System, method and computer system for contactless authentication using voice recognition
US10984082B2 (en) Electronic device and method for providing user information
CN113158154B (en) Mobile device, verification terminal device and identity verification method
US11119638B2 (en) Using face detection to update user interface orientation
CN109254661B (en) Image display method, image display device, storage medium and electronic equipment
US20240214208A1 (en) Techniques for providing a digital keychain for physical objects
CN107368722A (en) Biological image verification method, computer readable storage medium, mobile terminal
KR102537147B1 (en) System and method for providing certified augmented reality content
CN110895688A (en) System and method for identifying user identity
CN110895601A (en) User-identifying device and user-identifying system
CN211979663U (en) User-identifying device and user-identifying system
Moshayedi et al. Fingerprint identification banking (FIB); affordable and secure biometric IOT design
WO2021128096A1 (en) System for identifying user identity and method for identifying user identity
WO2021128038A1 (en) Apparatus for recognizing user and system for recognizing user
US12301567B2 (en) Systems and methods for identity verification
US11416594B2 (en) Methods and systems for ensuring a user is permitted to use an object to conduct an activity
CN108154014B (en) Electronic equipment unlocking method and device, storage medium and electronic equipment
CN120337195A (en) Verification method, verification device and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20200320

RJ01 Rejection of invention patent application after publication