[go: up one dir, main page]

CN110825223A - Control method and intelligent glasses - Google Patents

Control method and intelligent glasses Download PDF

Info

Publication number
CN110825223A
CN110825223A CN201911007754.9A CN201911007754A CN110825223A CN 110825223 A CN110825223 A CN 110825223A CN 201911007754 A CN201911007754 A CN 201911007754A CN 110825223 A CN110825223 A CN 110825223A
Authority
CN
China
Prior art keywords
sensor
input
display screen
detected
unlocking
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201911007754.9A
Other languages
Chinese (zh)
Inventor
杜莉莉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN201911007754.9A priority Critical patent/CN110825223A/en
Publication of CN110825223A publication Critical patent/CN110825223A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/30Authentication, i.e. establishing the identity or authorisation of security principals
    • G06F21/31User authentication
    • G06F21/32User authentication using biometric data, e.g. fingerprints, iris scans or voiceprints
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B2027/0178Eyeglass type

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Security & Cryptography (AREA)
  • Human Computer Interaction (AREA)
  • Computer Hardware Design (AREA)
  • Software Systems (AREA)
  • Optics & Photonics (AREA)
  • Telephone Function (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The invention provides a control method and intelligent glasses. The smart glasses include a first sensor, a second sensor, and a third sensor, the method comprising: and if the folding input is detected through the first sensor, the approach input of an external object is detected through the second sensor, and the temperature data detected through the third sensor meets a first preset condition, searching for external equipment and establishing communication connection with the external equipment. The invention can improve the convenience of the connection mode between the intelligent glasses and the electronic accessories.

Description

Control method and intelligent glasses
Technical Field
The invention relates to the technical field of electronic equipment, in particular to a control method and intelligent glasses.
Background
With the rapid development of the AR (Augmented Reality) technology, the AR technology has better applications in the fields of consumption, medical treatment, logistics and the like. Due to the portability and the close proximity of the glasses to the eyes, the industry has recognized one of the most suitable product carriers for AR technology. Compared with the AR capability displayed through a screen such as a mobile phone, the glasses have the advantages, and both hands can be completely released to better interact with a user.
Currently, smart glasses need to be worn on the head, which leads to a very poor user experience once the weight is too heavy. Therefore, most of the smart glasses are split, that is, the power supply module, the processor module and other functional modules are provided through separate electronic accessories (such as external devices like mobile terminals). Therefore, it is necessary to connect the smart glasses and the electronic accessories by a wired method such as a data line to achieve the purpose of using the smart glasses.
Obviously, the convenience of the connection mode between the intelligent glasses and the electronic accessories in the related art is poor.
Disclosure of Invention
The embodiment of the invention provides a control method and intelligent glasses, and aims to solve the problem that the convenience of a connection mode between the intelligent glasses and electronic accessories in the prior art is poor.
In order to solve the technical problem, the invention is realized as follows:
in a first aspect, an embodiment of the present invention provides a control method, which is applied to smart glasses including a first sensor, a second sensor, and a third sensor, where the method includes:
and if the folding input is detected through the first sensor, the approach input of an external object is detected through the second sensor, and the temperature data detected through the third sensor meets a first preset condition, searching for external equipment and establishing communication connection with the external equipment.
In a second aspect, an embodiment of the present invention further provides smart glasses, where the smart glasses include:
first sensor, second sensor and third sensor, smart glasses still include:
and the searching module is used for searching the external equipment and establishing communication connection with the external equipment if the folding input is detected through the first sensor, the approach input of the external object is detected through the second sensor, and the temperature data detected through the third sensor meets a first preset condition.
In a third aspect, an embodiment of the present invention further provides a pair of smart glasses, including: a memory, a processor and a computer program stored on the memory and executable on the processor, the computer program, when executed by the processor, implementing the steps of the control method.
In a fourth aspect, the embodiment of the present invention further provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when being executed by a processor, the computer program implements the steps of the control method.
In the embodiment of the invention, the folding input can be detected through the first sensor, the approach input of an external object can be detected through the second sensor, and the temperature data can be detected through the third sensor to meet the first preset condition, so that the action of wearing glasses by a user can be confirmed through the data detected by the three sensors.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings needed to be used in the description of the embodiments of the present invention will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art that other drawings can be obtained according to these drawings without inventive labor.
FIG. 1 is a flow chart of a control method of one embodiment of the present invention;
FIG. 2 is a schematic structural diagram of smart eyewear in accordance with one embodiment of the present invention;
FIG. 3 is a flow chart of a control method of another embodiment of the present invention;
FIG. 4 is a block diagram of smart glasses according to one embodiment of the present invention;
fig. 5 is a schematic diagram of a hardware structure of a mobile terminal according to an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
First embodiment
Referring to fig. 1, a flowchart of a control method according to an embodiment of the present invention is shown, and the control method is applied to smart glasses.
Fig. 2 shows a schematic structural diagram of smart glasses according to an embodiment of the present invention.
The smart glasses may include a first sensor (e.g., acceleration sensor 21), a second sensor (e.g., distance sensor 22), and a third sensor (e.g., temperature sensor 23).
The method may specifically comprise the steps of:
step 101, if the first sensor detects a folding input, the second sensor detects an approach input of an external object, and the third sensor detects temperature data meeting a first preset condition, searching for an external device, and establishing communication connection with the external device;
alternatively, as shown in fig. 2, the accelerometer sensor 21 may be built in at least one temple of the smart glasses, and preferably, one accelerometer sensor 21 may be disposed in each of the two temples.
The acceleration sensor 21 may be a gravity sensor.
The gravity sensor mainly utilizes the characteristic that the inside of the gravity sensor is deformed due to the acceleration. Since this deformation generates a voltage, the acceleration can be converted into a voltage output by simply calculating the relationship between the generated voltage and the applied acceleration. There are, of course, many other methods to make acceleration sensors, such as capacitance effect, thermal bubble effect, and optical effect, but the most basic principle is that some medium is deformed due to acceleration, and the deformation is measured and converted into voltage output by related circuits. Then, the corresponding relation between the voltage and the acceleration is utilized to obtain the acceleration corresponding to the output voltage, and therefore acceleration data is obtained.
Specifically, in this embodiment, the step of receiving the folding input by the first sensor may be implemented by: if the acceleration (the acceleration value corresponding to the voltage data is obtained according to the corresponding relation between the voltage and the acceleration) corresponding to the first sensing data (for example, the voltage data generated by the deformation of the medium in the first sensor) received by the first sensor is greater than the first threshold, it is determined that the first sensor receives the folding input.
In the embodiment of the present invention, the three conditions corresponding to the three sensors are mainly used for determining whether the user has an operation of wearing the smart glasses. However, the inventor considers that when a user wears glasses, the temple needs to be first converted from the folded state to the unfolded state, that is, the unfolded state of the temple as shown in fig. 2, and then an acceleration is generated when the temple of the glasses is unfolded or folded, so that the acceleration sensor 21 of the present embodiment may receive an acceleration corresponding to the first sensing data, and then if the acceleration value is greater than the first threshold value, it is determined that there is a folding input, that is, there is currently an action of opening or folding the temple, through the first sensor.
Since it cannot be accurately determined whether the user has an operation of wearing the smart glasses based on the folding input detected by the first sensor alone, it is also necessary to determine whether a proximity input of an external object, which can be used to determine that the external object is in proximity to the smart glasses, is detected by the second sensor, for example, the distance sensor 22 shown in fig. 2. Since the user performs the action of wearing the smart glasses so that the human body approaches the smart glasses, the second sensor is provided here.
In fig. 2, the distance sensor 22 is disposed in the area above the nose pads of the smart glasses. However, in other embodiments, the distance sensor may be disposed at any one location of the smart eyewear.
The distance sensor 22 may be classified into various types of distance sensors such as an optical distance sensor, an infrared distance sensor, an ultrasonic distance sensor, and a millimeter wave distance sensor according to its principle.
The function of the distance sensor is primarily to detect the approach of an object. The specific function of the distance sensor will be explained in detail below with reference to the following embodiments.
Optionally, detecting a proximity input of an external object by the second sensor includes:
transmitting, by the second sensor, a first data amount of a target substance to an external object, and receiving, by the second sensor, a second data amount of the target substance reflected back from the external object;
taking the example where the second sensor is an infrared distance sensor, the infrared distance sensor has an infrared emitting tube and an infrared receiving tube, the emitting tube can emit infrared rays with a first data amount (i.e. an example of a target substance), then if the emitted infrared rays are reflected by some external object and received by the receiving tube, for example, the amount of the infrared rays received by the receiving tube is the second data amount. Wherein the second amount of data and the first amount of data are both greater than zero.
Determining, by the second sensor, distance information between the external object and the second sensor according to the first data volume and the second data volume;
the infrared distance sensor may determine the distance between the external object and the infrared distance sensor based on the absolute value of the difference between the two data quantities. The smaller the absolute value of the difference is, the closer the distance is, that is, the smaller the distance information is.
If the distance information is less than a second threshold, determining that an approach input of an external object is detected by the second sensor.
Wherein if the distance information is small enough, e.g. smaller than a second threshold, it may be determined that a proximity input of an external object is detected by said second sensor, i.e. it is determined that an external object is present in proximity to the smart glasses.
The working principle of other types of distance sensors is similar to that of the infrared distance sensor for identifying distance information, and the distance between an external object (or an obstacle) and the distance sensor is determined according to the transmitting amount and the receiving amount of a certain substance, and the substance transmitted by the distance sensor can be ultrasonic waves, light pulses and the like.
In the embodiment of the invention, by judging the emission amount of the target substance by the distance sensor and the receiving amount of the target substance reflected back by the external object, the distance information between the external object and the distance sensor can be determined, and if the distance information is less than the second threshold value, the fact that the external object except the smart glasses is close to the smart glasses is indicated.
In addition, it is not possible to accurately determine whether the user has an operation of wearing the smart glasses based on the folding input detected by the first sensor alone or the approach input of the external object detected by the second sensor alone, because the approach input detected by the second sensor may be caused by a human body or an object, for example, when the smart glasses are placed on a table, or the approach input may be detected. Therefore, it is necessary to further confirm whether the external object is a human body in combination with the temperature sensor, so as to determine whether there is an action of the user wearing the smart glasses in combination with the input data detected by the three sensors.
The third sensor, for example, the temperature sensor 23 shown in fig. 2, may be provided in a nose pad region of the smart glasses or a connection region between the temples and the temples. The temperature sensor 23 is preferably provided in an area inside the nose pad area (the inside area can be brought into close contact with the skin near the bridge of the nose of the human body when the eyeglasses are worn), and the temperature sensor 23 is preferably provided in an area inside the connection area (the inside area can be brought into close contact with the skin near the ears of the human body when the eyeglasses are worn).
Wherein, if the temperature data detected by the temperature sensor 23 is greater than a certain preset temperature threshold (e.g., a third threshold), or within a certain preset temperature range, it is determined that the temperature data satisfies the first preset condition.
The third threshold value may take any one of values 34, 35, 36. The predetermined temperature range may be 34 to 37. The maximum value of the temperature range is set to 37 degrees, not higher than 37 degrees, in consideration of the fact that a user with high burning is not suitable for wearing smart glasses to operate. The temperature range is the normal body temperature range of a human body.
Then in step 101, it may be determined that there is an operation of unfolding or folding the temple through the folding input detected by the first sensor, and it may be determined that there is an external object approaching the smart glasses through the approach input of the external object detected by the second sensor, and it may be determined that the external object is probably a human through the temperature data detected by the third sensor satisfies the first preset condition.
It can be determined that there is a wearing motion of the smart glasses by the user by combining the data input detected by the three sensors, and thus, an external device can be searched and a communication connection can be established with the external device.
The external device may be an accessory device, such as a mobile terminal (preferably a mobile phone), for providing functions such as data processing for the smart glasses.
When searching for the external device, the searching may be performed in any communication manner, for example, the smart glasses may integrate a bluetooth module, search for surrounding bluetooth devices through bluetooth, and select one bluetooth device from the searched bluetooth devices to establish a communication connection.
In addition, the intelligent glasses can also integrate a WiFi module, peripheral external devices with WiFi hot spots are connected through the WiFi module, and one external device is selected from the searched external devices to establish communication connection.
The present invention is not limited to a specific manner of establishing a communication connection.
Alternatively, the external device to which the smart glasses establish the communication connection may be a device that has been paired and connected before among the searched external devices, so that the information list of the searched external devices is displayed to be selected by the user without unlocking the display screen of the smart glasses, and the smart glasses may automatically and directly connect the searched external devices that have been paired and connected before.
Optionally, step 102, the display screen of the smart glasses is unlocked.
And if the folding input is detected through the first sensor, the approach input of an external object is detected through the second sensor, and the temperature data detected through the third sensor meets a first preset condition, unlocking the display screen of the intelligent glasses.
Not only is the connection convenient, but also the step of manual unlocking is saved.
For the process of detecting the input data by the three sensors, reference may be made to the above detailed description of step 101, which is not repeated here. In this step, if it is determined that the following trigger condition one exists, if the display screen of the smart glasses is in the locked state, that is, if the trigger condition two exists, the display screen may be unlocked, that is, the display screen of the smart glasses is directly unlocked.
It should be noted that, the execution order of step 101 and step 102 is not limited in the present invention, as long as the respective trigger conditions are satisfied.
In this embodiment, the first trigger condition for executing the step of searching for the external device is: the folding input is detected by the first sensor, the approach input of the external object is detected by the second sensor, and the temperature data detected by the third sensor satisfies a first preset condition.
And the second trigger condition for executing the step of unlocking the display screen of the intelligent glasses is as follows: through folding input is detected to first sensor, and through the input that is close to of external object is detected to the second sensor, and through the temperature data that the third sensor detected satisfy first preset condition, just the display screen of intelligence glasses is in the lock-out state.
Optionally, step 103, an unlocking instruction is sent to the external device.
And if the folding input is detected through the first sensor, the approach input of an external object is detected through the second sensor, and the temperature data detected through the third sensor meets a first preset condition, an unlocking instruction is sent to the external equipment.
For the process of detecting the input data by the three sensors, reference may be made to the above detailed description of step 101, which is not repeated here. In this step, when it is confirmed that the second trigger condition exists, the smart glasses may trigger unlocking of the display screen of the smart glasses in a manner that the smart glasses transmit an unlocking instruction to the external device communicatively connected in step 101.
The unlocking instruction is used for unlocking the display screen of the intelligent glasses, and after the external device receives the unlocking instruction, the external device can respond to the unlocking instruction to control the display screen of the intelligent glasses to be unlocked.
That is, step 103 is performed after step 101. While step 102 and step 103 are performed alternatively or both in one embodiment, step 102 and step 103 are two implementations for unlocking the display screen.
Alternatively, in the above embodiment, when it is confirmed whether the input data detected by the above three sensors satisfy the condition, it may be determined whether the first sensor receives the folding input first, and in case of confirming that the first sensor detects the folding input, it may be determined whether the second sensor detects the approach input of the external object, and in case of confirming that the second sensor detects the approach input of the external object, it may be determined whether the temperature data detected by the third sensor satisfies the first preset condition, and in case of confirming that the temperature data detected by the third sensor satisfies the first preset condition, it may be determined that the input data detected by the three sensors all satisfy the condition, thereby performing the corresponding operation of searching for and connecting the external device or unlocking the display screen.
Owing to judge according to the judgement order of first sensor, second sensor, third sensor, accord with the user and wear the operation step order of glasses, namely the user is at first opened the mirror leg, then takes intelligent glasses near the face, wears intelligent glasses overhead at last for the mirror holder presss from both sides in the ear top, and the nose holds in the palm the setting in the bridge of the nose top. Like this, can promote the judgement degree of accuracy whether to have the action of wearing glasses to the user to can be when the user wears intelligent glasses, the display screen of connecting external equipment fast automatically and to glasses, external equipment's display screen unblock.
In the embodiment of the invention, the folding input can be detected through the first sensor, the approach input of an external object can be detected through the second sensor, and the temperature data can be detected through the third sensor to meet the first preset condition, so that the action of wearing glasses by a user can be confirmed through the data detected by the three sensors.
The scene needing to be connected with the electronic accessories is accurately identified through input data detected by the three sensors, and the accuracy of judging the action of wearing the glasses can be improved.
In addition, the folding input can be detected through the first sensor, the approach input of an external object can be detected through the second sensor, and the temperature data can be detected through the third sensor to meet the first preset condition, so that the action of wearing glasses by a user can be confirmed through the data detected by the three sensors, the unlocking of the display screen of the intelligent glasses and the display screen of the external equipment can be realized in time, the complexity of manually unlocking the screen by the user is simplified, and the unlocking convenience is improved.
Optionally, before performing the step of searching for the external device in step 101, the method according to the embodiment of the present invention may further include an identity authentication step.
The identity authentication step may specifically be: collecting first biological characteristic information of a user; wherein, whether the first voiceprint characteristic information is matched with the preset biological characteristic information can be judged.
The first biological characteristic information can be any one or more of eye pattern characteristic information, iris characteristic information and fingerprint characteristic information.
The following description will be given by taking the example of identity authentication through eye prints:
the smart glasses shown in fig. 2 may have an eyeprint authentication module built therein, and the inner side (the side facing the wearer of the glasses) of the frame region of the smart glasses above the bridge of the nose may have a camera 24 built therein, and the camera 24 is communicatively connected to the eyeprint authentication module. The iris authentication module may be disposed at any position in the smart glasses, which is not limited in the present invention.
The camera 24 can collect face image information of a glasses wearer and send the face image information to the eye pattern authentication module, the eye pattern authentication module can extract eye pattern feature information from the face image information and match the eye pattern feature information with pre-stored authenticated preset eye pattern feature information, and if the extracted eye pattern feature information is matched with a group of preset eye pattern feature information in the preset eye pattern feature information, it is determined that the first biological feature information is matched with the preset biological feature information.
The description will be given by taking the authentication through the iris as an example:
similar to the above method of performing identity authentication through eye prints, an iris authentication module is built in the smart glasses, and the iris authentication module is in communication connection with the camera 24. The iris authentication module may be disposed at any position in the smart glasses, which is not limited in the present invention.
The camera 24 can collect face image information of a glasses wearer and send the face image information to the iris authentication module, the iris authentication module can extract iris feature information from the face image information and match the iris feature information with pre-stored authenticated preset iris feature information, and if the extracted iris feature information is matched with a group of preset iris feature information in the preset iris feature information, the first biological feature information is determined to be matched with the preset biological feature information.
The following description will be given by taking the example of identity authentication by fingerprints:
the intelligent glasses are internally provided with the fingerprint authentication module, the fingerprint authentication module can be arranged at any position of the glasses legs, the glasses legs and the glasses frame, and the position where the fingerprint authentication module is preferably arranged at the position where the user fingers to conveniently touch.
When a user touches the fingerprint authentication module with a finger, the fingerprint authentication module can receive fingerprint information, extract fingerprint characteristic information in the fingerprint information, match the fingerprint characteristic information with pre-stored authenticated preset fingerprint characteristic information, and determine that the first biological characteristic information is matched with preset biological characteristic information if the extracted fingerprint characteristic information is matched with a group of preset fingerprint characteristic information in the preset fingerprint characteristic information.
Optionally, after the fingerprint authentication module may receive the fingerprint information, the method of the embodiment of the present invention may brighten the display screen and display the fingerprint guide icon to indicate that the display screen currently enters the fingerprint authentication state, but at this time, the display screen is not unlocked, and is still in the locked state, but is not in the black screen state but in the bright screen state.
Optionally, if the fingerprint feature information match does not match any group of preset fingerprint feature information in the preset fingerprint feature information, a prompt message indicating that the fingerprint identification fails may be output.
And if the first biological characteristic information is matched with preset biological characteristic information, executing the step of searching the external equipment.
It should be noted that, the present invention does not limit the execution sequence between the step of authenticating the identity and the step of determining whether the trigger condition one exists, however, in this embodiment, the step of searching for the external device may be executed only after the trigger condition one exists and the identity authentication passes. Therefore, the embodiment of the invention can search the external equipment and carry out communication connection after confirming that the action of wearing the intelligent glasses by the user exists and the identity of the user is authenticated, thereby avoiding the illegal user from using the intelligent glasses.
Preferably, after the existence of the first trigger condition is confirmed, the step of identity authentication is executed, so that an invalid step of identity authentication can be avoided when the first trigger condition does not exist, and therefore, even if the identity authentication passes, the step of searching for the external device cannot be executed because the first trigger condition does not exist, so that the connection efficiency of the smart glasses and the external device can be improved, and unnecessary signaling overhead on the smart glasses side is reduced.
Optionally, before the step of unlocking the display screen of the smart glasses in step 102 is performed, the method according to the embodiment of the present invention may also include the above-mentioned identity authentication step. For details, reference is made to the detailed description of the identity authentication procedure described above, which is not repeated herein.
And if the first biological characteristic information is matched with preset biological characteristic information, executing the step of unlocking the display screen of the intelligent glasses.
Optionally, the display screen may be unlocked and lightened, and a standby interface is displayed in the display screen.
It should be noted that, the present invention does not limit the execution sequence between the identity authentication step and the step of determining whether the trigger condition two exists.
In this embodiment, the step of unlocking the display screen may be performed only after the triggering condition two exists and the identity authentication passes. Therefore, the embodiment of the invention can unlock the display screen only when the action of wearing the intelligent glasses by the user is confirmed, the display screen of the intelligent glasses is in the locked state, and the identity of the user is authenticated, so that the unlocking step can be simplified, the unlocking efficiency is improved, the manual operation trigger of the user is reduced, and the convenience is provided.
Preferably, after the existence of the second trigger condition is confirmed, the step of identity authentication is executed, so that an invalid step of identity authentication can be avoided under the condition that the second trigger condition does not exist, and therefore, even if the identity authentication passes, the step of unlocking the display screen of the smart glasses cannot be executed due to the absence of the second trigger condition, and therefore the unlocking success rate and the unlocking efficiency can be improved. For example, when the data received by the three sensors is judged that the action of wearing the glasses by the user does not exist currently, even if the finger of the user is placed in the area where the fingerprint authentication module is located, the fingerprint authentication module does not perform the fingerprint authentication process on the received fingerprint information, so that the situation that the user mistakenly touches the area where the fingerprint authentication module is located to unlock the glasses in the process of not using the intelligent glasses can be reduced.
Optionally, before the step of sending the unlocking instruction to the external device in step 103 is executed, the method according to the embodiment of the present invention may also include the above-mentioned identity authentication step. For details, reference is made to the detailed description of the identity authentication procedure described above, which is not repeated herein.
And if the first biological characteristic information is matched with preset biological characteristic information, executing the step of sending an unlocking instruction to the external equipment.
Optionally, the display screen may be unlocked and lightened, and a standby interface is displayed in the display screen.
In this embodiment, step 103 needs to be executed after step 101 is executed, so if the identity authentication step is already performed before the external device is searched in step 101, the repeated identity authentication step is not needed before the unlocking instruction is sent to the external device in this step; if the identity authentication step is not performed before searching the external device in step 101, the identity authentication step needs to be performed before sending the unlock command to the external device in this embodiment.
Therefore, the embodiment of the invention can trigger the sending of the unlocking instruction to the connected external equipment under the condition that the action of wearing the intelligent glasses by the user and the display screen of the intelligent glasses are in the locked state, and after the identity of the user is authenticated and the intelligent glasses are in communication connection with the external equipment, so as to realize the purpose of unlocking the display screen of the intelligent glasses, simplify the unlocking step, improve the unlocking efficiency, reduce the manual operation trigger of the user and provide convenience.
Second embodiment
Referring to fig. 3, a flowchart of a control method according to another embodiment of the present invention is shown, where the smart glasses include a first sensor, a second sensor, and a third sensor, and the description of the three sensors may refer to the above embodiment, and will not be repeated here. In addition, the smart glasses are provided with a touch pad, which may be disposed on the outer wall of one or both of the temples, as shown in fig. 2, and the outer wall of one of the temples is provided with a touch pad 25. The method specifically comprises the following steps:
step 201, if the first sensor detects a folding input, the second sensor detects an approach input of an external object, and the temperature data detected by the third sensor satisfies a first preset condition, searching for a candidate external device;
for specific description of the input data detected by the three sensors, reference may be made to the above embodiments, which are not described herein again.
In the case that the existence of the first trigger condition is confirmed, a candidate external device may be searched, where the candidate external device is an external device that can be connected. The peripheral external devices that have been previously paired and connected are searched for by, for example, a bluetooth module, and include the device 1, the device 2, and the device 3. These three devices are all candidate external devices.
Step 203, if the display screen of the intelligent glasses is in an unlocked state, displaying an information list of the candidate external equipment on the display screen;
the display screen may be in the unlocked state due to the execution of step 102 or step 103 in the first embodiment, or due to other unlocking operations, the display screen may be unlocked and made to be in the unlocked state.
In this step, the information of the devices 1, 2 and 3 may be displayed on the two lenses (i.e. two display screens) illustrated in fig. 2 in a list manner. The information in the list may be identification information of the device, such as ID information, name information, and the like, which is available from the smart glasses.
Optionally, taking ID information as an example, the ID information list sequentially displays ID information of the device 1, the device 2, and the device 3: A. b, C are provided. Among them, the device 1 corresponding to the ID information (i.e., a) arranged first in the ID information list is the external device to be connected for communication selected by default.
Optionally, step 204, receiving a sliding input to the touch pad;
if the device 1 selected by default in the list is not the external device that the user wants to connect to, the user may perform a sliding operation on the touch pad 25 on the outer wall of the glasses leg in fig. 2, so that the smart glasses receive a sliding input to the touch pad. Wherein the user can slide the touch pad 25 up and down in the opposite direction according to the arrow shown in fig. 2.
Optionally, in step 205, in response to the sliding input, switching the selected target external device in the information list on the display screen according to the sliding direction of the sliding input;
for example, if the slide direction of the slide input is downward, the selected device is switched to the device 2 arranged behind the device 1; if the slide direction of the slide input is upward, the selected device is switched to the device 3 arranged in front of the device 1. Of course, this is merely an illustrative example, and the present invention is not limited to the relationship between the sliding direction and the switching sequence of the selected target external device.
Step 206, receiving a determination input to the touch pad;
wherein, the user can perform certain input in the form of single click, double click and the like on the touch pad.
And step 207, responding to the determined input, and establishing communication connection with the selected target external equipment displayed on the display screen.
If the above steps 204 and 205 are not executed, the device 1 with the first ID information a arranged by default may be determined as the target external device through steps 206 and 207, and a communication connection between the smart glasses and the device 1 may be established;
on the other hand, if step 204 and step 205 are performed, after step 206 and step 207, the target external device switched after step 204 and step 205 may be used as the target external device for the smart glasses to establish the communication connection. For example, after step 204 and step 205, the ID information displayed on the display screen is B, that is, the selected target external device is device 2, the smart glasses may establish a communication connection with device 2 in step 207.
When the smart glasses establish communication connection with the external device, a pairing request can be sent to the target external device, and if the smart glasses receive response information which is sent by the target external device and indicates agreement to the pairing request, the smart glasses can establish communication connection with the target external device.
In the embodiment of the present invention, when the number of the candidate external devices searched by the smart glasses is multiple, information of the multiple candidate external devices may be displayed in a list on the display screen, so that the user may select a target external device that needs to be connected, and the user may select the target external device from the list by a sliding input to the touch pad of the smart glasses, so that the connection between the smart glasses and the external device meets the intention of the user.
Optionally, the method according to the embodiment of the present invention may further include:
when the operation that the user takes off the smart glasses from the head is detected, the communication connection between the smart glasses and the external device (or the target external device in the second embodiment) is disconnected, and the display screen of the smart glasses is set to be in a screen locking state.
Specifically, if the folding input is detected by the first sensor, the remote input of the external object is detected by the second sensor, and the temperature data detected by the third sensor does not satisfy the first preset condition, the communication connection between the smart glasses and the external device (or the target external device in the second embodiment) is disconnected, and the display screen of the smart glasses is set to the screen locking state.
The determination of the distant input may be based on a determination that the distance information between the external object and the second sensor is greater than or equal to the second threshold.
The judgment basis that the temperature data detected by the third sensor does not meet the first preset condition may be that the temperature data is less than or equal to the third threshold, or that the temperature data is no longer within a preset temperature range. It may also be that the temperature data is smaller than a fourth threshold value, which is smaller than said third threshold value.
Therefore, when the operation that the user takes off the intelligent glasses is detected, the communication connection with the electronic accessory (namely the external equipment) can be automatically disconnected, and the power consumption of the intelligent glasses is reduced.
Third embodiment
Referring to fig. 4, a block diagram of smart glasses according to one embodiment of the present invention is shown. The intelligent glasses of the embodiment of the invention can realize the details of the control method in the embodiment and achieve the same effect. The smart glasses shown in fig. 4 include:
first sensor 41, second sensor 42 and third sensor 43, the smart glasses further include:
a searching module 44, configured to search for an external device and establish a communication connection with the external device if a folding input is detected by the first sensor 41, an approach input of an external object is detected by the second sensor 42, and temperature data detected by the third sensor 43 satisfies a first preset condition;
optionally, the smart glasses further comprise:
the first unlocking module is used for unlocking the display screen of the intelligent glasses;
optionally, the first unlocking module is further configured to unlock the display screen of the smart glasses if the folding input is detected by the first sensor 41, the approach input of the external object is detected by the second sensor 42, and the temperature data detected by the third sensor 43 meets a first preset condition;
optionally, the smart glasses further comprise:
and a second unlocking module, configured to send an unlocking instruction to the external device, that is, to send an unlocking instruction to the external device connected to the search module 44.
Optionally, the first sensor 41 is built in at least one temple of the smart glasses, and the smart glasses further include:
the first determining module is configured to determine that the folding input is detected by the first sensor 41 if the acceleration corresponding to the first sensing data received by the first sensor 41 is greater than a first threshold.
Optionally, the second sensor 42 is configured to transmit a first data amount of the target substance to an external object and receive a second data amount of the target substance reflected from the external object;
the second sensor 42 is configured to determine distance information between the external object and the second sensor 42 according to the first data amount and the second data amount;
optionally, the smart glasses further comprise:
a second determining module, configured to determine that an approach input of an external object is detected by the second sensor 42 if the distance information determined by the second sensor 42 is smaller than a second threshold.
Optionally, the smart glasses further comprise:
the acquisition module is used for acquiring first biological characteristic information of a user;
the searching module 44 is further configured to search for the external device if the first biometric information matches preset biometric information;
optionally, the first unlocking module is further configured to unlock the display screen of the smart glasses if the first biometric information matches preset biometric information;
optionally, the second unlocking module is further configured to send an unlocking instruction to the external device if the first biometric information matches preset biometric information.
Optionally, the smart glasses are provided with a touch pad, and the search module 44 includes:
a search sub-module for searching for candidate external devices;
the display sub-module is used for displaying the information list of the candidate external equipment on the display screen if the display screen of the intelligent glasses is in an unlocked state;
the first receiving submodule is used for receiving sliding input of the touch pad;
the switching submodule is used for responding to the sliding input and switching the selected target external equipment in the information list on the display screen according to the sliding direction of the sliding input;
the second receiving submodule is used for receiving a determination input of the touch pad;
and the connection submodule is used for responding to the determined input and establishing communication connection with the selected target external equipment displayed on the display screen.
The intelligent glasses provided by the embodiment of the invention can realize each process realized by the intelligent glasses in the method embodiments, and in order to avoid repetition, the details are not repeated.
The intelligent glasses can detect folding input through the first sensor, can detect approach input of an external object through the second sensor, and can detect that temperature data meets a first preset condition through the third sensor, and then the action of wearing the glasses by a user can be confirmed through the data detected by the three sensors.
A fourth embodiment.
Fig. 5 is a schematic diagram of a hardware structure of a mobile terminal implementing various embodiments of the present invention, where the mobile terminal may be in communication connection with smart glasses.
The smart glasses comprise a first sensor, a second sensor and a third sensor;
the intelligent glasses are used for searching the mobile terminal and establishing communication connection with the mobile terminal if the folding input is detected through the first sensor, the approach input of an external object is detected through the second sensor, and the temperature data detected through the third sensor meets a first preset condition.
According to the intelligent glasses provided by the embodiment of the invention, the folding input can be detected through the first sensor, the approach input of an external object can be detected through the second sensor, and the temperature data can be detected through the third sensor to meet the first preset condition, so that the action of wearing the glasses by a user can be confirmed through the data detected by the three sensors.
The mobile terminal 400 includes, but is not limited to: radio frequency unit 401, network module 402, audio output unit 403, input unit 404, sensor 405, display unit 406, user input unit 407, interface unit 408, memory 409, processor 410, and power supply 411. Those skilled in the art will appreciate that the mobile terminal architecture shown in fig. 5 is not intended to be limiting of mobile terminals, and that a mobile terminal may include more or fewer components than shown, or some components may be combined, or a different arrangement of components. In the embodiment of the present invention, the mobile terminal includes, but is not limited to, a mobile phone, a tablet computer, a notebook computer, a palm computer, a vehicle-mounted terminal, a wearable device, a pedometer, and the like.
The radio frequency unit 401 is configured to establish communication connection with the smart glasses;
it should be understood that, in the embodiment of the present invention, the radio frequency unit 401 may be used for receiving and sending signals during a message sending and receiving process or a call process, and specifically, receives downlink data from a base station and then processes the received downlink data to the processor 410; in addition, the uplink data is transmitted to the base station. Typically, radio unit 401 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like. Further, the radio unit 401 can also communicate with a network and other devices through a wireless communication system.
The mobile terminal provides the user with wireless broadband internet access through the network module 402, such as helping the user send and receive e-mails, browse web pages, and access streaming media.
The audio output unit 403 may convert audio data received by the radio frequency unit 401 or the network module 402 or stored in the memory 409 into an audio signal and output as sound. Also, the audio output unit 403 may also provide audio output related to a specific function performed by the mobile terminal 400 (e.g., a call signal reception sound, a message reception sound, etc.). The audio output unit 403 includes a speaker, a buzzer, a receiver, and the like.
The input unit 404 is used to receive audio or video signals. The input Unit 404 may include a Graphics Processing Unit (GPU) 4041 and a microphone 4042, and the Graphics processor 4041 processes image data of a still picture or video obtained by an image capturing apparatus (such as a camera) in a video capturing mode or an image capturing mode. The processed image frames may be displayed on the display unit 406. The image frames processed by the graphic processor 4041 may be stored in the memory 409 (or other storage medium) or transmitted via the radio frequency unit 401 or the network module 402. The microphone 4042 may receive sound, and may be capable of processing such sound into audio data. The processed audio data may be converted into a format output transmittable to a mobile communication base station via the radio frequency unit 401 in case of the phone call mode.
The mobile terminal 400 also includes at least one sensor 405, such as a light sensor, motion sensor, and other sensors. Specifically, the light sensor includes an ambient light sensor that can adjust the brightness of the display panel 4061 according to the brightness of ambient light, and a proximity sensor that can turn off the display panel 4061 and/or the backlight when the mobile terminal 400 is moved to the ear. As one of the motion sensors, the accelerometer sensor can detect the magnitude of acceleration in each direction (generally three axes), detect the magnitude and direction of gravity when stationary, and can be used to identify the posture of the mobile terminal (such as horizontal and vertical screen switching, related games, magnetometer posture calibration), and vibration identification related functions (such as pedometer, tapping); the sensors 405 may also include a fingerprint sensor, a pressure sensor, an iris sensor, a molecular sensor, a gyroscope, a barometer, a hygrometer, a thermometer, an infrared sensor, etc., which will not be described in detail herein.
The display unit 406 is used to display information input by the user or information provided to the user. The Display unit 406 may include a Display panel 4061, and the Display panel 4061 may be configured in the form of a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like.
The user input unit 407 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function control of the mobile terminal. Specifically, the user input unit 407 includes a touch panel 4071 and other input devices 4072. Touch panel 4071, also referred to as a touch screen, may collect touch operations by a user on or near it (e.g., operations by a user on or near touch panel 4071 using a finger, a stylus, or any suitable object or attachment). The touch panel 4071 may include two parts, a touch detection device and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 410, receives a command from the processor 410, and executes the command. In addition, the touch panel 4071 can be implemented by using various types such as a resistive type, a capacitive type, an infrared ray, and a surface acoustic wave. In addition to the touch panel 4071, the user input unit 407 may include other input devices 4072. Specifically, the other input devices 4072 may include, but are not limited to, a physical keyboard, function keys (such as volume control keys, switch keys, etc.), a track ball, a mouse, and a joystick, which are not described herein again.
Further, the touch panel 4071 can be overlaid on the display panel 4061, and when the touch panel 4071 detects a touch operation thereon or nearby, the touch operation is transmitted to the processor 410 to determine the type of the touch event, and then the processor 410 provides a corresponding visual output on the display panel 4061 according to the type of the touch event. Although in fig. 5, the touch panel 4071 and the display panel 4061 are two separate components to implement the input and output functions of the mobile terminal, in some embodiments, the touch panel 4071 and the display panel 4061 may be integrated to implement the input and output functions of the mobile terminal, which is not limited herein.
The interface unit 408 is an interface through which an external device is connected to the mobile terminal 400. For example, the external device may include a wired or wireless headset port, an external power supply (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device having an identification module, an audio input/output (I/O) port, a video I/O port, an earphone port, and the like. The interface unit 408 may be used to receive input (e.g., data information, power, etc.) from external devices and transmit the received input to one or more elements within the mobile terminal 400 or may be used to transmit data between the mobile terminal 400 and external devices.
The memory 409 may be used to store software programs as well as various data. The memory 409 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required by at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the cellular phone, and the like. Further, the memory 409 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
The processor 410 is a control center of the mobile terminal, connects various parts of the entire mobile terminal using various interfaces and lines, and performs various functions of the mobile terminal and processes data by operating or executing software programs and/or modules stored in the memory 409 and calling data stored in the memory 409, thereby integrally monitoring the mobile terminal. Processor 410 may include one or more processing units; preferably, the processor 410 may integrate an application processor, which mainly handles operating systems, user interfaces, application programs, etc., and a modem processor, which mainly handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into the processor 410.
The mobile terminal 400 may further include a power supply 411 (e.g., a battery) for supplying power to various components, and preferably, the power supply 411 may be logically connected to the processor 410 through a power management system, so as to implement functions of managing charging, discharging, and power consumption through the power management system.
In addition, the mobile terminal 400 includes some functional modules that are not shown, and thus, are not described in detail herein.
Preferably, an embodiment of the present invention further provides an intelligent glasses, which includes a processor 410, a memory 409, and a computer program stored in the memory 409 and capable of being executed on the processor 410, where the computer program, when executed by the processor 410, implements each process of the above control method embodiment, and can achieve the same technical effect, and in order to avoid repetition, details are not described here again.
The embodiment of the present invention further provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, the computer program implements each process of the control method embodiment, and can achieve the same technical effect, and in order to avoid repetition, details are not repeated here. The computer-readable storage medium may be a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present invention may be embodied in the form of a software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal (such as a mobile phone, a computer, a server, an air conditioner, or a network device) to execute the method according to the embodiments of the present invention.
While the present invention has been described with reference to the embodiments shown in the drawings, the present invention is not limited to the embodiments, which are illustrative and not restrictive, and it will be apparent to those skilled in the art that various changes and modifications can be made therein without departing from the spirit and scope of the invention as defined in the appended claims.

Claims (11)

1. A control method is applied to intelligent glasses, and is characterized in that the intelligent glasses comprise a first sensor, a second sensor and a third sensor, and the method comprises the following steps:
and if the folding input is detected through the first sensor, the approach input of an external object is detected through the second sensor, and the temperature data detected through the third sensor meets a first preset condition, searching for external equipment and establishing communication connection with the external equipment.
2. The method of claim 1, further comprising:
and unlocking the display screen of the intelligent glasses, or sending an unlocking instruction to the external equipment.
3. The method of claim 1, wherein the first sensor is built into at least one temple of the smart eyewear, and wherein detecting the folding input via the first sensor comprises:
if the acceleration corresponding to the first sensing data received by the first sensor is greater than a first threshold, it is determined that the folding input is detected by the first sensor.
4. The method of claim 2,
before the searching for the external device, the method further includes:
collecting first biological characteristic information of a user;
if the first biological characteristic information is matched with preset biological characteristic information, executing the step of searching the external equipment;
before the unlocking the display screen of the smart glasses or before the sending the unlocking instruction to the external device, the method further includes:
and if the first biological characteristic information is matched with preset biological characteristic information, executing the step of unlocking the display screen of the intelligent glasses, or executing the step of sending an unlocking instruction to the external equipment.
5. The method of claim 1, wherein the smart glasses are provided with a touch pad, and wherein the searching for the external device and establishing the communication connection with the external device comprises:
searching for a candidate external device;
if the display screen of the intelligent glasses is in an unlocked state, displaying an information list of the candidate external equipment on the display screen;
receiving a sliding input to the touch pad;
responding to the sliding input, and switching the selected target external equipment in the information list on the display screen according to the sliding direction of the sliding input;
receiving a determination input to the touch pad;
establishing a communication connection with the selected target external device displayed on the display screen in response to the determination input.
6. A smart eyewear, comprising: first sensor, second sensor and third sensor, smart glasses still include:
and the searching module is used for searching the external equipment and establishing communication connection with the external equipment if the folding input is detected through the first sensor, the approach input of the external object is detected through the second sensor, and the temperature data detected through the third sensor meets a first preset condition.
7. The smart eyewear of claim 6, further comprising:
the first unlocking module is used for unlocking the display screen of the intelligent glasses;
and the second unlocking module is used for sending an unlocking instruction to the external equipment.
8. The smart eyewear of claim 6, wherein the first sensor is built into at least one temple of the smart eyewear, the smart eyewear further comprising:
the first determining module is configured to determine that a folding input is detected by the first sensor if an acceleration corresponding to first sensing data received by the first sensor is greater than a first threshold.
9. The smart eyewear of claim 7, further comprising:
the acquisition module is used for acquiring first biological characteristic information of a user;
the searching module is further used for searching the external device if the first biological characteristic information is matched with preset biological characteristic information;
the first unlocking module is further used for unlocking the display screen of the intelligent glasses if the first biological characteristic information is matched with preset biological characteristic information;
the second unlocking module is further configured to send an unlocking instruction to the external device if the first biometric information matches preset biometric information.
10. The smart glasses according to claim 6, wherein the smart glasses are provided with a touch pad, and the search module comprises:
a search sub-module for searching for candidate external devices;
the display sub-module is used for displaying the information list of the candidate external equipment on the display screen if the display screen of the intelligent glasses is in an unlocked state;
the first receiving submodule is used for receiving sliding input of the touch pad;
the switching submodule is used for responding to the sliding input and switching the selected target external equipment in the information list on the display screen according to the sliding direction of the sliding input;
the second receiving submodule is used for receiving a determination input of the touch pad;
and the connection submodule is used for responding to the determined input and establishing communication connection with the selected target external equipment displayed on the display screen.
11. A smart eyewear, comprising: memory, processor and computer program stored on the memory and executable on the processor, which computer program, when being executed by the processor, carries out the steps of the control method according to any one of claims 1 to 5.
CN201911007754.9A 2019-10-22 2019-10-22 Control method and intelligent glasses Pending CN110825223A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911007754.9A CN110825223A (en) 2019-10-22 2019-10-22 Control method and intelligent glasses

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911007754.9A CN110825223A (en) 2019-10-22 2019-10-22 Control method and intelligent glasses

Publications (1)

Publication Number Publication Date
CN110825223A true CN110825223A (en) 2020-02-21

Family

ID=69550097

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911007754.9A Pending CN110825223A (en) 2019-10-22 2019-10-22 Control method and intelligent glasses

Country Status (1)

Country Link
CN (1) CN110825223A (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112925413A (en) * 2021-02-08 2021-06-08 维沃移动通信有限公司 Augmented reality glasses and touch control method thereof
CN113126301A (en) * 2021-04-23 2021-07-16 维沃移动通信有限公司 Intelligent glasses
WO2022089431A1 (en) * 2020-10-30 2022-05-05 维沃移动通信(杭州)有限公司 Device control method and apparatus, and electronic device
CN115499787A (en) * 2022-09-19 2022-12-20 歌尔科技有限公司 Intelligent glasses interconnection method and intelligent glasses

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120206323A1 (en) * 2010-02-28 2012-08-16 Osterhout Group, Inc. Ar glasses with event and sensor triggered ar eyepiece interface to external devices
CN106569725A (en) * 2016-11-09 2017-04-19 北京小米移动软件有限公司 A method and device for providing input for smart glasses, and touch device
CN107664841A (en) * 2016-07-29 2018-02-06 鸿富锦精密电子(郑州)有限公司 Intelligent glasses and the method for controlling intelligent glasses dormancy awakening
CN108966198A (en) * 2018-08-30 2018-12-07 Oppo广东移动通信有限公司 Network connection method and device, intelligent glasses and storage medium
US20190025587A1 (en) * 2010-02-28 2019-01-24 Microsoft Technology Licensing, Llc Ar glasses with event and user action control of external applications
CN109946853A (en) * 2019-03-26 2019-06-28 华为技术有限公司 A kind of intelligent glasses

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120206323A1 (en) * 2010-02-28 2012-08-16 Osterhout Group, Inc. Ar glasses with event and sensor triggered ar eyepiece interface to external devices
US20190025587A1 (en) * 2010-02-28 2019-01-24 Microsoft Technology Licensing, Llc Ar glasses with event and user action control of external applications
CN107664841A (en) * 2016-07-29 2018-02-06 鸿富锦精密电子(郑州)有限公司 Intelligent glasses and the method for controlling intelligent glasses dormancy awakening
CN106569725A (en) * 2016-11-09 2017-04-19 北京小米移动软件有限公司 A method and device for providing input for smart glasses, and touch device
CN108966198A (en) * 2018-08-30 2018-12-07 Oppo广东移动通信有限公司 Network connection method and device, intelligent glasses and storage medium
CN109946853A (en) * 2019-03-26 2019-06-28 华为技术有限公司 A kind of intelligent glasses

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022089431A1 (en) * 2020-10-30 2022-05-05 维沃移动通信(杭州)有限公司 Device control method and apparatus, and electronic device
CN112925413A (en) * 2021-02-08 2021-06-08 维沃移动通信有限公司 Augmented reality glasses and touch control method thereof
CN113126301A (en) * 2021-04-23 2021-07-16 维沃移动通信有限公司 Intelligent glasses
CN113126301B (en) * 2021-04-23 2023-08-22 维沃移动通信有限公司 Intelligent glasses
CN115499787A (en) * 2022-09-19 2022-12-20 歌尔科技有限公司 Intelligent glasses interconnection method and intelligent glasses

Similar Documents

Publication Publication Date Title
CN109381165B (en) Skin detection method and mobile terminal
CN108459797B (en) Control method of folding screen and mobile terminal
US20060044265A1 (en) HMD information apparatus and method of operation thereof
CN110825223A (en) Control method and intelligent glasses
CN109523253B (en) Payment method and device
CN110928407B (en) Information display method and device
CN109257505B (en) Screen control method and mobile terminal
CN109343788B (en) Operation control method of mobile terminal and mobile terminal
CN108038360B (en) Operation mode switching method and mobile terminal
CN109886686B (en) Secure payment method, device and computer readable storage medium
CN109190356B (en) Screen unlocking method and terminal
CN108549802A (en) An unlocking method, device and mobile terminal based on face recognition
CN109379539A (en) A kind of screen fill light method and terminal
CN108089801A (en) A kind of method for information display and mobile terminal
CN107704182B (en) Code scanning method and mobile terminal
CN109544172B (en) Display method and terminal equipment
CN108650408B (en) Screen unlocking method and mobile terminal
CN108196663B (en) Face recognition method and mobile terminal
CN109756621A (en) A two-dimensional code display method and terminal device
CN107707764A (en) The control method and mobile terminal of a kind of application
CN110519443B (en) Screen lightening method and mobile terminal
CN109164908B (en) Interface control method and mobile terminal
CN111078002A (en) Suspended gesture recognition method and terminal equipment
CN108734001B (en) User identity authentication method and mobile terminal
CN107895108B (en) Operation management method and mobile terminal

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20200221