CN110069136A - A kind of wearing state recognition methods, equipment and computer readable storage medium - Google Patents
A kind of wearing state recognition methods, equipment and computer readable storage medium Download PDFInfo
- Publication number
- CN110069136A CN110069136A CN201910357618.6A CN201910357618A CN110069136A CN 110069136 A CN110069136 A CN 110069136A CN 201910357618 A CN201910357618 A CN 201910357618A CN 110069136 A CN110069136 A CN 110069136A
- Authority
- CN
- China
- Prior art keywords
- feature
- orientation
- wearable device
- wearing state
- reference object
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000000034 method Methods 0.000 title claims abstract description 53
- 238000003860 storage Methods 0.000 title claims abstract description 16
- 230000033001 locomotion Effects 0.000 claims abstract description 99
- 230000001953 sensory effect Effects 0.000 claims abstract description 18
- 238000010191 image analysis Methods 0.000 claims description 12
- 238000003384 imaging method Methods 0.000 claims description 11
- 230000001133 acceleration Effects 0.000 claims description 6
- 238000004590 computer program Methods 0.000 claims description 6
- 230000006854 communication Effects 0.000 description 14
- 230000006870 function Effects 0.000 description 14
- 238000004891 communication Methods 0.000 description 13
- 230000009286 beneficial effect Effects 0.000 description 10
- 210000000707 wrist Anatomy 0.000 description 10
- 238000012545 processing Methods 0.000 description 9
- 230000008569 process Effects 0.000 description 7
- 239000013535 sea water Substances 0.000 description 6
- 238000010586 diagram Methods 0.000 description 5
- 230000003068 static effect Effects 0.000 description 5
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 description 5
- 238000010295 mobile communication Methods 0.000 description 4
- 238000005452 bending Methods 0.000 description 3
- 238000005516 engineering process Methods 0.000 description 3
- 241000208340 Araliaceae Species 0.000 description 2
- 235000005035 Panax pseudoginseng ssp. pseudoginseng Nutrition 0.000 description 2
- 235000003140 Panax quinquefolius Nutrition 0.000 description 2
- 230000003044 adaptive effect Effects 0.000 description 2
- 230000006399 behavior Effects 0.000 description 2
- 238000001514 detection method Methods 0.000 description 2
- 238000011161 development Methods 0.000 description 2
- 235000008434 ginseng Nutrition 0.000 description 2
- 210000004247 hand Anatomy 0.000 description 2
- 230000005764 inhibitory process Effects 0.000 description 2
- 239000004973 liquid crystal related substance Substances 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 238000003672 processing method Methods 0.000 description 2
- OKTJSMMVPCPJKN-UHFFFAOYSA-N Carbon Chemical compound [C] OKTJSMMVPCPJKN-UHFFFAOYSA-N 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 238000005314 correlation function Methods 0.000 description 1
- 230000007812 deficiency Effects 0.000 description 1
- 230000001419 dependent effect Effects 0.000 description 1
- 230000004069 differentiation Effects 0.000 description 1
- 239000012769 display material Substances 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 230000005611 electricity Effects 0.000 description 1
- 230000008030 elimination Effects 0.000 description 1
- 238000003379 elimination reaction Methods 0.000 description 1
- 229910021389 graphene Inorganic materials 0.000 description 1
- 230000005484 gravity Effects 0.000 description 1
- 238000003780 insertion Methods 0.000 description 1
- 230000037431 insertion Effects 0.000 description 1
- 230000003993 interaction Effects 0.000 description 1
- 230000014759 maintenance of location Effects 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 230000000149 penetrating effect Effects 0.000 description 1
- 238000003825 pressing Methods 0.000 description 1
- 230000005236 sound signal Effects 0.000 description 1
- 238000010897 surface acoustic wave method Methods 0.000 description 1
- 230000001360 synchronised effect Effects 0.000 description 1
- 230000009466 transformation Effects 0.000 description 1
- 238000012795 verification Methods 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M1/00—Substation equipment, e.g. for use by subscribers
- H04M1/72—Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
- H04M1/724—User interfaces specially adapted for cordless or mobile telephones
- H04M1/72448—User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions
- H04M1/72454—User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions according to context-related or environment-related conditions
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/163—Wearable computers, e.g. on a belt
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M2250/00—Details of telephonic subscriber devices
- H04M2250/12—Details of telephonic subscriber devices including a sensor for measuring a physical value, e.g. temperature or motion
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Human Computer Interaction (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Environmental & Geological Engineering (AREA)
- Computer Networks & Wireless Communication (AREA)
- Signal Processing (AREA)
- Image Analysis (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
This application discloses a kind of wearing state recognition methods, equipment and computer readable storage mediums, wherein, this method comprises: obtaining the image information under current wearing state by the camera assembly in wearable device, and the motion information under the current wearing state is obtained by the sensory package of wearable device;Then, described image information is parsed, the first profile feature of the first reference object is obtained, meanwhile, obtain the second contour feature of the second reference object;Finally, determining the worn orientation and motion state of the wearable device in conjunction with the motion information if the first orientation feature and the second orientation feature are in gyrobearing simultaneously.A kind of wearing state identifying schemes of hommization are realized, so that wearable device may not need user and be set in advance, current wearing state can be adaptively determined, the accuracy of identification is higher, and system resource occupancy is lower.
Description
Technical field
This application involves mobile communication field more particularly to a kind of wearing state recognition methods, equipment and computer-readable
Storage medium.
Background technique
In the prior art, with the fast development of intelligent terminal, occur being different from the wearing of normal procedure intelligent mobile phone
Formula equipment, for example, the wearable devices such as smartwatch or Intelligent bracelet.Since wearable device is compared to traditional intelligent hand
Machine, the particularity such as soft and hardware environment, mode of operation and operating environment, if the manipulation scheme of traditional smart phone is turned
With to wearable device, then may make troubles place, user experience to the operation of user are bad.
Summary of the invention
In order to solve above-mentioned technological deficiency in the prior art, the invention proposes a kind of wearing state recognition methods, should
Method includes:
The image information under current wearing state is obtained by the camera assembly in wearable device, and passes through wearable device
Sensory package obtain the motion information under the current wearing state;
Described image information is parsed, the first profile feature of the first reference object is obtained, meanwhile, obtain the second reference object
The second contour feature;
Identify the first orientation feature of the first profile feature, and, identify the second party of second contour feature
Position feature;
If the first orientation feature and the second orientation feature are in gyrobearing simultaneously, in conjunction with the movement
Information determines the worn orientation and motion state of the wearable device.
Optionally, the camera assembly by wearable device obtains the image information under current wearing state, and
The motion information under the current wearing state is obtained by the sensory package of wearable device, comprising:
Determine that the information of the wearable device consults orientation or default shooting orientation;
Orientation is consulted according to the information or default shooting orientation determines the imaging orientation of the camera assembly.
Optionally, the camera assembly by wearable device obtains the image information under current wearing state, and
The motion information under the current wearing state is obtained by the sensory package of wearable device, further includes:
Under the imaging orientation, described image information is obtained by the camera assembly;
The motion information under the current wearing state is obtained by the sensory package of wearable device, wherein the movement
Information includes acceleration information.
Optionally, the parsing described image information obtains the first profile feature of the first reference object, meanwhile, it obtains
Second contour feature of the second reference object, comprising:
It is obtained by predeterminated frequency and parses described image information;
At least two reference objects that there is unique orientative feature and belong to a different category are determined, as first ginseng
According to object and second reference object.
Optionally, the parsing described image information obtains the first profile feature of the first reference object, meanwhile, it obtains
Second contour feature of the second reference object, further includes:
Identify first reference object and the corresponding exterior contour of second reference object or body profile;
By image analysis, the first profile feature of the first reference object exterior contour or body profile is obtained,
And by image analysis, obtain the second contour feature of the second reference object exterior contour or body profile.
Optionally, the first orientation feature of the identification first profile feature, and, identify that second profile is special
The second orientation feature of sign, comprising:
Identify the direction of first reference object and the second reference object exterior contour or body profile;
The first orientation feature and the second orientation feature are respectively obtained according to the direction.
Optionally, if the first orientation feature and the second orientation feature are in gyrobearing simultaneously,
The worn orientation and motion state of the wearable device are determined in conjunction with the motion information, comprising:
Detect the first orientation feature and the second orientation feature;
The first rotation angle is obtained according to the first orientation feature, and obtains second according to the second orientation feature
Rotate angle.
Optionally, if the first orientation feature and the second orientation feature are in gyrobearing simultaneously,
The worn orientation and motion state of the wearable device are determined in conjunction with the motion information, further includes:
If the first rotation angle and the second rotation angle are in same direction of rotation and in same rotation angle
Angle value, it is determined that the wearable device is in rotation status;
The worn orientation and movement shape of the wearable device are determined in conjunction with the rotation status and the motion information
State.
The invention also provides a kind of wearing states to identify that equipment, the equipment include:
Memory, processor and it is stored in the computer program that can be run on the memory and on the processor;
The step of computer program realizes method as described in any one of the above embodiments when being executed by the processor.
The invention also provides a kind of computer readable storage medium, wearing is stored on the computer readable storage medium
State recognition program, the wearing state recognizer realize wearing state as described in any one of the above embodiments when being executed by processor
The step of recognition methods.
The beneficial effects of the present invention are obtain the image under current wearing state by the camera assembly in wearable device
Information, and the motion information under the current wearing state is obtained by the sensory package of wearable device;Then, described in parsing
Image information obtains the first profile feature of the first reference object, meanwhile, obtain the second contour feature of the second reference object;
Subsequently, the first orientation feature of the first profile feature is identified, and, identify the second orientation of second contour feature
Feature;Finally, if the first orientation feature and the second orientation feature are in gyrobearing simultaneously, in conjunction with the fortune
Dynamic information determines the worn orientation and motion state of the wearable device.Realize a kind of wearing state identification side of hommization
Case can adaptively determine current wearing state so that wearable device may not need user and be set in advance, identification it is accurate
Property is higher, and system resource occupancy is lower.
Detailed description of the invention
The drawings herein are incorporated into the specification and forms part of this specification, and shows and meets implementation of the invention
Example, and be used to explain the principle of the present invention together with specification.
In order to more clearly explain the embodiment of the invention or the technical proposal in the existing technology, to embodiment or will show below
There is attached drawing needed in technical description to be briefly described, it should be apparent that, for those of ordinary skill in the art
Speech, without any creative labor, is also possible to obtain other drawings based on these drawings.
Fig. 1 is a kind of hardware structural diagram of embodiment of wearable device provided in an embodiment of the present invention;
Fig. 2 is a kind of hardware schematic of embodiment of wearable device provided by the embodiments of the present application;
Fig. 3 is a kind of hardware schematic of embodiment of wearable device provided by the embodiments of the present application;
Fig. 4 is a kind of hardware schematic of embodiment of wearable device provided by the embodiments of the present application;
Fig. 5 is a kind of hardware schematic of embodiment of wearable device provided by the embodiments of the present application;
Fig. 6 is the flow chart of wearing state recognition methods first embodiment of the present invention;
Fig. 7 is the flow chart of wearing state recognition methods second embodiment of the present invention;
Fig. 8 is the flow chart of wearing state recognition methods 3rd embodiment of the present invention;
Fig. 9 is the flow chart of wearing state recognition methods fourth embodiment of the present invention;
Figure 10 is the flow chart of the 5th embodiment of wearing state recognition methods of the present invention;
Figure 11 is the flow chart of wearing state recognition methods sixth embodiment of the present invention;
Figure 12 is the flow chart of the 7th embodiment of wearing state recognition methods of the present invention;
Figure 13 is the flow chart of the 8th embodiment of wearing state recognition methods of the present invention.
Specific embodiment
It should be appreciated that the specific embodiments described herein are merely illustrative of the present invention, it is not intended to limit the present invention.
In subsequent description, it is only using the suffix for indicating such as " module ", " component " or " unit " of element
Be conducive to explanation of the invention, itself there is no a specific meaning.Therefore, " module ", " component " or " unit " can mix
Ground uses.
The wearable device provided in the embodiment of the present invention includes that Intelligent bracelet, smartwatch and smart phone etc. move
Dynamic terminal.With the continuous development of Screen Technology, the appearance of the screens form such as flexible screen, Folding screen, smart phone etc. is mobile eventually
End can also be used as wearable device.The wearable device provided in the embodiment of the present invention may include: RF (Radio
Frequency, radio frequency) unit, WiFi module, audio output unit, A/V (audio/video) input unit, sensor, display
The components such as unit, user input unit, interface unit, memory, processor and power supply.
It will be illustrated by taking wearable device as an example in subsequent descriptions, referring to Fig. 1, its each implementation to realize the present invention
A kind of hardware structural diagram of wearable device of example, which may include: RF (Radio
Frequency, radio frequency) unit 101, WiFi module 102, audio output unit 103, A/V (audio/video) input unit 104,
Sensor 105, display unit 106, user input unit 107, interface unit 108, memory 109, processor 110, Yi Ji electricity
The components such as source 111.It will be understood by those skilled in the art that wearable device structure shown in Fig. 1 is not constituted to wearable
The restriction of equipment, wearable device may include perhaps combining certain components or difference than illustrating more or fewer components
Component layout.
It is specifically introduced below with reference to all parts of the Fig. 1 to wearable device:
Radio frequency unit 101 can be used for receiving and sending messages or communication process in, signal sends and receivees, specifically, radio frequency list
Uplink information can be sent to base station by member 101, and after the downlink information that in addition can also be sent base station receives, being sent to can be worn
The processor 110 for wearing equipment is handled, and base station can be to the downlink information that radio frequency unit 101 is sent and be sent out according to radio frequency unit 101
What the uplink information sent generated, it is also possible to actively push away to radio frequency unit 101 after the information update for detecting wearable device
It send, for example, base station can penetrating to wearable device after detecting that geographical location locating for wearable device changes
Frequency unit 101 sends the message informing of geographical location variation, and radio frequency unit 101, can should after receiving the message informing
The processor 110 that message informing is sent to wearable device is handled, and it is logical that the processor 110 of wearable device can control the message
Know on the display panel 1061 for being shown in wearable device;In general, radio frequency unit 101 include but is not limited to antenna, at least one
Amplifier, transceiver, coupler, low-noise amplifier, duplexer etc..In addition, radio frequency unit 101 can also pass through channel radio
Letter communicated with network and other equipment, specifically may include: by wireless communication with the server communication in network system, example
Such as, wearable device can download file resource from server by wireless communication, for example can download and answer from server
With program, after wearable device completes the downloading of a certain application program, if the corresponding file of the application program in server
Resource updates, then the server can be by wireless communication to the message informing of wearable device push resource updates, to remind
User is updated the application program.Any communication standard or agreement can be used in above-mentioned wireless communication, including but not limited to
GSM (Global System of Mobile communication, global system for mobile communications), GPRS (General
Packet Radio Service, general packet radio service), CDMA2000 (Code Division Multiple Access
2000, CDMA 2000), (Wideband Code Division Multiple Access, wideband code division are more by WCDMA
Location), TD-SCDMA (Time Division-Synchronous Code Division Multiple Access, time division synchronous
CDMA), (Frequency Division Duplexing-Long Term Evolution, frequency division duplex are long by FDD-LTE
Phase evolution) and TDD-LTE (Time Division Duplexing-Long Term Evolution, time division duplex are drilled for a long time
Into) etc..
In one embodiment, wearable device 100 can access existing communication network by insertion SIM card.
In another embodiment, wearable device 100 can be come real by the way that esim card (Embedded-SIM) is arranged
Existing communication network is now accessed, by the way of esim card, the inner space of wearable device can be saved, reduce thickness.
It is understood that although Fig. 1 shows radio frequency unit 101, but it is understood that, radio frequency unit 101 its
And it is not belonging to must be configured into for wearable device, it can according to need within the scope of not changing the essence of the invention and save completely
Slightly., wearable device 100 can realize the communication connection with other equipment or communication network separately through wifi module 102,
The embodiment of the present invention is not limited thereto.
WiFi belongs to short range wireless transmission technology, and wearable device can help user to receive and dispatch by WiFi module 102
Email, browsing webpage and access streaming video etc., it provides wireless broadband internet access for user.Although Fig. 1
WiFi module 102 is shown, but it is understood that, and it is not belonging to must be configured into for wearable device, it completely can root
It is omitted within the scope of not changing the essence of the invention according to needs.
Audio output unit 103 can be in call signal reception pattern, call mode, record in wearable device 100
When under the isotypes such as mode, speech recognition mode, broadcast reception mode, by radio frequency unit 101 or WiFi module 102 it is received or
The audio data that person stores in memory 109 is converted into audio signal and exports to be sound.Moreover, audio output unit
103 can also provide audio output relevant to the specific function that wearable device 100 executes (for example, call signal reception sound
Sound, message sink sound etc.).Audio output unit 103 may include loudspeaker, buzzer etc..
A/V input unit 104 is for receiving audio or video signal.A/V input unit 104 may include graphics processor
(Graphics Processing Unit, GPU) 1041 and microphone 1042, graphics processor 1041 is in video acquisition mode
Or the image data of the static images or video obtained in image capture mode by image capture apparatus (such as camera) carries out
Reason.Treated, and picture frame may be displayed on display unit 106.Through graphics processor 1041, treated that picture frame can be deposited
Storage is sent in memory 109 (or other storage mediums) or via radio frequency unit 101 or WiFi module 102.Mike
Wind 1042 can connect in telephone calling model, logging mode, speech recognition mode etc. operational mode via microphone 1042
Quiet down sound (audio data), and can be audio data by such acoustic processing.Audio that treated (voice) data can
To be converted to the format output that can be sent to mobile communication base station via radio frequency unit 101 in the case where telephone calling model.
Microphone 1042 can be implemented various types of noises elimination (or inhibition) algorithms and send and receive sound to eliminate (or inhibition)
The noise generated during frequency signal or interference.
In one embodiment, wearable device 100 includes one or more cameras, by opening camera,
It can be realized the capture to image, realize the functions such as take pictures, record a video, the position of camera, which can according to need, to be configured.
Wearable device 100 further includes at least one sensor 105, for example, optical sensor, motion sensor and other
Sensor.Specifically, optical sensor includes ambient light sensor and proximity sensor, wherein ambient light sensor can be according to ring
The light and shade of border light adjusts the brightness of display panel 1061, proximity sensor can when wearable device 100 is moved in one's ear,
Close display panel 1061 and/or backlight.As a kind of motion sensor, accelerometer sensor can detect in all directions
The size of (generally three axis) acceleration, can detect that size and the direction of gravity, can be used to identify mobile phone posture when static
It (for example pedometer, is struck using (such as horizontal/vertical screen switching, dependent game, magnetometer pose calibrating), Vibration identification correlation function
Hit) etc..
In one embodiment, wearable device 100 further includes proximity sensor, can by using proximity sensor
Wearable device can be realized non-contact manipulation, provide more modes of operation.
In one embodiment, wearable device 100 further includes heart rate sensor, when wearing, by close to using
Person can be realized the detecting of heart rate.
In one embodiment, wearable device 100 can also include that fingerprint sensor can by reading fingerprint
Realize the functions such as safety verification.
Display unit 106 is for showing information input by user or being supplied to the information of user.Display unit 106 can wrap
Display panel 1061 is included, liquid crystal display (Liquid Crystal Display, LCD), Organic Light Emitting Diode can be used
Forms such as (Organic Light-Emitting Diode, OLED) configure display panel 1061.
In one embodiment, display panel 1061 uses flexible display screen, and wearable using flexible display screen sets
For when wearing, screen is able to carry out bending, to more be bonded.Optionally, the flexible display screen can use OLED screen
Body and graphene screen body, in other embodiments, the flexible display screen is also possible to other display materials, the present embodiment
It is not limited thereto.
In one embodiment, the display panel 1061 of wearable device can take rectangle, ring when convenient for wearing
Around.In other embodiments, other modes can also be taken.
User input unit 107 can be used for receiving the number or character information of input, and generate and wearable device
User setting and the related key signals input of function control.Specifically, user input unit 107 may include touch panel 1071
And other input equipments 1072.Touch panel 1071, also referred to as touch screen collect the touch behaviour of user on it or nearby
Make (for example user uses any suitable objects or attachment such as finger, stylus on touch panel 1071 or in touch panel
Operation near 1071), and corresponding attachment device is driven according to preset formula.Touch panel 1071 may include touching
Two parts of detection device and touch controller.Wherein, the touch orientation of touch detecting apparatus detection user, and detect touch behaviour
Make bring signal, transmits a signal to touch controller;Touch controller receives touch information from touch detecting apparatus, and
It is converted into contact coordinate, then gives processor 110, and order that processor 110 is sent can be received and executed.This
Outside, touch panel 1071 can be realized using multiple types such as resistance-type, condenser type, infrared ray and surface acoustic waves.In addition to touching
Panel 1071 is controlled, user input unit 107 can also include other input equipments 1072.Specifically, other input equipments 1072
It can include but is not limited to physical keyboard, function key (such as volume control button, switch key etc.), trace ball, mouse, operation
One of bar etc. is a variety of, specifically herein without limitation.
In one embodiment, one or more buttons have can be set in the side of wearable device 100.Button can be with
The various ways such as short-press, long-pressing, rotation are realized, to realize a variety of operating effects.The quantity of button can be different to be multiple
It can be applied in combination between button, realize a variety of operating functions.
Further, touch panel 1071 can cover display panel 1061, when touch panel 1071 detect on it or
After neighbouring touch operation, processor 110 is sent to determine the type of touch event, is followed by subsequent processing device 110 according to touch thing
The type of part provides corresponding visual output on display panel 1061.Although in Fig. 1, touch panel 1071 and display panel
1061 be the function that outputs and inputs of realizing wearable device as two independent components, but in certain embodiments,
Touch panel 1071 and display panel 1061 can be integrated and be realized the function that outputs and inputs of wearable device, specifically herein
Without limitation.For example, processor 110 can be controlled when receiving the message informing of a certain application program by radio frequency unit 101
The message informing show in a certain predeterminable area of display panel 1061 by system, the predeterminable area and touch panel 1071 certain
One region is corresponding, can be to corresponding to area on display panel 1061 by carrying out touch control operation to a certain region of touch panel 1071
The message informing shown in domain is controlled.
Interface unit 108 be used as at least one external device (ED) connect with wearable device 100 can by interface.Example
Such as, external device (ED) may include wired or wireless headphone port, external power supply (or battery charger) port, You Xianhuo
Wireless data communications port, memory card port, the port for connecting the device with identification module, audio input/output (I/O) end
Mouth, video i/o port, ear port etc..Interface unit 108 can be used for receiving the input from external device (ED) (for example, number
It is believed that breath, electric power etc.) and the input received is transferred to one or more elements in wearable device 100 or can
For transmitting data between wearable device 100 and external device (ED).
In one embodiment, wearable device 100 interface unit 108 using contact structure, by contact with
Corresponding other equipment connection, realizes the functions such as charging, connection.Use contact can be with waterproof.
Memory 109 can be used for storing software program and various data.Memory 109 can mainly include storing program area
The storage data area and, wherein storing program area can (such as the sound of application program needed for storage program area, at least one function
Sound playing function, image player function etc.) etc.;Storage data area can store according to mobile phone use created data (such as
Audio data, phone directory etc.) etc..In addition, memory 109 may include high-speed random access memory, it can also include non-easy
The property lost memory, a for example, at least disk memory, flush memory device or other volatile solid-state parts.
Processor 110 is the control centre of wearable device, utilizes various interfaces and the entire wearable device of connection
Various pieces, by running or execute the software program and/or module that are stored in memory 109, and call and be stored in
Data in memory 109 execute the various functions and processing data of wearable device, to carry out to wearable device whole
Monitoring.Processor 110 may include one or more processing units;Preferably, processor 110 can integrate application processor and modulation
Demodulation processor, wherein the main processing operation system of application processor, user interface and application program etc., modulation /demodulation processing
Device mainly handles wireless communication.It is understood that above-mentioned modem processor can not also be integrated into processor 110.
Wearable device 100 can also include the power supply 111 (such as battery) powered to all parts, it is preferred that power supply
111 can be logically contiguous by power-supply management system and processor 110, thus charged by power-supply management system realization management,
The functions such as electric discharge and power managed.
Although Fig. 1 is not shown, wearable device 100 can also be including bluetooth module etc., and details are not described herein.It is wearable to set
Standby 100, by bluetooth, can connect with other terminal devices, realize communication and the interaction of information.
Fig. 2-Fig. 4 is please referred to, is the structure under a kind of a kind of embodiment of wearable device provided in an embodiment of the present invention
Schematic diagram.Wearable device in the embodiment of the present invention, including flexible screen.In wearable device expansion, flexible screen is in
Strip;When wearable device is in wearing state, flexible screen bending is annular in shape.Fig. 2 and Fig. 3 show wearable device
Structural schematic diagram when screen is unfolded, Fig. 4 show structural schematic diagram when wearable device screen-bending.
Based on above-mentioned each embodiment, it can be seen that if the equipment is wrist-watch, bracelet or wearable device
When, the screen of the equipment can not overlay device watchband region, can also be with the watchband region of overlay device.Here, this Shen
It please propose a kind of optional embodiment, in the present embodiment, the equipment for wrist-watch, bracelet or wearable can be set
Standby, the equipment includes screen and interconnecting piece.The screen can be flexible screen, and the interconnecting piece can be watchband.It can
Choosing, the screen of the equipment or the viewing area of screen can be partly or completely covered on the watchband of equipment.Such as Fig. 5
Shown, Fig. 5 is a kind of a kind of hardware schematic of embodiment of wearable device provided by the embodiments of the present application, the equipment
Screen extends to two sides, and part is covered on the watchband of equipment.In other embodiments, the screen of the equipment can also be with
It is all covered on the watchband of the equipment, the embodiment of the present application is not limited thereto.
Embodiment one
Fig. 6 is the flow chart of wearing state recognition methods first embodiment of the present invention.A kind of wearing state recognition methods, should
Method includes:
S1, the image information under current wearing state is obtained by the camera assembly in wearable device, and passes through wearing
The sensory package of equipment obtains the motion information under the current wearing state;
S2, parsing described image information, obtain the first profile feature of the first reference object, meanwhile, obtain the second reference
Second contour feature of object;
The first orientation feature of S3, the identification first profile feature, and, identify the second of second contour feature
Orientative feature;
If S4, the first orientation feature and the second orientation feature are in gyrobearing simultaneously, in conjunction with described
Motion information determines the worn orientation and motion state of the wearable device.
In the present embodiment, firstly, obtaining the letter of the image under current wearing state by the camera assembly in wearable device
Breath, and the motion information under the current wearing state is obtained by the sensory package of wearable device;Then, the figure is parsed
As information, the first profile feature of the first reference object is obtained, meanwhile, obtain the second contour feature of the second reference object;Again
Then, the first orientation feature of the first profile feature is identified, and, identify that the second orientation of second contour feature is special
Sign;Finally, if the first orientation feature and the second orientation feature are in gyrobearing simultaneously, in conjunction with the movement
Information determines the worn orientation and motion state of the wearable device.
Optionally, in the present embodiment, the motion state of wearable device is obtained, for example, determining that this is worn according to motion state
It wears equipment to remain static or motion state, when being kept in motion, identifies that the movement is the pendulum for being worn on wrist
Dynamic movement or the lift hands movement for being worn on wrist etc.;
Optionally, in the present embodiment, camera assembly is set in one of wearable device or multiposition, passes through the camera shooting group
Part obtains image information by predeterminated frequency;
Optionally, when above-mentioned motion state changes, image information, motion state are obtained by above-mentioned camera assembly
The case where changing includes but is not limited to be switched to other motion states by regular motion state, alternatively, by stationary state
It is switched to motion state;
Optionally, described image information is parsed, the first profile feature of the first reference object is obtained, meanwhile, obtain second
Second contour feature of reference object, wherein the first reference object and the second reference object are that have fixed court in external environment
To object, for example, trees, building, pet, cloud and seawater etc.;
Optionally, in the present embodiment, to take two kinds of reference objects as explanation, it is to be understood that the present embodiment is same
Sample is suitable for two or more reference objects;
Optionally, in the present embodiment, above two reference object belongs to different types of object, to improve identification
Accuracy;
Optionally, the first profile feature of above-mentioned the first reference object of acquisition, meanwhile, obtain the second of the second reference object
Contour feature, wherein contour feature refers to the body profile or exterior contour for constituting the object, for example, the trunk of trees, building
Build the exterior wall of object, water day line of demarcation of seawater etc.;
Optionally, the first orientation feature of the first profile feature is identified, and, identify second contour feature
Second orientation feature, wherein orientative feature is used to indicate the direction of reference object, for example, determining the orientation by the trunk of trees
Feature is vertically upward, to determine that the orientative feature is horizontal parallel etc. by the water day line of demarcation of seawater;
Optionally, if the first orientation feature and the second orientation feature are in gyrobearing simultaneously, in conjunction with
The motion information determines the worn orientation and motion state of the wearable device, specifically, for example, if recognizing trees
Trunk is horizontal, and the swing of regularity is in horizontal trunk, it is determined that currently the worn orientation of wearable device is
Wrist or arm wearing, and in state of walking or run.
The beneficial effect of the present embodiment is, obtains the figure under current wearing state by the camera assembly in wearable device
As information, and the motion information under the current wearing state is obtained by the sensory package of wearable device;Then, institute is parsed
Image information is stated, the first profile feature of the first reference object is obtained, meanwhile, the second profile for obtaining the second reference object is special
Sign;Subsequently, the first orientation feature of the first profile feature is identified, and, identify the second of second contour feature
Orientative feature;Finally, if the first orientation feature and the second orientation feature are in gyrobearing simultaneously, in conjunction with institute
State worn orientation and motion state that motion information determines the wearable device.The wearing state for realizing a kind of hommization is known
Other scheme can adaptively determine current wearing state so that wearable device may not need user and be set in advance, identification
Accuracy is higher, and system resource occupancy is lower.
Embodiment two
Fig. 7 is the flow chart of wearing state recognition methods second embodiment of the present invention, based on the above embodiment, optionally,
The camera assembly by wearable device obtains the image information under current wearing state, and the biography for passing through wearable device
Sense component obtains the motion information under the current wearing state, comprising:
S11, determine that the information of the wearable device consults orientation or default shooting orientation;
S12, orientation is consulted according to the information or defaults the imaging orientation that shooting orientation determines the camera assembly.
In the present embodiment, firstly, determining that the information of the wearable device consults orientation or default shooting orientation;So
Afterwards, orientation is consulted according to the information or default shooting orientation determines the imaging orientation of the camera assembly.
Optionally, the information display orientation during the information in display area being consulted is as default display orientation;
Optionally, after camera being dressed in starting, the display orientation under shooting preview figure is as above-mentioned default display orientation;
Optionally, by the imaging orientation of camera assembly, that is, the direction and above-mentioned default display orientation phase of photosensitive sensors
Association, so that the shooting preview figure is in positive display in above-mentioned display area after obtaining shooting preview figure by camera assembly.
The beneficial effect of the present embodiment is, consults orientation by the information of the determination wearable device or default is shot
Orientation;Then, orientation is consulted according to the information or default shooting orientation determines the imaging orientation of the camera assembly.It realizes
A kind of humanized wearing state identifying schemes, so that wearable device may not need user and be set in advance, it can be adaptive
Determine current wearing state with answering, the accuracy of identification is higher, and system resource occupancy is lower.
Embodiment three
Fig. 8 is the flow chart of wearing state recognition methods 3rd embodiment of the present invention, based on the above embodiment, optionally,
The camera assembly by wearable device obtains the image information under current wearing state, and the biography for passing through wearable device
Sense component obtains the motion information under the current wearing state, further includes:
S13, under the imaging orientation, pass through the camera assembly obtain described image information;
S14, the motion information under the current wearing state is obtained by the sensory package of wearable device, wherein described
Motion information includes acceleration information.
In the present embodiment, firstly, under the imaging orientation, described image information is obtained by the camera assembly;
Then, the motion information under the current wearing state is obtained by the sensory package of wearable device, wherein the motion information
Including acceleration information.
Optionally, in order to rationally utilize the process resource of wearable device, when recognizing the motion information with default feature
When, described image information is obtained by the camera assembly;
Optionally, in order to rationally utilize wearable device process resource, when recognize wearable device light change when, lead to
It crosses the camera assembly and obtains described image information;
Optionally, in order to rationally utilize wearable device process resource, when connecing for the human body surface for recognizing wearable device
When touching temperature changes, described image information is obtained by the camera assembly.
The beneficial effect of the present embodiment is, by under the imaging orientation, obtained by the camera assembly described in
Image information;Then, the motion information under the current wearing state is obtained by the sensory package of wearable device, wherein institute
Stating motion information includes acceleration information.A kind of humanized wearing state identifying schemes are realized, so that wearable device
It may not need user to be set in advance, current wearing state can be adaptively determined, the accuracy of identification is higher, system resource
Occupancy is lower.
Example IV
Fig. 9 is the flow chart of wearing state recognition methods fourth embodiment of the present invention, based on the above embodiment, optionally,
The parsing described image information obtains the first profile feature of the first reference object, meanwhile, obtain the of the second reference object
Two contour features, comprising:
S21, it is obtained by predeterminated frequency and parses described image information;
S22, it determines at least two with unique orientative feature and the reference objects that belong to a different category, as described the
One reference object and second reference object.
In the present embodiment, firstly, being obtained by predeterminated frequency and parsing described image information;Then, it is determined that at least two
With unique orientative feature and the reference object that belongs to a different category, as first reference object and second ginseng
According to object.
Optionally, when recognizing above-mentioned trigger condition, for example, the light of the motion information of default feature or wearable device
When the Contact Temperature of human body surface of line variation or wearable device changes, start to obtain and parse with predeterminated frequency on
State image information;
Optionally, reference object can be in static object, for example, trees, building etc.;
Optionally, reference object can be the object in movement, for example, the letter of automobile and transformation in driving process
Number indicator light etc.;
Optionally, in the present embodiment, there is unique orientative feature to refer to, the direction of the reference object only can be a side
To, for example, no matter inclination degree, traffic lights are all in horizontality, likewise, no matter when and where, sea level
Water day line of demarcation all in horizontality.
The beneficial effect of the present embodiment is, is obtained by predeterminated frequency and parses described image information;Then, it is determined that extremely
Few two with unique orientative feature and the reference objects that belong to a different category, as first reference object and described
Second reference object.A kind of humanized wearing state identifying schemes are realized, so that wearable device may not need user
It is set in advance, current wearing state can be adaptively determined, the accuracy of identification is higher, and system resource occupancy is lower.
Embodiment five
Figure 10 is the flow chart of the 5th embodiment of wearing state recognition methods of the present invention, based on the above embodiment, optionally,
The parsing described image information obtains the first profile feature of the first reference object, meanwhile, obtain the of the second reference object
Two contour features, further includes:
S23, first reference object and the corresponding exterior contour of second reference object or main body wheel are identified
It is wide;
S24, pass through image analysis, the first profile for obtaining the first reference object exterior contour or body profile is special
Sign, and, by image analysis, obtain the second contour feature of the second reference object exterior contour or body profile.
In the present embodiment, firstly, identifying first reference object and the corresponding outside of second reference object
Profile or body profile;Then, by image analysis, the first reference object exterior contour or body profile are obtained
First profile feature, and, by image analysis, obtain the second of the second reference object exterior contour or body profile
Contour feature.
Optionally, described in example as above, for different types of reference object, corresponding exterior contour or main body wheel are chosen
Exterior feature, for example, the column etc. of the trunk of trees, building;
Optionally, by image analysis, the first round of the first reference object exterior contour or body profile is obtained
Wide feature, and, by image analysis, the second profile for obtaining the second reference object exterior contour or body profile is special
Sign, likewise, the column of building is similarly straight for example, the trunk of trees, corresponding contour feature are line-type features
Profile.The purpose of the present embodiment be reference object is carried out it is appropriate abstract, consequently facilitating carrying out sentencing for orientation or direction
It is fixed.
The beneficial effect of the present embodiment is, by identifying first reference object and second reference object pair
The exterior contour or body profile answered;Then, by image analysis, the first reference object exterior contour or master are obtained
The first profile feature of body profile, and, by image analysis, obtain the second reference object exterior contour or main body wheel
The second wide contour feature.A kind of humanized wearing state identifying schemes are realized, so that wearable device may not need
User is set in advance, and can adaptively determine current wearing state, and the accuracy of identification is higher, system resource occupancy compared with
It is low.
Embodiment six
Figure 11 is the flow chart of wearing state recognition methods sixth embodiment of the present invention, based on the above embodiment, optionally,
The first orientation feature of the identification first profile feature, and, identify that the second orientation of second contour feature is special
Sign, comprising:
The court of S31, identification first reference object and the second reference object exterior contour or body profile
To;
S32, the first orientation feature and the second orientation feature are respectively obtained according to the direction.
In the present embodiment, firstly, identify first reference object and the second reference object exterior contour or
The direction of person's body profile;Then, the first orientation feature is respectively obtained according to the direction and the second orientation is special
Sign.
Optionally, if the main body of reference object is indefinite, the other reference objects for belonging to same type, example are redefined
Such as, it when the leaf of selected trees is more dense, is not easy to differentiation trunk and therefore chooses other trees as the present embodiment
Reference object;
Optionally, the reference object for setting pre-set categories identifies reference object under the category.
The beneficial effect of the present embodiment is, by identifying outside first reference object and second reference object
The direction of contouring or body profile;Then, the first orientation feature and described are respectively obtained according to the direction
Two orientative features.A kind of humanized wearing state identifying schemes are realized, so that wearable device may not need user and mention
Preceding setting can adaptively determine current wearing state, and the accuracy of identification is higher, and system resource occupancy is lower.
Embodiment seven
Figure 12 is the flow chart of the 7th embodiment of wearing state recognition methods of the present invention, based on the above embodiment, optionally,
If the first orientation feature and the second orientation feature are in gyrobearing simultaneously, in conjunction with the motion information
Determine the worn orientation and motion state of the wearable device, comprising:
S41, the first orientation feature and the second orientation feature are detected;
S42, the first rotation angle is obtained according to the first orientation feature, and obtained according to the second orientation feature
Second rotation angle.
In the present embodiment, firstly, detecting the first orientation feature and the second orientation feature;Then, according to
The first orientation feature obtains the first rotation angle, and obtains the second rotation angle according to the second orientation feature.
Optionally, when wearable device be in wrist wear, and display area lifted in user it is vertically displaying when wrist is checked,
Then when wearable device is in wrist wearing, and user falls arm, the object vertically upward such as trees, building for being recognized
Body and the perpendicular relationship in direction vertically displaying before, that is, determining that corresponding rotation angle is 90 degree.
The beneficial effect of the present embodiment is, by detecting the first orientation feature and the second orientation feature;
Then, the first rotation angle is obtained according to the first orientation feature, and the second rotation is obtained according to the second orientation feature
Gyration.A kind of humanized wearing state identifying schemes are realized, so that wearable device may not need user and set in advance
It is fixed, current wearing state can be adaptively determined, the accuracy of identification is higher, and system resource occupancy is lower.
Embodiment eight
Figure 13 is the flow chart of the 8th embodiment of wearing state recognition methods of the present invention, based on the above embodiment, optionally,
If the first orientation feature and the second orientation feature are in gyrobearing simultaneously, in conjunction with the motion information
Determine the worn orientation and motion state of the wearable device, further includes:
If S43, the first rotation angle and the second rotation angle are in same direction of rotation and in same rotation
Corner angle value, it is determined that the wearable device is in rotation status;
S44, the worn orientation and fortune that the wearable device is determined in conjunction with the rotation status and the motion information
Dynamic state.
In the present embodiment, firstly, if the first rotation angle and the second rotation angle are in same rotation side
To and in same rotation angle value, it is determined that the wearable device is in rotation status;Then, in conjunction with the rotation status with
And the motion information determines the worn orientation and motion state of the wearable device.
Optionally, the present embodiment further includes the information such as the height for identifying trees or the height of desk, so that it is determined that wearing
The wearing height of equipment, the current worn orientation of further accurate judgement, for example, the height when being worn on wrist, with desk
It is close, and when being worn on arm, then it is higher than the height of desk.
The beneficial effect of the present embodiment is, if by judging at the first rotation angle and the second rotation angle
In same direction of rotation and it is in same rotation angle value, it is determined that the wearable device is in rotation status;Then, in conjunction with institute
It states rotation status and the motion information determines the worn orientation and motion state of the wearable device.Realize one kind more
Wearing state identifying schemes for hommization can adaptively determine so that wearable device may not need user and be set in advance
The accuracy of current wearing state, identification is higher, and system resource occupancy is lower.
Embodiment nine
Based on the above embodiment, the invention also provides a kind of wearing states to identify that equipment, the equipment include:
Memory, processor and it is stored in the computer program that can be run on the memory and on the processor;
The step of computer program realizes method as described in any one of the above embodiments when being executed by the processor.
Specifically, in the present embodiment, firstly, being obtained under current wearing state by the camera assembly in wearable device
Image information, and the motion information under the current wearing state is obtained by the sensory package of wearable device;Then, it parses
Described image information obtains the first profile feature of the first reference object, meanwhile, the second profile for obtaining the second reference object is special
Sign;Subsequently, the first orientation feature of the first profile feature is identified, and, identify the second of second contour feature
Orientative feature;Finally, if the first orientation feature and the second orientation feature are in gyrobearing simultaneously, in conjunction with institute
State worn orientation and motion state that motion information determines the wearable device.
Optionally, in the present embodiment, the motion state of wearable device is obtained, for example, determining that this is worn according to motion state
It wears equipment to remain static or motion state, when being kept in motion, identifies that the movement is the pendulum for being worn on wrist
Dynamic movement or the lift hands movement for being worn on wrist etc.;
Optionally, in the present embodiment, camera assembly is set in one of wearable device or multiposition, passes through the camera shooting group
Part obtains image information by predeterminated frequency;
Optionally, when above-mentioned motion state changes, image information, motion state are obtained by above-mentioned camera assembly
The case where changing includes but is not limited to be switched to other motion states by regular motion state, alternatively, by stationary state
It is switched to motion state;
Optionally, described image information is parsed, the first profile feature of the first reference object is obtained, meanwhile, obtain second
Second contour feature of reference object, wherein the first reference object and the second reference object are that have fixed court in external environment
To object, for example, trees, building, pet, cloud and seawater etc.;
Optionally, in the present embodiment, to take two kinds of reference objects as explanation, it is to be understood that the present embodiment is same
Sample is suitable for two or more reference objects;
Optionally, in the present embodiment, above two reference object belongs to different types of object, to improve identification
Accuracy;
Optionally, the first profile feature of above-mentioned the first reference object of acquisition, meanwhile, obtain the second of the second reference object
Contour feature, wherein contour feature refers to the body profile or exterior contour for constituting the object, for example, the trunk of trees, building
Build the exterior wall of object, water day line of demarcation of seawater etc.;
Optionally, the first orientation feature of the first profile feature is identified, and, identify second contour feature
Second orientation feature, wherein orientative feature is used to indicate the direction of reference object, for example, determining the orientation by the trunk of trees
Feature is vertically upward, to determine that the orientative feature is horizontal parallel etc. by the water day line of demarcation of seawater;
Optionally, if the first orientation feature and the second orientation feature are in gyrobearing simultaneously, in conjunction with
The motion information determines the worn orientation and motion state of the wearable device, specifically, for example, if recognizing trees
Trunk is horizontal, and the swing of regularity is in horizontal trunk, it is determined that currently the worn orientation of wearable device is
Wrist or arm wearing, and in state of walking or run.
The beneficial effect of the present embodiment is, obtains the figure under current wearing state by the camera assembly in wearable device
As information, and the motion information under the current wearing state is obtained by the sensory package of wearable device;Then, institute is parsed
Image information is stated, the first profile feature of the first reference object is obtained, meanwhile, the second profile for obtaining the second reference object is special
Sign;Subsequently, the first orientation feature of the first profile feature is identified, and, identify the second of second contour feature
Orientative feature;Finally, if the first orientation feature and the second orientation feature are in gyrobearing simultaneously, in conjunction with institute
State worn orientation and motion state that motion information determines the wearable device.The wearing state for realizing a kind of hommization is known
Other scheme can adaptively determine current wearing state so that wearable device may not need user and be set in advance, identification
Accuracy is higher, and system resource occupancy is lower.
Embodiment ten
Based on the above embodiment, the invention also provides a kind of computer readable storage medium, the computer-readable storages
Bitmap processing routine is stored on medium, bitmap processing routine realizes bitmap as described in any one of the above embodiments when being executed by processor
The step of processing method.
Implement bitmap processing method, equipment and computer readable storage medium of the invention, passes through taking the photograph in wearable device
As component obtains the image information under current wearing state, and the sensory package acquisition current wearing for passing through wearable device
Motion information under state;Then, described image information is parsed, the first profile feature of the first reference object is obtained, meanwhile, it obtains
Take the second contour feature of the second reference object;Subsequently, the first orientation feature of the first profile feature is identified, and,
Identify the second orientation feature of second contour feature;Finally, if the first orientation feature and the second orientation are special
Gyrobearing is levied while being in, then determines the worn orientation and motion state of the wearable device in conjunction with the motion information.
A kind of wearing state identifying schemes of hommization are realized, so that wearable device may not need user and be set in advance, it can be adaptive
Determine current wearing state with answering, the accuracy of identification is higher, and system resource occupancy is lower.
It should be noted that, in this document, the terms "include", "comprise" or its any other variant are intended to non-row
His property includes, so that the process, method, article or the device that include a series of elements not only include those elements, and
And further include other elements that are not explicitly listed, or further include for this process, method, article or device institute it is intrinsic
Element.In the absence of more restrictions, the element limited by sentence "including a ...", it is not excluded that including being somebody's turn to do
There is also other identical elements in the process, method of element, article or device.
The serial number of the above embodiments of the invention is only for description, does not represent the advantages or disadvantages of the embodiments.
Through the above description of the embodiments, those skilled in the art can be understood that above-described embodiment side
Method can be realized by means of software and necessary general hardware platform, naturally it is also possible to by hardware, but in many cases
The former is more preferably embodiment.Based on this understanding, technical solution of the present invention substantially in other words does the prior art
The part contributed out can be embodied in the form of software products, which is stored in a storage medium
In (such as ROM/RAM, magnetic disk, CD), including some instructions are used so that a terminal (can be mobile phone, computer, service
Device, air conditioner or network equipment etc.) execute method described in each embodiment of the present invention.
The embodiment of the present invention is described with above attached drawing, but the invention is not limited to above-mentioned specific
Embodiment, the above mentioned embodiment is only schematical, rather than restrictive, those skilled in the art
Under the inspiration of the present invention, without breaking away from the scope protected by the purposes and claims of the present invention, it can also make very much
Form, all of these belong to the protection of the present invention.
Claims (10)
1. a kind of wearing state recognition methods, which is characterized in that the described method includes:
The image information under current wearing state, and the biography for passing through wearable device are obtained by the camera assembly in wearable device
Sense component obtains the motion information under the current wearing state;
Described image information is parsed, the first profile feature of the first reference object is obtained, meanwhile, obtain the of the second reference object
Two contour features;
Identify the first orientation feature of the first profile feature, and, identify that the second orientation of second contour feature is special
Sign;
If the first orientation feature and the second orientation feature are in gyrobearing simultaneously, in conjunction with the motion information
Determine the worn orientation and motion state of the wearable device.
2. wearing state recognition methods according to claim 1, which is characterized in that the camera shooting by wearable device
Component obtains the image information under current wearing state, and obtains the current wearing shape by the sensory package of wearable device
Motion information under state, comprising:
Determine that the information of the wearable device consults orientation or default shooting orientation;
Orientation is consulted according to the information or default shooting orientation determines the imaging orientation of the camera assembly.
3. wearing state recognition methods according to claim 2, which is characterized in that the camera shooting by wearable device
Component obtains the image information under current wearing state, and obtains the current wearing shape by the sensory package of wearable device
Motion information under state, further includes:
Under the imaging orientation, described image information is obtained by the camera assembly;
The motion information under the current wearing state is obtained by the sensory package of wearable device, wherein the motion information
Including acceleration information.
4. wearing state recognition methods according to claim 3, which is characterized in that the parsing described image information obtains
The first profile feature of the first reference object is taken, meanwhile, obtain the second contour feature of the second reference object, comprising:
It is obtained by predeterminated frequency and parses described image information;
At least two reference objects that there is unique orientative feature and belong to a different category are determined, as described first referring to right
As and second reference object.
5. wearing state recognition methods according to claim 4, which is characterized in that the parsing described image information obtains
The first profile feature of the first reference object is taken, meanwhile, obtain the second contour feature of the second reference object, further includes:
Identify first reference object and the corresponding exterior contour of second reference object or body profile;
By image analysis, the first profile feature of the first reference object exterior contour or body profile is obtained, and,
By image analysis, the second contour feature of the second reference object exterior contour or body profile is obtained.
6. wearing state recognition methods according to claim 5, which is characterized in that the identification first profile feature
First orientation feature, and, identify the second orientation feature of second contour feature, comprising:
Identify the direction of first reference object and the second reference object exterior contour or body profile;
The first orientation feature and the second orientation feature are respectively obtained according to the direction.
7. wearing state recognition methods according to claim 6, which is characterized in that if the first orientation feature with
And the second orientation feature is in gyrobearing simultaneously, then the wearing side of the wearable device is determined in conjunction with the motion information
Position and motion state, comprising:
Detect the first orientation feature and the second orientation feature;
The first rotation angle is obtained according to the first orientation feature, and the second rotation is obtained according to the second orientation feature
Angle.
8. wearing state recognition methods according to claim 7, which is characterized in that if the first orientation feature with
And the second orientation feature is in gyrobearing simultaneously, then the wearing side of the wearable device is determined in conjunction with the motion information
Position and motion state, further includes:
If the first rotation angle and the second rotation angle are in same direction of rotation and are in same rotation angle value,
Then determine that the wearable device is in rotation status;
The worn orientation and motion state of the wearable device are determined in conjunction with the rotation status and the motion information.
9. a kind of wearing state identifies equipment, which is characterized in that the equipment includes:
Memory, processor and it is stored in the computer program that can be run on the memory and on the processor;
The step such as method described in any item of the claim 1 to 8 is realized when the computer program is executed by the processor
Suddenly.
10. a kind of computer readable storage medium, which is characterized in that be stored with wearing shape on the computer readable storage medium
State recognizer is realized when the wearing state recognizer is executed by processor as described in any item of the claim 1 to 8
The step of wearing state recognition methods.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910357618.6A CN110069136B (en) | 2019-04-29 | 2019-04-29 | Wearing state identification method and equipment and computer readable storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910357618.6A CN110069136B (en) | 2019-04-29 | 2019-04-29 | Wearing state identification method and equipment and computer readable storage medium |
Publications (2)
Publication Number | Publication Date |
---|---|
CN110069136A true CN110069136A (en) | 2019-07-30 |
CN110069136B CN110069136B (en) | 2022-10-11 |
Family
ID=67369639
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201910357618.6A Active CN110069136B (en) | 2019-04-29 | 2019-04-29 | Wearing state identification method and equipment and computer readable storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN110069136B (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112162595A (en) * | 2020-09-23 | 2021-01-01 | 深圳市爱都科技有限公司 | Vertical arm rotation identification method and wearable terminal |
CN112543313A (en) * | 2020-12-03 | 2021-03-23 | 深圳市海洋王照明工程有限公司 | Camera lighting system |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20150277540A1 (en) * | 2014-03-25 | 2015-10-01 | Bradley A. Jackson | Power management for a wearable apparatus |
CN105302541A (en) * | 2014-07-23 | 2016-02-03 | 联想(新加坡)私人有限公司 | Configuring wearable devices |
WO2018134897A1 (en) * | 2017-01-17 | 2018-07-26 | マクセル株式会社 | Position and posture detection device, ar display device, position and posture detection method, and ar display method |
CN108957505A (en) * | 2018-06-27 | 2018-12-07 | 四川斐讯信息技术有限公司 | A kind of localization method, positioning system and portable intelligent wearable device |
-
2019
- 2019-04-29 CN CN201910357618.6A patent/CN110069136B/en active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20150277540A1 (en) * | 2014-03-25 | 2015-10-01 | Bradley A. Jackson | Power management for a wearable apparatus |
CN105302541A (en) * | 2014-07-23 | 2016-02-03 | 联想(新加坡)私人有限公司 | Configuring wearable devices |
WO2018134897A1 (en) * | 2017-01-17 | 2018-07-26 | マクセル株式会社 | Position and posture detection device, ar display device, position and posture detection method, and ar display method |
CN108957505A (en) * | 2018-06-27 | 2018-12-07 | 四川斐讯信息技术有限公司 | A kind of localization method, positioning system and portable intelligent wearable device |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112162595A (en) * | 2020-09-23 | 2021-01-01 | 深圳市爱都科技有限公司 | Vertical arm rotation identification method and wearable terminal |
CN112162595B (en) * | 2020-09-23 | 2022-12-06 | 深圳市爱都科技有限公司 | Vertical arm rotation identification method and wearable terminal |
CN112543313A (en) * | 2020-12-03 | 2021-03-23 | 深圳市海洋王照明工程有限公司 | Camera lighting system |
Also Published As
Publication number | Publication date |
---|---|
CN110069136B (en) | 2022-10-11 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110086563A (en) | A kind of method of controlling rotation, equipment and computer readable storage medium | |
CN110362368A (en) | Picture customization display methods, relevant device and the storage medium of wearable device | |
CN110109544A (en) | Method of adjustment, wearable device and the readable storage medium storing program for executing of motor vibration amplitude | |
CN110062279A (en) | Video method of cutting out, wearable device and computer readable storage medium | |
CN109799912A (en) | A kind of display control method, equipment and computer readable storage medium | |
CN110177209A (en) | A kind of video parameter regulation method, equipment and computer readable storage medium | |
CN110069136A (en) | A kind of wearing state recognition methods, equipment and computer readable storage medium | |
CN110401434A (en) | Wearable device control method, wearable device and computer readable storage medium | |
CN110083513A (en) | A kind of lamp effect control method, equipment and computer readable storage medium | |
CN110072295A (en) | Dual-channels communication method, apparatus, first terminal and medium | |
CN110134312A (en) | Control method, wearable device and computer readable storage medium | |
CN110069102A (en) | A kind of display area regulation method, equipment and computer readable storage medium | |
CN110072071A (en) | A kind of video record interaction control method, equipment and computer readable storage medium | |
CN110177208A (en) | A kind of association control method of video record, equipment and computer readable storage medium | |
CN110096150A (en) | A kind of search interaction control method, equipment and computer readable storage medium | |
CN208754296U (en) | A kind of fixed device of satellite communication terminal and fixed station | |
CN110162247A (en) | A kind of screen control method, wearable device and computer readable storage medium | |
CN110109583A (en) | A kind of display and setting method, equipment and computer readable storage medium | |
CN110096210A (en) | A kind of information processing method, equipment and computer readable storage medium | |
CN109976869A (en) | A kind of operation progress control method, equipment and computer readable storage medium | |
CN110083208A (en) | A kind of overturning control method, equipment and computer readable storage medium | |
CN110109605A (en) | A kind of images share method, equipment and computer readable storage medium | |
CN110187770A (en) | A kind of wrist strap acoustic control exchange method, equipment and computer readable storage medium | |
CN110049248A (en) | A kind of reference object regulation method, equipment and computer readable storage medium | |
CN109981905A (en) | Information Quick-return method, wearable device and computer readable storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
TA01 | Transfer of patent application right | ||
TA01 | Transfer of patent application right |
Effective date of registration: 20220921 Address after: 510000 room 2501, No. 163, Huangpu Avenue West, Tianhe District, Guangzhou City, Guangdong Province (Part I) (office only) (not for plant use) Applicant after: CHINESE FOOD ANHONG (GUANGDONG) HEALTH INDUSTRY Co.,Ltd. Address before: 518000 6-8 / F, 10-11 / F, 6 / F, 6-10 / F, C zone, Han's innovation building, No. 9018, Beihuan Avenue, high tech Zone, Nanshan District, Shenzhen City, Guangdong Province Applicant before: NUBIA TECHNOLOGY Co.,Ltd. |
|
GR01 | Patent grant | ||
GR01 | Patent grant |