[go: up one dir, main page]

CN101751126A - Hand-free interface based on gesture using a plurality of sensor spaces - Google Patents

Hand-free interface based on gesture using a plurality of sensor spaces Download PDF

Info

Publication number
CN101751126A
CN101751126A CN200910258221A CN200910258221A CN101751126A CN 101751126 A CN101751126 A CN 101751126A CN 200910258221 A CN200910258221 A CN 200910258221A CN 200910258221 A CN200910258221 A CN 200910258221A CN 101751126 A CN101751126 A CN 101751126A
Authority
CN
China
Prior art keywords
data
gesture
sensor
micromotion
signal
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN200910258221A
Other languages
Chinese (zh)
Inventor
孙骏恭
吴中明
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Publication of CN101751126A publication Critical patent/CN101751126A/en
Pending legal-status Critical Current

Links

Images

Landscapes

  • User Interface Of Digital Computer (AREA)

Abstract

A system for interpreting a gesture, comprises, preferably, at least one microelectronic mechanical sensor, which comprises a base suitable for being attached on a first position (such as a finger or a hand) of the human. A signal accumulating unit is connected to the sensor, and has another base suitable for being attached on a second position of the human. The signal accumulating unit comprises a logic for packing and/or interpreting the data about the gesture in a plurality of spaces sensing at one or more sensors. The signal accumulating unit also comprises a communication port for being communicated with a main computer, through which the interpreted data is transmitted to the main computer.

Description

Use the hands-free interface in a plurality of sensors space based on gesture
Technical field
The present invention relates to be used for the communication based on gesture of man-machine interface.
Background technology
Man-machine interface system based on gesture has been proposed.For example, people's such as Park U.S. Patent No. 7,259,756 has been described a kind of system, is used for the motion of sensing and estimating user, and it uses the computer interface that acts on selection information.According to people's such as Park patent, the motion that is attached to " moving cell " on finger and hand or other body parts can be by sensing, decipher and screening, and the result is as the information that inputs to computing machine.The sensors that the people conceived such as Park comprise based on the sensor of image and based on the sensor of giant magnetoresistance GMR, and it is away from these moving cells, and can't help the user and carry.Use imaging sensor, the information of relevant gesture may be easy to be stopped by object, because imaging sensor (camera) is fixed in the environment or is positioned at the specific location of distance users, rather than carried by the user.Therefore, the gesture that detect must be limited in the small-sized arrangement zone in the sensor field of view.
Except the problem that visual sensor above-mentioned exists, the GMR technology is contained the caused interference of building materials of magnetic material (such as inner iron construction material).Therefore, GMR is infeasible in most buildingss and environment.
People such as Park also mention microelectromechanical systems (MEMS) inertial sensor that can use the variation of measuring acceleration and angular velocity, but do not discuss.It is not instructed about how to fetch data from the MEMS sensor.
Relate to computing machine based on another common problem of the input equipment of gesture and can how to distinguish various gestures with different implications, and action variation range how to recognize given gesture, the identical meaning of expression in this variation range.For example described such notion in the U.S. Patent No. 5,596,656 of Goldberg, the alphabet that comprises the assemble of symbol with geometry specification is provided, wherein therefore these symbols " fine difference " on figure can be distinguished by computing machine.
Therefore, be desirable to provide a kind of system and instrument that is used for based on the navigation of gesture, it is subjected to the interference and the constraint of environmental movement not too easily, and can support the reliable decipher to complicated gesture.
Summary of the invention
A kind of system that is used for the decipher gesture comprises at least one, and preferably includes a more than Miniature Sensor, and such as microelectronic mechanical sensor MEMS, sensor comprises the base on the position (such as finger or hand) that is suitable for being attached to human body.The signal accumulation unit is connected to sensor, itself or be attached on the sensor, perhaps have the base on the another location that is suitable for being attached to human body.The signal accumulation unit comprises and being used for packing from the data of one or more sensors producing the logic of packing data, these data comprise in a plurality of spaces data and from the data of the relevant gesture in one or more sensors place sensing of a plurality of sensors.The signal accumulation unit also comprises the communication port that is used for main computer communication, and packing data is sent to principal computer by this communication port.Principal computer comprises the resource of cooperating with the processing at place, signal accumulation unit, and the data that this resource produces sensor are carried out decipher and generated input signal as a result.This input signal is delivered to goal systems by the message that main frame uses suitable computing machine to generate then.Typical goal systems comprises this class method, presents software, Word, software administration home lighting and air-conditioning, software administration audio-visual equipment, software administration robot such as commercial affairs, or the like.
Employed each MEMS can produce the data in one or more spaces, one of them space is included at least two dimensions of time up-sampling, comprises displacement, speed and the acceleration of the translation in the linear space and displacement, speed and the acceleration of the rotation in the angular region.The gesture data from a plurality of spaces of a plurality of sensors that are assemblied in diverse location is used in many spatial analysiss, and/or from the gesture data in a plurality of spaces of the one or more sensors that are assemblied in single position, this many spatial analysiss use the ability of significantly having improved recognition system as the gesture analysis, support the gesture of decipher complexity.Many spatial analysiss support is to the machine decipher of complex symbol language, and it is not enough that the figure in any one space is cut apart possibility therein.For example, expection uses many spatial analysiss, based on the input from the MEMS on gesture people's the finger, can translate complicated synthetic language, such as American sign language ASL, international sign language, use the finger spelling of manual alphabet, or the like.And, use many spatial analysiss, utilize the self study process, can develop special-purpose gesture dictionary.
Host computer system has been described, it comprise be used for the user on the interface that communicates of signal accumulation unit, and the resource that is used for the data in a plurality of spaces of decipher.Except data processing hardware, resource comprise the gesture normative database and be used for will input data and database the program that compares of standard, wherein the gesture normative database comprises the one or more gesture standards in a plurality of spaces.And the resource in the main frame comprises being used to form and contains the decipher result's of gesture data the message and the communication resource that sends a message to target, data at the target place as input command or data.
Other aspects of the present invention and advantage are provided in following accompanying drawing, embodiment and claim.
Description of drawings
Fig. 1 is based on the simplified block diagram of the man-machine interface of gesture;
Fig. 2-Figure 16 shows can be by decipher to be used to utilizing microsensor described herein to create and detect the action sensing " space " of gesture;
Figure 17 is the block diagram that is used for the microsensor signal accumulation unit of man-machine interface system described herein;
Figure 18 is the block diagram that is used for the principal computer of man-machine interface system described herein; And
Figure 19 A and Figure 19 B provide the process flow diagram that the method for operating that is used for man-machine interface system described herein is shown.
Embodiment
Fig. 1 is based on the simplified block diagram of the man-machine interface of the gesture of doing in the environment 9.Master machine 10 (such as personal computer) or have graphical user interface or other equipment of display be attached to " hold-free (the hands-free) " sensing system that (makes the user needn't hold or grab sensor) on the human body and communicate.In preferred the realization, sensor comprises very little MEMS sensor 11-18, it is by be linked to signal accumulation unit 18 wired or wirelessly, packing in signal accumulation unit 18 data of autobiography sensor in the future, and uses Radio Communications Link (for example bluetooth) or use infrared communication link that packing data is transferred to master machine.Some embodiment also can use wired connection when needed.
As shown in the figure, each sensor 13-17 is attached on each of five fingers on hand, preferably at the rear portion, the 3rd joint of the second joint and the finger of thumb.In some system, can all use sensor on hand at two.And sensor unit can be attached on the health 19 as shown in figure, for example is included in the shirt pocket 12 or at ear to use on the accessory 11.Sensor 11-17 and accumulation unit 18 comprise corresponding base, and it is suitable for being attached to the position of human body or clothes (such as gloves or shirt).In some embodiment, the signal accumulation unit can be attached on one of them sensor, and does not need independent base.And, in some embodiment, each sensor is attached on the independent signal accumulation unit, and described signal accumulation unit and main frame communicate or communicate with another signal accumulation unit, and this another signal accumulation unit communicates with main frame or goal systems then.
In utilizing the system of a plurality of sensor units, the sensing of less dimension is provided, three linear dimensions possibilities are enough such as having for each sensor.For the system that utilizes single-sensor unit or less sensor unit, may wish that each unit comprises that one or more sensors reach 6 degree of freedom with sensing.
Because the size of sensor and support circuit is very little and in light weight, so sensor unit can be attached to a plurality of positions on the human body directly or indirectly, comprises hand, elbow, arm, health, belly, neck, head, pin, knee, toe or the like.Sensor unit can use resilient ring or band, clip, wrist strap, adhesive tape, glue or belt to be attached on the finger.Alternatively, sensor unit can be included in the clothes, in gloves, socks, cap, shirt or shoes.Sensor unit can be assemblied in ring, wrist-watch, glasses, earrings, other ears with on accessory and the necklace etc.
The typical sensor unit comprises inertial sensor and gyroscope, and it can sensing reach 6 degrees of motion, comprises x-, translation and x-on y-and the z-axle, the rotation on y-and the z-axle.Motion can be carried out decipher by decompose sensing data at translation and rotation both displacement, speed and acceleration space.The specific system based on gesture can utilize one or more sensors space, comprise a sensor space, perhaps at a plurality of sensors space of the one or more sensors that are assemblied in single position at each sensor of a plurality of sensors that are assemblied in the diverse location place.Most sensors can a plurality of axles of sensing and type of sports, and it can provide important information being used to constitute the language based on gesture, and is used to distinguish different gestures.In addition, single-sensor can provide the input information in linear and acceleration space, the velocity space and the displacement space angle, has provided the actual abundant input data that can't obtain in existing system based on vision.
For the purpose of description, microelectronic mechanical sensor MEMS is any class sensor that constitutes the unit, the small volume and less weight of wherein said unit, can be attached on the finger tip and can not disturb the proper motion of finger tip during the formation that system used gesture, and the nude film level assembly that it can be defined as first order encapsulation comprises pressure transducer, accelerometer, gyroscope, microphone etc.Usually MEMS comprises the element with environmental interaction, and its width or length is 1 millimeter magnitude, and can be in the same place with the support circuit package such as analog to digital converter, signal processor and communication port.
The typical MEMS that is suitable for the system based on gesture described herein comprises twin-axis accelerometer.For given application, two this acceierometer sensor can be assemblied in single position with a plurality of three-dimensional line acceleration of sensing.Other typical MEMS that are used for the system based on gesture described herein comprise gyroscope, and it comprises piezoelectric vibration gyroscope.
Master machine 10 and signal accumulation unit 18 comprise the data processing resource, and it provides the decipher to the gesture data that receives the self-sensing device.In some embodiment, signal accumulation unit 18 is carried out than more decipher processing in other embodiments, makes master machine 10 carry out different decipher treatment capacities based on the additional processing at 18 places, signal accumulation unit.The gesture data of institute's decipher by host process to produce signal specific.
Master machine 10 is determined the signal specific as the result of the gesture data of institute's decipher, determines the target of this signal specific, and the signal that obtains is issued target.Target can be included in the computer program that moves on the other system that moves on the master machine 10 or operate in user environment, the user is just undertaken by sign language and it alternately.Therefore, gesture data is delivered to main frame again to environment from the user, and the equipment that is used for controling environment, comprise the signal that sign language is converted to control audio-visual equipment, sign language is converted to voice or other sound signals, sign language is converted to the message that sends to remote system via internet or other communication protocol, like that.
Master machine 10 also comprises the feedback supplier's who serves as the user resource.This obtains the user provides hand signal to master machine, the mutual loop of master machine decipher signal and generation response.The user can send new hand signal then in interactive system, or the like.Typical switch type system comprises video-game etc., and wherein the user makes to use gesture provides input to recreation.
Master machine 10 can comprise mapping database, and it comprises the gesture standard that will be used to communicate by letter, and gesture is to the mapping of signal specific.The gesture standard can be taked the form of head-stamp unique in above-mentioned a plurality of space.Master machine 10 can comprise the computer program that the interactive learning process is provided, and presents the standard of certain gestures by this computer program to the user, and the user does gesture to attempt the standard that coupling is presented then.This provides the study circulation, and in this circulation, computing machine makes that the user can learn to be used for to carry out mutual gesture storehouse with computer system.
Master machine 10 can comprise interactive program, and the user defines the gesture standard that will use by this program.In this pattern, the user signals master machine 10, and it will define the gesture that potential solution is translated into signal specific.The user makes the action of definition gesture then, and main frame is with its reception and storage.Main frame is handled gesture data, and gesture data for example comprises the gesture data of a plurality of example gestures, thereby produces the head-stamp of this gesture in a plurality of spaces and store this head-stamp.The user re-defines the action of gesture then, and main frame is attempted the gesture data and the head-stamp that obtain are mated.This process can repeat always, up to finishing self-teaching.
Master machine 10 can also serve as device between two parties, wherein first user makes the storehouse that uses gesture send signal, this gesture data of main frame decipher, and by direct message, additional or other modes by environment that the user shared gesture data is delivered to other users' the group or second user, such as the one or more adversarys in the video-game.Other users can provide response, and this response also is to use the signal of signal in the gesture storehouse.
Master machine 10 can comprise the mapping of environment 9, and utilizes environment mappings and gesture dictionary to produce signal specific.For example, comprise that the gesture of pointing to the particular item in the environment can be interpreted as the signal that the expectation influence is positioned at the equipment of indicated direction.Therefore, can generate the gesture data that the indication user is pointing to the particular lamp in the room, and second gesture can produce the gesture data that indicates whether to increase or reduce the brightness of this lamp.
System described herein can use the sensor of describing the motion of sensor in the space to realize, comprise providing relating to the nearly gesture data of 6 degree of freedom, 3 translation freedoms in the linear space that provides by accelerometer are provided and the angular region that provides by gyroscope in 3 rotary freedoms.Also might use the space displacement of describing object at the gyroscope of all 6 degree of freedom at the accelerometer or the use of all 6 degree of freedom in theory.A plurality of spaces that use is provided at the sensing function that reaches 6 degree of freedom can be so that different complicated gestures can be distinguished in fast and reliable ground by system.Can come that sensor is run through the gesture data that produces between the moving period of given gesture by the displacement in linear space and the angular region, speed, acceleration analyzes.This analysis of crossing over a plurality of spaces provides head-stamp for each gesture, and it can be used to define the standard that makes this gesture can be different from other gestures, and is used to discern certain gestures to be converted to the signal of expectation.
In exemplary systems, mems accelerometer is used for providing the gesture data in linear acceleration space, and the MEMS gyroscope can be used for providing the gesture data in angular velocity space.The head-stamp of given gesture in the linear acceleration space and in the angular velocity space can be used to discern certain gestures.Concrete application is depended in the selection of the sensor that will use, number of sensors and the amount of space that will analyze, and influences the feasibility that cost, response time and the sensor of given gesture dictionary are selected.
Fig. 2-Figure 16 show can be how in various spaces the given action of decipher to provide abundant input information from single-sensor to carry out the gesture decipher.When a plurality of sensors were assemblied on the select location of human body, inertial sensor can be used for the detection of complex action, and it uses the combination of the data of gathering in a plurality of spaces.Some sensor based on MEMS is suitable for the measurement in the acceleration space.But, data are easy to decipher and are used for shown other spaces.And, use a more than multidimensional sensor, the perhaps combination of multidimensional sensor and fixed position or one-dimensional sensor, can detect language based on data, wherein use the information of relevant relative motion or relative displacement to come interpret data in these different spaces collections based on gesture.Fig. 2-Fig. 6 shows the head-stamp of action in displacement space, linear velocity space, linear acceleration space, angular velocity space and angular acceleration space at relative complex.Fig. 7-Figure 11 shows at based on the at the uniform velocity head-stamp of gesture in displacement space, linear velocity space, linear acceleration space, angular velocity space and angular acceleration space at the interval of linear movement.Figure 12-Figure 16 shows at based on the at the uniform velocity head-stamp of gesture in displacement space, linear velocity space, linear acceleration space, angular velocity space and angular acceleration space at the interval of angular motion.All or part of data shown in these accompanying drawings can be as the head-stamp of the micromotion of distinguishing certain gestures or gesture.
For example, if the user utilizes finger assembled sensor pivoting finger in the space, it has constant angular velocity on time domain, and then this motion will show as fixing point in the angular velocity space.This motion also can show as in the angular acceleration space and be positioned at the point of fixity that (0,0,0) is located, and for example, it has on time domain is zero angular acceleration.
For another example, the user utilizes finger assembled sensor to draw straight line in the space, and it has constant linear velocity on time domain, then will show as fixing point in this motion on-line velocity space.This motion also can show as in the linear acceleration space and be positioned at the point of fixity that (0,0,0) is located, and for example, it has on time domain is zero linear acceleration.
Simple gesture is called the combination of micromotion herein, can be used to form more complicated gesture.Can the gesture that will use be designed, make and it can be distinguished mutually in data handling system, wherein the figure in the detected one or more spaces of using system is cut apart and is discerned gesture.The head-stamp of certain gestures or micromotion can comprise the gesture data from two or more sensors, thereby a plurality of spaces that are used for head-stamp comprise: from the displacement of the lines space of first sensor with from the displacement of the lines space of second sensor; From the angular displacement space of first sensor with from the displacement of the lines space of second sensor; From the angular acceleration space of first sensor with from the displacement of the lines space of second sensor; Or the like.For example, the head-stamp of complicated gesture can comprise from each a spatial data of sensor of thumb and all the other four fingers on hand.Can use various configurations to produce unique head-stamp in a plurality of spaces.
Figure 17 is based on the block diagram of the gesture sensor-based system of MEMS sensor.The gesture sensor-based system comprises one group of MEMS sensor unit 30-33 (preferably including a plurality of sensor units), and it is coupled to hyperchannel analog to digital conversion circuit 34.MEMS sensor unit 30-33 can comprise inertial sensor, such as accelerometer and gyroscope.Change-over circuit 34 is coupled to bus, and micro controller unit MCU 35 is in the behavior of coordinating a plurality of unit between the executive system firmware on the bus, and coordination is used for the processing of the applied logic of gesture navigation.In the example shown, other unit on the bus comprise WatchDog Timer 36; Comparator logic 37, it is used for the indication gesture or comprises that the data list entries of micromotion of the gesture of a series of micromotions compares with the data sequence of being stored, and wherein the data sequence of being stored indicates unique sign of the micromotion of remembering gesture; SRAM 38 working storage for example are used for storing displacement, speed and the acceleration information of gesture when doing gesture; Embedded flash memory 39 is used to store micromotion database and application program, to support self-study and to proofread and correct; The applied logic 40 of any necessity except the logic that micro controller unit provided, it connects the logical OR high speed logic as glue and operates to support gesture decipher and navigation processing; ROM storer 41 is used for storage instruction or other control datas; And output device 42, be used for and main computer communication.WatchDog Timer 36 can be operated the time restriction that is used to be provided with to the processing that is used for the decipher gesture, gets rid of invalid command or from its recovery.Output device 42 can be the analog or digital passage, maybe can transmit other wireless or wired links of gesture input data such as bluetooth module, infrared module, WIFI module.Parts shown in Fig. 7 can be assemblied on the signal accumulation unit on the health (for example unit 18 among Fig. 1), perhaps are distributed in as required between the unit and host computer system that is assemblied on the health.
Although not shown, the gesture sensor-based system can comprise battery or electric battery.And, can use such as in RF ID technology, being used for the power coupler that radio-frequency power is sent usually.
Figure 18 is the simplified block diagram that is arranged to the data handling system 100 of the principal computer that is used for the communication system based on gesture described herein.System 100 comprises one or more CPU (central processing unit) 110, its arrangement is used for carrying out the computer program that is stored in program storage 101, accesses data memory 102, visit large scale memory 106 and control communication port 103, Standard User input equipment 104 and the display 105 such as disk drive, and wherein communication port 103 comprises and is used for the port of communicating by letter with signal accumulation unit 10 shown in Figure 1.Use the sign language system of the represented principal computer of Figure 18 to comprise single workstation, computer network and comprise custom-built machine by the apparatus of software control, robot etc.
For example system, the data processing resource that the gesture analytic process is utilized comprises and is embodied as the logic that is stored in the computer program in the storer 101.Alternatively, logic can use the computer program on local or the distributed machine to realize, and can partly use specialized hardware or other data processing resources to realize.Logic in the typical gesture analytic system comprises and is used for the decipher gesture data and is used to send the resource of the message of carrying the resulting signal of decipher and is used for the resource of sign language study and self study process.
Data-carrier store 102 is generally used for storing machine readable gesture dictionary, and its gesture that includes in a plurality of spaces defines, and is used for coming in conjunction with gesture data environment mappings and other data-intensive storehouse of decipher gesture.Large scale memory for example is used to store a plurality of gesture dictionaries and other large-scale data resource.
Figure 19 A and Figure 19 B provide and have illustrated and can be used for specifying the processor of purpose to carry out the process flow diagram of the order that simplifies the operation of the system of each step by the processor at sensor place, processor, processor in the principal computer or system in the signal accumulation unit.Process flow diagram starts from powering on of MEMS and signal accumulation unit or initialization (50).If system successfully powers on, (also promptly, not having system break) (51) are then carried out alternatively and are proofreaied and correct or self study process (52).If system does not successfully power on, then logic will enter " wait for and resetting " pattern (53).During correction/self study process, accept input (54) from sensor, it is screened and decipher.The graphic user interface channeling conduct that can use a computer of this process, wherein indicate the user to make certain gestures or micromotion, with comparing of being gathered as the result's of this gesture input data and anticipatory data corresponding to this gesture or micromotion.Can provide feedback to improve gesture or micromotion to the user, perhaps can revise anticipatory data with the employed action of match user so that be complementary with anticipatory data better.And the user can use interaction logic to specify the certain gestures that will be interpreted as particular command.In addition, this step can be used for setting up the system at left hand uses and the right hand uses.Combine as the result of correction/self study process or with it, create or upgrade gesture or micromotion database (56), with reference to the input of this database during can test operation to detect specific input command or language.
After determining to finish correction/self study process (55), system transfers to the run-up mode (57) of waiting for the gesture input.During run-up mode, gather input (58) from sensor, it is screened and analyze to determine whether to receive effective gesture input signal (59).Input signal can use mechanical signal or sound signal to describe, and perhaps is identified as the result of certain gestures order, or the like.The input data can be further formatted to be used for displacement, speed and the acceleration (60) of decipher along above-mentioned each linear axes and angle axle.The data that obtain then with gesture or micromotion database in information compare (61).If find coupling (62), then this output is delivered to principal computer (64) with as exporting sign language/designated command that (66) are located in system.The feedback of the decipher of relevant gesture or micromotion also offers gesture or micromotion database 56.If gesture or micromotion not with database in any entries match, then can analyze to determine the most similar gesture or micromotion (63) it.The result of this analysis can be as learning tool providing feedback to the user, thereby improve the technical ability that makes the navigational system that uses gesture, and perhaps can be used for the gesture of selecting this action most probable to expect according to probabilistic manner.
After gesture or micromotion deciphers and have been delivered to host computer system, for example at the gesture that comprises a series of micromotions, host computer system can be used the input signal of further processing with the identification expectation, perhaps in the situation that gesture is discerned in the signal accumulation unit fully, send message to the target processing of carrying out the indicated order of this signal, perhaps the indicated data of this signal are suitably handled.
System also provides the order of selecting institute's decipher or has used the function (65) that covers gesture command from the input in another source for the user.This selection/covering instrument 65 can be used for fuzzy gesture, for example by allowing one of determined " the most similar gesture " in the choice box 63, and as the affirmation signal of the gesture of the order that host computer system is had material impact at expression, and as second layer decipher.This selection/covering can be used in response to the voice interpreter of " the carrying out (go) " that can hear or " not carrying out (no go) " order and realize, perhaps uses other input systems, realizes such as keyboard or touch pad.
MEMS sensor unit ultralight just and very little, thereby to be attached on the human body can be very glitch-free for they." hands-free " technology of the small portable type sensor that this technology makes the gesture sensing to be used in combination to be attached on the health, thus the operator does not need grabbing moving cell or club provides the gesture input.It allows intuitively, interactive, come to initiate a message and order as people's natural way to machine.And displacement, speed and acceleration by sensing linear space and angular region can utilize meticulous gesture.User-defined gesture can be learnt with customized user language and order by system.
System based on gesture described herein can utilize complicated command history, and can be applied to the human synthetic language of decipher.The ability of sensing and the complicated gesture of decoding has multiple application, comprises the application at family, office and Customer Information processing capacity.The gesture sensing technology can be as presenting control tool, allows the speaker to use the gesture of the institute's sensing shown image that navigates, such as make image amplification, rolling image, move to down one page, like that.
The gesture sensing technology can the more than participant of decipher gesture.For example, the adversary in the computer game can use gesture sensing technology described herein to come alternately.
The ability of decipher based on the input of gesture is provided, and use its many places that can also can't use in the prior art.For example, the input signal based on gesture can be applied to automotive safety.For example, when driving, sensor can be worn on hand or be attached on the bearing circle, and can be used to detect the rotation of bearing circle or the action that the driver makes based on the another hand.Combine with the sensor such as image recognition, radar or ultrasound wave of activity (for example the place ahead stop vehicle) on detecting road,, then can give the alarm if system does not detect steering wheel rotation with avoiding obstacles.And the gesture sensing technology can be used so that get in touch with the hospital personnel its room outside by the patient of hospital, perhaps in their ability use when impaired of speaking.Gesture sensing technology described herein can also use in rugged surroundings, comprises industrial scene, seabed scene, fire-fighting, or the like.
Although described the present invention with reference to above-mentioned preferred implementation and example, be to be understood that these examples are intended to schematic meaning rather than limited significance.Those skilled in the art are easy to envision distortion and combination, these distortion and being combined in the spiritual scope of the present invention.

Claims (25)

1. system that is used for the decipher gesture comprises:
Sensor comprises the base that is suitable for being attached on the position of human body;
Be connected to the signal accumulation unit of described sensor, described signal accumulation unit comprises and being used for about packing in the data of the gesture of described sensor place sensing to produce the logic of packing data; And
Be used for the communication link with main computer communication, described packing data is sent to described principal computer by described communication link.
2. system according to claim 1 comprises a plurality of sensors that are connected to described signal accumulation unit, and it comprises the base that is suitable for being attached on the position of human body.
3. system according to claim 1, wherein said signal accumulation unit will be converted to digital form from analog form from the data of described sensor, and the bag of assembling digital gesture data, and described packing data comprises described bag.
4. system according to claim 1, wherein said communication link comprises radio communicating channel.
5. system according to claim 1, wherein said signal accumulation unit comprises the storer of storage micromotion database, and be used for the data from the data of described sensor and described micromotion database are compared producing the logic of interpret data, and described packing data comprises described interpret data.
6. system according to claim 1, the described gesture data that wherein said signal accumulation unit receives comprises the data in a plurality of spaces, and described signal accumulation unit comprises the storer of storage micromotion database, and be used for data and the data in the described micromotion database from described a plurality of spaces of described sensor are compared to produce the logic of interpret data, wherein said micromotion database comprises the standard that is defined at least one micromotion in a plurality of spaces, and described packing data comprises described interpret data.
7. system according to claim 1 comprises described principal computer, and described principal computer comprises and is used for the described packing data of decipher with recognition result signal and the resource that is used for described consequential signal is sent to target.
8. system according to claim 1, comprise described principal computer, described principal computer comprises the machine readable mapping of environment, and is used to use described machine readable to shine upon the described packing data of decipher with recognition result signal and the resource that is used for described consequential signal is sent to target.
9. system that is used for the decipher gesture comprises:
A plurality of microelectronic mechanical sensor MEMS comprise the base on the relevant position that is suitable for being attached to human body;
Be connected to the signal accumulation unit of described sensor, one in itself and the described a plurality of microelectronic mechanical sensor is assembled together, perhaps comprise the base on the another location that is suitable for being attached to human body alternatively, described signal accumulation unit comprises and being used for packing to produce the logic of packing data about the data in the gesture of described a plurality of sensors place sensing; And
Be used for the communication link with main computer communication, described packing data is sent to described principal computer by described communication link.
10. system according to claim 9, wherein said signal accumulation unit is handled the data from described sensor according to a plurality of in displacement, speed and the acceleration.
11. system according to claim 9, wherein said sensor produces a plurality of data in index line displacement, linear velocity, linear acceleration, angular displacement, angular velocity, the angular acceleration.
12. system according to claim 9, wherein said communication link comprises radio communicating channel.
13. system according to claim 9, wherein said signal accumulation unit comprises the storer of storage micromotion database, and is used for and will compares to produce the logic of interpret data from the data of described sensor and the data of described micromotion database.
14. system according to claim 9, the described gesture data that wherein said signal accumulation unit receives comprises the data in a plurality of spaces, and described signal accumulation unit comprises the storer of storage micromotion database, and be used for data and the data in the described micromotion database from described a plurality of spaces of described sensor are compared to produce the logic of interpret data, wherein said micromotion database comprises the standard that is defined at least one micromotion in a plurality of spaces, and described packing data comprises described interpret data.
15. system according to claim 9 comprises described principal computer, described principal computer comprises and is used for the described packing data of decipher with recognition result signal and the resource that is used for described consequential signal is sent to target.
16. system according to claim 9, comprise described principal computer, described principal computer comprises the machine readable mapping of environment, and is used to use described machine readable to shine upon the described packing data of decipher with recognition result signal and the resource that is used for described consequential signal is sent to target.
17. system according to claim 9, wherein said signal accumulation unit comprises the storer of storage micromotion database, and is used for and will compares to produce the logic of interpret data from the data of described sensor and the data of described micromotion database; And described system comprises described principal computer, described principal computer comprises and is used for described interpret data is handled with recognition result signal and the resource that is used for described consequential signal is sent to target, wherein said processing comprises mates the clauses and subclauses in micromotion sequence and the gesture dictionary with the recognition result signal, and described dictionary comprises the gesture standard based on the corresponding sequence of micromotion.
18. one kind is used to make use gesture and generates the method for computer-readable signal, comprising:
Use is assemblied in the action that the sensor of one or more positions of human body comes the described one or more locational one or more sensors of sensing, to produce the gesture data in a plurality of spaces, described a plurality of spaces are selected from displacement of the lines space, linear velocity space, linear acceleration space, angular displacement space, angular velocity space and angular acceleration space;
Clauses and subclauses in described gesture data and the machine readable gesture standard dictionary are mated with identification signal, described dictionary is with the signal set of the entry map in the dictionary to target processing, wherein said clauses and subclauses are included in the action norm in a plurality of spaces of one or more described positions, and described a plurality of spaces are selected from displacement of the lines space, linear velocity space, linear acceleration space, angular displacement space, angular velocity space and angular acceleration space; And
Described signal is sent to target processing.
19. method according to claim 18, wherein said one or more positions comprise the position on a plurality of fingers of hand.
20. method according to claim 18, wherein said one or more positions comprise the position on the finger of a plurality of hands.
21. method according to claim 18, wherein said one or more positions comprise the position that surpasses the 3rd joint on a plurality of fingers of human body.
22. method according to claim 18, wherein said one or more positions comprise the position that surpasses second joint on the thumb of the position that surpasses the 3rd joint on a plurality of fingers of human body and human body.
23. method according to claim 18, comprise described gesture data is sent to processor on the human body, and in described processor with the packing of described gesture data, by wireless communication link the gesture data of described packing is forwarded to principal computer from the described processor on the human body.
24. method according to claim 18, comprise described gesture data is sent to the processor that is assemblied on the human body from described one or more sensors, described gesture data is converted to digital form, digital gesture data is handled with the micromotion in the set of identification micromotion; And the gesture data that wherein uses in described coupling comprises the micromotion of described identification.
25. method according to claim 18 comprises that the machine readable mapping of described dictionary of use and environment is analyzed to discern described signal described gesture data.
CN200910258221A 2008-12-17 2009-12-17 Hand-free interface based on gesture using a plurality of sensor spaces Pending CN101751126A (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US33731208A 2008-12-17 2008-12-17
US12/337,312 2008-12-17

Publications (1)

Publication Number Publication Date
CN101751126A true CN101751126A (en) 2010-06-23

Family

ID=42478172

Family Applications (1)

Application Number Title Priority Date Filing Date
CN200910258221A Pending CN101751126A (en) 2008-12-17 2009-12-17 Hand-free interface based on gesture using a plurality of sensor spaces

Country Status (1)

Country Link
CN (1) CN101751126A (en)

Cited By (29)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102478959A (en) * 2010-11-28 2012-05-30 蒋霞 Control system and method for electronic device
CN102486677A (en) * 2010-12-06 2012-06-06 鸿富锦精密工业(深圳)有限公司 Ring Sensor Interactive System
CN102671374A (en) * 2011-03-09 2012-09-19 刘胜 Intelligent clothing for game
CN102707799A (en) * 2011-09-12 2012-10-03 北京盈胜泰科技术有限公司 Gesture identification method and gesture identification device
CN103455136A (en) * 2012-06-04 2013-12-18 中兴通讯股份有限公司 Inputting method, inputting device and inputting system based on gesture control
CN103543837A (en) * 2013-11-07 2014-01-29 金纯� Keyboard-free input method and device achieving same
CN104090649A (en) * 2014-05-20 2014-10-08 上海翰临电子科技有限公司 Intelligent watchband and operating control method thereof
EP2916210A1 (en) 2014-03-05 2015-09-09 Markantus AG Finger-worn device for providing user input
CN104951061A (en) * 2014-03-28 2015-09-30 太瀚科技股份有限公司 System and method for pressure and dynamic sensing control
CN105242780A (en) * 2015-09-23 2016-01-13 谢小强 Interactive control method and apparatus
CN105472113A (en) * 2014-09-04 2016-04-06 深圳富泰宏精密工业有限公司 Driving mode starting system and method
CN106054650A (en) * 2016-07-18 2016-10-26 汕头大学 Novel intelligent household system and multi-gesture control method thereof
CN106293090A (en) * 2016-08-15 2017-01-04 北海市深蓝科技发展有限责任公司 A kind of intelligent wireless mouse
CN106293138A (en) * 2016-08-12 2017-01-04 包爱民 A kind of body sensing ring mouse
CN106489080A (en) * 2014-08-07 2017-03-08 谷歌公司 Gesture sensing data transmission based on radar
CN107644182A (en) * 2016-07-21 2018-01-30 英飞凌科技股份有限公司 Radio system for wearable device
CN107943302A (en) * 2017-12-20 2018-04-20 潍坊科技学院 A kind of finger-bag type computer input unit
CN108829243A (en) * 2012-11-24 2018-11-16 奥普迪格股份有限公司 Calculate interface system
US10310621B1 (en) 2015-10-06 2019-06-04 Google Llc Radar gesture sensing using existing data protocols
US10409385B2 (en) 2014-08-22 2019-09-10 Google Llc Occluded gesture recognition
US10496182B2 (en) 2015-04-30 2019-12-03 Google Llc Type-agnostic RF signal representations
US10664061B2 (en) 2015-04-30 2020-05-26 Google Llc Wide-field radar-based gesture recognition
US10664059B2 (en) 2014-10-02 2020-05-26 Google Llc Non-line-of-sight radar-based gesture recognition
US10817070B2 (en) 2015-04-30 2020-10-27 Google Llc RF-based micro-motion tracking for gesture tracking and recognition
CN112183429A (en) * 2020-10-12 2021-01-05 苏州晴森模具有限公司 3D gesture recognition system and method
US10936085B2 (en) 2015-05-27 2021-03-02 Google Llc Gesture detection and interactions
US10948996B2 (en) 2014-06-03 2021-03-16 Google Llc Radar-based gesture-recognition at a surface of an object
US11140787B2 (en) 2016-05-03 2021-10-05 Google Llc Connecting an electronic component to an interactive textile
US11169988B2 (en) 2014-08-22 2021-11-09 Google Llc Radar recognition-aided search

Cited By (67)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102478959A (en) * 2010-11-28 2012-05-30 蒋霞 Control system and method for electronic device
CN102486677A (en) * 2010-12-06 2012-06-06 鸿富锦精密工业(深圳)有限公司 Ring Sensor Interactive System
CN102671374A (en) * 2011-03-09 2012-09-19 刘胜 Intelligent clothing for game
CN102707799A (en) * 2011-09-12 2012-10-03 北京盈胜泰科技术有限公司 Gesture identification method and gesture identification device
CN102707799B (en) * 2011-09-12 2016-03-02 北京安赛博技术有限公司 A kind of gesture identification method and gesture identifying device
CN103455136A (en) * 2012-06-04 2013-12-18 中兴通讯股份有限公司 Inputting method, inputting device and inputting system based on gesture control
CN108829243B (en) * 2012-11-24 2021-10-15 奥普迪格股份有限公司 Computing interface system
CN108829243A (en) * 2012-11-24 2018-11-16 奥普迪格股份有限公司 Calculate interface system
CN103543837A (en) * 2013-11-07 2014-01-29 金纯� Keyboard-free input method and device achieving same
EP2916210A1 (en) 2014-03-05 2015-09-09 Markantus AG Finger-worn device for providing user input
US10296085B2 (en) 2014-03-05 2019-05-21 Markantus Ag Relatively simple and inexpensive finger operated control device including piezoelectric sensors for gesture input, and method thereof
CN104951061A (en) * 2014-03-28 2015-09-30 太瀚科技股份有限公司 System and method for pressure and dynamic sensing control
CN104090649A (en) * 2014-05-20 2014-10-08 上海翰临电子科技有限公司 Intelligent watchband and operating control method thereof
US10948996B2 (en) 2014-06-03 2021-03-16 Google Llc Radar-based gesture-recognition at a surface of an object
CN106489080A (en) * 2014-08-07 2017-03-08 谷歌公司 Gesture sensing data transmission based on radar
CN106489080B (en) * 2014-08-07 2019-11-05 谷歌有限责任公司 Gesture sensing and data transmission based on radar
US10642367B2 (en) 2014-08-07 2020-05-05 Google Llc Radar-based gesture sensing and data transmission
US11169988B2 (en) 2014-08-22 2021-11-09 Google Llc Radar recognition-aided search
US11221682B2 (en) 2014-08-22 2022-01-11 Google Llc Occluded gesture recognition
US11816101B2 (en) 2014-08-22 2023-11-14 Google Llc Radar recognition-aided search
US10936081B2 (en) 2014-08-22 2021-03-02 Google Llc Occluded gesture recognition
US12153571B2 (en) 2014-08-22 2024-11-26 Google Llc Radar recognition-aided search
US10409385B2 (en) 2014-08-22 2019-09-10 Google Llc Occluded gesture recognition
CN105472113A (en) * 2014-09-04 2016-04-06 深圳富泰宏精密工业有限公司 Driving mode starting system and method
US11163371B2 (en) 2014-10-02 2021-11-02 Google Llc Non-line-of-sight radar-based gesture recognition
US10664059B2 (en) 2014-10-02 2020-05-26 Google Llc Non-line-of-sight radar-based gesture recognition
US10664061B2 (en) 2015-04-30 2020-05-26 Google Llc Wide-field radar-based gesture recognition
US10817070B2 (en) 2015-04-30 2020-10-27 Google Llc RF-based micro-motion tracking for gesture tracking and recognition
US12340028B2 (en) 2015-04-30 2025-06-24 Google Llc RF-based micro-motion tracking for gesture tracking and recognition
US11709552B2 (en) 2015-04-30 2023-07-25 Google Llc RF-based micro-motion tracking for gesture tracking and recognition
US10496182B2 (en) 2015-04-30 2019-12-03 Google Llc Type-agnostic RF signal representations
US10936085B2 (en) 2015-05-27 2021-03-02 Google Llc Gesture detection and interactions
CN105242780A (en) * 2015-09-23 2016-01-13 谢小强 Interactive control method and apparatus
CN105242780B (en) * 2015-09-23 2018-05-18 谢小强 A kind of interaction control method and device
US11693092B2 (en) 2015-10-06 2023-07-04 Google Llc Gesture recognition using multiple antenna
US11698439B2 (en) 2015-10-06 2023-07-11 Google Llc Gesture recognition using multiple antenna
US10503883B1 (en) 2015-10-06 2019-12-10 Google Llc Radar-based authentication
US10823841B1 (en) 2015-10-06 2020-11-03 Google Llc Radar imaging on a mobile computing device
US10540001B1 (en) 2015-10-06 2020-01-21 Google Llc Fine-motion virtual-reality or augmented-reality control using radar
US10908696B2 (en) 2015-10-06 2021-02-02 Google Llc Advanced gaming and virtual reality control using radar
US10310621B1 (en) 2015-10-06 2019-06-04 Google Llc Radar gesture sensing using existing data protocols
US10705185B1 (en) 2015-10-06 2020-07-07 Google Llc Application-based signal processing parameters in radar-based detection
US10379621B2 (en) 2015-10-06 2019-08-13 Google Llc Gesture component with gesture library
US11132065B2 (en) 2015-10-06 2021-09-28 Google Llc Radar-enabled sensor fusion
US12117560B2 (en) 2015-10-06 2024-10-15 Google Llc Radar-enabled sensor fusion
US12085670B2 (en) 2015-10-06 2024-09-10 Google Llc Advanced gaming and virtual reality control using radar
US10459080B1 (en) 2015-10-06 2019-10-29 Google Llc Radar-based object detection for vehicles
US10768712B2 (en) 2015-10-06 2020-09-08 Google Llc Gesture component with gesture library
US11175743B2 (en) 2015-10-06 2021-11-16 Google Llc Gesture recognition using multiple antenna
US11698438B2 (en) 2015-10-06 2023-07-11 Google Llc Gesture recognition using multiple antenna
US11256335B2 (en) 2015-10-06 2022-02-22 Google Llc Fine-motion virtual-reality or augmented-reality control using radar
US10401490B2 (en) 2015-10-06 2019-09-03 Google Llc Radar-enabled sensor fusion
US11385721B2 (en) 2015-10-06 2022-07-12 Google Llc Application-based signal processing parameters in radar-based detection
US11656336B2 (en) 2015-10-06 2023-05-23 Google Llc Advanced gaming and virtual reality control using radar
US11481040B2 (en) 2015-10-06 2022-10-25 Google Llc User-customizable machine-learning in radar-based gesture detection
US11592909B2 (en) 2015-10-06 2023-02-28 Google Llc Fine-motion virtual-reality or augmented-reality control using radar
US11140787B2 (en) 2016-05-03 2021-10-05 Google Llc Connecting an electronic component to an interactive textile
CN106054650A (en) * 2016-07-18 2016-10-26 汕头大学 Novel intelligent household system and multi-gesture control method thereof
US11417963B2 (en) 2016-07-21 2022-08-16 Infineon Technologies Ag Radio frequency system for wearable device
US11336026B2 (en) 2016-07-21 2022-05-17 Infineon Technologies Ag Radio frequency system for wearable device
CN107644182A (en) * 2016-07-21 2018-01-30 英飞凌科技股份有限公司 Radio system for wearable device
CN107644182B (en) * 2016-07-21 2020-09-04 英飞凌科技股份有限公司 Radio frequency system for wearable device
CN106293138A (en) * 2016-08-12 2017-01-04 包爱民 A kind of body sensing ring mouse
CN106293090A (en) * 2016-08-15 2017-01-04 北海市深蓝科技发展有限责任公司 A kind of intelligent wireless mouse
CN107943302A (en) * 2017-12-20 2018-04-20 潍坊科技学院 A kind of finger-bag type computer input unit
CN112183429B (en) * 2020-10-12 2024-01-19 苏州晴森模具有限公司 3D gesture recognition system and method
CN112183429A (en) * 2020-10-12 2021-01-05 苏州晴森模具有限公司 3D gesture recognition system and method

Similar Documents

Publication Publication Date Title
CN101751126A (en) Hand-free interface based on gesture using a plurality of sensor spaces
CN104024987B (en) Device, methods and techniques for wearable navigation equipment
CN104007844B (en) Electronic instrument and wearable type input device for same
Kandalan et al. Techniques for constructing indoor navigation systems for the visually impaired: A review
CN102460510B (en) For the space multi-mode opertaing device used together with spatial operation system
US20110234488A1 (en) Portable engine for entertainment, education, or communication
US20220155866A1 (en) Ring device having an antenna, a touch pad, and/or a charging pad to control a computing device based on user motions
US10976863B1 (en) Calibration of inertial measurement units in alignment with a skeleton model to control a computer system based on determination of orientation of an inertial measurement unit from an image of a portion of a user
KR101341481B1 (en) System for controlling robot based on motion recognition and method thereby
US11175729B2 (en) Orientation determination based on both images and inertial measurement units
US20150185852A1 (en) Ring mobile device and operation method of the same
CN103930944A (en) Adaptive tracking system for spatial input devices
JP6938980B2 (en) Information processing equipment, information processing methods and programs
RU187548U1 (en) VIRTUAL REALITY GLOVE
CN111552383A (en) Finger identification method and system of virtual augmented reality interaction equipment and interaction equipment
WO2020009715A2 (en) Tracking user movements to control a skeleton model in a computer system
Xu et al. Intelligent wearable interfaces
JP2015176176A (en) Information input-output device and information input-output method
RU2670649C1 (en) Method of manufacturing virtual reality gloves (options)
Mao et al. Eliminating drift of the head gesture reference to enhance Google Glass-based control of an NAO humanoid robot
ES2561927B1 (en) Multisensor system for rehabilitation and interaction of people with disabilities
Mubashira et al. A comprehensive study on human interaction with IoT systems
KR102659301B1 (en) Smart ring and method for smart ring control
Sciberras Interactive gesture controller for a motorised wheelchair
CN203522997U (en) Multifunctional earphone based on MEMS sensor

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C02 Deemed withdrawal of patent application after publication (patent law 2001)
WD01 Invention patent application deemed withdrawn after publication

Open date: 20100623