WO2013035554A1 - 入力体の動き検出方法およびそれを用いた入力デバイス - Google Patents
入力体の動き検出方法およびそれを用いた入力デバイス Download PDFInfo
- Publication number
- WO2013035554A1 WO2013035554A1 PCT/JP2012/071456 JP2012071456W WO2013035554A1 WO 2013035554 A1 WO2013035554 A1 WO 2013035554A1 JP 2012071456 W JP2012071456 W JP 2012071456W WO 2013035554 A1 WO2013035554 A1 WO 2013035554A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- hand
- coordinates
- fist
- fingertip
- movement
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 15
- 238000012634 optical imaging Methods 0.000 claims abstract description 32
- 230000005484 gravity Effects 0.000 claims description 46
- 210000000707 wrist Anatomy 0.000 claims description 6
- 230000008859 change Effects 0.000 claims description 4
- 238000005259 measurement Methods 0.000 abstract description 6
- 238000010191 image analysis Methods 0.000 abstract description 5
- 238000001514 detection method Methods 0.000 description 16
- 238000003384 imaging method Methods 0.000 description 7
- 238000010586 diagram Methods 0.000 description 3
- 230000008569 process Effects 0.000 description 3
- 230000003287 optical effect Effects 0.000 description 2
- 239000007787 solid Substances 0.000 description 2
- 238000013459 approach Methods 0.000 description 1
- 238000005452 bending Methods 0.000 description 1
- 230000015572 biosynthetic process Effects 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000012545 processing Methods 0.000 description 1
- 230000009467 reduction Effects 0.000 description 1
- 238000011160 research Methods 0.000 description 1
- 238000003786 synthesis reaction Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/0304—Detection arrangements using opto-electronic means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/042—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
- G06F3/0425—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means using a single imaging device like a video camera for tracking the absolute position of a single or a plurality of objects with respect to an imaged reference surface, e.g. video camera imaging a display or a projection screen, a table or a wall surface, on which a computer generated image is displayed or projected
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
- G06T7/246—Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/20—Movements or behaviour, e.g. gesture recognition
- G06V40/28—Recognition of hand or arm movements, e.g. recognition of deaf sign language
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/041—Indexing scheme relating to G06F3/041 - G06F3/045
- G06F2203/04101—2.5D-digitiser, i.e. digitiser detecting the X/Y position of the input means, finger or stylus, also when it does not touch, but is proximate to the digitiser's interaction surface and also measures the distance of the input means within a short range in the Z direction, possibly with a separate measurement setup
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30196—Human being; Person
Definitions
- the present invention relates to a hand movement detection method used for inputting coordinates in an input device, and an input device using the method.
- An input device using a human hand or a finger as an input body for operation has been developed as a user interface of an apparatus capable of interacting with the displayed 2D video or 3D video.
- This input device is equipped with a three-dimensional position measurement system having a plurality of optical imaging means such as cameras, and is based on images (two-dimensional images) obtained from each camera in which a photographing position and a photographing angle are determined.
- the three-dimensional position and three-axis (XYZ-axis) coordinates of an object (input body) to be obtained are obtained by calculation, and the coordinate values are output to control means (computer or the like) such as a display device (for example, Patent Document 1, 2).
- the input body is used as an optical imaging means for photographing the input body.
- JP-A-9-53914 Japanese Patent Laid-Open No. 11-23262
- each camera may not be disposed at an optimal position for photographing, and may be disposed at a position where the operator feels uncomfortable.
- the movement of the hand of a person who is not used to the operation may become unnatural or not smooth.
- the present invention has been made in view of such circumstances, and an input body motion detection method capable of detecting a three-dimensional motion of a human hand from image analysis using a single optical imaging means.
- An object of the present invention is to provide an input device for pointing operation using this motion detection method.
- the present invention is a method for detecting a three-dimensional movement of a hand used for inputting coordinates in an input device with a single optical imaging means, above or below the hand including a fist. Projecting light toward the hand from a light source disposed on the optical hand, and arranging an optical imaging means on the same side as the light source with respect to the hand, and reflecting the light by the hand to the virtual photographing plane. The step of acquiring as the above two-dimensional image, and assigning the coordinates of two axes orthogonal to each other to the two-dimensional image, and recognizing the shape of the fist and the tip position of the finger protruding from the fist from the image.
- a first gist is a method for detecting a motion of an input body, comprising: determining that a fingertip has pivoted up and down about the wrist or elbow of the hand.
- the present invention provides a light source disposed above or below a hand including a fist used as an input body of the device, and disposed on the same side as the light source with respect to the hand.
- a movement determination unit that compares distances and determines, when the distance is reduced or expanded before and after the time interval, the movement of the hand at that time as a vertical movement of a finger relative to the virtual imaging plane of the optical imaging unit;
- the present inventor conducted extensive research to solve the above-mentioned problems, and verified in detail the movement (image) of the hand when the image was taken with one camera. And I discovered that there was a difference in how the fingertip moves and how the fist (palm) moves in the hand.
- the input body motion detection method includes a step of projecting light toward a hand including a fist and a single optical imaging means.
- the motion detection method of the input body of the present invention when the distance between the center of gravity coordinates of the fist and the fingertip coordinates does not change before and after the repetition, the entire hand including the fist is If the distance between the center of gravity coordinate of the fist and the fingertip coordinate changes and it is determined that the slide has moved in the shooting plane (virtual plane) direction, the fingertip is placed on the shooting plane using the wrist or elbow of the hand as a fulcrum. On the other hand, it determines with having rotated up and down. Therefore, the motion detection method of the input body of the present invention can detect the three-dimensional motion of the human hand from the image analysis using only one optical imaging means.
- the input device of the present invention includes a light source, an optical imaging unit disposed on the same side as the light source, a control unit, and a fist shape distribution from a two-dimensional image obtained by the scientific imaging unit.
- Shape recognition means for calculating the coordinates of the center of gravity and the coordinates of the tip of the finger, and a motion determination means for comparing the distance between the center of gravity coordinates of the fist and the fingertip coordinates before and after a predetermined time interval. Yes. Therefore, when the distance between the center of gravity coordinates of the fist and the fingertip coordinates is reduced or enlarged before and after the time interval, the movement of the hand at that time is determined by the finger on the virtual imaging plane of the optical imaging means. It can be determined as vertical movement (movement in a direction perpendicular to the optical imaging means). As a result, the input device of the present invention can detect the three-dimensional movement of the human hand from the image analysis using only one optical imaging means equipped.
- the input device of the present invention requires only one optical imaging means as described above, it is possible to configure an input device that detects three-dimensional movement with simple equipment and low cost. it can.
- the degree of freedom of arrangement of the optical imaging means (camera and the like) is improved, the camera and the like can be arranged (hidden) at a position where the operator is not conscious. Therefore, the input device of the present invention can be an intuitive and user-friendly device that is easy to input even for beginners.
- (A)-(c) is a figure explaining the detection method of the coordinate of the hand in the input device of embodiment of this invention. It is a figure which shows the 1st pattern of the movement of the hand in the input device of this invention.
- (A), (b) is a figure explaining the detection method of the movement (XY direction) of the hand in the input device of embodiment of this invention. It is a figure which shows the 2nd pattern of the movement of the hand in the input device of this invention.
- (A), (b) is another figure explaining the detection method of the movement (Z direction) of the hand in the input device of embodiment of this invention. It is a figure which shows the other example of arrangement
- FIG. 1A is a view for explaining a method of detecting the coordinates of the hand H in the input device according to the embodiment of the present invention
- FIG. 1B is taken by the optical imaging means (camera C) of the input device.
- FIG. 1C is a schematic diagram of an image H ′′ obtained by binarizing the two-dimensional image H ′ of the hand H.
- FIG. 1C is a schematic diagram of the two-dimensional (virtual imaging plane P) image H ′.
- the computer having the functions of a control means, shape recognition means, motion determination means, etc. for controlling the camera C and the light source L, and a computer connected to the camera C are not shown.
- the input device in the present embodiment is for detecting the three-dimensional movement of the hand H including the fist, which is used as an input body of the device, with a single optical imaging means (camera C).
- a camera unit including a camera C having an image sensor and a plurality of light sources L arranged around the camera C is disposed below the hand H (substantially vertically below). Yes.
- the input device reflects reflection (image) of light projected from the light sources L toward the hand H by shape recognition means (not shown) in the XY directions as shown in FIG. After acquiring as a two-dimensional image H ′ on the virtual imaging plane P having coordinate axes, the acquired two-dimensional image H ′ is binarized based on a threshold value as shown in FIG.
- the shape of the fist of the hand H (identified with a solid line in the figure) is identified, and a coordinate (centroid coordinate G) corresponding to the center of gravity of the area distribution of the fist is calculated.
- the input device repeats the projection of light from the light source L, the acquisition of the two-dimensional image H ′ by the camera C, and the calculation of the center of gravity coordinates G and fingertip coordinates T of the fist based on the two-dimensional image.
- the distance between the center-of-gravity coordinates G of the fist and the fingertip coordinates T changes before and after the above-mentioned repetition by the motion determination means (not shown), the hand at that time is referred to (see FIG. 4 and FIG. 5A).
- the movement of H is determined as the vertical movement (movement in the Z direction) of the finger with respect to the virtual photographing plane P of the camera C. This is a feature of the input device of the present invention.
- the detection method used for detecting the movement of the input device and its input body (hand H) will be described in more detail.
- the camera unit disposed below the hand H is a camera as shown in FIG. C and a plurality (three in this example) of light sources L arranged around the camera C.
- an image sensor such as a CMOS or a CCD can be used.
- a photoelectric conversion element such as a photodiode, a phototransistor, a photo IC, a photo reflector, etc., in addition to the camera C using the CMOS image sensor or the CCD image sensor.
- Various optical sensors used specifically, one-dimensional or two-dimensional PSD (Position Sensitive Detector), pyroelectric infrared sensor, CdS sensor, or the like can be used.
- the light source L for example, a light emitter such as an infrared LED or a lamp can be used.
- a light emitter such as an infrared LED or a lamp
- the camera unit may be disposed below the input body (hand H) with an inclination with respect to the hand H (see FIG. 6), or disposed above the hand H (see FIG. 7). Also good.
- a method for detecting the movement of the hand H inserted in the detection area of the input device will be described step by step in the process (step).
- This light projection may be intermittent light emission [light projection step].
- the hand H is photographed by the camera C disposed on the same side as the light source L (downward in this example) with respect to the hand H, and reflection of the light by the hand H ( As shown in FIG. 1B, the reflected light or the reflected image is acquired as a two-dimensional image H ′ (an image corresponding to the virtual imaging plane P) having coordinate axes in the XY directions orthogonal to each other [imaging step].
- the hand H After the obtained two-dimensional image H ′ is binarized based on a threshold value, as shown in FIG. 1 (c), from the binarized image H ′′, the hand H The shape of the fist (solid hatched portion in the figure) is identified, and the coordinates (centroid coordinates G) corresponding to the distribution center of gravity of this fist area are calculated by calculation.
- the binarized image H ′′ Among them, a finger protruding from the fist (dotted line hatched portion in the figure) is identified, and a coordinate (fingertip coordinate T) corresponding to the tip position is calculated.
- the center-of-gravity coordinates G and fingertip coordinates T of these fists are stored in a storage means such as a control means (computer) [coordinate specifying step].
- a step of projecting the light at a predetermined time interval [light projection step], a step of acquiring a two-dimensional image [imaging step], and a step of calculating the center of gravity coordinates G and fingertip coordinates T of the fist [coordinates Specific step] is repeated, and the center-of-gravity coordinates G and fingertip coordinates T of the fist after the repetition are measured again [measurement step].
- the movement of the hand H is changed to the following two patterns, that is, the first pattern in which the hand H slides horizontally (see FIG. 2), or the second movement in which the hand H swings up and down. It is determined whether it is a pattern (see FIG. 4), and the movement direction (XYZ direction) of the fingertip coordinates T and the movement amount thereof are output to the outside of the display device or the like via a control means or the like [determination step].
- the center-of-gravity coordinate G of the fist moves from the initial position before movement (coordinate G 0 ) indicated by a two-dot chain line in the drawing to the position after movement (coordinate G 1 ) indicated by a solid line
- the fingertip coordinates T moves in parallel with the center of gravity coordinate G of the fist from the initial position before movement (coordinate T 0 ) to the position after movement (coordinate T 1 ) indicated by a solid line.
- the movement determination means of this input device determines that the hand H has slid in the horizontal direction as shown in FIG. 2, and the XY coordinate value of the fingertip coordinate T 1 after the movement Alternatively, the moving direction and distance of the fingertip coordinates T (between coordinates T 0 and coordinates T 1 ) are output to the outside as input body data.
- the movement of the fingertip coordinate T (T 0 ⁇ T 1 ) on the virtual imaging plane P having the coordinate axes in the XY directions is shown.
- Identification areas assigned to four directions [X (+), X ( ⁇ ), Y (+), Y ( ⁇ )]] may be set for each area. If comprised in this way, simultaneously with the determination of the movement of the hand H by said [determination step], the said input device respond
- the fingertip coordinates T By the movement of the fingertip coordinates T, it is possible to function as a pointing device that simply outputs signals in four directions (XY directions + and-directions).
- the setting angle ⁇ , the shape, the arrangement, and the like of the area in the identification area may be set according to the device, application, or the like that outputs the signal.
- the fingertip coordinate T is relatively far from the initial position (coordinate T 0 ) before the movement, and after the movement indicated by the solid line. It moves to a position (coordinate T 2 ) and moves so as to approach the center of gravity coordinate G position of the fist.
- the distance d 0 between the center of gravity coordinate G 0 of the fist of the hand H 0 before the movement and the fingertip coordinate T 0 and the movement are obtained by repeating the above [Measuring step].
- the input device and its movement determination means determine that the hand H swings in the vertical direction (Z direction) as shown in FIG. 4 and outputs the signal to the outside as data of the input body.
- the detection method (image processing flow) of the vertical movement of the hand H in the Z-axis direction that is, the movement of the fingertip coordinate T to the gravity center coordinate G on the binarized image will be described in more detail.
- infrared rays are projected from each light source L (infrared LED) disposed below the hand H including the fist, and the camera C disposed similarly below the hand H reflects the hand H (reflected).
- (Dimensional image) is photographed, and coordinate axes in the XY directions are assigned to the two-dimensional image.
- the threshold value of brightness (luminance) for binarization is set optimally by the shape recognition means (program), the binarization process is performed on the two-dimensional image, and then the thinning process is performed. As shown in FIG. 5A, the outer shape of the hand H is clarified.
- the finger part is identified by the shape recognition means (program), and the coordinates [fingertip coordinates T (Xp, Yq)] corresponding to the tip part are calculated.
- the shape of the fist of the hand H is identified by a similar program, and coordinates corresponding to the center of gravity of the area distribution of the fist [centroid coordinates G (Xm, Yn)] is calculated.
- the center-of-gravity coordinate G the total value of the X-coordinate values of the pixels existing in the fist shape / the number of pixels existing in the fist shape (1)
- Yn of barycentric coordinates G total value of Y coordinates of pixels existing in fist shape / number of pixels existing in fist shape (2)
- the distance d 0 between the center of gravity coordinate G 0 of the hand and the fingertip coordinate T 0 is compared with the distance d 2 between the center of gravity coordinate G 2 of the fist of the hand H 2 after movement and the fingertip coordinate T 2 .
- FIG. 5B is a diagram for explaining in principle the method of comparing the distance d between the center of gravity coordinates G and the fingertip coordinates T of the fist.
- the condition is that the segment A connecting the center of gravity G 0 of the fist of the hand H 0 before the movement and the fingertip coordinate T 0 and the fist of the hand H 2 after the movement.
- the absolute value of the difference ( ⁇ 1 ⁇ 2)] is equal to or less than a predetermined threshold value. This condition is provided in order not to misdetermine an operation such as bending a finger as the vertical movement of the hand H.
- the above-mentioned “distance d” is added to the movement determination means or the movement determination program.
- the motion detection method of the input body allows the motion of the hand H when the distance between the center of gravity coordinate G of the fist and the fingertip coordinate T is reduced or enlarged before and after the measurement. Can be determined as the vertical movement of the finger relative to the virtual imaging plane P of the optical imaging means (camera C).
- the input device of the present invention using the input body motion detection method uses only one camera C disposed below or above the hand H, and from the image analysis thereof, the human hand H A movement in the Z-axis direction, that is, a three-dimensional movement can be detected.
- the input device can detect the movement in the Z-axis direction, for example, the movement in the horizontal direction (XY direction) of the hand H is assigned to the cursor movement operation of the display device, etc. It is also possible to assign the movement to the determination (click) operation.
- the movement of the hand H in the X axis (left and right) direction and the Z axis (vertical) direction is assigned to the movement operation of the object on the display device, and the movement in the Y axis (front and back) direction is
- the object can be assigned to the enlargement / reduction operation.
- the input device of the present invention can perform an operation corresponding to original three-dimensional (XYZ three-axis) information for a three-dimensional (3D) image or the like.
- there is also a merit that a more intuitive operation is possible by approaching the same operating environment as that in an actual three-dimensional space.
- the motion detection method of the input body according to the present invention can be achieved even when the camera unit is disposed in an inclined manner below the input body (hand H) (FIG. 6).
- the three-dimensional movement of the human hand H can be detected. That is, as shown in FIG. 6, in the case where the optical imaging means (camera C) is disposed below the hand H in an inclined manner, the optical imaging means (camera C) is arranged just below the hand H (see FIG. 4).
- the vertical movement (movement in the Z direction) of the hand H with respect to the virtual imaging plane P ′ can be recognized and determined.
- the optical imaging unit (camera C)
- the optical imaging unit similarly recognizes the vertical movement (movement in the Z direction) of the hand H with respect to the virtual imaging plane P ′′. Therefore, the camera unit can be arranged in any way as long as the hand H is in a position where it is not shadowed by an arm or other object that obstructs imaging.
- the optical imaging means of the present invention captures and recognizes the reflected light of the hand H, the camera C and the light source L constituting the camera unit are connected to the input body (hand H). Arranged on the same side of the upper or lower side.
- the input body motion detection method and the input device using the same can detect a three-dimensional motion of a human hand using only one camera without using a plurality of cameras. it can. As a result, a more intuitive operation can be performed on a three-dimensional (3D) image or the like as in an actual three-dimensional space.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Multimedia (AREA)
- Computer Vision & Pattern Recognition (AREA)
- General Health & Medical Sciences (AREA)
- Psychiatry (AREA)
- Social Psychology (AREA)
- Health & Medical Sciences (AREA)
- Position Input By Displaying (AREA)
- User Interface Of Digital Computer (AREA)
- Image Analysis (AREA)
- Image Input (AREA)
Abstract
Description
重心座標Gの=拳の形状内に存在する画素のX座標の値の合計値/拳の形状内に存在する画素の数・・・(1)
重心座標GのYn=拳の形状内に存在する画素のY座標の値の合計値/拳の形状内に存在する画素の数・・・(2)
C カメラ
L 光源
P 仮想撮影平面
G 拳の重心座標
T 指先座標
Claims (2)
- 入力デバイスにおいて座標の入力に用いられる手先の三次元の動きをひとつの光学的撮像手段で検出する方法であって、拳を含む手先の上方または下方に配置された光源から、この手先に向けて光を投射するステップと、上記手先に対して光源と同じ側に光学的撮像手段を配置して、この手先による上記光の反射を、仮想撮影平面上の二次元画像として取得するステップと、上記二次元画像に、互いに直交する二軸の座標を割り当て、この画像のなかから、拳の形状とこの拳から突出する指の先端位置とを認識して抽出した後、演算により、拳の面積の分布重心の座標と指先の座標とを算出するステップと、上記光を投射するステップと二次元画像を取得するステップと拳の重心座標および指先座標を算出するステップとを繰り返し、この繰り返しの前後における上記拳の重心座標と指先座標の間の距離とを比較するとともに、上記繰り返しの前後で上記拳の重心座標と指先座標の間の距離が変化していない場合は、上記拳を含む手先が上記仮想撮影平面方向にスライド移動したと判定し、上記繰り返しの前後で上記拳の重心座標と指先座標の間の距離が変化している場合は、この指先が上記手先の手首または肘を支点に上下方向に回動したと判定するステップと、を備えることを特徴とする入力体の動き検出方法。
- デバイスの入力体として使用する拳を含む手先の上方または下方に配置された光源と、上記手先に対して上記光源と同じ側に配置された光学的撮像手段と、これら光源と光学的撮像手段を制御する制御手段と、上記光源から手先に向かって投射された光の反射を二次元画像として取得し、この二次元画像から、拳の形状の分布重心に相当する座標とこの拳から突出する指の先端位置に相当する座標とを算出する形状認識手段と、所定の時間間隔の前後で、上記拳の重心座標と上記指先座標の間の距離を比較し、この距離が上記時間間隔の前後で縮小または拡大した場合に、その際の手先の動きを、上記光学的撮像手段の仮想撮影平面に対する指の上下動として判断する動き判定手段と、を備えることを特徴とする入力デバイス。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/342,586 US20140225826A1 (en) | 2011-09-07 | 2012-08-24 | Method for detecting motion of input body and input device using same |
KR1020147005970A KR20140060297A (ko) | 2011-09-07 | 2012-08-24 | 입력체의 움직임 검출 방법 및 그것을 이용한 입력 디바이스 |
EP12830488.8A EP2755115A4 (en) | 2011-09-07 | 2012-08-24 | METHOD OF DETECTING INPUT BODY MOTION, AND INPUT DEVICE IMPLEMENTING THE SAME |
CN201280043180.0A CN103797446A (zh) | 2011-09-07 | 2012-08-24 | 输入体的动作检测方法以及使用了该方法的输入设备 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2011194938 | 2011-09-07 | ||
JP2011-194938 | 2011-09-07 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2013035554A1 true WO2013035554A1 (ja) | 2013-03-14 |
Family
ID=47832004
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2012/071456 WO2013035554A1 (ja) | 2011-09-07 | 2012-08-24 | 入力体の動き検出方法およびそれを用いた入力デバイス |
Country Status (7)
Country | Link |
---|---|
US (1) | US20140225826A1 (ja) |
EP (1) | EP2755115A4 (ja) |
JP (1) | JP2013069273A (ja) |
KR (1) | KR20140060297A (ja) |
CN (1) | CN103797446A (ja) |
TW (1) | TW201324258A (ja) |
WO (1) | WO2013035554A1 (ja) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20150097812A1 (en) * | 2013-10-08 | 2015-04-09 | National Taiwan University Of Science And Technology | Interactive operation method of electronic apparatus |
US9412012B2 (en) | 2013-10-16 | 2016-08-09 | Qualcomm Incorporated | Z-axis determination in a 2D gesture system |
Families Citing this family (34)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9679215B2 (en) | 2012-01-17 | 2017-06-13 | Leap Motion, Inc. | Systems and methods for machine control |
US9070019B2 (en) | 2012-01-17 | 2015-06-30 | Leap Motion, Inc. | Systems and methods for capturing motion in three-dimensional space |
US9501152B2 (en) | 2013-01-15 | 2016-11-22 | Leap Motion, Inc. | Free-space user interface and control using virtual constructs |
US8693731B2 (en) | 2012-01-17 | 2014-04-08 | Leap Motion, Inc. | Enhanced contrast for object detection and characterization by optical imaging |
US8638989B2 (en) | 2012-01-17 | 2014-01-28 | Leap Motion, Inc. | Systems and methods for capturing motion in three-dimensional space |
US10691219B2 (en) | 2012-01-17 | 2020-06-23 | Ultrahaptics IP Two Limited | Systems and methods for machine control |
US11493998B2 (en) | 2012-01-17 | 2022-11-08 | Ultrahaptics IP Two Limited | Systems and methods for machine control |
US12260023B2 (en) | 2012-01-17 | 2025-03-25 | Ultrahaptics IP Two Limited | Systems and methods for machine control |
TWI488092B (zh) | 2012-12-07 | 2015-06-11 | Pixart Imaging Inc | 光學式觸控裝置及其操作方法 |
US9459697B2 (en) | 2013-01-15 | 2016-10-04 | Leap Motion, Inc. | Dynamic, free-space user interactions for machine control |
WO2014200589A2 (en) | 2013-03-15 | 2014-12-18 | Leap Motion, Inc. | Determining positional information for an object in space |
US9916009B2 (en) | 2013-04-26 | 2018-03-13 | Leap Motion, Inc. | Non-tactile interface systems and methods |
NL2011182C2 (en) * | 2013-07-17 | 2015-01-21 | Aesthetic Interactions B V | Luminaire system. |
US10281987B1 (en) | 2013-08-09 | 2019-05-07 | Leap Motion, Inc. | Systems and methods of free-space gestural interaction |
TWI507919B (zh) * | 2013-08-23 | 2015-11-11 | Univ Kun Shan | 追蹤與記錄指尖軌跡的影像處理方法 |
US10846942B1 (en) | 2013-08-29 | 2020-11-24 | Ultrahaptics IP Two Limited | Predictive information for free space gesture control and communication |
US9632572B2 (en) | 2013-10-03 | 2017-04-25 | Leap Motion, Inc. | Enhanced field of view to augment three-dimensional (3D) sensory space for free-space gesture interpretation |
US9304597B2 (en) | 2013-10-29 | 2016-04-05 | Intel Corporation | Gesture based human computer interaction |
US9996638B1 (en) | 2013-10-31 | 2018-06-12 | Leap Motion, Inc. | Predictive information for free space gesture control and communication |
US9741169B1 (en) | 2014-05-20 | 2017-08-22 | Leap Motion, Inc. | Wearable augmented reality devices with object detection and tracking |
JP2016038889A (ja) | 2014-08-08 | 2016-03-22 | リープ モーション, インコーポレーテッドLeap Motion, Inc. | モーション感知を伴う拡張現実 |
US10310675B2 (en) * | 2014-08-25 | 2019-06-04 | Canon Kabushiki Kaisha | User interface apparatus and control method |
CN105843456B (zh) * | 2015-01-16 | 2018-10-12 | 致伸科技股份有限公司 | 触控装置 |
CN106201116B (zh) * | 2015-05-05 | 2021-10-22 | 联想(北京)有限公司 | 信息处理方法及电子设备 |
US10222869B2 (en) * | 2015-08-03 | 2019-03-05 | Intel Corporation | State machine based tracking system for screen pointing control |
JP6579866B2 (ja) * | 2015-08-31 | 2019-09-25 | キヤノン株式会社 | 情報処理装置とその制御方法、プログラム、記憶媒体 |
CN105832343B (zh) * | 2016-05-22 | 2020-04-03 | 上海大学 | 多维视觉手功能康复定量评估系统和评估方法 |
JP7017675B2 (ja) * | 2018-02-15 | 2022-02-09 | 有限会社ワタナベエレクトロニクス | 非接触入力システム、方法およびプログラム |
US11698457B2 (en) | 2019-09-04 | 2023-07-11 | Pixart Imaging Inc. | Object detecting system and object detecting method |
CN111124116A (zh) * | 2019-12-18 | 2020-05-08 | 佛山科学技术学院 | 一种虚拟现实中与远距离物体交互方法及系统 |
JP7163526B1 (ja) | 2021-07-20 | 2022-10-31 | 株式会社あかつき | 情報処理システム、プログラム及び情報処理方法 |
JP7052128B1 (ja) | 2021-07-20 | 2022-04-11 | 株式会社あかつき | 情報処理システム、プログラム及び情報処理方法 |
JP7286857B2 (ja) * | 2021-07-20 | 2023-06-05 | 株式会社あかつき | 情報処理システム、プログラム及び情報処理方法 |
JP7286856B2 (ja) * | 2022-03-30 | 2023-06-05 | 株式会社あかつき | 情報処理システム、プログラム及び情報処理方法 |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0953914A (ja) | 1995-08-14 | 1997-02-25 | Nec Corp | 三次元座標計測装置 |
JPH1123262A (ja) | 1997-07-09 | 1999-01-29 | Nekusuta:Kk | 三次元位置計測システム |
JPH11134089A (ja) * | 1997-10-29 | 1999-05-21 | Takenaka Komuten Co Ltd | ハンドポインティング装置 |
JP2004171476A (ja) * | 2002-11-22 | 2004-06-17 | Keio Gijuku | ハンドパターンスイッチ装置 |
JP2006099749A (ja) * | 2004-08-31 | 2006-04-13 | Matsushita Electric Works Ltd | ジェスチャースイッチ |
Family Cites Families (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP3458543B2 (ja) * | 1995-07-25 | 2003-10-20 | 株式会社日立製作所 | 手形状認識機能付き情報処理装置 |
JP2868449B2 (ja) * | 1995-12-22 | 1999-03-10 | 株式会社エイ・ティ・アール通信システム研究所 | 手振り認識装置 |
EP0837418A3 (en) * | 1996-10-18 | 2006-03-29 | Kabushiki Kaisha Toshiba | Method and apparatus for generating information input using reflected light image of target object |
JP3752246B2 (ja) * | 2003-08-11 | 2006-03-08 | 学校法人慶應義塾 | ハンドパターンスイッチ装置 |
JP4991458B2 (ja) * | 2007-09-04 | 2012-08-01 | キヤノン株式会社 | 画像表示装置及びその制御方法 |
JP5228439B2 (ja) * | 2007-10-22 | 2013-07-03 | 三菱電機株式会社 | 操作入力装置 |
CN102099775B (zh) * | 2008-07-17 | 2014-09-24 | 日本电气株式会社 | 信息处理装置、记录有程序的存储介质以及目标移动方法 |
WO2012024022A2 (en) * | 2010-08-20 | 2012-02-23 | University Of Massachusetts | Hand and finger registration for control applications |
-
2012
- 2012-08-24 WO PCT/JP2012/071456 patent/WO2013035554A1/ja active Application Filing
- 2012-08-24 CN CN201280043180.0A patent/CN103797446A/zh active Pending
- 2012-08-24 KR KR1020147005970A patent/KR20140060297A/ko not_active Withdrawn
- 2012-08-24 JP JP2012185199A patent/JP2013069273A/ja active Pending
- 2012-08-24 US US14/342,586 patent/US20140225826A1/en not_active Abandoned
- 2012-08-24 TW TW101130801A patent/TW201324258A/zh unknown
- 2012-08-24 EP EP12830488.8A patent/EP2755115A4/en not_active Withdrawn
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0953914A (ja) | 1995-08-14 | 1997-02-25 | Nec Corp | 三次元座標計測装置 |
JPH1123262A (ja) | 1997-07-09 | 1999-01-29 | Nekusuta:Kk | 三次元位置計測システム |
JPH11134089A (ja) * | 1997-10-29 | 1999-05-21 | Takenaka Komuten Co Ltd | ハンドポインティング装置 |
JP2004171476A (ja) * | 2002-11-22 | 2004-06-17 | Keio Gijuku | ハンドパターンスイッチ装置 |
JP2006099749A (ja) * | 2004-08-31 | 2006-04-13 | Matsushita Electric Works Ltd | ジェスチャースイッチ |
Non-Patent Citations (1)
Title |
---|
See also references of EP2755115A4 |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20150097812A1 (en) * | 2013-10-08 | 2015-04-09 | National Taiwan University Of Science And Technology | Interactive operation method of electronic apparatus |
US9256324B2 (en) * | 2013-10-08 | 2016-02-09 | National Taiwan University Of Science And Technology | Interactive operation method of electronic apparatus |
US9412012B2 (en) | 2013-10-16 | 2016-08-09 | Qualcomm Incorporated | Z-axis determination in a 2D gesture system |
Also Published As
Publication number | Publication date |
---|---|
KR20140060297A (ko) | 2014-05-19 |
TW201324258A (zh) | 2013-06-16 |
JP2013069273A (ja) | 2013-04-18 |
EP2755115A4 (en) | 2015-05-06 |
US20140225826A1 (en) | 2014-08-14 |
EP2755115A1 (en) | 2014-07-16 |
CN103797446A (zh) | 2014-05-14 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2013035554A1 (ja) | 入力体の動き検出方法およびそれを用いた入力デバイス | |
US9207773B1 (en) | Two-dimensional method and system enabling three-dimensional user interaction with a device | |
US8723789B1 (en) | Two-dimensional method and system enabling three-dimensional user interaction with a device | |
US20210181857A1 (en) | Systems and Methods of Creating a Realistic Displacement of a Virtual Object in Virtual Reality/Augmented Reality Environments | |
US8971565B2 (en) | Human interface electronic device | |
US9600078B2 (en) | Method and system enabling natural user interface gestures with an electronic system | |
US10234941B2 (en) | Wearable sensor for tracking articulated body-parts | |
JP5658500B2 (ja) | 情報処理装置及びその制御方法 | |
CN105589607B (zh) | 触控系统、触控显示系统和触控交互方法 | |
TW201508561A (zh) | 用於移動追蹤的光斑感測 | |
WO2017147748A1 (zh) | 一种可穿戴式系统的手势控制方法以及可穿戴式系统 | |
JP2004094653A (ja) | 情報入力システム | |
JP2014186715A (ja) | 情報処理装置、情報処理方法 | |
JP4608326B2 (ja) | 指示動作認識装置及び指示動作認識プログラム | |
JP6528964B2 (ja) | 入力操作検出装置、画像表示装置、プロジェクタ装置、プロジェクタシステム、及び入力操作検出方法 | |
JPH08328735A (ja) | ハンドポインティング入力装置 | |
JP2017219942A (ja) | 接触検出装置、プロジェクタ装置、電子黒板装置、デジタルサイネージ装置、プロジェクタシステム、接触検出方法、プログラム及び記憶媒体。 | |
JP6643825B2 (ja) | 装置及び方法 | |
JP2018018308A (ja) | 情報処理装置、及びその制御方法ならびにコンピュータプログラム | |
TWI471757B (zh) | 懸浮及點擊手勢偵測裝置 | |
TWI493382B (zh) | 懸浮及點擊手勢偵測裝置 | |
JP2006190212A (ja) | 3次元位置入力装置 | |
JP6618301B2 (ja) | 情報処理装置、その制御方法、プログラム、及び記憶媒体 | |
JP2018063555A (ja) | 情報処理装置、情報処理方法及びプログラム | |
KR101695727B1 (ko) | 스테레오 비전을 이용한 위치검출 시스템 및 위치검출 방법 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 12830488 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 14342586 Country of ref document: US |
|
ENP | Entry into the national phase |
Ref document number: 20147005970 Country of ref document: KR Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2012830488 Country of ref document: EP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |