[go: up one dir, main page]

CN101799717A - Man-machine interaction method based on hand action catch - Google Patents

Man-machine interaction method based on hand action catch Download PDF

Info

Publication number
CN101799717A
CN101799717A CN 201010118444 CN201010118444A CN101799717A CN 101799717 A CN101799717 A CN 101799717A CN 201010118444 CN201010118444 CN 201010118444 CN 201010118444 A CN201010118444 A CN 201010118444A CN 101799717 A CN101799717 A CN 101799717A
Authority
CN
China
Prior art keywords
hand
mark
mouse
image
marker
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN 201010118444
Other languages
Chinese (zh)
Inventor
何明霞
宁福星
李萌
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tianjin University
Original Assignee
Tianjin University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tianjin University filed Critical Tianjin University
Priority to CN 201010118444 priority Critical patent/CN101799717A/en
Publication of CN101799717A publication Critical patent/CN101799717A/en
Pending legal-status Critical Current

Links

Images

Landscapes

  • User Interface Of Digital Computer (AREA)
  • Position Input By Displaying (AREA)

Abstract

本发明属于视觉测量技术及计算机技术领域,涉及一种基于手部行为捕捉的人机交互方法,包括下列步骤:在手的两个不同部位贴上特定的标记,并将手置于摄像头的视场内;程序初始化,包括初始化视频配置,读取标记样本库和摄像头参数;获取视频图像,探测标记并识别不同标记样本;计算被探测到的标记中心点在标记坐标系中的三维坐标;根据不同标记的坐标值的变化实现对鼠标的控制。本发明采用三维信息实现了鼠标的基本功能,具有结构简单、成本低、采用非接触测量摆脱通讯线的束缚等特点。这一发明可以用于立体视觉、虚拟现实和增强现实系统中,开发民用级产品,也为开发新型的人机交互方式和控制模式提供一种新技术。

Figure 201010118444

The invention belongs to the field of visual measurement technology and computer technology, and relates to a human-computer interaction method based on hand behavior capture. In the field; program initialization, including initializing video configuration, reading the marker sample library and camera parameters; acquiring video images, detecting markers and identifying different marker samples; calculating the three-dimensional coordinates of the detected marker center point in the marker coordinate system; according to The change of the coordinate value of different markers realizes the control of the mouse. The invention uses three-dimensional information to realize the basic functions of the mouse, and has the characteristics of simple structure, low cost, non-contact measurement and free from the shackles of communication lines, and the like. This invention can be used in stereo vision, virtual reality and augmented reality systems to develop civilian-grade products, and also provides a new technology for the development of new human-computer interaction methods and control modes.

Figure 201010118444

Description

Based on hand behavior catcher machine interaction method
Technical field
The present invention relates to utilize and catch the method that the hand behavior realizes man-machine interaction, belong to vision measurement technology and field of computer technology.
Background technology
Be subjected to the restriction of main flow display device and application software, traditional man-machine interaction mode is a kind of operation that is limited in the two dimensional surface usually.Proposition along with notions such as stereoscopic vision, virtual reality, augmented realities, the user likes more natural mode, in the virtual environment of more imaginative power, carry out reciprocation and influence each other with operand, thereby produce " immersing " impression and experience in being equal to true environment, therefore the data glove Study on Technology is developed and uses.Data glove can be followed the tracks of flexible and changeable gesture of user and dimensional orientation, makes the operator naturally and understandably the consciousness of oneself be passed to computing machine.More existing research institutions are doing a lot of work aspect the research and development of data glove both at home and abroad, and have released some data glove product and application thereof.The data glove product all must utilize accurate sensor to realize the measurement of data, and the difference of used sensor makes differing greatly of measuring accuracy and these two technical indicators of maximum sample frequency.The data glove bulk material mostly is expensive snapback fibre, and all needs to be worn on operator's hand, data glove to manipulate between process and the computing machine in use to be connected with order wire, and communication interface adopts RS232, high bit rate 115.2kbps.These data glove valuable product and be used for scientific research do not see that as yet civilian level product releases more.
Summary of the invention
The objective of the invention is to adopt two-dimensional signal to carry out man-machine interaction at traditional man-machine interaction mode (conventional mouse), do not fit into novel display technique (as the 3D projection, 3-D display), and data glove equipment complexity, cost an arm and a leg and deficiencies such as connection constraint are arranged, proposed a kind of based on hand behavior catcher machine interactive mode, i.e. vision mouse.Adopt this non-contact measurement of vision measurement, realization is obtained operation hand specific markers three-dimensional information, and realizes man-machine interaction according to the changes in coordinates of mark, has realized the function of conventional mouse.
The present invention adopts following technical scheme:
A kind of based on hand behavior catcher machine interaction method, utilize camera collection to post the hand images of mark, it is sent in the computing machine, in computing machine, store marker samples parameter and camera characteristic parameter, comprise the following steps:
(1) at least two at hand can stick different marks in independently movable position, and hand are placed in the visual field of camera;
(2) utilize the camera continuous acquisition to comprise the hand images sequence of whole marks, it is sent in the computing machine;
(3) computing machine is according to the corresponding marker samples parameter of being stored, and snoop tag is also discerned not isolabeling;
(4) three-dimensional coordinate of mark center point in the mark coordinate system that is detected according to the hand images of being gathered and camera calculation of characteristic parameters;
(5) set a change threshold that is marked at the coordinate figure of hand images hand images follow-up, whether exceed the change threshold that sets according to isolabeling coordinate change amount not with it, and the direction of changes in coordinates, realization is to the control of mouse.
As preferred implementation, of the present invention based on hand behavior catcher machine interaction method, step wherein (3) comprises the following steps:
(1) sets binary-state threshold, convert the hand images that collects to bianry image;
(2) this bianry image is done the connected domain analysis, search for and discern the image-region that all have edge feature;
(3) image-region and each the marker samples parameter of each that will identify with edge feature compares, and identifies each not isolabeling.
Be located at and post two different marks of mark A and mark B on hand, the hand images of being gathered is the plane picture of hand, and step wherein (5) comprises the following steps:
(1) adjacent two two field pictures is calculated the D coordinates value of mark A and mark B respectively and try to achieve the change amount of each mark coordinate figure; Whether judge mark A surpasses described change threshold with respect to the left and right directions on manual manipulation plane or the coordinate change amount of fore-and-aft direction, if surpass, then call the mouse_event function in the Windows api function storehouse, according to coordinate change amount, moving direction and the mobile pixel number of mouse beacon on the basis of current location.
(2) if marker for determination A does not move, then continue judge mark B and whether surpass described change threshold with respect to the above-below direction coordinate change amount on manual manipulation plane, if surpass, then call the mouse_event function among the Windows API, mouse beacon produces the right button operation.
(3) if marker for determination B does not carry out the right button operation, then continue the coordinate change amount of judge mark B on the manual manipulation plane and whether surpass described change threshold, if surpass, then call the mouse_event function in the Windows api function storehouse, mouse beacon produces the left button operation.
Man-computer mode of the present invention adopts vision technique to carry out three-dimensional coordinate and asks for, and is subjected to the restriction of camera sample rate and program runtime, and the highest sample frequency is 15.6Hz.The precision of cursor control has reached 2 pixels, error rate by key control is 4%, can realize that the cursor of mouse moves control and button control function, can finish the routine operation (browsing page to computing machine, file operations etc.), to take Installed System Memory be 40M in program run.The present invention adopts three-dimensional information to realize the basic function of mouse, has characteristics such as constraint simple in structure, that cost is low, the employing non-cpntact measurement is broken away from connection.
Description of drawings
The used mark synoptic diagram of Fig. 1 the present invention, (a): the back of the hand standard laid down by the ministries or commissions of the Central Government note; (b): forefinger end mark;
The pinhole imaging system model synoptic diagram that Fig. 2 the present invention uses;
Fig. 3 the present invention realizes the mouse control program flow diagram;
Fig. 4 the present invention realizes cursor of mouse positioning function process flow diagram;
Fig. 5 the present invention realizes mousebutton control function process flow diagram.
Embodiment
The invention provides a kind of camera that adopts and replace the special-purpose Fibre Optical Sensor or the man-machine interaction method of piezoelectric sensor, can realize obtaining operator's hand three-dimensional information.Only need to paste specific markers, adopt the non-contact information metering system, broken away from the constraint of gloves and order wire, improved the comfort level and the degree of freedom of operator's hand in the back of the hand portion and finger tip.The present invention can finish traditional mouse basic function, promptly the mouse control method of catching based on the hand behavior both can use jointly with mouse, also can not have independent use the under the situation of mouse, can realize each generic operation to computing machine fully, experiment test is satisfactory for result.
The invention will be further described below in conjunction with accompanying drawing and example.
Peripherals of the present invention is made up of camera, specific markers, fixed support, image data line and computing machine.Selected have automatic focus function and Ka Er Zeiss camera lens, and model is the camera that sieve skill is seen mini high definition type soon, technical parameter: 2,000,000 pixel ccd image sensors, 30 frame/seconds of sample rate.Camera is fixed on the support, takes over against hand place operation planar.The mark that is adopted as shown in Figure 1, two mark A and B are pasted on the back of the hand and forefinger end respectively, are affixed on the right hand usually by custom.Marker characteristic is black squares frame class mark, and distinguishes with the difference of marker character center pattern.The size of the back of the hand mark A is elected length of side 5cm as, forefinger end mark B length of side 1.5cm; Mark A center pattern is circle, and mark B center pattern is a right-angle triangle.
Verified the relation between tracking effective range (ultimate range on video camera and mark place plane under promptly can the capture of labels situation) and the label size by experiment.The capturing ability of this man-machine interaction mode adversary portion gauge point and the magnitude relationship of label size are tight.Adopt the mark of 7cm, 9cm, 11cm and four kinds of different sizes of 18cm, obtain to catch under the different size parameter effective range of hand gauge point.By experiment, can obtain as drawing a conclusion: label size is big more, and the tracking effective range is big more.But owing to influenced by the visual field, the two is not linear change, and along with the increase of label size, it is more little that the tracking effective range increases by on a year-on-year basis.Because mark must be affixed on palm back and finger tip, be subject to the size of the back of the hand and finger tip, the size of determining mark A and B at last is respectively 5cm and 1.5cm, the two all can effectively be hunted down in the tracking scope of 25cm, determine that thus support bracket fastened height is 30cm, camera is 25cm apart from mark place plan range.
The present invention adopts this Integrated Development Environment of Microsoft viscul c++6.0, by above-mentioned hardware and program, has realized the function of mouse action.Fig. 3 realizes the process flow diagram of mouse action for the present invention.Concrete steps are as follows:
(1) program initialization comprises the initialization video configuration, reads marker samples parameter and camera characteristic parameter.The initialization video configuration comprises photographic images size, sampling frame frequency and color space, is configured to 640*480,30 frame/seconds and RGB 24 respectively.
(2) obtain video image, snoop tag is also discerned different marker samples.Utilize single camera to obtain to comprise the image of whole marks to be input in the system, adopt binary-state threshold 100, convert the coloured image that collects to bianry image.This bianry image is done the connected domain analysis, search for and discern the image-region that all have the square edge feature.Utilize template matching algorithm, will extract to such an extent that the mark in image-region and the marker samples storehouse is compared, find out zone that matching degree is higher than setup parameter and promptly think the not isolabeling that the system that identifies adopts.
(3) calculate the three-dimensional coordinate of mark center point in the mark coordinate system that is detected.By the pin-hole imaging model, set up mark coordinate system O m-X mY mZ m, camera coordinate system O c-X cY cZ cWith the desirable screen coordinate system O of video camera u-X uY u, the mutual relationship between coordinate system as shown in Figure 4.Desirable screen coordinate (x by mark center point u, y u) and transition matrix T Cm(the mark coordinate is tied to the transition matrix between the camera coordinate system) calculates the three-dimensional coordinate (x of different mark center point in the mark coordinate system m, y m, z m).
(4) according to the not variation of the coordinate figure of isolabeling, call the mouse_event function in the Windows api function storehouse, realize the control to mouse, the flow process of cursor positioning operation as shown in Figure 4.The D coordinates value of record mark A is also calculated the change amount of adjacent two two field picture coordinate figures; Whether the coordinate change amount of the x direction of judge mark A (manual manipulation plane left and right directions) and y direction (manual manipulation plane fore-and-aft direction) surpasses threshold value 10, if surpass, then call the SetCursorPos (xvalue in the Windows api function storehouse, yvalue) function, mouse beacon moving direction and the mobile pixel number on the basis of current location.The pixel number and the coordinate change amount that move are linear, and scale-up factor can pass through program setting.
(5) if marker for determination A does not move, whether z direction (manual manipulation plane above-below direction) the coordinate change amount that then continues judge mark B surpasses threshold value 20, if surpass, then call the mouse_event function in the Windows api function storehouse, mouse beacon produces the left button operation.If marker for determination B does not carry out the left button operation, then continue the coordinate change amount of judge mark B on x-y plane (hand place operation planar) and whether surpass setting threshold, if surpass, then call the mouse_event function in the Windows api function storehouse, mouse beacon produces the right button operation.The process flow diagram of mouse beacon left and right press key operation as shown in Figure 5.
(6) close video acquisition, EOP (end of program).
At last, catch mutual experiment by the hand gauge point, some important technology parameters of the behavior of making type hand being caught human-computer interaction technology quantize, and estimate this technology actual effect.The maximum sample frequency that experiment test obtains this technology is 15.6Hz, the precision of cursor control has reached 2 pixels, error rate by key control is 4%, it is 40M that program run takies Installed System Memory, the cursor that can realize mouse moves control and button control function, can finish routine operation (browsing page, file operation etc.) to computing machine.
The present invention is because of system simply greatly reduces cost, and the civilian commercialization that helps technology is popularized.Different with traditional human-computer interaction device, this man-computer mode adopts space three-dimensional information to realize interactive operation, under the situation that series of concepts such as stereoscopic vision, virtual reality and augmented reality constantly are suggested, this man-computer mode has remarkable advantages more, will be more widely used and more long-range development.

Claims (3)

1.一种基于手部行为捕捉的人机交互方法,利用摄像头采集贴有标记的手部图像,将其送入计算机中,在计算机里存储有标记样本参数和摄像头特征参数,包括下列步骤:1. A human-computer interaction method based on hand behavior capture, utilizing a camera to collect a hand image with a mark, sending it into a computer, storing a mark sample parameter and a camera characteristic parameter in the computer, comprising the following steps: (1)在手的至少两个能够独立活动的部位贴上不同的标记,并将手置于摄像头的视场内;(1) Put different marks on at least two parts of the hand that can move independently, and place the hand in the field of view of the camera; (2)利用摄像头连续采集包含全部标记的手部图像序列,将其送入计算机中;(2) Utilize the camera to continuously collect the hand image sequence that contains all the marks, and send it to the computer; (3)计算机根据所存储的相应的标记样本参数,探测标记并识别不同标记;(3) The computer detects the marks and identifies different marks according to the stored corresponding mark sample parameters; (4)根据所采集的手部图像和摄像头特征参数计算被探测到的标记中心点在标记坐标系中的三维坐标;(4) Calculate the three-dimensional coordinates of the detected mark center point in the mark coordinate system according to the collected hand image and camera feature parameters; (5)设定一个标记在一个手部图像与其后续的手部图像的坐标值的变化阈值,根据不同标记坐标改变量是否超出所设定的变化阈值,以及坐标变化的方向,实现对鼠标的控制。(5) Set the change threshold of a mark on a hand image and the coordinate value of its subsequent hand image, and realize the control of the mouse according to whether the amount of change in the coordinates of different marks exceeds the set change threshold and the direction of the coordinate change control. 2.根据权利要求1所述基于手部行为捕捉的人机交互方法,其特征在于,其中的步骤(3)包括下列步骤:2. the human-computer interaction method based on hand behavior capture according to claim 1, is characterized in that, step (3) wherein comprises the following steps: (1)设定二值化阈值,将采集到的手部图像转换成二值图像;(1) Set the binarization threshold, and convert the collected hand image into a binary image; (2)对该二值图像作连通域分析,搜索并识别所有具有边缘特征的图像区域;(2) Perform connected domain analysis on the binary image, search and identify all image regions with edge features; (3)将识别出的各个具有边缘特征的图像区域与各个标记样本参数相比对,识别出各个不同标记。(3) Compare the identified image regions with edge features with the parameters of each marker sample, and identify each different marker. 3.根据权利要求1所述基于手部行为捕捉的人机交互方法,其特征在于,设在手上贴有标记A和标记B两个不同的标记,所采集的手部图像为手部的平面图像,其中的步骤(5)包括下列步骤:3. the human-computer interaction method based on hand behavior capture according to claim 1, is characterized in that, two different marks of mark A and mark B are affixed on the hand, and the hand image collected is the image of the hand. Plane image, step (5) wherein comprises the following steps: (1)对相邻两帧图像分别计算标记A和标记B的三维坐标值并求得各标记坐标值的改变量;判断标记A相对于手操作平面的左右方向或前后方向的坐标改变量是否超过所述的变化阈值,若超过,则调用Windows API函数库中的mouse_event函数,根据坐标改变量,控制鼠标在当前位置的基础上的移动方向和移动像素点数。(1) Calculate the three-dimensional coordinate values of marker A and marker B respectively for two adjacent frames of images and obtain the change amount of each marker coordinate value; judge whether the coordinate change amount of marker A relative to the left-right direction or front-rear direction of the hand operation plane is If the change threshold is exceeded, the mouse_event function in the Windows API function library is called to control the moving direction and number of moving pixels of the mouse on the basis of the current position according to the amount of coordinate change. (2)若判定标记A没有移动,则继续判断标记B相对于手操作平面的上下方向坐标改变量是否超过所述的变化阈值,若超过,则调用Windows API中的mouse_event函数,控制鼠标产生右键操作。(2) If it is determined that mark A has not moved, continue to judge whether the change in the coordinates of mark B relative to the hand operation plane in the up and down direction exceeds the change threshold, and if so, call the mouse_event function in the Windows API to control the mouse to generate a right button operate. (3)若判定标记B不进行右键操作,则继续判断标记B在手操作平面上的坐标改变量是否超过所述的变化阈值,若超过,则调用Windows API函数库中的mouse_event函数,控制鼠标产生左键操作。(3) If it is judged that mark B does not perform the right button operation, then continue to judge whether the coordinate change of mark B on the hand operation plane exceeds the described change threshold, if it exceeds, then call the mouse_event function in the Windows API function library to control the mouse Produces a left-click operation.
CN 201010118444 2010-03-05 2010-03-05 Man-machine interaction method based on hand action catch Pending CN101799717A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN 201010118444 CN101799717A (en) 2010-03-05 2010-03-05 Man-machine interaction method based on hand action catch

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN 201010118444 CN101799717A (en) 2010-03-05 2010-03-05 Man-machine interaction method based on hand action catch

Publications (1)

Publication Number Publication Date
CN101799717A true CN101799717A (en) 2010-08-11

Family

ID=42595416

Family Applications (1)

Application Number Title Priority Date Filing Date
CN 201010118444 Pending CN101799717A (en) 2010-03-05 2010-03-05 Man-machine interaction method based on hand action catch

Country Status (1)

Country Link
CN (1) CN101799717A (en)

Cited By (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101923433A (en) * 2010-08-17 2010-12-22 北京航空航天大学 A Human-Computer Interaction Method Based on Hand Shadow Recognition
CN102156808A (en) * 2011-03-30 2011-08-17 北京触角科技有限公司 System and method for improving try-on effect of reality real-time virtual ornament
CN102411453A (en) * 2010-09-21 2012-04-11 北京市通州区科学技术协会 Method and device for enhancing outdoor practicability of virtual touch screen system
CN102662462A (en) * 2012-03-12 2012-09-12 中兴通讯股份有限公司 Electronic device, gesture recognition method and gesture application method
CN103135882A (en) * 2011-12-02 2013-06-05 深圳泰山在线科技有限公司 Method and system for control of display of window image
CN103389793A (en) * 2012-05-07 2013-11-13 深圳泰山在线科技有限公司 Human-computer interaction method and human-computer interaction system
CN103699213A (en) * 2012-12-19 2014-04-02 苏州贝腾特电子科技有限公司 Control method for double click of virtual mouse
CN103941861A (en) * 2014-04-02 2014-07-23 北京理工大学 Multi-user cooperation training system adopting mixed reality technology
CN104199548A (en) * 2014-08-29 2014-12-10 福州瑞芯微电子有限公司 Man-machine interactive type virtual touch device, system and method
CN104199549A (en) * 2014-08-29 2014-12-10 福州瑞芯微电子有限公司 Man-machine interactive type virtual touch device, system and method
CN104199547A (en) * 2014-08-29 2014-12-10 福州瑞芯微电子有限公司 Man-machine interactive type virtual touch device, system and method
CN106293078A (en) * 2016-08-02 2017-01-04 福建数博讯信息科技有限公司 Virtual reality exchange method based on photographic head and device
CN106796649A (en) * 2014-05-24 2017-05-31 远程信息技术发展中心 Gesture-based human machine interface using markers
CN107341818A (en) * 2016-04-29 2017-11-10 北京博酷科技有限公司 Image analysis algorithm for the test of touch-screen response performance
CN107450714A (en) * 2016-05-31 2017-12-08 大唐电信科技股份有限公司 Man-machine interaction support test system based on augmented reality and image recognition
CN108523281A (en) * 2017-03-02 2018-09-14 腾讯科技(深圳)有限公司 Gloves peripheral hardware, method, apparatus and system for virtual reality system
CN109243575A (en) * 2018-09-17 2019-01-18 华南理工大学 A kind of virtual acupuncture-moxibustion therapy method and system based on mobile interaction and augmented reality
US10262197B2 (en) 2015-11-17 2019-04-16 Huawei Technologies Co., Ltd. Gesture-based object measurement method and apparatus
WO2019134606A1 (en) * 2018-01-05 2019-07-11 Oppo广东移动通信有限公司 Terminal control method, device, storage medium, and electronic apparatus
WO2022126775A1 (en) * 2020-12-14 2022-06-23 安徽鸿程光电有限公司 Cursor control method and apparatus, device and medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5825352A (en) * 1996-01-04 1998-10-20 Logitech, Inc. Multiple fingers contact sensing method for emulating mouse buttons and mouse operations on a touch sensor pad
CN1664755A (en) * 2005-03-11 2005-09-07 西北工业大学 A video recognition input system
EP1852774A2 (en) * 2006-05-03 2007-11-07 Mitsubishi Electric Corporation Method and system for emulating a mouse on a multi-touch sensitive surface
CN101344816A (en) * 2008-08-15 2009-01-14 华南理工大学 Human-computer interaction method and device based on gaze tracking and gesture recognition

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5825352A (en) * 1996-01-04 1998-10-20 Logitech, Inc. Multiple fingers contact sensing method for emulating mouse buttons and mouse operations on a touch sensor pad
CN1664755A (en) * 2005-03-11 2005-09-07 西北工业大学 A video recognition input system
EP1852774A2 (en) * 2006-05-03 2007-11-07 Mitsubishi Electric Corporation Method and system for emulating a mouse on a multi-touch sensitive surface
CN101344816A (en) * 2008-08-15 2009-01-14 华南理工大学 Human-computer interaction method and device based on gaze tracking and gesture recognition

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
《中国优秀硕士学位论文全文数据库》 20080831 褥铜 人手捕捉虚拟物体交互技术研究 12-43 1-3 , 2 *

Cited By (28)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101923433A (en) * 2010-08-17 2010-12-22 北京航空航天大学 A Human-Computer Interaction Method Based on Hand Shadow Recognition
CN102411453A (en) * 2010-09-21 2012-04-11 北京市通州区科学技术协会 Method and device for enhancing outdoor practicability of virtual touch screen system
CN102156808A (en) * 2011-03-30 2011-08-17 北京触角科技有限公司 System and method for improving try-on effect of reality real-time virtual ornament
CN103135882A (en) * 2011-12-02 2013-06-05 深圳泰山在线科技有限公司 Method and system for control of display of window image
CN103135882B (en) * 2011-12-02 2016-08-03 深圳泰山体育科技股份有限公司 Control the method and system that window picture shows
CN102662462B (en) * 2012-03-12 2016-03-30 中兴通讯股份有限公司 Electronic installation, gesture identification method and gesture application process
CN102662462A (en) * 2012-03-12 2012-09-12 中兴通讯股份有限公司 Electronic device, gesture recognition method and gesture application method
CN103389793A (en) * 2012-05-07 2013-11-13 深圳泰山在线科技有限公司 Human-computer interaction method and human-computer interaction system
CN103389793B (en) * 2012-05-07 2016-09-21 深圳泰山在线科技有限公司 Man-machine interaction method and system
CN103699213A (en) * 2012-12-19 2014-04-02 苏州贝腾特电子科技有限公司 Control method for double click of virtual mouse
CN103941861B (en) * 2014-04-02 2017-02-08 北京理工大学 Multi-user cooperation training system adopting mixed reality technology
CN103941861A (en) * 2014-04-02 2014-07-23 北京理工大学 Multi-user cooperation training system adopting mixed reality technology
CN106796649A (en) * 2014-05-24 2017-05-31 远程信息技术发展中心 Gesture-based human machine interface using markers
CN104199547A (en) * 2014-08-29 2014-12-10 福州瑞芯微电子有限公司 Man-machine interactive type virtual touch device, system and method
CN104199549A (en) * 2014-08-29 2014-12-10 福州瑞芯微电子有限公司 Man-machine interactive type virtual touch device, system and method
CN104199548A (en) * 2014-08-29 2014-12-10 福州瑞芯微电子有限公司 Man-machine interactive type virtual touch device, system and method
CN104199547B (en) * 2014-08-29 2017-05-17 福州瑞芯微电子股份有限公司 Virtual touch screen operation device, system and method
CN104199548B (en) * 2014-08-29 2017-08-25 福州瑞芯微电子股份有限公司 A kind of three-dimensional man-machine interactive operation device, system and method
CN104199549B (en) * 2014-08-29 2017-09-26 福州瑞芯微电子股份有限公司 A kind of virtual mouse action device, system and method
US10262197B2 (en) 2015-11-17 2019-04-16 Huawei Technologies Co., Ltd. Gesture-based object measurement method and apparatus
CN107341818A (en) * 2016-04-29 2017-11-10 北京博酷科技有限公司 Image analysis algorithm for the test of touch-screen response performance
CN107450714A (en) * 2016-05-31 2017-12-08 大唐电信科技股份有限公司 Man-machine interaction support test system based on augmented reality and image recognition
CN106293078A (en) * 2016-08-02 2017-01-04 福建数博讯信息科技有限公司 Virtual reality exchange method based on photographic head and device
CN108523281A (en) * 2017-03-02 2018-09-14 腾讯科技(深圳)有限公司 Gloves peripheral hardware, method, apparatus and system for virtual reality system
WO2019134606A1 (en) * 2018-01-05 2019-07-11 Oppo广东移动通信有限公司 Terminal control method, device, storage medium, and electronic apparatus
CN109243575A (en) * 2018-09-17 2019-01-18 华南理工大学 A kind of virtual acupuncture-moxibustion therapy method and system based on mobile interaction and augmented reality
CN109243575B (en) * 2018-09-17 2022-04-22 华南理工大学 Virtual acupuncture method and system based on mobile interaction and augmented reality
WO2022126775A1 (en) * 2020-12-14 2022-06-23 安徽鸿程光电有限公司 Cursor control method and apparatus, device and medium

Similar Documents

Publication Publication Date Title
CN101799717A (en) Man-machine interaction method based on hand action catch
CN104460951A (en) Human-computer interaction method
CN112926423B (en) Pinch gesture detection and recognition method, device and system
Lee et al. Handy AR: Markerless inspection of augmented reality objects using fingertip tracking
JP4768196B2 (en) Apparatus and method for pointing a target by image processing without performing three-dimensional modeling
US6147678A (en) Video hand image-three-dimensional computer interface with multiple degrees of freedom
CN103135758B (en) Realize the method and system of shortcut function
CN102999152B (en) A kind of gesture motion recognition methods and system
US9135513B2 (en) Image processing apparatus and method for obtaining position and orientation of imaging apparatus
CN103941866B (en) Three-dimensional gesture recognizing method based on Kinect depth image
US20180004772A1 (en) Method and apparatus for identifying input features for later recognition
WO2021136386A1 (en) Data processing method, terminal, and server
CN102508578B (en) Projection positioning device and method as well as interaction system and method
JP7379065B2 (en) Information processing device, information processing method, and program
CN106547356B (en) Intelligent interaction method and device
CN102096471B (en) Human-computer interaction method based on machine vision
CN104364733A (en) Position-of-interest detection device, position-of-interest detection method, and position-of-interest detection program
JP2013250882A5 (en)
CN108027656B (en) Input device, input method, and program
CN104246664B (en) The transparent display virtual touch device of pointer is not shown
CN103925879A (en) Indoor robot vision hand-eye relation calibration method based on 3D image sensor
CN101477631A (en) Method, equipment for extracting target from image and human-machine interaction system
CN108022264A (en) Camera pose determines method and apparatus
CN113487674B (en) Human body pose estimation system and method
CN111399634B (en) Method and device for gesture-guided object recognition

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C02 Deemed withdrawal of patent application after publication (patent law 2001)
WD01 Invention patent application deemed withdrawn after publication

Open date: 20100811