Shin et al., 2018 - Google Patents
EMG and IMU based real-time HCI using dynamic hand gestures for a multiple-DoF robot armShin et al., 2018
View PDF- Document ID
- 10792337402976621539
- Author
- Shin S
- Tafreshi R
- Langari R
- Publication year
- Publication venue
- Journal of Intelligent & Fuzzy Systems
External Links
Snippet
This study focuses on a myoelectric interface that controls a robotic manipulator via neuromuscular electrical signals generated when humans make hand gestures. The proposed system recognizes dynamic hand motions, which change shapes, poses, and …
- 230000003183 myoelectrical 0 abstract description 132
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06K—RECOGNITION OF DATA; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
- G06K9/00—Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints
- G06K9/00335—Recognising movements or behaviour, e.g. recognition of gestures, dynamic facial expressions; Lip-reading
- G06K9/00355—Recognition of hand or arm movements, e.g. recognition of deaf sign language
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06K—RECOGNITION OF DATA; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
- G06K9/00—Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints
- G06K9/62—Methods or arrangements for recognition using electronic means
- G06K9/6267—Classification techniques
- G06K9/6268—Classification techniques relating to the classification paradigm, e.g. parametric or non-parametric approaches
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06K—RECOGNITION OF DATA; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
- G06K9/00—Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints
- G06K9/62—Methods or arrangements for recognition using electronic means
- G06K9/6217—Design or setup of recognition systems and techniques; Extraction of features in feature space; Clustering techniques; Blind source separation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06N—COMPUTER SYSTEMS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N99/00—Subject matter not provided for in other groups of this subclass
- G06N99/005—Learning machines, i.e. computer in which a programme is changed according to experience gained by the machine itself during a complete run
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F19/00—Digital computing or data processing equipment or methods, specially adapted for specific applications
- G06F19/30—Medical informatics, i.e. computer-based analysis or dissemination of patient or disease data
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06N—COMPUTER SYSTEMS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N5/00—Computer systems utilising knowledge based models
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06N—COMPUTER SYSTEMS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computer systems based on biological models
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| Chen et al. | Real-time multi-modal human–robot collaboration using gestures and speech | |
| Wang et al. | Controlling object hand-over in human–robot collaboration via natural wearable sensing | |
| Moon et al. | Multiple kinect sensor fusion for human skeleton tracking using Kalman filtering | |
| Luzhnica et al. | A sliding window approach to natural hand gesture recognition using a custom data glove | |
| Du et al. | Markerless kinect-based hand tracking for robot teleoperation | |
| Shin et al. | EMG and IMU based real-time HCI using dynamic hand gestures for a multiple-DoF robot arm | |
| Martínez-Villaseñor et al. | A concise review on sensor signal acquisition and transformation applied to human activity recognition and human–robot interaction | |
| Simão et al. | Unsupervised gesture segmentation by motion detection of a real-time data stream | |
| LaViola Jr | Context aware 3D gesture recognition for games and virtual reality | |
| LaViola Jr | An introduction to 3D gestural interfaces | |
| Pan et al. | Automated detection of handovers using kinematic features | |
| Pareek et al. | Myotrack: Realtime estimation of subject participation in robotic rehabilitation using semg and imu | |
| Noh et al. | A decade of progress in human motion recognition: A comprehensive survey from 2010 to 2020 | |
| Ganguly et al. | Kinect Sensor Based Single Person Hand Gesture Recognition for Man–Machine Interaction | |
| Aronson et al. | Inferring goals with gaze during teleoperated manipulation | |
| Wu et al. | Beyond remote control: Exploring natural gesture inputs for smart TV systems | |
| Villani et al. | A general pipeline for online gesture recognition in human–robot interaction | |
| Bature et al. | Boosted gaze gesture recognition using underlying head orientation sequence | |
| Jung et al. | Touch gesture recognition-based physical human–robot interaction for collaborative tasks | |
| Pięta et al. | Automated classification of virtual reality user motions using a motion atlas and machine learning approach | |
| Jindal et al. | A comparative analysis of established techniques and their applications in the field of gesture detection | |
| Tani et al. | A gesture interface for radiological workstations | |
| Baskaran et al. | Multi-dimensional task recognition for human-robot teaming: literature review | |
| Savaş et al. | Hand gesture recognition with two stage approach using transfer learning and deep ensemble learning | |
| Ahmed Al-mashhadani et al. | Human-Animal Affective Robot Touch Classification Using Deep Neural Network. |