Kim et al., 2025 - Google Patents
Developing Brain-Based Bare-Handed Human–Machine Interaction via On-Skin InputKim et al., 2025
- Document ID
- 6178889019722988368
- Author
- Kim M
- Shin H
- Cho J
- Lee S
- Publication year
- Publication venue
- IEEE Transactions on Cybernetics
External Links
Snippet
Developing natural, intuitive, and human-centric input systems for mobile human-machine interaction (HMI) poses significant challenges. Existing gaze or gesture-based interaction systems are often constrained by their dependence on continuous visual engagement …
- 230000003993 interaction 0 title abstract description 41
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/015—Input arrangements based on nervous system activity detection, e.g. brain waves [EEG] detection, electromyograms [EMG] detection, electrodermal response detection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0354—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
- G06F3/03547—Touch pads, in which fingers can move on a surface
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/014—Hand-worn input/output arrangements, e.g. data gloves
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/013—Eye tracking input arrangements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterized by the transducing means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Detecting, measuring or recording for diagnostic purposes; Identification of persons
- A61B5/04—Detecting, measuring or recording bioelectric signals of the body of parts thereof
- A61B5/0476—Electroencephalography
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06K—RECOGNITION OF DATA; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
- G06K9/00—Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints
- G06K9/00335—Recognising movements or behaviour, e.g. recognition of gestures, dynamic facial expressions; Lip-reading
- G06K9/00355—Recognition of hand or arm movements, e.g. recognition of deaf sign language
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Detecting, measuring or recording for diagnostic purposes; Identification of persons
- A61B5/16—Devices for psychotechnics; Testing reaction times; Devices for evaluating the psychological state
- A61B5/165—Evaluating the state of mind, e.g. depression, anxiety
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Detecting, measuring or recording for diagnostic purposes; Identification of persons
- A61B5/04—Detecting, measuring or recording bioelectric signals of the body of parts thereof
- A61B5/0402—Electrocardiography, i.e. ECG
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US11493993B2 (en) | Systems, methods, and interfaces for performing inputs based on neuromuscular control | |
| US20230072423A1 (en) | Wearable electronic devices and extended reality systems including neuromuscular sensors | |
| Garcia-Moreno et al. | A CNN-LSTM deep learning classifier for motor imagery EEG detection using a low-invasive and low-cost BCI headband | |
| Ahsan et al. | EMG signal classification for human computer interaction: a review | |
| Scheirer et al. | Expression glasses: a wearable device for facial expression recognition | |
| Masai et al. | Face commands-user-defined facial gestures for smart glasses | |
| US20250095302A1 (en) | Wearable Electronic Devices And Extended Reality Systems Including Neuromuscular Sensors | |
| Pareek et al. | Myotrack: Realtime estimation of subject participation in robotic rehabilitation using semg and imu | |
| Yang et al. | Design of virtual keyboard using blink control method for the severely disabled | |
| CN113082448A (en) | Virtual immersion type autism children treatment system based on electroencephalogram signal and eye movement instrument | |
| Hasan | Rukaiya Khatun Moury, Nazimul Haque. Coordination between Visualization and Execution of Movements | |
| Sommer et al. | Classification of fNIRS finger tapping data with multi-labeling and deep learning | |
| Suppiah et al. | Fuzzy inference system (FIS)-long short-term memory (LSTM) network for electromyography (EMG) signal analysis | |
| Hayashi et al. | Human–machine interfaces based on bioelectric signals: a narrative review with a novel system proposal | |
| Shen et al. | Clenchclick: Hands-free target selection method leveraging teeth-clench for augmented reality | |
| Tao et al. | Review of electrooculography-based human-computer interaction: recent technologies, challenges and future trends | |
| Blankertz et al. | Detecting mental states by machine learning techniques: the berlin brain–computer interface | |
| Ma et al. | Using EEG artifacts for BCI applications | |
| Tariq | Revolutionizing Communication: EEG-Based Brain-Computer Interface for Speech and Mood Detection | |
| Kim et al. | Developing Brain-Based Bare-Handed Human–Machine Interaction via On-Skin Input | |
| Šumak et al. | Design and development of contactless interaction with computers based on the Emotiv EPOC+ device | |
| Deng et al. | HCI systems: Real-time Detection and Interaction based on EOG and IOG | |
| Marinova et al. | Deep learning for facial expression and human activity recognition using smart glasses | |
| Xing et al. | The development of EEG-based brain computer interfaces: potential and challenges | |
| Pareek et al. | MyoTrack: Tracking subject participation in robotic rehabilitation using sEMG and IMU |