[go: up one dir, main page]

Kim et al., 2025 - Google Patents

Developing Brain-Based Bare-Handed Human–Machine Interaction via On-Skin Input

Kim et al., 2025

Document ID
6178889019722988368
Author
Kim M
Shin H
Cho J
Lee S
Publication year
Publication venue
IEEE Transactions on Cybernetics

External Links

Snippet

Developing natural, intuitive, and human-centric input systems for mobile human-machine interaction (HMI) poses significant challenges. Existing gaze or gesture-based interaction systems are often constrained by their dependence on continuous visual engagement …
Continue reading at ieeexplore.ieee.org (other versions)

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRICAL DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/015Input arrangements based on nervous system activity detection, e.g. brain waves [EEG] detection, electromyograms [EMG] detection, electrodermal response detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRICAL DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0354Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
    • G06F3/03547Touch pads, in which fingers can move on a surface
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRICAL DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/014Hand-worn input/output arrangements, e.g. data gloves
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRICAL DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRICAL DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterized by the transducing means
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRICAL DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Detecting, measuring or recording for diagnostic purposes; Identification of persons
    • A61B5/04Detecting, measuring or recording bioelectric signals of the body of parts thereof
    • A61B5/0476Electroencephalography
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06KRECOGNITION OF DATA; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K9/00Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints
    • G06K9/00335Recognising movements or behaviour, e.g. recognition of gestures, dynamic facial expressions; Lip-reading
    • G06K9/00355Recognition of hand or arm movements, e.g. recognition of deaf sign language
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Detecting, measuring or recording for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times; Devices for evaluating the psychological state
    • A61B5/165Evaluating the state of mind, e.g. depression, anxiety
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Detecting, measuring or recording for diagnostic purposes; Identification of persons
    • A61B5/04Detecting, measuring or recording bioelectric signals of the body of parts thereof
    • A61B5/0402Electrocardiography, i.e. ECG
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRICAL DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048

Similar Documents

Publication Publication Date Title
US11493993B2 (en) Systems, methods, and interfaces for performing inputs based on neuromuscular control
US20230072423A1 (en) Wearable electronic devices and extended reality systems including neuromuscular sensors
Garcia-Moreno et al. A CNN-LSTM deep learning classifier for motor imagery EEG detection using a low-invasive and low-cost BCI headband
Ahsan et al. EMG signal classification for human computer interaction: a review
Scheirer et al. Expression glasses: a wearable device for facial expression recognition
Masai et al. Face commands-user-defined facial gestures for smart glasses
US20250095302A1 (en) Wearable Electronic Devices And Extended Reality Systems Including Neuromuscular Sensors
Pareek et al. Myotrack: Realtime estimation of subject participation in robotic rehabilitation using semg and imu
Yang et al. Design of virtual keyboard using blink control method for the severely disabled
CN113082448A (en) Virtual immersion type autism children treatment system based on electroencephalogram signal and eye movement instrument
Hasan Rukaiya Khatun Moury, Nazimul Haque. Coordination between Visualization and Execution of Movements
Sommer et al. Classification of fNIRS finger tapping data with multi-labeling and deep learning
Suppiah et al. Fuzzy inference system (FIS)-long short-term memory (LSTM) network for electromyography (EMG) signal analysis
Hayashi et al. Human–machine interfaces based on bioelectric signals: a narrative review with a novel system proposal
Shen et al. Clenchclick: Hands-free target selection method leveraging teeth-clench for augmented reality
Tao et al. Review of electrooculography-based human-computer interaction: recent technologies, challenges and future trends
Blankertz et al. Detecting mental states by machine learning techniques: the berlin brain–computer interface
Ma et al. Using EEG artifacts for BCI applications
Tariq Revolutionizing Communication: EEG-Based Brain-Computer Interface for Speech and Mood Detection
Kim et al. Developing Brain-Based Bare-Handed Human–Machine Interaction via On-Skin Input
Šumak et al. Design and development of contactless interaction with computers based on the Emotiv EPOC+ device
Deng et al. HCI systems: Real-time Detection and Interaction based on EOG and IOG
Marinova et al. Deep learning for facial expression and human activity recognition using smart glasses
Xing et al. The development of EEG-based brain computer interfaces: potential and challenges
Pareek et al. MyoTrack: Tracking subject participation in robotic rehabilitation using sEMG and IMU