Bose et al., 2023 - Google Patents
In-situ enhanced anchor-free deep CNN framework for a high-speed human-machine interactionBose et al., 2023
- Document ID
- 16477739218966411041
- Author
- Bose S
- Kumar V
- Sreekar C
- Publication year
- Publication venue
- Engineering Applications of Artificial Intelligence
External Links
Snippet
Abstract Human-Robot Interaction (HRI) constitutes a demanding research field that integrates artificial intelligence, informatics, robotics, engineering, and human-machine interaction. In the present era, there is an increased focus on natural user interfaces, with …
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for entering handwritten data, e.g. gestures, text
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0346—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/40—Robotics, robotics mapping to robotics vision
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/39—Robotics, robotics to robotics hand
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06K—RECOGNITION OF DATA; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
- G06K9/00—Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints
- G06K9/00335—Recognising movements or behaviour, e.g. recognition of gestures, dynamic facial expressions; Lip-reading
- G06K9/00355—Recognition of hand or arm movements, e.g. recognition of deaf sign language
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B19/00—Programme-control systems
- G05B19/02—Programme-control systems electric
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06N—COMPUTER SYSTEMS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N99/00—Subject matter not provided for in other groups of this subclass
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| Kumar et al. | Hand data glove: A new generation real-time mouse for human-computer interaction | |
| Lu et al. | Gesture recognition using data glove: An extreme learning machine method | |
| Bose et al. | In-situ enhanced anchor-free deep CNN framework for a high-speed human-machine interaction | |
| Zubrycki et al. | Using integrated vision systems: three gears and leap motion, to control a 3-finger dexterous gripper | |
| Yadav et al. | Gesture recognition system for human-computer interaction using computer vision | |
| Parimala et al. | Real-time brightness, contrast and the volume control with hand gesture using open CV python | |
| Julian et al. | Gesture-driven virtual mouse: empowering accessibility through hand movements | |
| Liu et al. | Multimodal human-robot collaboration: advancements and future directions | |
| Xia et al. | Large vision-language models enabled novel objects 6D pose estimation for human-robot collaboration | |
| Pajor et al. | Kinect sensor implementation in FANUC robot manipulation | |
| Patra et al. | Human-Machine Interaction in Industrial Automation: Gesture-Based PLC Control | |
| Charan et al. | Controlling powerpoint presentation using hand gestures in real-time | |
| Sumukha et al. | Gesture controlled 6 DoF manipulator with custom gripper for pick and place operation using ROS2 framework | |
| Fei et al. | Seamless robot teleoperation: Intuitive control through hand gestures and neural network decoding | |
| TK et al. | Real-Time Virtual Mouse using Hand Gestures for Unconventional Environment | |
| Xie et al. | One-Handed Wonders: A Remote Control Method Based on Hand Gesture for Mobile Manipulator | |
| Alvarado García et al. | Gesture-Based Control of an OMRON Viper 650 Robot | |
| Chen et al. | A generic framework for the design of visual-based gesture control interface | |
| Jeevan et al. | AI Virtual Mouse Using Hand Geatures | |
| Torielli et al. | An intuitive tele-collaboration interface exploring laser-based interaction and behavior trees | |
| Darshan et al. | GESTURE-CONTROLLED VIRTUAL MOUSE | |
| Jain et al. | Integrating Hand Gesture Tracking with Physical and Digital Twins of Robotic Arm | |
| Cobbina et al. | Gesture Recognition Through Object Detection for Efficient Human-Robot Collaboration | |
| Yadav et al. | Hand Gesture Recognition Automation System | |
| Kumar et al. | Hand Gesture Controlled White Board |