[go: up one dir, main page]

Atienza et al., 2005 - Google Patents

Intuitive human-robot interaction through active 3d gaze tracking

Atienza et al., 2005

View PDF
Document ID
5376426600689207052
Author
Atienza R
Zelinsky A
Publication year
Publication venue
Robotics Research. The Eleventh International Symposium: With 303 Figures

External Links

Snippet

One of the biggest obstacles facing humans and robots is the lack of means for natural and meaningful interaction. Robots find it difficult to understand human intentions since our way of communication is different from the way machines exchange their information. Our aim is …
Continue reading at users.cecs.anu.edu.au (PDF) (other versions)

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRICAL DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06KRECOGNITION OF DATA; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K9/00Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints
    • G06K9/00221Acquiring or recognising human faces, facial parts, facial sketches, facial expressions
    • G06K9/00268Feature extraction; Face representation
    • G06K9/00281Local features and components; Facial parts ; Occluding parts, e.g. glasses; Geometrical relationships
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRICAL DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/40Robotics, robotics mapping to robotics vision
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06KRECOGNITION OF DATA; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K9/00Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints
    • G06K9/36Image preprocessing, i.e. processing the image information without deciding about the identity of the image
    • G06K9/46Extraction of features or characteristics of the image
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06KRECOGNITION OF DATA; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K9/00Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints
    • G06K9/00362Recognising human body or animal bodies, e.g. vehicle occupant, pedestrian; Recognising body parts, e.g. hand
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30196Human being; Person

Similar Documents

Publication Publication Date Title
US12131529B2 (en) Virtual teach and repeat mobile manipulation system
Mazhar et al. Towards real-time physical human-robot interaction using skeleton information and hand gestures
US10068135B2 (en) Face detection, identification, and tracking system for robotic devices
Atienza et al. Intuitive human-robot interaction through active 3d gaze tracking
Lee et al. Visual perception framework for an intelligent mobile robot
Kozamernik et al. Visual quality and safety monitoring system for human-robot cooperation
Fernández et al. A Kinect-based system to enable interaction by pointing in smart spaces
Lin et al. The implementation of augmented reality in a robotic teleoperation system
Yonemoto et al. Egocentric articulated pose tracking for action recognition
Huang et al. Human-to-robot handover control of an autonomous mobile robot based on hand-masked object pose estimation
Freddi et al. Development and experimental validation of algorithms for human–robot interaction in simulated and real scenarios
Bdiwi et al. Handing-over model-free objects to human hand with the help of vision/force robot control
Sigalas et al. Visual estimation of attentive cues in HRI: the case of torso and head pose
Hwang et al. Neural-network-based 3-D localization and inverse kinematics for target grasping of a humanoid robot by an active stereo vision system
Knoop et al. Sensor fusion for model based 3d tracking
Morales et al. An approach to estimate the orientation and movement trend of a person in the vicinity of an industrial robot
Kahily et al. Real-time human detection and tracking from a mobile armed robot using RGB-D sensor
Durdu et al. Morphing estimated human intention via human-robot interactions
Bdiwi et al. Segmentation of model-free objects carried by human hand: Intended for human-robot interaction applications
Aguilar et al. A simple yet smart head module for mobile manipulators
Kim et al. Pointing gesture-based unknown object extraction for learning objects with robot
Cheng et al. A Vision-based Remote Assistance Method and it's Application in Object Transfer
Eayrs et al. An Intelligent Autonomous Robot with Recognition, Depth-Aware Perception, and Manipulation
Walter et al. Appearance-based object reacquisition for mobile manipulation
Shen et al. A trifocal tensor based camera-projector system for robot-human interaction