Lun, 2018 - Google Patents
Human activity tracking and recognition using Kinect sensorLun, 2018
View PDF- Document ID
- 11105383006745668813
- Author
- Lun R
- Publication year
External Links
Snippet
The objective of this dissertation research is to use Kinect sensor, a motion sensing input device, to develop an integrated software system that can be used for tracking non- compliant activity postures of consented health-care workers for assisting the workers' …
- 230000000694 effects 0 title abstract description 275
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/0304—Detection arrangements using opto-electronic means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/04815—Interaction with three-dimensional environments, e.g. control of viewpoint to navigate in the environment
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F19/00—Digital computing or data processing equipment or methods, specially adapted for specific applications
- G06F19/30—Medical informatics, i.e. computer-based analysis or dissemination of patient or disease data
- G06F19/34—Computer-assisted medical diagnosis or treatment, e.g. computerised prescription or delivery of medication or diets, computerised local control of medical devices, medical expert systems or telemedicine
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06K—RECOGNITION OF DATA; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
- G06K9/00—Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints
- G06K9/00335—Recognising movements or behaviour, e.g. recognition of gestures, dynamic facial expressions; Lip-reading
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06K—RECOGNITION OF DATA; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
- G06K9/00—Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints
- G06K9/00362—Recognising human body or animal bodies, e.g. vehicle occupant, pedestrian; Recognising body parts, e.g. hand
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06N—COMPUTER SYSTEMS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N99/00—Subject matter not provided for in other groups of this subclass
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| Caeiro-Rodríguez et al. | A systematic review of commercial smart gloves: Current status and applications | |
| Ahad et al. | Vision-based Action Understanding for Assistive Healthcare: A Short Review. | |
| Lun et al. | A survey of applications and human motion recognition with microsoft kinect | |
| Hondori et al. | A review on technical and clinical impact of microsoft kinect on physical therapy and rehabilitation | |
| KR101923243B1 (en) | Inferring spatial object descriptions from spatial gestures | |
| LaViola Jr | 3D gestural interaction: The state of the field | |
| CN102222431A (en) | Hand language translator based on machine | |
| LaViola Jr | Context aware 3D gesture recognition for games and virtual reality | |
| Bigdelou et al. | Simultaneous categorical and spatio-temporal 3d gestures using kinect | |
| Pardos et al. | On unifying deep learning and edge computing for human motion analysis in exergames development | |
| Shi et al. | Accurate and fast classification of foot gestures for virtual locomotion | |
| Haggag et al. | Body parts segmentation with attached props using rgb-d imaging | |
| Zhou | Role of human body posture recognition method based on wireless network Kinect in line dance aerobics and gymnastics training | |
| Lun | Human activity tracking and recognition using Kinect sensor | |
| Pięta et al. | Automated classification of virtual reality user motions using a motion atlas and machine learning approach | |
| Garcia et al. | Immersive augmented reality for Parkinson disease rehabilitation | |
| bin Mohd Sidik et al. | A study on natural interaction for human body motion using depth image data | |
| Fu et al. | Design and application of yoga intelligent teaching platform based on Internet of Things | |
| Ogiela et al. | Natural user interfaces for exploring and modeling medical images and defining gesture description technology | |
| Ravi | Automatic gesture recognition and tracking system for physiotherapy | |
| Etaat | An Online Balance Training Application Using Pose Estimation and Augmented Reality | |
| Lun et al. | A survey of using microsoft kinect in healthcare | |
| Narang et al. | Generating virtual avatars with personalized walking gaits using commodity hardware | |
| Gavrilă et al. | Towards the development of a medical rehabilitation system | |
| Khaksar | A Framework for Gamification of Human Joint Remote Rehabilitation, Incorporating Non-Invasive Sensors |