DE4201934A1 - GESTIC COMPUTER - Google Patents
GESTIC COMPUTERInfo
- Publication number
- DE4201934A1 DE4201934A1 DE19924201934 DE4201934A DE4201934A1 DE 4201934 A1 DE4201934 A1 DE 4201934A1 DE 19924201934 DE19924201934 DE 19924201934 DE 4201934 A DE4201934 A DE 4201934A DE 4201934 A1 DE4201934 A1 DE 4201934A1
- Authority
- DE
- Germany
- Prior art keywords
- data processing
- processing system
- user interface
- objects
- graphical user
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Withdrawn
Links
- 238000012545 processing Methods 0.000 claims abstract description 36
- 230000033001 locomotion Effects 0.000 claims abstract description 13
- 241000282414 Homo sapiens Species 0.000 claims description 20
- 230000014509 gene expression Effects 0.000 claims description 15
- 238000000605 extraction Methods 0.000 claims description 4
- 230000008921 facial expression Effects 0.000 claims description 3
- 230000004424 eye movement Effects 0.000 claims description 2
- 230000004886 head movement Effects 0.000 claims description 2
- 241001235534 Graphis <ascomycete fungus> Species 0.000 claims 1
- 238000004891 communication Methods 0.000 abstract description 4
- 238000000034 method Methods 0.000 abstract description 4
- 230000002452 interceptive effect Effects 0.000 abstract description 2
- 238000011161 development Methods 0.000 description 4
- 230000018109 developmental process Effects 0.000 description 4
- 210000003811 finger Anatomy 0.000 description 3
- 238000004883 computer application Methods 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 239000000835 fiber Substances 0.000 description 2
- 210000004247 hand Anatomy 0.000 description 2
- 230000003993 interaction Effects 0.000 description 2
- 210000000707 wrist Anatomy 0.000 description 2
- 241000282412 Homo Species 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000001149 cognitive effect Effects 0.000 description 1
- 230000007613 environmental effect Effects 0.000 description 1
- 239000004744 fabric Substances 0.000 description 1
- 238000011017 operating method Methods 0.000 description 1
- 230000008447 perception Effects 0.000 description 1
- 238000011160 research Methods 0.000 description 1
- 210000003813 thumb Anatomy 0.000 description 1
- 238000012549 training Methods 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Description
Die Verbesserung der Kommunikation zwischen Mensch und Ma schine und die Suche nach neuen Eingabetechniken gewinnt be dingt durch immer komplexer werdende Computeranwendungen immer mehr an Bedeutung. Aus diesem Grunde ist es ein Ziel vieler technischer Entwicklungen, neben den klassischen Eingabegerä ten eines Computers, wie z. B. Tastatur, Graphik-Tablett, Maus etc. und in Entwicklung befindlichen Eingabemethoden, wie Sprach- und Handschrifterkennung, neue, der natürlichen räum lichen menschlichen Kommunikation angepaßte Interaktionsmög lichkeiten mit der Maschine zu entwickeln. Hierfür bieten sich die Gestik und Mimik des Menschen auf natürliche Weise an, da diese Ausdrucksformen in der natürlichen Kommunikation zwi schen Menschen von großer Bedeutung sind. Um diese Ausdrucks formen des Menschen als Eingabemethode für Computer nutzbar zu machen, müssen Kopf- und Körperbewegungen erkannt und inter pretiert werden.Improving communication between people and men machine and the search for new input techniques wins due to ever more complex computer applications more important. Because of this, it is a goal of many technical developments, in addition to the classic input devices ten of a computer such. B. keyboard, graphic tablet, mouse etc. and input methods under development, such as Speech and handwriting recognition, new, natural space Interaction adapted to human communication to develop possibilities with the machine. For this offer the gestures and facial expressions of humans in a natural way because these forms of expression in natural communication between people are of great importance. To this expression shape human beings as an input method for computers make head and body movements recognized and inter be pretended.
Das bisher einzige am Markt tatsächlich etablierte benutzer freundliche Eingabemedium für Computer ist die Maus. Betrach tet man den Einfluß dieses Eingabemediums auf die gesamte Ent wicklung von Computern bzw. deren Benutzerschnittstellen, so stellt man eine sehr große Steigerung der Benutzerfreundlich keit und der Effektivität von damit ausgerüsteten Computern fest. Fensteroberflächen, in denen graphische Objekte (Icons) mit der Maus manipuliert werden können, stellen im Vergleich zu den alten, an der Tastatur orientierten Bedienweisen eine völlig neue Art dar, Computer zu benutzen. Dieser qualitative Sprung in der Bedienung ermöglichte eine wesentliche Steige rung der Arbeitsproduktivität, eine deutliche Verringerung der Anlernzeiten und machte viele Computeranwendungen erst mög lich.The only user actually established on the market to date friendly input medium for computers is the mouse. Consider the influence of this input medium on the entire Ent development of computers or their user interfaces, so represents a very large increase in user friendliness and the effectiveness of computers equipped with it firmly. Window surfaces in which graphic objects (icons) can be manipulated with the mouse, compare to the old, keyboard-based operating methods completely new way of using computers. This qualitative Leap in operation enabled a significant climb labor productivity, a significant reduction in Training times and made many computer applications possible Lich.
Die Maus ist ein zweidimensionales Eingabemedium. Im Gegensatz dazu denkt und bewegt sich der Mensch in dreidimensionalen Um gebungen. Seine Gesten und Wahrnehmungen sind räumlich ge prägt. Diese Umstände sollten konsequenter Weise in einem dreidimensionalen Eingabemedium und in entsprechenden dreidi mensionalen Benutzeroberflächen ihre Entsprechung finden.The mouse is a two-dimensional input medium. In contrast for this, people think and move in three-dimensional surroundings given. His gestures and perceptions are spatial shapes. These circumstances should be consistent in one three-dimensional input medium and in corresponding dreidi dimensional user interfaces find their counterpart.
Neben der Maus wurde in den letzten Jahren ein weiteres Einga bemedium, welches als Data Glove (Datenhandschuh) bezeichnet wird, bekannt. Diese Date Glove wurde in den Jahren 1984 bis 1987 von der Firma VPL-Research von Thomas G. Zimmermann und L. Young Harvill entwickelt. Der Data Glove übersetzt Bewegun gen der Hand und der Finger in elektrische Signale. Zwischen zwei Stofflagen verlaufen fieberoptische Kabel entlang aller Finger und des Daumens. Beide Enden jedes Kabels münden in ein Interface-Board nahe des Handgelenks. Eine lichtemittierende Diode an einem Ende sendet Licht entlang des Kabels zu einem Phototransistor am anderen Ende, der das Licht in ein elektri sches Signal umwandelt. Das Signal gelant vom Handgelenk zum angeschlossenen Rechner durch ein angeschlossenes Kabel. Je stärker die Finger gekrümmt werden, desto größer ist der Lichtverlust im fieberoptischen Kabel, und desto kleiner damit das entsprechende elektrische Signal. Die Lage und Orientie rung der Hand im Raum wird mit Hilfe eines sog. Pollhemus-Sen sors (Mc Donnel Douglas Corporation), der auf dem Handrücken montiert wird, bestimmt. Dieser Sensor mißt die Stärke und Orientierung dreier aufeinander senkrecht stehender künstlich erzeugter Magnetfelder im Raum.In addition to the mouse, another entry has been made in recent years bemedium, which is referred to as a data glove is known. This Date Glove was made between 1984 and 1987 by the company VPL-Research by Thomas G. Zimmermann and L. Young Harvill developed. The Data Glove translates motion electrical signals to the hand and fingers. Between two layers of fabric run fiber optic cables along all Fingers and thumb. Both ends of each cable open into one Interface board near the wrist. A light emitting Diode at one end sends light along the cable to you Phototransistor on the other end, which turns the light into an electri converts signal. The signal from the wrist to connected computer through a connected cable. Each the more the fingers are curved, the larger the Loss of light in the fiber optic cable, and the smaller with it the corresponding electrical signal. The location and orientation The hand in the room is raised with the help of a so-called Pollhemus Sen sors (Mc Donnel Douglas Corporation) on the back of his hand is mounted, determined. This sensor measures the strength and Orientation of three mutually perpendicular artificial generated magnetic fields in the room.
Die Nachteile des Data Glove sind in erster Linie in seinem hohen Preis und in seiner relativ hohen Anfälligkeit gegenüber mechanischen Einflüssen zu sehen. Ferner bestehen erhebliche Akzeptanzprobleme, da der Data Glove nicht immer angenehm zu tragen ist. Ferner schränkt der Data Glove die Beweglichkeit des Anwenders durch Zuführungskabel und sein Gewicht ein. Der Benutzer ist durch den Data Glove allgemein in seiner Aktions fähigkeit eingeschränkt, da er die Hand nicht für weitere Ar beiten, wie z. B. Schreiben, verwenden kann. The disadvantages of the Data Glove are primarily in its high price and in its relatively high vulnerability to to see mechanical influences. There are also significant ones Acceptance problems because the data glove is not always pleasant is wearing. The Data Glove also limits mobility of the user through the supply cable and its weight. The Through the Data Glove, users are generally in their actions ability limited because he did not use the hand for further work work, such as B. writing can use.
Die Firma Xerox Palo Alto Research Center, der Erfinder der Schreibtischbenutzeroberflächen von Computersystemen, stellte vor kurzem einen Prototyp einer dreidimensionalen Benutze roberfläche vor. Die Idee ist hierbei, daß sich auf dem Bild schirm Strukturen befinden, welche sich in Echtzeit im dreidi mensionalen Raum bewegen, und vom menschlichen visuellen Sy stem in ihrer räumlichen Struktur wahrgenommen und interpre tiert werden (Mark A. Clarkson, Xerox PARC: "An Easier Inter face, Byte Magazine", February 1991, pp. 277-282; George G. Ro bertson, Stuart K. Card and Jock D. Mackinlay, Xerox PARC: "The Cognitive Coprocessor Architecture for Interactive User Interfaces", Proceedings of the ACM SIGGRAPH Symposium on User Interface Software and Technology, New York 1989, pp. 10-18).Xerox Palo Alto Research Center, the inventor of the Desktop user interfaces of computer systems recently a prototype of a three-dimensional use surface. The idea here is that on the picture umbrella structures, which are in real time in three move dimensional space, and from the human visual sy stem perceived in their spatial structure and interpr (Mark A. Clarkson, Xerox PARC: "An Easier Inter face, Byte Magazine ", February 1991, pp. 277-282; George G. Ro bertson, Stuart K. Card and Jock D. Mackinlay, Xerox PARC: "The Cognitive Coprocessor Architecture for Interactive User Interfaces ", Proceedings of the ACM SIGGRAPH Symposium on User Interface Software and Technology, New York 1989, pp. 10-18).
Für eine sinnvolle Arbeit mit derartigen dreidimensionalen Be nutzeroberflächen wird ein dreidimensionales Eingabegerät für Computer benötigt. Wünschenswert ist ein dreidimensionales Eingabemedium, welches den Benutzer nicht behindert, so daß er frei arbeiten kann, das vom Benutzer möglichst wenig Anpassung an seinen Rechnerarbeitsplatz fordert, das in der Lage ist, menschliche Metaphern zu erkennen und zu interpretieren und das relativ unabhängig gegenüber Umwelteinflüssen ist.For a meaningful work with such three-dimensional Be user interfaces becomes a three-dimensional input device for Computer needed. A three-dimensional is desirable Input medium that does not hinder the user, so that he can work freely, the least possible adjustment by the user to his computer workstation that is able to recognize and interpret human metaphors and which is relatively independent of environmental influences.
Der Erfindung liegt die Aufgabe zugrunde, ein Datenverarbei tungssystem anzugeben, welches in der Lage ist, die Ausdrucks formen der menschlichen Gestik direkt in Befehle zu seiner Steuerung umzusetzen. Diese Aufgabe wird durch ein Datenverar beitungssystem mit Merkmalen nach Anspruch 1 gelöst.The invention has for its object a data processing system that is able to express the expression shape human gestures directly into commands to his Implement control. This task is accomplished by a data processing processing system with features according to claim 1 solved.
Bei diesem Datenverarbeitungssystem, welches durch Ausdrucks formen der menschlichen Gestik gesteuert wird, sind Mittel vorgesehen, zur Aufnahme von Bildern des menschlichen Körpers oder von Körperteilen, zur Extraktion solcher Ausdrucksformen aus solchen Bildern, und zur Übersetzung solcher Ausdrucksfor men in Befehle zur Steuerung des Datenverarbeitungssystems.In this data processing system, which by expression forms of human gestures are controlled intended to take pictures of the human body or body parts, for the extraction of such expressions from such pictures, and to translate such expressions men in commands for controlling the data processing system.
Ein derartiges Datenverarbeitungssystem macht die Verwendung technischer Eingabemedien, wie z. B. einer Maus oder einer Ta statur, weitgehend, wenn nicht völlig, entbehrlich. In einer bevorzugten Ausführungsform der Erfindung ist bei dem Daten verarbeitungssystem eine graphische Benutzeroberfläche vorge sehen, deren Bedienungselemente mit Hilfe von Objekten manipu liert werden können, welche extrahierte Ausdrucksformen der menschlichen Gestik darstellen, wodurch das Datenverarbei tungssystem gesteuert wird. Die graphische Benutzeroberfläche kann dabei sowohl zweidimensional als auch dreidimensional sein. Eine Weiterbildung der Erfindung sieht vor, daß extra hierte Ausdrucksformen der menschlichen Gestik durch Objekte auf dieser graphischen Benutzeroberfläche dargestellt werden, welche geeignet sind, Bewegungen und Stellungen der menschli chen Hand oder beider Hände eines Menschen zu visualisieren. Dabei ist es besonders vorteilhaft, wenn die graphische Benut zeroberfläche des Datenverarbeitungssystems Strukturmerkmale der realen Arbeitsumgebung eines typischen Anwenders des Da tenverarbeitungssystems oder Strukturmerkmale von Gegenständen aus dieser Arbeitsumgebung aufweist, wobei Bedienungselemente dieser Benutzeroberfläche mit Hilfe graphischer Darstellungen von Gegenständen aus dieser Arbeitsumgebung oder Teilen sol cher Gegenstände visualisiert werden. Je nach Anwendungsart kann die graphische Benutzeroberfläche dabei z. B. Struktur merkmale eines Schreibtisches aufweisen. Zusätzlich oder al ternativ dazu kann die graphische Benutzeroberfläche auch Strukturmerkmale eines Karteikastens oder mehrerer Karteikä sten aufweisen. Für viele Anwendungen ist es vorteilhaft, wenn die graphische Benutzeroberfläche außerdem Strukturmerkmale eines Schriftstücks oder mehrerer Schriftstücke aufweist. Schließlich gibt es eine Reihe von Anwendungen, für welche die graphische Benutzeroberfläche vorteilhaft Strukturmerkmale ei nes Aktenschranks oder mehrerer Aktenschränke aufweist. Für eine Reihe von Anwendungen ist es vorteilhaft, wenn die Benut zeroberfläche gleichzeitig Strukturmerkmale von mehreren der genannten Arbeitsumgebungen bzw. von mehreren Gegenständen aus einer Arbeitsumgebung aufweist.Such a data processing system makes use technical input media, such as B. a mouse or a Ta stature, largely, if not entirely, unnecessary. In a preferred embodiment of the invention is in the data processing system featured a graphical user interface see whose controls manipu with the help of objects Which extracted forms of expression the human gesture, which means data processing system is controlled. The graphical user interface can be both two-dimensional and three-dimensional be. A further development of the invention provides that extra expressions of human gestures through objects are displayed on this graphical user interface, which are suitable for movements and positions of human beings hand or both hands of a person. It is particularly advantageous if the graphic user Surface of the data processing system structural features the real working environment of a typical user of the Da processing system or structural features of objects from this working environment, with controls this user interface with the help of graphical representations objects from this working environment or parts of them objects are visualized. Depending on the type of application can the graphical user interface z. B. Structure features of a desk. Additionally or al Alternatively, the graphical user interface can also Structural features of a card index box or several card indexes show most. For many applications, it is advantageous if the graphical user interface also features structural features one or more documents. Finally, there are a number of applications for which the graphical user interface advantageous structural features ei nes filing cabinet or multiple filing cabinets. For a number of uses, it is beneficial if the user Structural features of several of the surfaces simultaneously work environments or from several objects has a working environment.
In anderen Anwendungen sind graphische Benutzeroberflächen be sonders geeignet, welche Strukturmerkmale einer technischen Leitwarte aufweisen. Schließlich ist es in einer Reihe von me dizintechnischen Anwendungen vorteilhaft, wenn die graphische Benutzeroberfläche Strukturmerkmale bestimmter medizintechni scher Geräte aufweist.In other applications there are graphical user interfaces particularly suitable, which structural features of a technical Have control room. After all, it's in a series of me medical applications advantageous if the graphic User interface structural features of certain medical technology devices.
Die Extraktion von Ausdrucksformen der menschlichen Gestik ge lingt besonders zuverlässig und effizient, wenn zu diesem Zweck die Lage von Markierungen auf der Oberfläche des men schlichen Körpers ausgewertet wird. Bei vielen Anwendungen kommt hierfür bevorzugt die Auswertung der Lage von Markierun gen auf der Handoberfläche in Betracht. In anderen Anwendungen ist es vorteilhaft, die Blickrichtung eines Anwenders, seine Augenbewegungen, seine Kopfbewegungen, seine Mundbewegungen, seine Mimik oder die Bewegung des Körpers insgesamt auszuwer ten. Mit Hilfe eines derartigen Datenverarbeitungssystems ist die intuitive Bedienung eines Computers in verschiedensten An wendungen möglich. Im Ergebnis wird hierdurch quasi ein neues dreidimensionales Eingabegerät für Computer geschaffen, wel ches Handgesten und Bewegungen der menschlichen Hand erkennt, verfolgt, interpretiert und dann zur Steuerung von Rechnerak tionen verwendet.The extraction of expressions of human gestures succeeds particularly reliably and efficiently if at this Purpose the location of markings on the surface of the men crawled body is evaluated. In many applications the location of Markierun is preferred for this purpose on the palm of the hand. In other applications it is advantageous to have a user's gaze Eye movements, his head movements, his mouth movements, to evaluate his facial expressions or the movement of the body as a whole With the help of such a data processing system the intuitive operation of a computer in various ways turns possible. The result is a new one three-dimensional input device for computers created wel recognizes hand gestures and movements of the human hand, tracked, interpreted and then used to control computers tions used.
Gegenüber bestehenden Eingabemedien wie z. B. einer Maus oder einem Trackball, besitzt die erfindungsgemäße Lösung folgende Vorteile: Sie gewährleistet eine noch intuitivere Bedienbar keit des Rechners, welche vom Benutzer ein Minimum an Anpas sung an seinen Rechnerarbeitsplatz fordert. Die Freiheit des Benutzers wird weitgehend erhalten, da er kein zusätzliches Eingabegerät bedienen muß und deshalb in seiner gewohnten Ar beitsumgebung frei weiterarbeiten kann. Beide Hände des Be nutzers können gleichzeitig zur Steuerung des Datenverarbei tungssystems verwendet werden. Die räumliche Welt des Benut zers wird bei Verwendung einer dreidimensionalen Benutzerober fläche in eine räumliche Interaktionswelt mit dem Rechner ab gebildet.Compared to existing input media such as B. a mouse or a trackball, the solution according to the invention has the following Advantages: It ensures even more intuitive operation of the computer, which the user has to adapt to a minimum solution to his computer workstation. The freedom of User is largely preserved as he has no additional Must operate the input device and therefore in its usual ar working environment can continue to work freely. Both hands of the Be users can simultaneously control data processing be used system. The spatial world of the user zers becomes when using a three-dimensional user interface surface into a spatial interaction world with the computer educated.
Claims (20)
- - zur Aufnahme von Bildern des menschlichen Körpers oder Teilen davon,
- - zur Extraktion solcher Ausdrucksformen aus solchen Bil dern, und
- - zur Übersetzung solcher Ausdrucksformen in Befehle zur Steuerung des Datenverarbeitungssystems.
- - to take pictures of the human body or parts thereof,
- - for the extraction of such expressions from such images, and
- - To translate such expressions into commands for controlling the data processing system.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
DE19924201934 DE4201934A1 (en) | 1992-01-24 | 1992-01-24 | GESTIC COMPUTER |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
DE19924201934 DE4201934A1 (en) | 1992-01-24 | 1992-01-24 | GESTIC COMPUTER |
Publications (1)
Publication Number | Publication Date |
---|---|
DE4201934A1 true DE4201934A1 (en) | 1993-07-29 |
Family
ID=6450191
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
DE19924201934 Withdrawn DE4201934A1 (en) | 1992-01-24 | 1992-01-24 | GESTIC COMPUTER |
Country Status (1)
Country | Link |
---|---|
DE (1) | DE4201934A1 (en) |
Cited By (31)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE19601026A1 (en) * | 1996-01-13 | 1997-07-17 | Gordon Pipa | Human eye movement recognition equipment for computer use by severely-disabled person |
DE19845027A1 (en) * | 1998-09-30 | 2000-04-06 | Siemens Ag | Medical-science technical system for medical treatment system |
US6175610B1 (en) | 1998-02-11 | 2001-01-16 | Siemens Aktiengesellschaft | Medical technical system controlled by vision-detected operator activity |
DE10022321A1 (en) * | 2000-05-09 | 2001-11-15 | Bayerische Motoren Werke Ag | Apparatus in a vehicle for identifying or recognizing a hand position and translating this to a control command |
DE10125653C1 (en) * | 2001-05-25 | 2002-11-07 | Siemens Ag | Rehabilitation of patients with motor and cognitive disabilities with a gesture recognition system has an adaptation phase in which patients train the computer system to recognize input commands |
EP1477351A2 (en) | 2003-05-15 | 2004-11-17 | Webasto AG | Vehicle roof equipped with an operating device for electrical vehicle components and method for operating the electrical vehicle components |
EP1830244A2 (en) | 2006-03-01 | 2007-09-05 | Audi Ag | Method and device for operating at least two functional components of a system, in particular of a vehicle |
US7312788B2 (en) | 2003-03-11 | 2007-12-25 | Fraunhofer-Gesellschaft Zur Foerderung Der Angewandten Forschung E.V. | Gesture-based input device for a user interface of a computer |
DE102008004980A1 (en) * | 2008-01-17 | 2009-08-06 | Hansjörg Lienert | Device for human-machine-communication, is embedded in environment, and is based on usage of mobile and immobile symbolic control elements, where arbitrary objects are used as control elements and for data input and output |
EP2283790A1 (en) | 2009-08-14 | 2011-02-16 | Karl Storz GmbH & Co. KG | Control and method for operating an operation light |
US8638989B2 (en) | 2012-01-17 | 2014-01-28 | Leap Motion, Inc. | Systems and methods for capturing motion in three-dimensional space |
DE102014207637A1 (en) | 2014-04-23 | 2015-10-29 | Bayerische Motoren Werke Aktiengesellschaft | Gesture interaction with a driver information system of a vehicle |
US9285893B2 (en) | 2012-11-08 | 2016-03-15 | Leap Motion, Inc. | Object detection and tracking with variable-field illumination devices |
US9465461B2 (en) | 2013-01-08 | 2016-10-11 | Leap Motion, Inc. | Object detection and tracking with audio and optical signals |
US9495613B2 (en) | 2012-01-17 | 2016-11-15 | Leap Motion, Inc. | Enhanced contrast for object detection and characterization by optical imaging using formed difference images |
US9613262B2 (en) | 2014-01-15 | 2017-04-04 | Leap Motion, Inc. | Object detection and tracking for providing a virtual device experience |
US9679215B2 (en) | 2012-01-17 | 2017-06-13 | Leap Motion, Inc. | Systems and methods for machine control |
US9916009B2 (en) | 2013-04-26 | 2018-03-13 | Leap Motion, Inc. | Non-tactile interface systems and methods |
US9945660B2 (en) | 2012-01-17 | 2018-04-17 | Leap Motion, Inc. | Systems and methods of locating a control object appendage in three dimensional (3D) space |
US10099368B2 (en) | 2016-10-25 | 2018-10-16 | Brandon DelSpina | System for controlling light and for tracking tools in a three-dimensional space |
US10585193B2 (en) | 2013-03-15 | 2020-03-10 | Ultrahaptics IP Two Limited | Determining positional information of an object in space |
US10609285B2 (en) | 2013-01-07 | 2020-03-31 | Ultrahaptics IP Two Limited | Power consumption in motion-capture systems |
US10691219B2 (en) | 2012-01-17 | 2020-06-23 | Ultrahaptics IP Two Limited | Systems and methods for machine control |
US10739862B2 (en) | 2013-01-15 | 2020-08-11 | Ultrahaptics IP Two Limited | Free-space user interface and control using virtual constructs |
US10846942B1 (en) | 2013-08-29 | 2020-11-24 | Ultrahaptics IP Two Limited | Predictive information for free space gesture control and communication |
US11720180B2 (en) | 2012-01-17 | 2023-08-08 | Ultrahaptics IP Two Limited | Systems and methods for machine control |
US11740705B2 (en) | 2013-01-15 | 2023-08-29 | Ultrahaptics IP Two Limited | Method and system for controlling a machine according to a characteristic of a control object |
US11778159B2 (en) | 2014-08-08 | 2023-10-03 | Ultrahaptics IP Two Limited | Augmented reality with motion sensing |
US11775033B2 (en) | 2013-10-03 | 2023-10-03 | Ultrahaptics IP Two Limited | Enhanced field of view to augment three-dimensional (3D) sensory space for free-space gesture interpretation |
US11868687B2 (en) | 2013-10-31 | 2024-01-09 | Ultrahaptics IP Two Limited | Predictive information for free space gesture control and communication |
US12154238B2 (en) | 2014-05-20 | 2024-11-26 | Ultrahaptics IP Two Limited | Wearable augmented reality devices with object detection and tracking |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4843568A (en) * | 1986-04-11 | 1989-06-27 | Krueger Myron W | Real time perception of and response to the actions of an unencumbered participant/user |
JPH02132510A (en) * | 1988-11-12 | 1990-05-22 | Sony Corp | Input device |
EP0393368A2 (en) * | 1989-03-20 | 1990-10-24 | Hitachi, Ltd. | Man-machine interface system |
EP0438017A2 (en) * | 1990-01-18 | 1991-07-24 | International Business Machines Corporation | Method of graphically accessing electronic data with animated icons |
-
1992
- 1992-01-24 DE DE19924201934 patent/DE4201934A1/en not_active Withdrawn
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4843568A (en) * | 1986-04-11 | 1989-06-27 | Krueger Myron W | Real time perception of and response to the actions of an unencumbered participant/user |
JPH02132510A (en) * | 1988-11-12 | 1990-05-22 | Sony Corp | Input device |
EP0393368A2 (en) * | 1989-03-20 | 1990-10-24 | Hitachi, Ltd. | Man-machine interface system |
EP0438017A2 (en) * | 1990-01-18 | 1991-07-24 | International Business Machines Corporation | Method of graphically accessing electronic data with animated icons |
Non-Patent Citations (2)
Title |
---|
Mc Avinney, Paul: Telltale Gestures, In: BYTE 7/1990 S. 237-240 * |
Ohmura, Kazunori et. al.: Method of detecting facedirection using image processing for human interface, In: SPIE Vol. 1001 Visual Communications and Image Processing 1988, S. 625-632 * |
Cited By (69)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE19601026A1 (en) * | 1996-01-13 | 1997-07-17 | Gordon Pipa | Human eye movement recognition equipment for computer use by severely-disabled person |
US6175610B1 (en) | 1998-02-11 | 2001-01-16 | Siemens Aktiengesellschaft | Medical technical system controlled by vision-detected operator activity |
DE19845027A1 (en) * | 1998-09-30 | 2000-04-06 | Siemens Ag | Medical-science technical system for medical treatment system |
DE19845027C2 (en) * | 1998-09-30 | 2000-08-31 | Siemens Ag | Medical technology system |
DE10022321A1 (en) * | 2000-05-09 | 2001-11-15 | Bayerische Motoren Werke Ag | Apparatus in a vehicle for identifying or recognizing a hand position and translating this to a control command |
DE10125653C1 (en) * | 2001-05-25 | 2002-11-07 | Siemens Ag | Rehabilitation of patients with motor and cognitive disabilities with a gesture recognition system has an adaptation phase in which patients train the computer system to recognize input commands |
EP1262158A3 (en) * | 2001-05-25 | 2003-10-29 | Siemens Aktiengesellschaft | Interactive rehabilitation apparatus using gesture recognition |
US7312788B2 (en) | 2003-03-11 | 2007-12-25 | Fraunhofer-Gesellschaft Zur Foerderung Der Angewandten Forschung E.V. | Gesture-based input device for a user interface of a computer |
DE10321964A1 (en) * | 2003-05-15 | 2004-12-09 | Webasto Ag | Vehicle roof with an operating device for electrical vehicle components and method for operating electrical vehicle components |
US7342485B2 (en) | 2003-05-15 | 2008-03-11 | Webasto Ag | Motor vehicle roof with a control means for electrical motor vehicle components and process for operating electrical motor vehicle components |
DE10321964B4 (en) * | 2003-05-15 | 2008-05-29 | Webasto Ag | Vehicle roof with an operating device for electrical vehicle components and method for operating electrical vehicle components |
EP1477351A2 (en) | 2003-05-15 | 2004-11-17 | Webasto AG | Vehicle roof equipped with an operating device for electrical vehicle components and method for operating the electrical vehicle components |
EP1830244A2 (en) | 2006-03-01 | 2007-09-05 | Audi Ag | Method and device for operating at least two functional components of a system, in particular of a vehicle |
DE102008004980A1 (en) * | 2008-01-17 | 2009-08-06 | Hansjörg Lienert | Device for human-machine-communication, is embedded in environment, and is based on usage of mobile and immobile symbolic control elements, where arbitrary objects are used as control elements and for data input and output |
US8817085B2 (en) | 2009-08-14 | 2014-08-26 | Karl Storz Gmbh & Co. Kg | Control system and method to operate an operating room lamp |
EP2283790A1 (en) | 2009-08-14 | 2011-02-16 | Karl Storz GmbH & Co. KG | Control and method for operating an operation light |
US9945660B2 (en) | 2012-01-17 | 2018-04-17 | Leap Motion, Inc. | Systems and methods of locating a control object appendage in three dimensional (3D) space |
US9767345B2 (en) | 2012-01-17 | 2017-09-19 | Leap Motion, Inc. | Systems and methods of constructing three-dimensional (3D) model of an object using image cross-sections |
US12086327B2 (en) | 2012-01-17 | 2024-09-10 | Ultrahaptics IP Two Limited | Differentiating a detected object from a background using a gaussian brightness falloff pattern |
US11994377B2 (en) | 2012-01-17 | 2024-05-28 | Ultrahaptics IP Two Limited | Systems and methods of locating a control object appendage in three dimensional (3D) space |
US10699155B2 (en) | 2012-01-17 | 2020-06-30 | Ultrahaptics IP Two Limited | Enhanced contrast for object detection and characterization by optical imaging based on differences between images |
US9436998B2 (en) | 2012-01-17 | 2016-09-06 | Leap Motion, Inc. | Systems and methods of constructing three-dimensional (3D) model of an object using image cross-sections |
US10767982B2 (en) | 2012-01-17 | 2020-09-08 | Ultrahaptics IP Two Limited | Systems and methods of locating a control object appendage in three dimensional (3D) space |
US9495613B2 (en) | 2012-01-17 | 2016-11-15 | Leap Motion, Inc. | Enhanced contrast for object detection and characterization by optical imaging using formed difference images |
US11782516B2 (en) | 2012-01-17 | 2023-10-10 | Ultrahaptics IP Two Limited | Differentiating a detected object from a background using a gaussian brightness falloff pattern |
US9626591B2 (en) | 2012-01-17 | 2017-04-18 | Leap Motion, Inc. | Enhanced contrast for object detection and characterization by optical imaging |
US9652668B2 (en) | 2012-01-17 | 2017-05-16 | Leap Motion, Inc. | Enhanced contrast for object detection and characterization by optical imaging based on differences between images |
US9672441B2 (en) | 2012-01-17 | 2017-06-06 | Leap Motion, Inc. | Enhanced contrast for object detection and characterization by optical imaging based on differences between images |
US9679215B2 (en) | 2012-01-17 | 2017-06-13 | Leap Motion, Inc. | Systems and methods for machine control |
US9697643B2 (en) | 2012-01-17 | 2017-07-04 | Leap Motion, Inc. | Systems and methods of object shape and position determination in three-dimensional (3D) space |
US9741136B2 (en) | 2012-01-17 | 2017-08-22 | Leap Motion, Inc. | Systems and methods of object shape and position determination in three-dimensional (3D) space |
US10691219B2 (en) | 2012-01-17 | 2020-06-23 | Ultrahaptics IP Two Limited | Systems and methods for machine control |
US9778752B2 (en) | 2012-01-17 | 2017-10-03 | Leap Motion, Inc. | Systems and methods for machine control |
US9153028B2 (en) | 2012-01-17 | 2015-10-06 | Leap Motion, Inc. | Systems and methods for capturing motion in three-dimensional space |
US9934580B2 (en) | 2012-01-17 | 2018-04-03 | Leap Motion, Inc. | Enhanced contrast for object detection and characterization by optical imaging based on differences between images |
US8638989B2 (en) | 2012-01-17 | 2014-01-28 | Leap Motion, Inc. | Systems and methods for capturing motion in three-dimensional space |
US10565784B2 (en) | 2012-01-17 | 2020-02-18 | Ultrahaptics IP Two Limited | Systems and methods for authenticating a user according to a hand of the user moving in a three-dimensional (3D) space |
US11720180B2 (en) | 2012-01-17 | 2023-08-08 | Ultrahaptics IP Two Limited | Systems and methods for machine control |
US10366308B2 (en) | 2012-01-17 | 2019-07-30 | Leap Motion, Inc. | Enhanced contrast for object detection and characterization by optical imaging based on differences between images |
US10410411B2 (en) | 2012-01-17 | 2019-09-10 | Leap Motion, Inc. | Systems and methods of object shape and position determination in three-dimensional (3D) space |
US11308711B2 (en) | 2012-01-17 | 2022-04-19 | Ultrahaptics IP Two Limited | Enhanced contrast for object detection and characterization by optical imaging based on differences between images |
US9285893B2 (en) | 2012-11-08 | 2016-03-15 | Leap Motion, Inc. | Object detection and tracking with variable-field illumination devices |
US10609285B2 (en) | 2013-01-07 | 2020-03-31 | Ultrahaptics IP Two Limited | Power consumption in motion-capture systems |
US10097754B2 (en) | 2013-01-08 | 2018-10-09 | Leap Motion, Inc. | Power consumption in motion-capture systems with audio and optical signals |
US9465461B2 (en) | 2013-01-08 | 2016-10-11 | Leap Motion, Inc. | Object detection and tracking with audio and optical signals |
US11740705B2 (en) | 2013-01-15 | 2023-08-29 | Ultrahaptics IP Two Limited | Method and system for controlling a machine according to a characteristic of a control object |
US11353962B2 (en) | 2013-01-15 | 2022-06-07 | Ultrahaptics IP Two Limited | Free-space user interface and control using virtual constructs |
US10739862B2 (en) | 2013-01-15 | 2020-08-11 | Ultrahaptics IP Two Limited | Free-space user interface and control using virtual constructs |
US12204695B2 (en) | 2013-01-15 | 2025-01-21 | Ultrahaptics IP Two Limited | Dynamic, free-space user interactions for machine control |
US11874970B2 (en) | 2013-01-15 | 2024-01-16 | Ultrahaptics IP Two Limited | Free-space user interface and control using virtual constructs |
US10585193B2 (en) | 2013-03-15 | 2020-03-10 | Ultrahaptics IP Two Limited | Determining positional information of an object in space |
US11693115B2 (en) | 2013-03-15 | 2023-07-04 | Ultrahaptics IP Two Limited | Determining positional information of an object in space |
US9916009B2 (en) | 2013-04-26 | 2018-03-13 | Leap Motion, Inc. | Non-tactile interface systems and methods |
US10452151B2 (en) | 2013-04-26 | 2019-10-22 | Ultrahaptics IP Two Limited | Non-tactile interface systems and methods |
US11282273B2 (en) | 2013-08-29 | 2022-03-22 | Ultrahaptics IP Two Limited | Predictive information for free space gesture control and communication |
US10846942B1 (en) | 2013-08-29 | 2020-11-24 | Ultrahaptics IP Two Limited | Predictive information for free space gesture control and communication |
US11461966B1 (en) | 2013-08-29 | 2022-10-04 | Ultrahaptics IP Two Limited | Determining spans and span lengths of a control object in a free space gesture control environment |
US12086935B2 (en) | 2013-08-29 | 2024-09-10 | Ultrahaptics IP Two Limited | Predictive information for free space gesture control and communication |
US11776208B2 (en) | 2013-08-29 | 2023-10-03 | Ultrahaptics IP Two Limited | Predictive information for free space gesture control and communication |
US11775033B2 (en) | 2013-10-03 | 2023-10-03 | Ultrahaptics IP Two Limited | Enhanced field of view to augment three-dimensional (3D) sensory space for free-space gesture interpretation |
US11868687B2 (en) | 2013-10-31 | 2024-01-09 | Ultrahaptics IP Two Limited | Predictive information for free space gesture control and communication |
US9613262B2 (en) | 2014-01-15 | 2017-04-04 | Leap Motion, Inc. | Object detection and tracking for providing a virtual device experience |
DE102014207637A1 (en) | 2014-04-23 | 2015-10-29 | Bayerische Motoren Werke Aktiengesellschaft | Gesture interaction with a driver information system of a vehicle |
WO2015162058A1 (en) | 2014-04-23 | 2015-10-29 | Bayerische Motoren Werke Aktiengesellschaft | Gesture interaction with a driver information system of a vehicle |
US10585487B2 (en) | 2014-04-23 | 2020-03-10 | Bayerische Motoren Werke Aktiengesellschaft | Gesture interaction with a driver information system of a vehicle |
US12154238B2 (en) | 2014-05-20 | 2024-11-26 | Ultrahaptics IP Two Limited | Wearable augmented reality devices with object detection and tracking |
US11778159B2 (en) | 2014-08-08 | 2023-10-03 | Ultrahaptics IP Two Limited | Augmented reality with motion sensing |
US12095969B2 (en) | 2014-08-08 | 2024-09-17 | Ultrahaptics IP Two Limited | Augmented reality with motion sensing |
US10099368B2 (en) | 2016-10-25 | 2018-10-16 | Brandon DelSpina | System for controlling light and for tracking tools in a three-dimensional space |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
DE4201934A1 (en) | GESTIC COMPUTER | |
Gabbard | A taxonomy of usability characteristics in virtual environments | |
Bainbridge | Berkshire encyclopedia of human-computer interaction | |
DE69032645T2 (en) | Data processing system with input data based on gestures | |
EP3143478B1 (en) | Method for displaying a virtual interaction on at least one screen and input device | |
McPherson | Feminist in a software lab: difference+ design | |
DE69434843T2 (en) | Head-mounted image display device and data processing system incorporating the same | |
Poupyrev et al. | Developing a generic augmented-reality interface | |
DE102009032637B4 (en) | Image magnification system for a computer interface | |
DE69432344T2 (en) | Data entry device with display keyboard | |
Su et al. | The virtual panel architecture: A 3D gesture framework | |
Duval et al. | Improving awareness for 3D virtual collaboration by embedding the features of users’ physical environments and by augmenting interaction tools with cognitive feedback cues | |
Kim et al. | Tangible 3D: Hand Gesture Interaction for Immersive 3D Modeling. | |
Jeffri et al. | Guidelines for the interface design of ar systems for manual assembly | |
EP1573502A2 (en) | Rapid input device | |
DE112019002798T5 (en) | INFORMATION PROCESSING DEVICE, INFORMATION PROCESSING METHOD AND PROGRAM | |
Lu et al. | Classification, application, challenge, and future of midair gestures in augmented reality | |
Tano et al. | Three design principles learned through developing a series of 3D sketch systems:“Memory Capacity”,“Cognitive Mode”, and “Life-size and Operability” | |
Utterson | Early Visions of Interactivity: The in (put) s and Out (put) s of Real-Time Computing | |
Heemsbergen et al. | Physical digitality: Making reality visible through multimodal digital affordances for human perception | |
Poor et al. | Applying the Norman 1986 user-centered model to post-WIMP UIs: Theoretical predictions and empirical outcomes | |
Soares et al. | Ego-exo: A cooperative manipulation technique with automatic viewpoint control | |
Su et al. | A logical hand device in virtual environments | |
DE10054242A1 (en) | Method of inputting data into a system, such as a computer, requires the user making changes to a real image by hand movement | |
Bornschein et al. | Collaborative tactile graphic workstation for touch-sensitive pin-matrix devices |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
8110 | Request for examination paragraph 44 | ||
8130 | Withdrawal |