USRE48054E1 - Virtual interface and control device - Google Patents
Virtual interface and control device Download PDFInfo
- Publication number
- USRE48054E1 USRE48054E1 US15/252,066 US201615252066A USRE48054E US RE48054 E1 USRE48054 E1 US RE48054E1 US 201615252066 A US201615252066 A US 201615252066A US RE48054 E USRE48054 E US RE48054E
- Authority
- US
- United States
- Prior art keywords
- antenna
- control pad
- receiver
- pulse frequency
- generating
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Expired - Fee Related
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/046—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by electromagnetic means
Definitions
- the present invention is related to methods and devices for interfacing with electronic devices that receive commands from an operator, such as computer systems.
- Input devices for use with a computer have transformed significantly over the last three decades. Generally speaking, punch cards gave way to terminals with keyboards; keyboards gave way to the mouse.
- the mouse has evolved from a unit housing a ball interacting with motion detectors, to a number of variants, some of which are as follows:
- U.S. Pat. No. 6,313,825 to Gilbert discloses an input device for a computer that detects movement of an object, such as a finger, within a selected field of space.
- the input device is used to control movement of a cursor over a display device.
- the device includes directional transducers that receive reflections of EMF from an object in the field, and provides signals to an “interpreter.”
- the interpreter detects movements by employing a clock which determines the time difference between the reflections received by the transducers, which it then reduces to a signal that controls the cursor.
- U.S. Pat. No. 6,690,357 to Dunton discloses an input device that uses images of input devices, and scanning sensors that detect user interaction with those images.
- the scanning sensors include digital video cameras that capture the movement of a user's hands and convert the movement into input command signals.
- the scanning sensors may alternatively sense the projected light reflected from the user's hands, or may detect the combination of the reflected projected light and the user's hands.
- U.S. Pat. No. 6,614,422 to Rafii, et al. discloses an input device that employs a three-dimensional sensor imaging to capture three-dimensional data as to the placement of a user's fingers on a substrate that either bears or displays a template similar to a keyboard or a keypad.
- the three-dimensional sensor transmits optically acquired data to a companion computer system that computes the velocity and location of the user's fingers, and converts that information into a command.
- U.S. Pat. No. 6,498,628 to Iwamura discloses an electronic appliance remote controller that employs a camera as a motion-sensing interface.
- the camera captures video images of a users' hand, evaluates the moving speed and direction of the hand, and correspondingly moves a cursor appearing on a screen.
- Zimmerman discloses an apparatus for generating control signals for the manipulation of virtual objects in a computer system.
- the apparatus includes a glove worn on a hand that includes sensors for detecting the gestures of the hand, and hand position.
- the computer system receives data from the sensors, and generates corresponding control signals in response.
- U.S. Patent Application Publication 2002/0006807 Mantyjarvi, et al., teaches a device for entering data that creates a virtual keyboard by using an infrared transceiver arrangement.
- the infrared transceivers record reflection data obtained from an object placed within a field of infrared light, and processes the data to correspond to a key position or function.
- the invention comprises an input device for a computer or other programmable circuit that translates the proximity of an object to one or more antennae into an electronic signal.
- the antenna generates a reference first frequency and a second frequency.
- FIG. 1 is a perspective view of one embodiment of the intervention.
- FIG. 2 is a side view of one embodiment of the intervention.
- FIG. 3 is perspective view of a prefaced embodiment of the invention as used with a personal computer.
- FIG. 1 shows a perspective view of preferred embodiment of the invention that may be used to operate a programmable device, such as a computer.
- a first antenna 1 , and second antenna are 2 rotatably and pivotally attached to control pad 3 .
- Control pad 3 may include a third antenna 4 , which may be embedded in, or externally attached to control panel 3 .
- Control pad 3 may also include buttons 5 and 6 , which can correspond to the left and right buttons found on a conventional mouse.
- Antennas 1 and 2 may be constructed from conventional materials known to those of ordinary skill in the art. Antennas 1 and 2 may be rotatably and pivotally attached to control pad 3 by a combination of actuators that position the antennae in optimal relationships based upon feedback from the system driver.
- Control pad 3 may resemble a conventional mouse pad known to those in the art. Control pad 3 may be constructed from any non-conductive material that is electromagnetically invisible to signals emitted or received by the antennae. Antenna 4 may be formed from conventional materials, and can be embedded within control pad 3 , or may be attached externally in a manner similar to antennae 1 and 2 .
- FIG. 2 depicts a side view where antennae 1 and 2 are in vertical positions relative to control pad 3 , which is merely an example of how the antennae may be positioned. In practice, antennae 1 and 2 may be positioned in any positions relative to each other and control pad 3 to achieve optimal transmission and reception.
- FIG. 3 presents a perspective view of the invention as applied to a conventional computer 7 . While FIG. 3 depicts the invention being used with a personal computer (“PC”), it is important to note that the invention can be used with any size or type of computer or device that depends on input from a human operator. Examples include, but are not limited to, notebook computers, laptop computers, workstations, video game consoles, cash registers, automatic tellers, vehicle electronics or surgical/medical devices.
- PC personal computer
- This invention may be used with one or more antennae.
- the antenna or antennae 1 , 2 and 3 act as both emitters and receivers of electromagnetic fields.
- the antennae are operated in an electromagnetic spectrum range of 3 Hz to 1.24 eV.
- the antenna or antennae are arrayed in various arrangements depending upon the particular application, and the current frequency range being used.
- Each of the antennae initially emits a reference frequency.
- an object such as a hand
- the object creates a disturbance to the field. This disturbance is registered as a change in value.
- the value change is translated into a coordinate by a device driver or other software conventionally installed in the device to be controlled.
- the change in coordinates may be expressed as a command to the device to be controlled, for example, the movement of a cursor on a computer screen.
- the invention can register more than one disturbance to the field at a given time, giving the ability to convey more complex commands to a device to be controlled than can be achieved through conventional means.
- two or more interfaces will be linked with an imaging device that projects three-dimensional images.
- An example of such a three-dimensional imaging device is a holographic projector.
- the field emitted by the interfaces can overlay the projection.
- Through device drivers or other software programmed into a programmable circuit, attempts to interact with the images in the three-dimensional projection will be captured by the interface and will enable the user to move the virtual objects.
- the invention will provide software specifically designed to relate two- and three-dimensional motion in two- and three-dimensional images as represented by a programmable circuit.
- This software can also interpret disturbances to the field for programmable circuits designed to control the motion of mechanical devices.
- the software may have a specific user interface that is modifiable for the user.
- the invention may be linked electronically (wirelessly) or mechanically to the device to be controlled or object device, and can be powered by battery, a separate AC connection, by the object device, or any other conventional means known to those in the art. Two or more of these inventions may be connected to the device to be controlled so that a single user may use both hands simultaneously, or that multiple users can control the device. If used in conjunction with a computer as depicted in FIG. 3 , buttons 5 and 6 may be used as on a conventional mouse, or alternatively, the optional third antenna 4 can be employed to interpret movement in three dimensions so that the invention can replicate the conventional functions of buttons 5 and 6 electronically, in a manner familiar to users of conventional mouse devices.
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- General Physics & Mathematics (AREA)
- Electromagnetism (AREA)
- Position Input By Displaying (AREA)
Abstract
An input device for a computer or other programmable device translates the proximity of an object to one or more antennae into an electronic signal. The antennae generate a first frequency and a second frequency. When an object, such as a hand, is placed in proximity to the antenna, the object causes the first and second frequencies to heterodyne, which creates a third frequency, also referred to as a beat frequency or pulse frequency. A receiver interprets the pulse frequency and translates it into an electronic signal that can be used to command a computer or other programmable device.
Description
This application is a continuation U.S. patent application Ser. No. 13/174,357, filed on Jun. 30, 2011, which claims the benefit of U.S. patent application Ser. No. 11/327,785, filed on Jan. 6, 2006, which claims the benefit of U.S. Provisional Patent Application Ser. No. 60/641,809 filed Jan. 7, 2005. Each of the above-referenced patent applications are hereby incorporated by reference in their entirety.
1. Field of the Invention
The present invention is related to methods and devices for interfacing with electronic devices that receive commands from an operator, such as computer systems.
2. Description of the Related Art
The following description and discussion of the prior art is undertaken in order to provide background information so that the present invention may be completely understood and appreciated in its proper context.
Input devices for use with a computer have transformed significantly over the last three decades. Generally speaking, punch cards gave way to terminals with keyboards; keyboards gave way to the mouse. The mouse has evolved from a unit housing a ball interacting with motion detectors, to a number of variants, some of which are as follows:
U.S. Pat. No. 6,313,825 to Gilbert discloses an input device for a computer that detects movement of an object, such as a finger, within a selected field of space. The input device is used to control movement of a cursor over a display device. The device includes directional transducers that receive reflections of EMF from an object in the field, and provides signals to an “interpreter.” The interpreter detects movements by employing a clock which determines the time difference between the reflections received by the transducers, which it then reduces to a signal that controls the cursor.
U.S. Pat. No. 6,690,357 to Dunton discloses an input device that uses images of input devices, and scanning sensors that detect user interaction with those images. The scanning sensors include digital video cameras that capture the movement of a user's hands and convert the movement into input command signals. The scanning sensors may alternatively sense the projected light reflected from the user's hands, or may detect the combination of the reflected projected light and the user's hands.
U.S. Pat. No. 6,614,422 to Rafii, et al., discloses an input device that employs a three-dimensional sensor imaging to capture three-dimensional data as to the placement of a user's fingers on a substrate that either bears or displays a template similar to a keyboard or a keypad. The three-dimensional sensor transmits optically acquired data to a companion computer system that computes the velocity and location of the user's fingers, and converts that information into a command.
U.S. Pat. No. 6,498,628 to Iwamura discloses an electronic appliance remote controller that employs a camera as a motion-sensing interface. The camera captures video images of a users' hand, evaluates the moving speed and direction of the hand, and correspondingly moves a cursor appearing on a screen.
In U.S. Patent Application Publication 2003/0048312, Zimmerman discloses an apparatus for generating control signals for the manipulation of virtual objects in a computer system. The apparatus includes a glove worn on a hand that includes sensors for detecting the gestures of the hand, and hand position. The computer system receives data from the sensors, and generates corresponding control signals in response.
U.S. Patent Application Publication 2002/0075240, Lieberman, et al., describes a device for inputting alphanumeric information into a computer that employs sensors that may be optical, acoustic or position sensors to sense the “pressing” or “striking” of virtual keys. The sensor then forwards data to a processor, which converts the “pressing” or “striking” data with characters, instructions information or data.
U.S. Patent Application Publication 2002/0006807, Mantyjarvi, et al., teaches a device for entering data that creates a virtual keyboard by using an infrared transceiver arrangement. The infrared transceivers record reflection data obtained from an object placed within a field of infrared light, and processes the data to correspond to a key position or function.
The invention comprises an input device for a computer or other programmable circuit that translates the proximity of an object to one or more antennae into an electronic signal. The antenna generates a reference first frequency and a second frequency.
In the following detailed description of the preferred embodiments, reference is made to the accompanying drawings, and in which are shown by way of illustration specific embodiments in which the invention may be practiced. It is to be understood that other embodiments may be utilized and structural changes may be made without departing from the scope of the present invention.
This invention may be used with one or more antennae. In operation, the antenna or antennae 1, 2 and 3 act as both emitters and receivers of electromagnetic fields. Generally, the antennae are operated in an electromagnetic spectrum range of 3 Hz to 1.24 eV. The antenna or antennae are arrayed in various arrangements depending upon the particular application, and the current frequency range being used. Each of the antennae initially emits a reference frequency. When an object, such as a hand, is placed in the field created by the antenna or antennae over control pad 3, the object creates a disturbance to the field. This disturbance is registered as a change in value. The value change is translated into a coordinate by a device driver or other software conventionally installed in the device to be controlled. As the object moves within the field, the change in coordinates may be expressed as a command to the device to be controlled, for example, the movement of a cursor on a computer screen. If desired, the invention can register more than one disturbance to the field at a given time, giving the ability to convey more complex commands to a device to be controlled than can be achieved through conventional means.
In an alternative embodiment, two or more interfaces will be linked with an imaging device that projects three-dimensional images. An example of such a three-dimensional imaging device is a holographic projector. The field emitted by the interfaces can overlay the projection. Through device drivers or other software programmed into a programmable circuit, attempts to interact with the images in the three-dimensional projection will be captured by the interface and will enable the user to move the virtual objects.
The invention will provide software specifically designed to relate two- and three-dimensional motion in two- and three-dimensional images as represented by a programmable circuit. This software can also interpret disturbances to the field for programmable circuits designed to control the motion of mechanical devices. The software may have a specific user interface that is modifiable for the user.
The invention may be linked electronically (wirelessly) or mechanically to the device to be controlled or object device, and can be powered by battery, a separate AC connection, by the object device, or any other conventional means known to those in the art. Two or more of these inventions may be connected to the device to be controlled so that a single user may use both hands simultaneously, or that multiple users can control the device. If used in conjunction with a computer as depicted in FIG. 3 , buttons 5 and 6 may be used as on a conventional mouse, or alternatively, the optional third antenna 4 can be employed to interpret movement in three dimensions so that the invention can replicate the conventional functions of buttons 5 and 6 electronically, in a manner familiar to users of conventional mouse devices.
Although specific embodiments have been illustrated and described herein, it is appreciated by those of ordinary skill in the art that any arrangement which is calculated to achieve the same purpose may be substituted for the specific embodiments shown. This application is intended to cover any adaptations or variations of the present invention. Therefore, it is intended that the invention be limited only by the claims and their equivalents.
Claims (18)
1. A method for generating an electronic signal using an input device to command a computer or other programmable device, comprising:
generating, by a first antenna, a first frequency;
generating, by a second antenna, a second frequency;
placing an object in proximity to the first antenna and the second antenna, the object causing the first and second frequencies to generate a pulse frequency without any physical contact between the object and the input device;
sensing, by a receiver, the pulse frequency;
translating the pulse frequency into an electronic signal; and
commanding the computer or other programmable device using the electronic signal,
wherein the receiver is embedded in or externally attached to a touchpad control pad, and wherein the first antenna and the second antenna are positioned on opposite ends of the control pad and the receiver is positioned between the two antennas, wherein the object is configured to cause the first and second frequencies to be heterodyned so as to generate the heterodyned pulse frequency that is subsequently sensed by the receiver,
wherein the first antenna is attached to the control pad, wherein the second antenna is attached to the control pad, wherein the first antenna is attached to the control pad via servos and actuators such that the first antenna is rotated and pivoted about the control pad, and wherein the first antenna is configured to automatically self-calibrate and position itself in a position based on feedback input from the receiver.
2. The method of claim 1 , wherein the object placed in proximity to the first antenna and the second antenna is a hand, and wherein the pulse frequency is generated without any physical contact between the hand and the input device.
3. The method of claim 2 , wherein placing the hand in proximity to the first and second antennae causes the two frequencies to heterodyne to generate the pulse frequency.
4. The method of claim 1 , wherein the first and second antennae generate a third frequency configured to heterodyne with the first and second frequencies.
5. The method of claim 1 , wherein the first antenna is attached to the control pad.
6. The method of claim 5 , wherein the second antenna is attached to the control pad.
7. The method of claim 6 , wherein the first antenna is attached to the control pad via servos and actuators such that it may be rotated and pivoted about the control pad.
8. The method of claim 7 , wherein the first antenna is configured to automatically self-calibrate and position itself in the optimal position based on feedback input from the receiver.
9. The method of claim 1 , wherein the programmable device comprises an imaging device, and wherein the imaging device is configured to project three-dimensional images.
10. The method of claim 9 , wherein the imaging device is a holographic projector.
11. The method of claim 10 , further comprising:
projecting, by the holographic projector, a holographic image; and
overlaying the projected holographic image with a field emitted by the first antenna, the second antenna, and the a third antenna.
12. The method of claim 11 , further comprising interacting with the holographic image using the field.
13. The method of claim 1, wherein the first and second antennas are configured to move from a first position to a second position different from the first position, wherein the first and second antennas are arranged to be parallel to a top surface of the control pad in the first position, and wherein the first and second antennas are arranged to be angled with respect to the top surface of the control pad in the second position.
14. A method for generating an electronic signal using an input device to command a computer or other programmable device, comprising:
generating, by a first antenna, a first frequency;
generating, by a second antenna, a second frequency;
generating a pulse frequency based on the first and second frequencies, when an object is placed in proximity to the first antenna and the second antenna, without any physical contact between the object and the input device;
sensing, by a receiver, the pulse frequency;
translating the pulse frequency into an electronic signal; and
commanding the computer or other programmable device using the electronic signal,
wherein the receiver is embedded in or externally attached to a control pad, wherein the first antenna and the second antenna are positioned on opposite ends of the control pad and the receiver is positioned between the two antennas, wherein the object is configured to cause the first and second frequencies to be heterodyned so as to generate the heterodyned pulse frequency that is subsequently sensed by the receiver,
wherein the first antenna is attached to the control pad via servos and actuators such that the first antenna is rotated and pivoted about the control pad, and wherein the first antenna is configured to automatically self-calibrate and position itself in a position based on feedback input from the receiver.
15. The method of claim 14, further comprising generating, by at least one additional antenna, at least one third frequency, wherein generating the pulse frequency is performed based on the first to third frequencies.
16. The method of claim 14, wherein the object comprises a portion of a user's body.
17. A method for generating an electronic signal using an input device to command a computer or other programmable device, comprising:
generating, by a plurality of antennas, a plurality of frequencies, the plurality of antennas comprising first and second antennas configured to respectively generate first and second frequencies;
generating a pulse frequency based on the plurality of frequencies, when an object is placed in proximity to the plurality of antennas, without any physical contact between the object and the input device;
sensing, by at least one receiver, the pulse frequency;
translating the pulse frequency into an electronic signal; and
commanding the computer or other programmable device using the electronic signal,
wherein the receiver is embedded in or externally attached to a control pad, wherein the first antenna and the second antenna are positioned on opposite ends of the control pad and the receiver is positioned between the two antennas, and wherein the object is configured to cause the first and second frequencies to be heterodyned so as to generate the heterodyned pulse frequency that is subsequently sensed by the receiver,
wherein the first antenna is attached to the control pad via a plurality of actuators such that the first antenna is rotated and pivoted about the control pad, and wherein the first antenna is configured to automatically self-calibrate and position itself in a position based on feedback input from the receiver.
18. The method of claim 17, wherein the object comprises a portion of a user's body.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/252,066 USRE48054E1 (en) | 2005-01-07 | 2016-08-30 | Virtual interface and control device |
Applications Claiming Priority (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US64180905P | 2005-01-07 | 2005-01-07 | |
US11/327,785 US20060152482A1 (en) | 2005-01-07 | 2006-01-06 | Virtual interface and control device |
US13/174,357 US8358283B2 (en) | 2005-01-07 | 2011-06-30 | Virtual interface and control device |
US13/746,244 US8823648B2 (en) | 2005-01-07 | 2013-01-21 | Virtual interface and control device |
US15/252,066 USRE48054E1 (en) | 2005-01-07 | 2016-08-30 | Virtual interface and control device |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/746,244 Reissue US8823648B2 (en) | 2005-01-07 | 2013-01-21 | Virtual interface and control device |
Publications (1)
Publication Number | Publication Date |
---|---|
USRE48054E1 true USRE48054E1 (en) | 2020-06-16 |
Family
ID=36652769
Family Applications (4)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US11/327,785 Abandoned US20060152482A1 (en) | 2005-01-07 | 2006-01-06 | Virtual interface and control device |
US13/174,357 Active US8358283B2 (en) | 2005-01-07 | 2011-06-30 | Virtual interface and control device |
US13/746,244 Ceased US8823648B2 (en) | 2005-01-07 | 2013-01-21 | Virtual interface and control device |
US15/252,066 Expired - Fee Related USRE48054E1 (en) | 2005-01-07 | 2016-08-30 | Virtual interface and control device |
Family Applications Before (3)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US11/327,785 Abandoned US20060152482A1 (en) | 2005-01-07 | 2006-01-06 | Virtual interface and control device |
US13/174,357 Active US8358283B2 (en) | 2005-01-07 | 2011-06-30 | Virtual interface and control device |
US13/746,244 Ceased US8823648B2 (en) | 2005-01-07 | 2013-01-21 | Virtual interface and control device |
Country Status (1)
Country | Link |
---|---|
US (4) | US20060152482A1 (en) |
Families Citing this family (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20060152482A1 (en) * | 2005-01-07 | 2006-07-13 | Chauncy Godwin | Virtual interface and control device |
KR101789683B1 (en) * | 2011-06-13 | 2017-11-20 | 삼성전자주식회사 | Display apparatus and Method for controlling display apparatus and remote controller |
US9298333B2 (en) * | 2011-12-22 | 2016-03-29 | Smsc Holdings S.A.R.L. | Gesturing architecture using proximity sensing |
KR101873749B1 (en) * | 2012-01-26 | 2018-07-03 | 엘지전자 주식회사 | Mobile Terminal |
GB2515830A (en) * | 2013-07-05 | 2015-01-07 | Broadcom Corp | Method and apparatus for use in a radio communication device |
US10289771B2 (en) * | 2015-12-16 | 2019-05-14 | Dassault Systemes | Modification of a constrained asymmetrical subdivision mesh |
Citations (37)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4524348A (en) * | 1983-09-26 | 1985-06-18 | Lefkowitz Leonard R | Control interface |
US5392054A (en) * | 1993-01-29 | 1995-02-21 | Ericsson Ge Mobile Communications Inc. | Diversity antenna assembly for portable radiotelephones |
US5774045A (en) * | 1994-07-18 | 1998-06-30 | Siemens Aktiengesellschaft | Arrangement for detecting objects in a region to be monitored |
US5959612A (en) * | 1994-02-15 | 1999-09-28 | Breyer; Branko | Computer pointing device |
US5973677A (en) * | 1997-01-07 | 1999-10-26 | Telxon Corporation | Rechargeable, untethered electronic stylus for computer with interactive display screen |
US5990865A (en) * | 1997-01-06 | 1999-11-23 | Gard; Matthew Davis | Computer interface device |
US6005547A (en) * | 1995-10-14 | 1999-12-21 | Xerox Corporation | Calibration of an interactive desktop system |
US6243054B1 (en) * | 1998-07-01 | 2001-06-05 | Deluca Michael | Stereoscopic user interface method and apparatus |
US6313825B1 (en) | 1998-12-28 | 2001-11-06 | Gateway, Inc. | Virtual input device |
US20020006807A1 (en) | 2000-06-28 | 2002-01-17 | Jani Mantyjarvi | Method and arrangement for entering data in an electronic apparatus and an electronic apparatus |
US20020033803A1 (en) * | 2000-08-07 | 2002-03-21 | The Regents Of The University Of California | Wireless, relative-motion computer input device |
US20020075240A1 (en) | 2000-05-29 | 2002-06-20 | Vkb Inc | Virtual data entry device and method for input of alphanumeric and other data |
US6498628B2 (en) | 1998-10-13 | 2002-12-24 | Sony Corporation | Motion sensing interface |
US20030048312A1 (en) | 1987-03-17 | 2003-03-13 | Zimmerman Thomas G. | Computer data entry and manipulation apparatus and method |
US6542125B1 (en) * | 1997-06-04 | 2003-04-01 | Robert Bosch Gmbh | Radio device with moveable antenna |
US6614422B1 (en) | 1999-11-04 | 2003-09-02 | Canesta, Inc. | Method and apparatus for entering data using a virtual input device |
US6690357B1 (en) | 1998-10-07 | 2004-02-10 | Intel Corporation | Input device using scanning sensors |
US20040041828A1 (en) * | 2002-08-30 | 2004-03-04 | Zellhoefer Jon William | Adaptive non-contact computer user-interface system and method |
US20040135776A1 (en) * | 2002-10-24 | 2004-07-15 | Patrick Brouhon | Hybrid sensing techniques for position determination |
US20050134556A1 (en) * | 2003-12-18 | 2005-06-23 | Vanwiggeren Gregory D. | Optical navigation based on laser feedback or laser interferometry |
US20050248539A1 (en) * | 2004-05-05 | 2005-11-10 | Morrison Gerald D | Apparatus and method for detecting a pointer relative to a touch surface |
US20050276448A1 (en) * | 2000-07-07 | 2005-12-15 | Pryor Timothy R | Multi-functional control and entertainment systems |
US20070103440A1 (en) * | 2005-11-08 | 2007-05-10 | Microsoft Corporation | Optical tracker |
US20070211022A1 (en) * | 2006-03-08 | 2007-09-13 | Navisense. Llc | Method and device for three-dimensional sensing |
US20070220437A1 (en) * | 2006-03-15 | 2007-09-20 | Navisense, Llc. | Visual toolkit for a virtual user interface |
US20080120577A1 (en) * | 2006-11-20 | 2008-05-22 | Samsung Electronics Co., Ltd. | Method and apparatus for controlling user interface of electronic device using virtual plane |
US7414705B2 (en) * | 2005-11-29 | 2008-08-19 | Navisense | Method and system for range measurement |
US20090183125A1 (en) * | 2008-01-14 | 2009-07-16 | Prime Sense Ltd. | Three-dimensional user interface |
US7583983B2 (en) * | 2002-09-20 | 2009-09-01 | Kyocera Corporation | Adaptive array wireless communication apparatus, reception level display method, reception level adjusting method, reception level display program, and reception level adjusting program |
US7589709B2 (en) * | 2002-06-04 | 2009-09-15 | Koninklijke Philips Electronics N.V. | Method of measuring the movement of an input device |
US20110181509A1 (en) * | 2010-01-26 | 2011-07-28 | Nokia Corporation | Gesture Control |
US20110279397A1 (en) * | 2009-01-26 | 2011-11-17 | Zrro Technologies (2009) Ltd. | Device and method for monitoring the object's behavior |
US20110298708A1 (en) * | 2010-06-07 | 2011-12-08 | Microsoft Corporation | Virtual Touch Interface |
US20130002614A1 (en) * | 2011-06-28 | 2013-01-03 | Microsoft Corporation | Electromagnetic 3d stylus |
US8358283B2 (en) * | 2005-01-07 | 2013-01-22 | Chauncy Godwin | Virtual interface and control device |
US8614669B2 (en) * | 2006-03-13 | 2013-12-24 | Navisense | Touchless tablet method and system thereof |
US8730162B1 (en) * | 2011-04-07 | 2014-05-20 | Google Inc. | Methods and apparatus related to cursor device calibration |
Family Cites Families (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6710770B2 (en) * | 2000-02-11 | 2004-03-23 | Canesta, Inc. | Quasi-three-dimensional method and apparatus to detect and localize interaction of user-object and virtual transfer device |
US7006236B2 (en) * | 2002-05-22 | 2006-02-28 | Canesta, Inc. | Method and apparatus for approximating depth of an object's placement onto a monitored region with applications to virtual interface devices |
US6611252B1 (en) * | 2000-05-17 | 2003-08-26 | Dufaux Douglas P. | Virtual data input device |
US6650318B1 (en) * | 2000-10-13 | 2003-11-18 | Vkb Inc. | Data input device |
US20020075334A1 (en) * | 2000-10-06 | 2002-06-20 | Yfantis Evangelos A. | Hand gestures and hand motion for replacing computer mouse events |
AU2002362085A1 (en) * | 2001-12-07 | 2003-07-09 | Canesta Inc. | User interface for electronic devices |
US6654001B1 (en) * | 2002-09-05 | 2003-11-25 | Kye Systems Corp. | Hand-movement-sensing input device |
-
2006
- 2006-01-06 US US11/327,785 patent/US20060152482A1/en not_active Abandoned
-
2011
- 2011-06-30 US US13/174,357 patent/US8358283B2/en active Active
-
2013
- 2013-01-21 US US13/746,244 patent/US8823648B2/en not_active Ceased
-
2016
- 2016-08-30 US US15/252,066 patent/USRE48054E1/en not_active Expired - Fee Related
Patent Citations (37)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4524348A (en) * | 1983-09-26 | 1985-06-18 | Lefkowitz Leonard R | Control interface |
US20030048312A1 (en) | 1987-03-17 | 2003-03-13 | Zimmerman Thomas G. | Computer data entry and manipulation apparatus and method |
US5392054A (en) * | 1993-01-29 | 1995-02-21 | Ericsson Ge Mobile Communications Inc. | Diversity antenna assembly for portable radiotelephones |
US5959612A (en) * | 1994-02-15 | 1999-09-28 | Breyer; Branko | Computer pointing device |
US5774045A (en) * | 1994-07-18 | 1998-06-30 | Siemens Aktiengesellschaft | Arrangement for detecting objects in a region to be monitored |
US6005547A (en) * | 1995-10-14 | 1999-12-21 | Xerox Corporation | Calibration of an interactive desktop system |
US5990865A (en) * | 1997-01-06 | 1999-11-23 | Gard; Matthew Davis | Computer interface device |
US5973677A (en) * | 1997-01-07 | 1999-10-26 | Telxon Corporation | Rechargeable, untethered electronic stylus for computer with interactive display screen |
US6542125B1 (en) * | 1997-06-04 | 2003-04-01 | Robert Bosch Gmbh | Radio device with moveable antenna |
US6243054B1 (en) * | 1998-07-01 | 2001-06-05 | Deluca Michael | Stereoscopic user interface method and apparatus |
US6690357B1 (en) | 1998-10-07 | 2004-02-10 | Intel Corporation | Input device using scanning sensors |
US6498628B2 (en) | 1998-10-13 | 2002-12-24 | Sony Corporation | Motion sensing interface |
US6313825B1 (en) | 1998-12-28 | 2001-11-06 | Gateway, Inc. | Virtual input device |
US6614422B1 (en) | 1999-11-04 | 2003-09-02 | Canesta, Inc. | Method and apparatus for entering data using a virtual input device |
US20020075240A1 (en) | 2000-05-29 | 2002-06-20 | Vkb Inc | Virtual data entry device and method for input of alphanumeric and other data |
US20020006807A1 (en) | 2000-06-28 | 2002-01-17 | Jani Mantyjarvi | Method and arrangement for entering data in an electronic apparatus and an electronic apparatus |
US20050276448A1 (en) * | 2000-07-07 | 2005-12-15 | Pryor Timothy R | Multi-functional control and entertainment systems |
US20020033803A1 (en) * | 2000-08-07 | 2002-03-21 | The Regents Of The University Of California | Wireless, relative-motion computer input device |
US7589709B2 (en) * | 2002-06-04 | 2009-09-15 | Koninklijke Philips Electronics N.V. | Method of measuring the movement of an input device |
US20040041828A1 (en) * | 2002-08-30 | 2004-03-04 | Zellhoefer Jon William | Adaptive non-contact computer user-interface system and method |
US7583983B2 (en) * | 2002-09-20 | 2009-09-01 | Kyocera Corporation | Adaptive array wireless communication apparatus, reception level display method, reception level adjusting method, reception level display program, and reception level adjusting program |
US20040135776A1 (en) * | 2002-10-24 | 2004-07-15 | Patrick Brouhon | Hybrid sensing techniques for position determination |
US20050134556A1 (en) * | 2003-12-18 | 2005-06-23 | Vanwiggeren Gregory D. | Optical navigation based on laser feedback or laser interferometry |
US20050248539A1 (en) * | 2004-05-05 | 2005-11-10 | Morrison Gerald D | Apparatus and method for detecting a pointer relative to a touch surface |
US8358283B2 (en) * | 2005-01-07 | 2013-01-22 | Chauncy Godwin | Virtual interface and control device |
US20070103440A1 (en) * | 2005-11-08 | 2007-05-10 | Microsoft Corporation | Optical tracker |
US7414705B2 (en) * | 2005-11-29 | 2008-08-19 | Navisense | Method and system for range measurement |
US20070211022A1 (en) * | 2006-03-08 | 2007-09-13 | Navisense. Llc | Method and device for three-dimensional sensing |
US8614669B2 (en) * | 2006-03-13 | 2013-12-24 | Navisense | Touchless tablet method and system thereof |
US20070220437A1 (en) * | 2006-03-15 | 2007-09-20 | Navisense, Llc. | Visual toolkit for a virtual user interface |
US20080120577A1 (en) * | 2006-11-20 | 2008-05-22 | Samsung Electronics Co., Ltd. | Method and apparatus for controlling user interface of electronic device using virtual plane |
US20090183125A1 (en) * | 2008-01-14 | 2009-07-16 | Prime Sense Ltd. | Three-dimensional user interface |
US20110279397A1 (en) * | 2009-01-26 | 2011-11-17 | Zrro Technologies (2009) Ltd. | Device and method for monitoring the object's behavior |
US20110181509A1 (en) * | 2010-01-26 | 2011-07-28 | Nokia Corporation | Gesture Control |
US20110298708A1 (en) * | 2010-06-07 | 2011-12-08 | Microsoft Corporation | Virtual Touch Interface |
US8730162B1 (en) * | 2011-04-07 | 2014-05-20 | Google Inc. | Methods and apparatus related to cursor device calibration |
US20130002614A1 (en) * | 2011-06-28 | 2013-01-03 | Microsoft Corporation | Electromagnetic 3d stylus |
Also Published As
Publication number | Publication date |
---|---|
US20130222242A1 (en) | 2013-08-29 |
US20120025959A1 (en) | 2012-02-02 |
US8823648B2 (en) | 2014-09-02 |
US20060152482A1 (en) | 2006-07-13 |
US8358283B2 (en) | 2013-01-22 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
USRE48054E1 (en) | Virtual interface and control device | |
US11099655B2 (en) | System and method for gesture based data and command input via a wearable device | |
US11221730B2 (en) | Input device for VR/AR applications | |
US9268400B2 (en) | Controlling a graphical user interface | |
EP2717120B1 (en) | Apparatus, methods and computer program products providing finger-based and hand-based gesture commands for portable electronic device applications | |
US5598187A (en) | Spatial motion pattern input system and input method | |
US7358963B2 (en) | Mouse having an optically-based scrolling feature | |
US8907894B2 (en) | Touchless pointing device | |
US20030174125A1 (en) | Multiple input modes in overlapping physical space | |
US20070222746A1 (en) | Gestural input for navigation and manipulation in virtual space | |
US20110279397A1 (en) | Device and method for monitoring the object's behavior | |
US20110037695A1 (en) | Ergonomic control unit for providing a pointing function | |
US9141230B2 (en) | Optical sensing in displacement type input apparatus and methods | |
US20110095983A1 (en) | Optical input device and image system | |
EP1160651A1 (en) | Wireless cursor control | |
WO2003003185A1 (en) | System for establishing a user interface | |
US20050264522A1 (en) | Data input device | |
US11992752B2 (en) | Input apparatus for a games console | |
US20080036739A1 (en) | Integrated Wireless Pointing Device, Terminal Equipment with the Same, and Pointing Method Using Wireless Pointing Device | |
EP1785817A2 (en) | External operation signal recognition system of a mobile communication terminal | |
KR100812998B1 (en) | Mouse Control System Using Bluetooth Communication | |
JP2000187551A (en) | Input device | |
WO2003071411A1 (en) | Multiple input modes in overlapping physical space | |
JP2003108308A (en) | Mouth with lateral scroll function | |
JP2001084099A (en) | Track ball |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
LAPS | Lapse for failure to pay maintenance fees |
Free format text: PATENT EXPIRED FOR FAILURE TO PAY MAINTENANCE FEES (ORIGINAL EVENT CODE: EXP.); ENTITY STATUS OF PATENT OWNER: SMALL ENTITY |