US9244538B2 - Using portable electronic devices for user input - Google Patents
Using portable electronic devices for user input Download PDFInfo
- Publication number
- US9244538B2 US9244538B2 US13/789,688 US201313789688A US9244538B2 US 9244538 B2 US9244538 B2 US 9244538B2 US 201313789688 A US201313789688 A US 201313789688A US 9244538 B2 US9244538 B2 US 9244538B2
- Authority
- US
- United States
- Prior art keywords
- sensor reading
- electronic device
- portable electronic
- acceleration
- accelerometer
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active, expires
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0346—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0354—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
- G06F3/03543—Mice or pucks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/038—Control and interface arrangements therefor, e.g. drivers or device-embedded control circuitry
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M2250/00—Details of telephonic subscriber devices
- H04M2250/02—Details of telephonic subscriber devices including a Bluetooth interface
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M2250/00—Details of telephonic subscriber devices
- H04M2250/12—Details of telephonic subscriber devices including a sensor for measuring a physical value, e.g. temperature or motion
Definitions
- a mouse In computing, a mouse is a pointing device that allows for fine control of a graphical user interface on a computer.
- Computer mice are external to computers and thus may lack portability.
- laptops, netbooks, and other portable computing devices typically do not include mice. Instead, trackpads, pointing sticks, or touchscreens have been used. Though portable, such pointing devices suffer from poor precision and inferior usability when compared to computer mice.
- the portable electronic device can be a smartphone, a personal data assistant, a portable navigation device, and/or other types of electronic device.
- the portable electronic device can include a processor, a memory, and at least one of an accelerometer, a gyroscope, a magnetometer, or other suitable types of inertial measurement unit.
- the portable electronic device can be operatively coupled to the computer via a wired (e.g., a USB connection) or wireless (e.g., Bluetooth, WIFI, etc.) connection.
- the inertial measurement unit can detect and measure accelerations of the portable electronic device along at least two dimensions in a coordinate system.
- the processor executing instructions stored in the memory, then receives and double integrates the measured accelerations with respect to time to obtain position changes of the portable electronic device along the at least two directions.
- the processor can then transmit the calculated position changes to the computer via the wired or wireless connection.
- the computer can move a computer cursor on a graphical user interface in accordance with the calculated position changes of the portable electronic device.
- FIG. 1 is a schematic diagram illustrating a computing framework for using a portable electronic device for user input in accordance with embodiments of the present technology.
- FIG. 2 is a block diagram showing computing components suitable for the computer and the portable electronic device of FIG. 1 in accordance with embodiments of the present technology.
- FIG. 3 is a block diagram showing software modules suitable for the portable electronic device of FIG. 2 in accordance with embodiments of the present technology.
- FIG. 4 is a flow diagram illustrating a process for using a portable electronic device for user input in accordance with embodiments of the present technology.
- FIG. 5A is an example acceleration versus time plot in accordance with embodiments of the present technology.
- FIGS. 5B and 5C are example original and corrected speed versus time plots obtained based on the acceleration versus time plot of FIG. 5A , respectively.
- FIG. 5D is an example position versus time plot based on the corrected speed versus time plot of FIG. 5C .
- computer mice may lack portability for using with laptops, netbooks, and other portable computing devices.
- Trackpads, pointing sticks, touchscreens, or other pointing devices though portable, suffer from poor precision and inferior usability when compared to computer mice.
- Several embodiments of the present technology are directed to using portable electronic devices that users already carry as pointing devices.
- portable electronic devices with an inertial measurement unit can be configured to provide generally similar usability as computer mice. As a result, users can achieve fine control of graphical user interfaces on computers without carrying additional components.
- an “inertial measurement unit” generally refers to a measuring component configured to measure at least one of a velocity, orientation, or gravitational force of a physical mass.
- An inertial measurement unit can include at least one of an accelerometer, a gyroscope, a magnetometer, or other suitable types of inertial sensing element.
- the term “accelerometer” generally refers to a sensing element configured to measure a proper acceleration of a mass.
- a proper acceleration measured by an accelerometer is not necessarily a coordinate acceleration (i.e., a rate of change of velocity). For example, an accelerometer at rest on a surface of the Earth would measure a proper acceleration of 9.81 m/s 2 due to its weight. By contrast, an accelerometer in free fall would measure a proper acceleration of zero.
- FIG. 1 is a schematic diagram illustrating a computing framework 100 for using a portable electronic device for user input in accordance with embodiments of the present technology.
- the computer framework 100 can include a computer 110 in communication with a portable electronic device 120 via a connection 104 . Both the computer 110 and the portable electronic device 120 are resting on a surface 102 .
- the connection 104 can include a universal serial bus (“USB”) link, a serial link, a parallel link, a Bluetooth link, a WIFI link, and/or other suitable wired or wireless link.
- the computer 110 includes a laptop computer having a keyboard 115 and a display 116 .
- the computer 110 can also include a netbook, a desktop, and/or other suitable computing devices.
- the portable electronic device 120 includes a smartphone having a housing 121 carrying a touchscreen 124 and a button 125 .
- the portable electronic device 120 can also include a personal data assistant, a portable navigation device, or other suitable portable electronic devices.
- the portable electronic device 120 can also include a processor, a memory, and an inertial measurement unit (not shown in FIG. 1 ) configured to measure and track position changes of the portable electronic device 120 . The portable electronic device 120 can then transmit the position changes to the computer 110 to control positions of a cursor 118 on the display 116 .
- a user 101 can move the portable electronic device 120 along the x-, y-, or z-axis on the surface 102 .
- the portable electronic device 120 can then determine a position change along at least the x-, y-, or z-axis from a first position 129 a to a second position 129 b (shown in phantom lines for clarity), as indicated by the arrow 106 .
- the portable electronic device 120 then transmits the determined position change as a cursor control signal or other suitable types of user input signal to the computer 110 via the connection 104 .
- the computer 110 can use the user input signal to, for example, control a cursor position on the display 116 , execute computing commands, and/or perform other suitable actions based on the user input signal. For instance, as shown in FIG. 1 , the computer 110 can cause the cursor 118 to traverse from a first cursor position 119 a to a second cursor position 119 b (shown in phantom lines for clarity) based on the position change of the portable electronic device 120 . As a result, the user 101 may use the portable electronic device 120 for user input generally similarly as using a computer mouse.
- FIG. 2 is a block diagram showing computing components suitable for the computer 110 and the portable electronic device 120 of FIG. 1 in accordance with embodiments of the present technology.
- individual software components, modules, and routines may be a computer program, procedure, or process written as source code in C, C++, Java, and/or other suitable programming languages.
- the computer program, procedure, or process may be compiled into object or machine code and presented for execution by one or more processors of a personal computer, a network server, a laptop computer, a smartphone, and/or other suitable computing devices.
- Various implementations of the source and/or object code and associated data may be stored in a computer memory that includes read-only memory, random-access memory, magnetic disk storage media, optical storage media, flash memory devices, and/or other suitable computer readable storage media excluding propagated signals.
- the computer 110 can include a processor 112 , a memory 113 , and an input/output component 114 operatively coupled to the display 116 .
- the processor 112 can include a microprocessor, a field-programmable gate array, and/or other suitable logic devices.
- the memory 113 can include volatile and/or nonvolatile computer readable media (e.g., ROM; RAM, magnetic disk storage media; optical storage media; flash memory devices, EEPROM, and/or other suitable storage media) configured to store data received from, as well as instructions for, the processor 112 .
- the input/output component 114 can include a digital and/or analog input/output interface configured to accept input from and/or provide output to the display 116 and/or other components of the computer 110 .
- the portable electronic device 120 can include a device processor 122 , a device memory 123 , and an inertial measurement unit 126 operatively coupled to one another.
- the device processor 122 can include a microprocessor and/or other suitable logic devices.
- the device memory 123 can include volatile and/or nonvolatile computer readable media (e.g., ROM; RAM, magnetic disk storage media; optical storage media; flash memory devices, EEPROM, and/or other suitable storage media) configured to store data received from, as well as instructions for, the device processor 122 .
- the inertial measurement unit 126 can include at least one of an accelerometer, a gyroscope, a magnetometer, or other suitable types of inertial sensing element.
- the inertial measurement unit 126 can include a piezoelectric, piezo-resistive, or capacitive accelerometer.
- One suitable accelerometer can be a 3-axis accelerometer (Model No. LIS3DH) provided by STMicroelectronics of Geneva, Switzerland.
- the inertial measurement unit 126 can include a pendulous integrating gyroscopic accelerometer, a surface micro-machined capacitive accelerometer, and/or other suitable types of accelerometer.
- the device processor 122 can be configured to execute instructions of software components.
- software components of the device processor 122 can include an input module 132 , a database module 134 , a process module 136 , and an output module 138 interconnected with one another.
- the device processor 122 may execute instructions of other suitable software components in addition to or in lieu of the foregoing software modules.
- the input module 132 can accept input data 150 (e.g., sensor readings from the inertial measurement unit 126 ), and communicates the accepted input data 150 to other components for further processing.
- the database module 134 organizes records, including measurement records 142 , and facilitates storing and retrieving of these records to and from the device memory 123 .
- the measurement records 142 may include contemporaneous and/or historical values of at least one of acceleration, speed, or position in at least one dimension. Any type of database organization may be utilized, including a flat file system, hierarchical database, relational database, or distributed database.
- the process module 136 analyzes the input data 150 from the input module 132 and/or other data sources to determine a position change of the portable electronic device 120 .
- the output module 138 generates output signals 152 based on the analyzed input data 150 and transmits the output signals 152 as user input signals to the computer 110 via the connection 104 .
- Embodiments of the process module 136 are described in more detail below with reference to FIG. 3 .
- FIG. 3 is a block diagram showing embodiments of the process module 136 in FIG. 3 .
- the process module 136 may further include a sensing module 160 , an analysis module 162 , a control module 164 , and a calculation module 166 interconnected with one other.
- Each module may be a computer program, procedure, or routine written as source code in a conventional programming language, or one or more modules may be hardware modules.
- the sensing module 160 is configured to receive the input data 150 and converting the input data 150 into suitable metrics in engineering or other units.
- the sensing module 160 may receive sensor readings from the inertial measurement unit 126 ( FIG. 2 ) and convert the received sensor readings to data of an acceleration vector in meters per second squared in an Euclidean coordinate system.
- the acceleration vector can thus have a magnitude and a direction, which may be represented as acceleration sub-vectors in two- or three-dimensions.
- the sensing module 160 may sample the input from the inertial measurement unit 126 at about 30 Hz, about 60 Hz, about 120 Hz, about 1,000 Hz, or other suitable frequencies.
- the acceleration vector can also be represented as acceleration sub-vectors in a cylindrical coordinate system, a spherical coordinate system, or other suitable coordinate systems.
- the sensing module 160 may perform other suitable conversions.
- the calculation module 166 may include routines configured to perform various types of calculations to facilitate operation of other modules.
- calculation module 166 can include linear regression, polynomial regression, interpolation, extrapolation, and/or other suitable subroutines. In further examples, the calculation module 166 can also include counters, timers, and/or other suitable routines.
- the analysis module 162 can be configured to analyze the various sensed and/or calculated metrics to determine if the portable electronic device 120 is moving. For example, the analysis module 162 can compare a current acceleration vector ⁇ to a previous acceleration vector ⁇ , for instance stored in the device memory 123 as the position records 142 . If a magnitude change of the acceleration vector ⁇ is greater than a threshold (e.g., a moving average of the magnitude or other suitable values), the analysis module 162 can indicate that the portable electronic device 120 is moving.
- a threshold e.g., a moving average of the magnitude or other suitable values
- the analysis module 162 can indicate that the portable electronic device 120 is stationary and can set the current acceleration vector ⁇ as a current value of the gravitational acceleration g of the portable electronic device 120 .
- the analysis module 162 can also monitor the velocity v . If the velocity has a magnitude greater than a speed threshold, the analysis module 162 can indicate that the portable electronic device 120 is moving. In other examples, the analysis module 162 can perform other suitable analysis to determine if the portable electronic device 120 is moving.
- the control module 164 may be configured to control the operation of the portable electronic device 120 based on analysis results from the analysis module 162 . For example, in one embodiment, if the analysis module 162 indicates that the portable electronic device 120 is moving, the control module 164 can instruct the calculation module 166 to calculate the position change vector p , as discussed above. The control module 164 can then provide the calculated position change vector p to the output module 138 as the output signals 152 .
- control module 164 may also generate the output signals 152 based on user input 154 .
- the user 101 FIG. 1
- the control module 164 may generate output signals 152 based on the user input 154 .
- control module 164 may interpret the user input 154 as a click, a double click, a scroll, a pan, and/or other suitable commands.
- the control module 164 may generate the output signals 152 based on other suitable information.
- the computer 110 may determine such position changes.
- the processor 112 of the computer 110 may execute instructions for at least one of the software modules 160 , 162 , 164 , and 166 .
- the portable electronic device 120 can transmit the data input 150 and/or other suitable information to the computer 110 via the connection 104 .
- the computer 110 can determine position changes of the portable electronic device 120 , as discussed above and as discussed with reference to FIG. 4 .
- FIG. 4 is a flowchart showing a process 200 for using a portable electronic device for user input in accordance with embodiments of the present technology. Even though the process 200 is described below with reference to the computing framework 100 of FIG. 1 and the software components/modules of FIGS. 2 and 3 , the process 200 may also be applied in other systems with additional or different hardware and/or software components.
- the process 200 can include acquiring a sensor signal or sensor reading at stage 202 .
- acquiring the sensor signal can include sampling the inertial measurement unit 126 ( FIG. 2 ) to obtain data of a proper acceleration vector at 30 Hz or at other suitable sampling rates.
- acquiring the sensor signal can also include acquiring data of at least one of an angular momentum, a pitch, a roll, or a yaw of the portable electronic device 120 .
- acquiring the sensor signal can include acquiring other suitable measurements.
- the process 200 can also include determining if the portable electronic device 120 is moving at stage 204 .
- the determination may be made with the analysis module 162 ( FIG. 3 ) as discussed above with reference to FIG. 3 . In other embodiments, the determination may be made in other suitable manners.
- the process 200 can optionally include storing the acquired sensor signal at stage 206 .
- the acquired sensor signal can include an acceleration vector, and can be stored as the gravitational acceleration for the portable electronic device 120 in, for example, the device memory 123 ( FIG. 2 ).
- the gravitation acceleration may or may not be normal to the Earth.
- the gravitation acceleration may be canted relative to the surface 102 ( FIG. 1 ) if the surface 102 is not flat.
- other suitable sensor signals may also be stored in the device memory 123 or in other suitable locations.
- the process 200 can proceed to optional stage 207 to calibrate the sensor signal.
- the sensor signal may be calibrated by retrieving the gravitational acceleration of the portable electronic device 120 from, for example, the device memory 123 and subtracting the retrieved gravitational acceleration from the acquired sensor signal. As a result, a net acceleration vector accounting for the gravitational acceleration experienced by the portable electronic device 120 .
- the calibration of the sensor signal at stage 207 may be omitted.
- the process 200 can then include determining a position change of the portable electronic device 120 at stage 208 .
- the position change may be determined as a position change vector by double integrating the net acceleration with the calculation module 166 , as discussed above with reference to FIG. 3 .
- the position change may be determined by performing a first integration of the net acceleration to obtain a velocity, and a second integration of the velocity to obtain the position change vector with or without an exponential decay function. Filtering, averaging, and/or other suitable data manipulation may be performed between the first and second integration.
- the position change may be determined in other suitable manners.
- the process 200 can include outputting the determined position change as pointer data to the computer 110 ( FIG. 1 ) via the connection 104 at stage 210 .
- the computer 110 can cause the cursor 118 to move, for example, from the first cursor position 119 a to the second cursor position 119 b ( FIG. 1 ).
- the process 200 then includes a decision block 212 to determine whether the process continues. In one embodiment, the process 200 continues if the computer 110 is still operating. In other embodiments, the process 200 may continue based on other suitable conditions. If the process continues, the process 200 reverts to acquiring additional sensor signals at stage 202 ; otherwise, the process ends.
- FIG. 5A is an example acceleration versus time plot 300 in accordance with embodiments of the present technology.
- various moving parameters i.e., accelerations, velocities, or position changes
- the portable electronic device 120 FIG. 1
- the acceleration experienced by the portable electronic device 120 can have a generally constant acceleration a 0 before time t 1 , at which time, the acceleration increases until a maximum acceleration 302 is reached at time t 2 . Subsequently, the portable electronic device 120 decelerates to a negative maximum acceleration 304 at time t 3 before returning to the original acceleration a 0 at t 4 .
- FIG. 5B is an example speed versus time plot 310 obtained (e.g., by integrating with respect to time) based on the acceleration versus time plot of FIG. 5A without calibration. As shown in FIG. 5B , it is believed that a non-flat surface 102 ( FIG. 1 ) on which the portable electronic device rests may cause the obtained velocity to have non-zero values 312 before time t 1 and after time t 4 , when the portable electronic device 120 is stationary.
- the acceleration as shown in FIG. 5A may be calibrated by subtracting the gravitational acceleration experienced by the portable electronic device 120 , as discussed above with reference to FIG. 4 .
- FIG. 5C is an example speed versus time plot 320 obtained based on the acceleration versus time plot of FIG. 5A with calibration. As shown in FIG. 5C , the velocity increases at time t 1 and decreases before reaching a maximum velocity 322 . Subsequently, the velocity decreases to zero at time t 4 .
- FIG. 5D is an example position versus time plot 330 based on the corrected speed versus time plot of FIG. 5C .
- the position of the portable electronic device 120 changes from P1 to P2 over t 1 to t 4 .
- the inventor has recognized that data noises in the acceleration as shown in FIG. 5A may cause the position at P2 to be non-constant, as shown by the dashed line 324 .
- such an error may be corrected by double integrating the acceleration with an exponential decay function, as discussed above with reference to FIG. 3 .
- such an error may be corrected by setting the position at t 4 to be constant.
- other suitable techniques may also be used to correct such an error.
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
- Position Input By Displaying (AREA)
- Telephone Function (AREA)
Abstract
Description
ā net =ā−
where ānet is the net acceleration vector; ā is the measured acceleration vector; and
where
where
where τ is a mean lifetime that may be user selected or otherwise suitably determined. In other examples, the
Claims (20)
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US13/789,688 US9244538B2 (en) | 2013-03-08 | 2013-03-08 | Using portable electronic devices for user input |
PCT/US2014/020055 WO2014137955A1 (en) | 2013-03-08 | 2014-03-04 | Using portable electronic devices for user input |
CN201480012629.6A CN105027037B (en) | 2013-03-08 | 2014-03-04 | User's input is carried out using portable electric appts |
EP14712871.4A EP2965177B1 (en) | 2013-03-08 | 2014-03-04 | Using portable electronic devices for user input on a computer |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US13/789,688 US9244538B2 (en) | 2013-03-08 | 2013-03-08 | Using portable electronic devices for user input |
Publications (2)
Publication Number | Publication Date |
---|---|
US20140253443A1 US20140253443A1 (en) | 2014-09-11 |
US9244538B2 true US9244538B2 (en) | 2016-01-26 |
Family
ID=50382611
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/789,688 Active 2034-04-08 US9244538B2 (en) | 2013-03-08 | 2013-03-08 | Using portable electronic devices for user input |
Country Status (4)
Country | Link |
---|---|
US (1) | US9244538B2 (en) |
EP (1) | EP2965177B1 (en) |
CN (1) | CN105027037B (en) |
WO (1) | WO2014137955A1 (en) |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9423318B2 (en) * | 2014-07-29 | 2016-08-23 | Honeywell International Inc. | Motion detection devices and systems |
CN107168522A (en) * | 2017-04-10 | 2017-09-15 | 北京小鸟看看科技有限公司 | Control method, device and the virtual reality device of application |
Citations (20)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6104380A (en) * | 1997-04-14 | 2000-08-15 | Ricoh Company, Ltd. | Direct pointing apparatus for digital displays |
US6212296B1 (en) * | 1997-12-23 | 2001-04-03 | Ricoh Company, Ltd. | Method and apparatus for transforming sensor signals into graphical images |
US20030117370A1 (en) * | 1999-12-16 | 2003-06-26 | Van Brocklin Andrew L. | Optical pointing device |
US20040236500A1 (en) * | 2003-03-18 | 2004-11-25 | Samsung Electronics Co., Ltd. | Input system based on a three-dimensional inertial navigation system and trajectory estimation method thereof |
US20060182316A1 (en) * | 2005-02-16 | 2006-08-17 | Samsung Electronics Co., Ltd. | Apparatus and method for recognizing spatial writing and recording medium storing the method |
US20060187208A1 (en) * | 2005-02-24 | 2006-08-24 | Wenstrand John S | Programmable lift response for an optical navigation device |
US20070171202A1 (en) * | 2006-01-24 | 2007-07-26 | Samsung Electronics Co., Ltd. | Trajectory estimation apparatus, method, and medium for estimating two-dimensional trajectory of gesture |
US20080120448A1 (en) | 2006-11-21 | 2008-05-22 | Microsoft Corporation | Remote mouse and keyboard using bluetooth |
US20080255795A1 (en) * | 2007-04-13 | 2008-10-16 | Keynetik, Inc. | Force Sensing Apparatus and Method to Determine the Radius of Rotation of a Moving Object |
US20080291163A1 (en) * | 2004-04-30 | 2008-11-27 | Hillcrest Laboratories, Inc. | 3D Pointing Devices with Orientation Compensation and Improved Usability |
US20090298538A1 (en) | 2008-05-30 | 2009-12-03 | Hon Fu Jin Precision Industry (Shenzhen) Co., Ltd. | Multifunction mobile phone and method thereof |
US20100033352A1 (en) * | 2008-08-08 | 2010-02-11 | Industrial Technology Research Institute | Real-time motion recognition method and inertia-sensing and trajectory-reconstruction device using the same |
US20100042358A1 (en) | 2008-08-15 | 2010-02-18 | Apple Inc. | Motion plane correction for mems-based input devices |
US20110001663A1 (en) | 2009-07-02 | 2011-01-06 | Seiko Epson Corporation | Position calculation method and position calculation apparatus |
US7870496B1 (en) | 2009-01-29 | 2011-01-11 | Jahanzeb Ahmed Sherwani | System using touchscreen user interface of a mobile device to remotely control a host computer |
US20110199301A1 (en) | 2010-02-12 | 2011-08-18 | Microsoft Corporation | Sensor-based Pointing Device for Natural Input and Interaction |
US8065508B2 (en) | 2007-11-09 | 2011-11-22 | Google Inc. | Activating applications based on accelerometer data |
US20110304577A1 (en) * | 2010-06-11 | 2011-12-15 | Sp Controls, Inc. | Capacitive touch screen stylus |
US20120179408A1 (en) * | 2011-01-06 | 2012-07-12 | Sony Corporation | Information processing apparatus, information processing system, and information processing method |
US20130069917A1 (en) * | 2011-08-04 | 2013-03-21 | Jeen-Shing WANG | Moving trajectory generation method |
-
2013
- 2013-03-08 US US13/789,688 patent/US9244538B2/en active Active
-
2014
- 2014-03-04 EP EP14712871.4A patent/EP2965177B1/en active Active
- 2014-03-04 WO PCT/US2014/020055 patent/WO2014137955A1/en active Application Filing
- 2014-03-04 CN CN201480012629.6A patent/CN105027037B/en active Active
Patent Citations (20)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6104380A (en) * | 1997-04-14 | 2000-08-15 | Ricoh Company, Ltd. | Direct pointing apparatus for digital displays |
US6212296B1 (en) * | 1997-12-23 | 2001-04-03 | Ricoh Company, Ltd. | Method and apparatus for transforming sensor signals into graphical images |
US20030117370A1 (en) * | 1999-12-16 | 2003-06-26 | Van Brocklin Andrew L. | Optical pointing device |
US20040236500A1 (en) * | 2003-03-18 | 2004-11-25 | Samsung Electronics Co., Ltd. | Input system based on a three-dimensional inertial navigation system and trajectory estimation method thereof |
US20080291163A1 (en) * | 2004-04-30 | 2008-11-27 | Hillcrest Laboratories, Inc. | 3D Pointing Devices with Orientation Compensation and Improved Usability |
US20060182316A1 (en) * | 2005-02-16 | 2006-08-17 | Samsung Electronics Co., Ltd. | Apparatus and method for recognizing spatial writing and recording medium storing the method |
US20060187208A1 (en) * | 2005-02-24 | 2006-08-24 | Wenstrand John S | Programmable lift response for an optical navigation device |
US20070171202A1 (en) * | 2006-01-24 | 2007-07-26 | Samsung Electronics Co., Ltd. | Trajectory estimation apparatus, method, and medium for estimating two-dimensional trajectory of gesture |
US20080120448A1 (en) | 2006-11-21 | 2008-05-22 | Microsoft Corporation | Remote mouse and keyboard using bluetooth |
US20080255795A1 (en) * | 2007-04-13 | 2008-10-16 | Keynetik, Inc. | Force Sensing Apparatus and Method to Determine the Radius of Rotation of a Moving Object |
US8065508B2 (en) | 2007-11-09 | 2011-11-22 | Google Inc. | Activating applications based on accelerometer data |
US20090298538A1 (en) | 2008-05-30 | 2009-12-03 | Hon Fu Jin Precision Industry (Shenzhen) Co., Ltd. | Multifunction mobile phone and method thereof |
US20100033352A1 (en) * | 2008-08-08 | 2010-02-11 | Industrial Technology Research Institute | Real-time motion recognition method and inertia-sensing and trajectory-reconstruction device using the same |
US20100042358A1 (en) | 2008-08-15 | 2010-02-18 | Apple Inc. | Motion plane correction for mems-based input devices |
US7870496B1 (en) | 2009-01-29 | 2011-01-11 | Jahanzeb Ahmed Sherwani | System using touchscreen user interface of a mobile device to remotely control a host computer |
US20110001663A1 (en) | 2009-07-02 | 2011-01-06 | Seiko Epson Corporation | Position calculation method and position calculation apparatus |
US20110199301A1 (en) | 2010-02-12 | 2011-08-18 | Microsoft Corporation | Sensor-based Pointing Device for Natural Input and Interaction |
US20110304577A1 (en) * | 2010-06-11 | 2011-12-15 | Sp Controls, Inc. | Capacitive touch screen stylus |
US20120179408A1 (en) * | 2011-01-06 | 2012-07-12 | Sony Corporation | Information processing apparatus, information processing system, and information processing method |
US20130069917A1 (en) * | 2011-08-04 | 2013-03-21 | Jeen-Shing WANG | Moving trajectory generation method |
Non-Patent Citations (10)
Title |
---|
"DroidPad: Use Your Smartphone as Your PC Mouse or Joystick [Android]", Retrieved at <<http://www.makeuseof.com/dir/droidpad-use-smartphone-as-mouse/>>, Retrieved Date: Feb. 28, 2013, pp. 4. |
"DroidPad: Use Your Smartphone as Your PC Mouse or Joystick [Android]", Retrieved at >, Retrieved Date: Feb. 28, 2013, pp. 4. |
"International Search Report & Written Opinion for PCT Patent Application No. PCT/US2014/020055", Mailed Date: Jul. 8, 2014, Filed Date: Mar. 4, 2014, 9 Pages. |
Dong, Chelsea, "SmartMouse: Turn Your Smart Phone into a Wireless Multi-touch Magic Mouse", Retrieved at <<http://technode.com/2012/09/17/smartmouse-turn-your-smart-phone-into-a-wireless-multi-touch-magic-mouse/>>, Sep. 17, 2012, pp. 3. |
International Preliminary Report on Patentability for PCT Patent Application No. PCT/US2014/020055, Mailed Date: Feb. 9, 2015, Filed Date: Mar. 4, 2014, 11 Pages. |
Kiran R, et al., "Implementing Mobile Phone as a Multi-Purpose Controller using 3D Sensor Technology", Retrived at <<http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=5563675>>, In 3rd IEEE International Conference on Computer Science and Information Technology, Jul. 9, 2010, pp. 5. |
Kiran R, et al., "Implementing Mobile Phone as a Multi-Purpose Controller using 3D Sensor Technology", Retrived at >, In 3rd IEEE International Conference on Computer Science and Information Technology, Jul. 9, 2010, pp. 5. |
Rhee, Ed, "How to Use Your Android Device as a Cordless Mouse and Keyboard", Retrieved at <<http://howto.cnet.com/8301-11310-39-20070948-285/how-to-use-your-android-device-as-a-cordless-mouse-and-keyboard/>>, Jun. 15, 2011, pp. 6. |
Smurf, Lazy, "WiFi Mouse: Utilize your Android Smartphone as a Mouse, Keyboard and Trackpad", Retrieved at <<http://www.themobimag.com/wifi-mouse-utilize-smartphone-as-mouse/>>, Retrieved Date: Feb. 28, 2013, pp. 3. |
Smurf, Lazy, "WiFi Mouse: Utilize your Android Smartphone as a Mouse, Keyboard and Trackpad", Retrieved at >, Retrieved Date: Feb. 28, 2013, pp. 3. |
Also Published As
Publication number | Publication date |
---|---|
EP2965177B1 (en) | 2019-07-31 |
WO2014137955A1 (en) | 2014-09-12 |
CN105027037B (en) | 2018-02-06 |
US20140253443A1 (en) | 2014-09-11 |
CN105027037A (en) | 2015-11-04 |
EP2965177A1 (en) | 2016-01-13 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US8928609B2 (en) | Combining touch screen and other sensing detections for user interface control | |
CN102289306B (en) | Attitude sensing equipment and positioning method thereof as well as method and device for controlling mouse pointer | |
US9135802B2 (en) | Hardware attitude detection implementation of mobile devices with MEMS motion sensors | |
US20180058857A1 (en) | Local perturbation rejection using time shifting | |
US20160047840A1 (en) | Offset estimation apparatus, offset estimation method, and computer readable medium | |
CN102914414B (en) | Vibration measuring instrument based on Android platform mobile phone and detection method thereof | |
US20180267074A1 (en) | Systems and methods for motion detection | |
CN109425753A (en) | Hybrid altimeter for measuring vertical velocity | |
Bose et al. | On the noise and power performance of a shoe-mounted multi-IMU inertial positioning system | |
US20220146280A1 (en) | Walk discrimination device, walk discrimination method, and program recording medium | |
US9244538B2 (en) | Using portable electronic devices for user input | |
JP6175127B2 (en) | Holding state determination device and program | |
KR20130053882A (en) | Terminal device for correcting gyro-sensor sensing value and accelation sensor sensing value and method for controlling thereof | |
KR101639351B1 (en) | Wearable input system and method for recognizing motion | |
US9927917B2 (en) | Model-based touch event location adjustment | |
CN101315589A (en) | Electronic device and position detection device and method thereof | |
WO2014185027A1 (en) | Offset estimation device, offset estimation method, and program | |
JP6073455B2 (en) | Holding state change detection device, holding state change detection method, and program | |
US11592911B2 (en) | Predictive data-reconstruction system and method for a pointing electronic device | |
US11048343B2 (en) | Method and apparatus for analyzing mouse gliding performance | |
CN110109599A (en) | The interactive approach of user and stylus, categorizing system and stylus product | |
US20240337994A1 (en) | Human Operable Controller | |
Salcedo Ortega | Microelectromechanical Systems Inertial Measurement Unit As An Attitude And Heading Reference System |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: MICROSOFT CORPORATION, WASHINGTON Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:ELSON, JEREMY;REEL/FRAME:029947/0499 Effective date: 20130226 |
|
AS | Assignment |
Owner name: MICROSOFT TECHNOLOGY LICENSING, LLC, WASHINGTON Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:MICROSOFT CORPORATION;REEL/FRAME:034747/0417 Effective date: 20141014 Owner name: MICROSOFT TECHNOLOGY LICENSING, LLC, WASHINGTON Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:MICROSOFT CORPORATION;REEL/FRAME:039025/0454 Effective date: 20141014 |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Year of fee payment: 4 |
|
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 8TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1552); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Year of fee payment: 8 |