[go: up one dir, main page]

US20120249475A1 - 3d user interface control - Google Patents

3d user interface control Download PDF

Info

Publication number
US20120249475A1
US20120249475A1 US13/434,677 US201213434677A US2012249475A1 US 20120249475 A1 US20120249475 A1 US 20120249475A1 US 201213434677 A US201213434677 A US 201213434677A US 2012249475 A1 US2012249475 A1 US 2012249475A1
Authority
US
United States
Prior art keywords
command
sensor
gesture
user interaction
display
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US13/434,677
Inventor
Mark J. Murphy
Mel J. CONWAY
Adrian Flanagan
Shrenik Deliwala
Eoin E. ENGLISH
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Analog Devices Inc
Original Assignee
Analog Devices Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Analog Devices Inc filed Critical Analog Devices Inc
Priority to US13/434,677 priority Critical patent/US20120249475A1/en
Assigned to ANALOG DEVICES, INC. reassignment ANALOG DEVICES, INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: DELIWALA, SHRENIK, CONWAY, MEL J., ENGLISH, EOIN E., FLANAGAN, ADRIAN, MURPHY, MARK J.
Publication of US20120249475A1 publication Critical patent/US20120249475A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1684Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
    • G06F1/1694Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675 the I/O peripheral being a single or a set of motion sensors for pointer control or gesture input obtained by sensing movements of the portable computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/016Input arrangements with force or tactile feedback as computer generated output to the user
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2200/00Indexing scheme relating to G06F1/04 - G06F1/32
    • G06F2200/16Indexing scheme relating to G06F1/16 - G06F1/18
    • G06F2200/163Indexing scheme relating to constructional details of the computer
    • G06F2200/1637Sensing arrangement for detection of housing movement or orientation, e.g. for controlling scrolling or cursor movement on the display of an handheld computer

Definitions

  • GUIs graphical user interfaces
  • a user controls the device by performing gestures or movements on or about icons or “keys” represented on the GUI or moving the device in space (e.g., shaking or turning the device).
  • 3D displays Some electronic devices also provide for three-dimensional (“3D”) displays.
  • a 3D display provides an illusion of visual depth to a user for 3D content displayed or rendered by the 3D display.
  • 3D displays are well-known, user interfaces for 3D displays are primitive and often require interaction with conventional input devices, such as touch screens, to control a 3D enabled display device.
  • FIG. 1 is a block diagram of a display device according to an embodiment of the present invention.
  • FIG. 2 is a block diagram of a user interface (“UI”) controller according to an embodiment of the present invention.
  • UI user interface
  • FIG. 3 is a model of detection layers for a display device according to an embodiment of the present invention.
  • FIGS. 4-8 illustrate exemplary 3D workspaces and UI controls for use with embodiments of the present invention.
  • Embodiments of the present invention provide a 3D user interface processing system for a device that may include at least one sensor, a 3D display, and a controller.
  • the controller may include a memory, which may store instructional information, and a processor.
  • the processor may be configured to receive sensor data from the sensor(s) and to interpret sensor data according to the instructional information.
  • the processor may generate a user interface command(s) and transmit the command(s) to the 3D display.
  • the processor may also generate host commands to control and/or execute applications within a host system of the device.
  • FIG. 1 is a simplified block diagram of a display device 100 according to an embodiment of the present invention.
  • the device 100 may include a UI controller 110 , a display driver 120 . 1 , a 3D display 130 . 1 , an optical sensor 140 , a touch sensor 150 , a touch screen 160 , and a host system 170 .
  • the device 100 may be embodied as a consumer electronic device such as a cell phone, PDA, tablet computer, gaming device, television, etc.
  • the 3D display 130 . 1 may be embodied as a stereoscopic display or an auto-stereoscopic display.
  • the UI controller 110 may include a processor 112 and a memory 114 .
  • the processor 112 may control the operations of the UI controller 110 according to instructions stored in the memory 114 .
  • the memory 114 may store gesture definition libraries 114 . 1 , UI maps 114 . 2 , and command definition libraries 114 . 3 , which may enable user control or manipulation of workspaces for the 3D display 130 . 1 , and/or control functions or applications of the display device 100 .
  • the memory 114 may be provided as a non-volatile memory, a volatile memory such as random access memory (RAM), or a combination thereof.
  • the UI controller 110 may be coupled to the host system 170 of the device.
  • the UI controller 110 may receive instructions from the host system 170 .
  • the host system 170 may include an operating system (“OS”) 172 and application(s) 174 . 1 - 174 .N that are executed by the host system 170 .
  • the host system 170 may include program instructions to govern operations of the device 100 and manage device resources on behalf of various applications.
  • the host system 170 may, for example, manage content of the 3D display 130 . 1 .
  • the UI controller 110 may be integrated into the host system 170 .
  • the UI controller 110 may manage user interaction for the device 100 based on the gesture definitions 114 . 1 , UI maps 114 . 2 , and command definitions 114 . 3 stored in the memory 114 .
  • User interaction with the display device 100 may be detected using various UI sensors.
  • the UI controller 110 may be coupled to UI sensor(s) such as the optical sensor 140 and the touch sensor 150 that may measure different user interactions with the 3D display 130 . 1 and/or the touch screen 160 .
  • the touch screen may 160 may be overlaid on the face of a display, which may be provided as a backlit LCD display with an LCD matrix, lenticular lenses, polaraziers, etc.
  • the optical sensor 140 may detect user interactions within 3D workspaces of the device 100 , which are discussed in more detail in FIG. 4 .
  • the optical sensor 140 may detect the location (or locations for user interactions) and the time of the user movement (e.g., finger, stylus, pen, etc.) as it hovers or moves within a 3D workspace(s).
  • the optical sensor 140 may include one or more light emitters (not shown), such as a light emitting diode (“LED”) or other similar device and one or more light sensors (not shown).
  • the light emitters may emit light into the 3D workspace(s).
  • An object or multiple objects in the field of view of the emitters may reflect light back into the light sensors, which may generate electrical impulses/signals representing the intensity of light incident thereon.
  • the signals may indicate a location of light impingement on a surface of each light sensor.
  • the optical sensor 140 may digitize and decode the signals from the sensor(s) to calculate a three-dimensional position (i.e., using X, Y, Z coordinates) of the object. Calculation of X, Y, Z coordinates of an object in a field of view is described in the following U.S. patent applications, the contents of which are incorporated herein: U.S. patent application Ser. No. 12/327,511, entitled “Method of Locating an Object in 3D,” filed on Dec. 3, 2008; U.S. patent application Ser. No. 12/499,335, entitled “Method of Locating an Object in 3D,” filed on Jul.
  • the touch sensor 150 may detect user touches on or near the touch screen 160 .
  • the touch sensor 150 may detect the location (or locations for multi-touch user interactions) and time of the user touch.
  • the touch sensor 150 may be complimentary to the type of touch screen 160 .
  • the touch sensor 150 may be provided as a capacitive sensor (or a grid of capacitive sensors) detecting changes in respective capacitive fields.
  • the touch sensor 150 may be programmed to differentiate between desired user touch events and false positives for objects larger than a typical user interaction instrument (e.g., finger, pen, stylus).
  • Other types of sensors may include but are not limited to resistive sensors, inductive sensors, etc.
  • the UI controller 110 may receive the sensor(s) results and may generate user interface commands to control or manipulate content rendered on the 3D display 130 . 1 , via the display driver 120 . 1 .
  • the UI controller 110 may also generate host commands based on the sensor(s) results to control and/or execute applications in the host system 170 , which may also alter content rendered on the 3D display 130 . 1 .
  • the UI controller 110 may also be coupled to a haptics driver 120 . 2 , which may drive one or more haptics actuator(s) 130 . 2 .
  • the UI controller 110 may generate haptics commands to control the haptics actuator(s) 130 . 2 from the sensor(s) results.
  • Haptics refers to the sense of touch.
  • haptics relates to providing a sensory feedback or “haptics effects” to a user.
  • the haptics actuator(s) 130 . 2 may be embodied as piezoelectric elements, linear resonant actuators (“LRAs”), eccentric rotating mass actuators (“ERMs”), and/or other known actuator types.
  • the haptics driver 120 . 2 may transmit drive signals to the haptics actuator(s) 130 . 2 causing it to vibrate according to the drive signal properties. The vibrations may be felt by the user providing various vibro-tactile sensory feedback stimuli.
  • haptics effects may be generated for the device, meaning a user may feel the haptics effects with a hand opposite of the hand interacting with a 3D workspace. For example, selecting an interactive element in a 3D workspace with a finger from the user's right hand may generate a haptics effect felt by the user's left hand, which may indicate selection of the icon. Similar haptics effects may be generated for other interactions, for example, scrolling a 3D workspace left to right may generate a haptics effect which may provide a sensation of a movement from left to right.
  • FIG. 2 is a functional block diagram of a UI controller 200 according to an embodiment of the present invention.
  • the UI controller 200 may include input driver(s) 210 , a processor 220 , a memory 230 , and output driver(s) 240 .
  • the input driver(s) 210 may receive sensor inputs (e.g., optical sensors, touch sensors, etc.) and may generate a corresponding input signal.
  • the sensor inputs may be coupled to the input driver(s) 210 via a serial interface such as a high speed 12 C interface.
  • the input driver(s) 210 may also control the coupled sensor operations such as when to power on, read data, etc.
  • the processor 220 may control the operations of the UI controller 200 according to instructions saved in the memory 230 .
  • the memory 230 may be provided as a non-volatile memory, a volatile memory such as random access memory (“RAM”), or a combination thereof.
  • the processor 220 may include a gesture classification module 222 , a UI search module 224 , and a command search module 226 .
  • the memory 230 may include gesture definition data 232 , UI map data 234 , and command definition data 236 .
  • the data may be stored as look-up-tables (“LUTs”).
  • the gesture definition data 232 may include a LUT with possible input value(s) and corresponding gesture(s).
  • the UI map data 234 may include a LUT with possible input value(s) and corresponding icon(s).
  • the command definitions 236 may include a LUT with possible gesture and icon values and corresponding user interface commands or host commands.
  • data i.e., gesture data, UI map data, and/or command data, etc.
  • the gesture classification module 222 may receive the input signals from the input driver(s) 210 and may decode a gesture(s) from the input signals based on the gesture definition data 232 . For example, the gesture classification module 222 may compare the input signals to stored input values in the gesture definition data 232 and may match the input signals to a corresponding stored gesture value. The gesture may represent a user action within a 3D workspace or on the touch screen as indicated by the input signals. The gesture classification module 222 may also calculate location data from the input signals and may generate location data (e.g., X, Y, Z coordinates) therefrom.
  • location data e.g., X, Y, Z coordinates
  • the UI search module 224 may receive the gesture and/or location data from the gesture classification module 222 .
  • the UI search module 224 may calculate a UI interaction such as an icon selection or workspace manipulation (scroll, zoom, etc.) based on the UI map data 232 .
  • the UI search module 224 may compare the gesture and/or location data to stored map values in the UI map data 232 and may match the gestures and/or locations to corresponding 2D or 3D UI interactions.
  • the command search module 226 may receive the calculated gesture and UI interaction data, and may generate one or more commands (i.e., display commands, host commands, haptics commands, etc.) based on the command definition data 236 . For example, the command search module 226 may compare the calculated gesture and UI interaction data to one or more corresponding command(s).
  • the commands may be received by the output driver(s) 240 , which, in turn, may control various output devices (i.e., 3D display, haptics device, etc.). For example, the commands may control or manipulate 3D content rendered in a 3D workspace of a 3D display (scrolling icons, zooming content for an application, etc.).
  • data from each of the modules 222 - 226 may also be communicated to the host system 250 to control OS functions or applications running in the host system 250 . For example, swiping a finger from left to right in a 3D workspace of a document reader application to turn pages of a document being viewed with the application.
  • FIG. 3 illustrates a model 300 of user input detection areas, labeled “detection layer 1 ”-“detection layer N,” for a 3D enabled display device 310 according to an embodiment of the present invention.
  • the detection layers 1 -N may be associated with interactive spatial volumes for detecting user interactions with the device 310 .
  • the spatial volumes for the detection layers 1 -N may emanate from the device 310 at varying angles, as provided by optical sensors (not shown) for the device 310 .
  • the varying angles may provide for detecting user interactions not only directly above the device 310 , but also around the perimeter of the device 310 .
  • the detection layers 1 -N are shown as flat spatial volumes, in practice, they may be curved radial spatial volumes.
  • a 3D display may create the illusion of 3D depth for content displayed or rendered on a 3D enabled display device.
  • the content may include GUI elements such as 3D icons, buttons, etc. or 3D application content such as a map, a movie, a drawing canvas, etc.
  • the content may be rendered in 3D workspaces or “UI layers” above the 3D display.
  • the detection layers 1 -N as described in FIG. 3 may be calibrated to overlap with the UI layers to detect user interactions with the UI layers.
  • a user may interact within the 3D workspaces to manipulate GUI elements rendered in a workspace (e.g., scrolling, moving, or selecting GUI elements) or to control applications rendered in a workspace.
  • 3D workspaces 400 for a 3D enabled display 410 device illustrate 3D workspaces 400 for a 3D enabled display 410 device according to an embodiment of the present invention.
  • the 3D workspaces may be rendered as one or more 3D workspaces, shown here as “UI Layer 1 ” and “UI Layer 2 .”
  • the 3D workspaces may extended in an outward manner away from the display device 410 .
  • the display device 410 may render content for a 3D workspace as the workspace may be “activated” by a user for viewing content for the workspace.
  • a user may activate or “maximize” a 3D workspace by performing a double-tap gesture in the desired workspace.
  • UI layer 2 is shown as a “current” working layer for the user, while UI layer 1 is shown as a desired “next” working layer that the user may wish to maximize.
  • the user may perform the double-tap gesture in UI layer 1 to cause the display device 410 to render the content associated with UI layer 1 .
  • a display device may “push” an activated 3D workspace to an outward-most viewing perspective (away from the display device) and maximize the viewing depth of the workspace's rendered content to a predetermined viewing depth.
  • FIG. 4( c ) illustrates a plurality of 3D icons rendered in UI Layer 1 for the device 410 .
  • a user may perform a snap touch within a 3D icon rendered in UI Layer 1 (move their finger quickly forward and back resembling touch of an element on a touch screen) to select the corresponding icon.
  • FIG. 5 illustrates a 3D workspace 500 for a 3D enabled display device 510 according to an embodiment of the present invention.
  • 3D icons may be spaced apart from each other by predetermined distances. The separation distance may be an inactive area between the 3D icons.
  • other areas of the workspaces may be unoccupied by content. In various embodiments, these areas may be designated as “dead zones,” wherein user interaction with the dead zones does not control or manipulate the workspace/device.
  • the device 510 may render a plurality of 3D icons in a 3D workspace, labeled “UI Layer 2 .” Dead zones for UI Layer 2 are indicated as the space between the 3D icons.
  • areas that do not include interactive elements may be designated as “active zones,” wherein user interaction within an active zone may control or manipulate rendering of a 3D workspace(s) or functions of the device 510 .
  • An active zone for UI Layer 2 is shown as the open area behind the 3D icons. A user may move their finger from left-to-right in UI Layer 2 to cause the 3D icons in that layer to scroll from left to right.
  • active zones may provide for activating workspaces, zooming, performing gesture commands, etc.
  • a UI controller may also provide for 3D UI controls for user gestures performed within a 3D workspace to manipulate content rendered in the workspace.
  • the UI controller may include gesture definitions and UI maps which may enable control and/or interaction with 3D content associated with various applications rendered on a 3D display.
  • FIGS. 6( a )-( b ) illustrate 3D workspaces 600 for a 3D enabled display device 610 according to an embodiment of the present invention.
  • the device 610 may include a 3D mapping application to display a 3D map.
  • the device 610 may render the 3D map in a 3D workspace, shown as “UI Layer 2 .”
  • a UI controller i.e., UI controller 200 of FIG. 2 may enable user control of the 3D map through gestures, selections, etc. to manipulate views of the map or select items in the map.
  • the user may perform a “spreading” gesture to “zoom-out” the 3D map.
  • FIG. 6( b ) the user may perform a “pinching” gesture to “zoom-in” the 3D map.
  • a user may also select items in an application (e.g., buildings rendered in a 3D map of a mapping application) by performing a snap touch to activate content related to the items.
  • an application e.g., buildings rendered in a 3D map of a mapping application
  • FIG. 7 illustrates another 3D workspace 700 for a 3D enabled display device 710 according to an embodiment of the present invention.
  • the device 710 may include a 3D keyboard application to display a 3D keyboard.
  • the device 710 may render the 3D keyboard in 3D workspaces, shown as “UI Layer 1 ” and “UI Layer 2 .”
  • the 3D keyboard may be rendered in UI Layer 2
  • 3D icons for text input commands such as “Caps Lock,” “Delete,” or “Space” may be rendered in UI Layer 1 .
  • a user may input text by swiping a finger through 3D letter icons of the 3D keyboard or by “tapping” the letter icons in UI Layer 2 .
  • the user may select the text input commands by performing a snap touch on corresponding 3D icons for the commands.
  • a UI controller may also include gesture definitions associated with 3D UI controls to control operations associated with host system functions of a 3D enabled display device.
  • FIG. 8 illustrates another 3D workspace 800 for a 3D enabled display device 810 according to an embodiment of the present invention.
  • the device 810 may provide for user inputs through gestures performed in a 3D workspace, shown as “UI Layer 1 .” For example, a user performing an ‘L’ gesture in UI Layer 1 to lock the device 810 .
  • a user may also perform predetermined gesture commands for inputting various words or phrases.
  • Various embodiments may be implemented using hardware elements, software elements, or a combination of both.
  • hardware elements may include processors, microprocessors, circuits, circuit elements (e.g., transistors, resistors, capacitors, inductors, and so forth), integrated circuits, application specific integrated circuits (“ASIC”), programmable logic devices (“PLD”), digital signal processors (“DSP”), field programmable gate array (“FPGA”), logic gates, registers, semiconductor device, chips, microchips, chip sets, and so forth.
  • Examples of software may include software components, programs, applications, computer programs, application programs, system programs, machine programs, operating system software, middleware, firmware, software modules, routines, subroutines, functions, methods, procedures, software interfaces, application program interfaces (“API”), instruction sets, computing code, computer code, code segments, computer code segments, words, values, symbols, or any combination thereof. Determining whether an embodiment is implemented using hardware elements and/or software elements may vary in accordance with any number of factors, such as desired computational rate, power levels, heat tolerances, processing cycle budget, input data rates, output data rates, memory resources, data bus speeds and other design or performance constraints.
  • Some embodiments may be implemented, for example, using a non-transitory computer-readable medium or article which may store an instruction or a set of instructions that, if executed by a machine, may cause the machine to perform a method and/or operations in accordance with the embodiments.
  • a machine may include, for example, any suitable processing platform, computing platform, computing device, processing device, computing system, processing system, computer, processor, or the like, and may be implemented using any suitable combination of hardware and/or software.
  • the computer-readable medium or article may include, for example, any suitable type of memory unit, memory device, memory article, memory medium, storage device, storage article, storage medium and/or storage unit, for example, memory, removable or non-removable media, erasable or non-erasable media, writeable or re-writeable media, digital or analog media, hard disk, floppy disk, Compact Disc Read Only Memory (CD-ROM), Compact Disc Recordable (CD-R), Compact Disc Rewriteable (CD-RW), optical disk, magnetic media, magneto-optical media, removable memory cards or disks, various types of Digital Versatile Disc (DVD), a tape, a cassette, or the like.
  • any suitable type of memory unit for example, any suitable type of memory unit, memory device, memory article, memory medium, storage device, storage article, storage medium and/or storage unit, for example, memory, removable or non-removable media, erasable or non-erasable media, writeable or re-writeable media, digital or analog media, hard disk
  • the instructions may include any suitable type of code, such as source code, compiled code, interpreted code, executable code, static code, dynamic code, encrypted code, and the like, implemented using any suitable high-level, low-level, object-oriented, visual, compiled and/or interpreted programming language.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Hardware Design (AREA)
  • User Interface Of Digital Computer (AREA)
  • Programmable Controllers (AREA)

Abstract

Techniques to provide a three-dimensional (“3D”) user interface (“UI”) processing system for a device that may include at least one sensor, a 3D display, and a controller. The controller may include a memory, which may store instructional information, and a processor. The processor may be configured to receive sensor data from the sensor(s) and to interpret sensor data according to the instructional information. The processor may generate a user interface command(s) and transmit the command(s) to the 3D display to control and/or manipulate the 3D display. The processor may also generate host commands to control and/or execute applications within a host system of the device.

Description

    RELATED APPLICATIONS
  • This application claims priority to provisional U.S. Patent Application Ser. No. 61/470,764, entitled “Touch Screen and Haptic Control” filed on Apr. 1, 2011, the content of which is incorporated herein in its entirety.
  • BACKGROUND
  • Electronic devices such as smart phones, tablet computers, gaming devices, etc. often include two-dimensional graphical user interfaces (“GUIs”) that provide for interaction and control of the device. A user controls the device by performing gestures or movements on or about icons or “keys” represented on the GUI or moving the device in space (e.g., shaking or turning the device).
  • Some electronic devices also provide for three-dimensional (“3D”) displays. A 3D display provides an illusion of visual depth to a user for 3D content displayed or rendered by the 3D display. Although 3D displays are well-known, user interfaces for 3D displays are primitive and often require interaction with conventional input devices, such as touch screens, to control a 3D enabled display device.
  • Accordingly, there is a need in the art for improving 3D user interface control.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 is a block diagram of a display device according to an embodiment of the present invention.
  • FIG. 2 is a block diagram of a user interface (“UI”) controller according to an embodiment of the present invention.
  • FIG. 3 is a model of detection layers for a display device according to an embodiment of the present invention.
  • FIGS. 4-8 illustrate exemplary 3D workspaces and UI controls for use with embodiments of the present invention.
  • DETAILED DESCRIPTION
  • Embodiments of the present invention provide a 3D user interface processing system for a device that may include at least one sensor, a 3D display, and a controller. The controller may include a memory, which may store instructional information, and a processor. The processor may be configured to receive sensor data from the sensor(s) and to interpret sensor data according to the instructional information. The processor may generate a user interface command(s) and transmit the command(s) to the 3D display. The processor may also generate host commands to control and/or execute applications within a host system of the device.
  • FIG. 1 is a simplified block diagram of a display device 100 according to an embodiment of the present invention. The device 100 may include a UI controller 110, a display driver 120.1, a 3D display 130.1, an optical sensor 140, a touch sensor 150, a touch screen 160, and a host system 170. The device 100 may be embodied as a consumer electronic device such as a cell phone, PDA, tablet computer, gaming device, television, etc. The 3D display 130.1 may be embodied as a stereoscopic display or an auto-stereoscopic display.
  • The UI controller 110 may include a processor 112 and a memory 114. The processor 112 may control the operations of the UI controller 110 according to instructions stored in the memory 114. The memory 114 may store gesture definition libraries 114.1, UI maps 114.2, and command definition libraries 114.3, which may enable user control or manipulation of workspaces for the 3D display 130.1, and/or control functions or applications of the display device 100. The memory 114 may be provided as a non-volatile memory, a volatile memory such as random access memory (RAM), or a combination thereof.
  • The UI controller 110 may be coupled to the host system 170 of the device. The UI controller 110 may receive instructions from the host system 170. The host system 170 may include an operating system (“OS”) 172 and application(s) 174.1-174.N that are executed by the host system 170. The host system 170 may include program instructions to govern operations of the device 100 and manage device resources on behalf of various applications. The host system 170 may, for example, manage content of the 3D display 130.1. In an embodiment, the UI controller 110 may be integrated into the host system 170.
  • The UI controller 110 may manage user interaction for the device 100 based on the gesture definitions 114.1, UI maps 114.2, and command definitions 114.3 stored in the memory 114. User interaction with the display device 100 may be detected using various UI sensors. As illustrated in FIG. 1, the UI controller 110 may be coupled to UI sensor(s) such as the optical sensor 140 and the touch sensor 150 that may measure different user interactions with the 3D display 130.1 and/or the touch screen 160. The touch screen may 160 may be overlaid on the face of a display, which may be provided as a backlit LCD display with an LCD matrix, lenticular lenses, polaraziers, etc.
  • The optical sensor 140 may detect user interactions within 3D workspaces of the device 100, which are discussed in more detail in FIG. 4. The optical sensor 140 may detect the location (or locations for user interactions) and the time of the user movement (e.g., finger, stylus, pen, etc.) as it hovers or moves within a 3D workspace(s). The optical sensor 140 may include one or more light emitters (not shown), such as a light emitting diode (“LED”) or other similar device and one or more light sensors (not shown). During operation of the display device 100, the light emitters may emit light into the 3D workspace(s). An object or multiple objects in the field of view of the emitters may reflect light back into the light sensors, which may generate electrical impulses/signals representing the intensity of light incident thereon.
  • The signals may indicate a location of light impingement on a surface of each light sensor. The optical sensor 140 may digitize and decode the signals from the sensor(s) to calculate a three-dimensional position (i.e., using X, Y, Z coordinates) of the object. Calculation of X, Y, Z coordinates of an object in a field of view is described in the following U.S. patent applications, the contents of which are incorporated herein: U.S. patent application Ser. No. 12/327,511, entitled “Method of Locating an Object in 3D,” filed on Dec. 3, 2008; U.S. patent application Ser. No. 12/499,335, entitled “Method of Locating an Object in 3D,” filed on Jul. 8, 2009; U.S. patent application Ser. No. 12/499,369, entitled “Method of Locating an Object in 3D,” filed on Jul. 8, 2009, issued as U.S. Pat. No. 8,072,614 on Dec. 6, 2011; U.S. patent application Ser. No. 12/499,384, entitled “Method of Locating an Object in 3D,” filed on Jul. 8, 2009; U.S. patent application Ser. No. 12/499,414, entitled “Method of Locating an Object in 3D,” filed on Jul. 8, 2009; U.S. patent application Ser. No. 12/435,499, entitled “Optical Distance Measurement by Triangulation of an Active Transponder,” filed on May 5, 2009; U.S. patent application Ser. No. 12/789,190, entitled “Position Measurement Systems Using Position Sensitive Detectors,” filed on May 27, 2010; U.S. patent application Ser. No. 12/783,673, entitled “Multiuse Optical Sensor,” filed on May 20, 2010.
  • The touch sensor 150 may detect user touches on or near the touch screen 160. The touch sensor 150 may detect the location (or locations for multi-touch user interactions) and time of the user touch. The touch sensor 150 may be complimentary to the type of touch screen 160. For example, if the touch screen 160 is provided as a capacitive touch screen, the touch sensor 150 may be provided as a capacitive sensor (or a grid of capacitive sensors) detecting changes in respective capacitive fields. Further, the touch sensor 150 may be programmed to differentiate between desired user touch events and false positives for objects larger than a typical user interaction instrument (e.g., finger, pen, stylus). Other types of sensors may include but are not limited to resistive sensors, inductive sensors, etc.
  • The UI controller 110 may receive the sensor(s) results and may generate user interface commands to control or manipulate content rendered on the 3D display 130.1, via the display driver 120.1. The UI controller 110 may also generate host commands based on the sensor(s) results to control and/or execute applications in the host system 170, which may also alter content rendered on the 3D display 130.1.
  • In an embodiment, the UI controller 110 may also be coupled to a haptics driver 120.2, which may drive one or more haptics actuator(s) 130.2. In such an embodiment, the UI controller 110 may generate haptics commands to control the haptics actuator(s) 130.2 from the sensor(s) results. Haptics refers to the sense of touch. In electronic devices, haptics relates to providing a sensory feedback or “haptics effects” to a user. The haptics actuator(s) 130.2 may be embodied as piezoelectric elements, linear resonant actuators (“LRAs”), eccentric rotating mass actuators (“ERMs”), and/or other known actuator types. The haptics driver 120.2 may transmit drive signals to the haptics actuator(s) 130.2 causing it to vibrate according to the drive signal properties. The vibrations may be felt by the user providing various vibro-tactile sensory feedback stimuli.
  • For a 3D enabled display device, such as device 100, haptics effects may be generated for the device, meaning a user may feel the haptics effects with a hand opposite of the hand interacting with a 3D workspace. For example, selecting an interactive element in a 3D workspace with a finger from the user's right hand may generate a haptics effect felt by the user's left hand, which may indicate selection of the icon. Similar haptics effects may be generated for other interactions, for example, scrolling a 3D workspace left to right may generate a haptics effect which may provide a sensation of a movement from left to right.
  • FIG. 2 is a functional block diagram of a UI controller 200 according to an embodiment of the present invention. The UI controller 200 may include input driver(s) 210, a processor 220, a memory 230, and output driver(s) 240. The input driver(s) 210 may receive sensor inputs (e.g., optical sensors, touch sensors, etc.) and may generate a corresponding input signal. The sensor inputs may be coupled to the input driver(s) 210 via a serial interface such as a high speed 12C interface. The input driver(s) 210 may also control the coupled sensor operations such as when to power on, read data, etc.
  • The processor 220 may control the operations of the UI controller 200 according to instructions saved in the memory 230. The memory 230 may be provided as a non-volatile memory, a volatile memory such as random access memory (“RAM”), or a combination thereof. The processor 220 may include a gesture classification module 222, a UI search module 224, and a command search module 226. The memory 230 may include gesture definition data 232, UI map data 234, and command definition data 236. The data may be stored as look-up-tables (“LUTs”).
  • For example, the gesture definition data 232 may include a LUT with possible input value(s) and corresponding gesture(s). The UI map data 234 may include a LUT with possible input value(s) and corresponding icon(s). Furthermore, the command definitions 236 may include a LUT with possible gesture and icon values and corresponding user interface commands or host commands. In an embodiment, data (i.e., gesture data, UI map data, and/or command data, etc.) may be written into the memory 230 by the host system 250 or may be pre-programmed.
  • The gesture classification module 222 may receive the input signals from the input driver(s) 210 and may decode a gesture(s) from the input signals based on the gesture definition data 232. For example, the gesture classification module 222 may compare the input signals to stored input values in the gesture definition data 232 and may match the input signals to a corresponding stored gesture value. The gesture may represent a user action within a 3D workspace or on the touch screen as indicated by the input signals. The gesture classification module 222 may also calculate location data from the input signals and may generate location data (e.g., X, Y, Z coordinates) therefrom.
  • The UI search module 224 may receive the gesture and/or location data from the gesture classification module 222. The UI search module 224 may calculate a UI interaction such as an icon selection or workspace manipulation (scroll, zoom, etc.) based on the UI map data 232. For example, the UI search module 224 may compare the gesture and/or location data to stored map values in the UI map data 232 and may match the gestures and/or locations to corresponding 2D or 3D UI interactions.
  • The command search module 226 may receive the calculated gesture and UI interaction data, and may generate one or more commands (i.e., display commands, host commands, haptics commands, etc.) based on the command definition data 236. For example, the command search module 226 may compare the calculated gesture and UI interaction data to one or more corresponding command(s). The commands may be received by the output driver(s) 240, which, in turn, may control various output devices (i.e., 3D display, haptics device, etc.). For example, the commands may control or manipulate 3D content rendered in a 3D workspace of a 3D display (scrolling icons, zooming content for an application, etc.). In various embodiments, data from each of the modules 222-226 may also be communicated to the host system 250 to control OS functions or applications running in the host system 250. For example, swiping a finger from left to right in a 3D workspace of a document reader application to turn pages of a document being viewed with the application.
  • FIG. 3 illustrates a model 300 of user input detection areas, labeled “detection layer 1”-“detection layer N,” for a 3D enabled display device 310 according to an embodiment of the present invention. The detection layers 1-N may be associated with interactive spatial volumes for detecting user interactions with the device 310. As illustrated, the spatial volumes for the detection layers 1-N may emanate from the device 310 at varying angles, as provided by optical sensors (not shown) for the device 310. The varying angles may provide for detecting user interactions not only directly above the device 310, but also around the perimeter of the device 310. Although the detection layers 1-N are shown as flat spatial volumes, in practice, they may be curved radial spatial volumes.
  • As discussed, a 3D display may create the illusion of 3D depth for content displayed or rendered on a 3D enabled display device. The content may include GUI elements such as 3D icons, buttons, etc. or 3D application content such as a map, a movie, a drawing canvas, etc. The content may be rendered in 3D workspaces or “UI layers” above the 3D display. The detection layers 1-N as described in FIG. 3 may be calibrated to overlap with the UI layers to detect user interactions with the UI layers. A user may interact within the 3D workspaces to manipulate GUI elements rendered in a workspace (e.g., scrolling, moving, or selecting GUI elements) or to control applications rendered in a workspace. FIGS. 4( a)-(d) illustrate 3D workspaces 400 for a 3D enabled display 410 device according to an embodiment of the present invention. As illustrated in FIG. 4( a), the 3D workspaces may be rendered as one or more 3D workspaces, shown here as “UI Layer 1” and “UI Layer 2.” The 3D workspaces may extended in an outward manner away from the display device 410.
  • In an embodiment, the display device 410 may render content for a 3D workspace as the workspace may be “activated” by a user for viewing content for the workspace. As illustrated in FIG. 4( b) a user may activate or “maximize” a 3D workspace by performing a double-tap gesture in the desired workspace. For FIG. 4( b), UI layer 2 is shown as a “current” working layer for the user, while UI layer 1 is shown as a desired “next” working layer that the user may wish to maximize. The user may perform the double-tap gesture in UI layer 1 to cause the display device 410 to render the content associated with UI layer 1.
  • In an embodiment, a display device may “push” an activated 3D workspace to an outward-most viewing perspective (away from the display device) and maximize the viewing depth of the workspace's rendered content to a predetermined viewing depth.
  • As discussed, a user may also interact with interactive elements (e.g., 3D icons) of a 3D workspace. FIG. 4( c) illustrates a plurality of 3D icons rendered in UI Layer 1 for the device 410. A user may perform a snap touch within a 3D icon rendered in UI Layer 1 (move their finger quickly forward and back resembling touch of an element on a touch screen) to select the corresponding icon.
  • The 3D workspaces may include other areas that do not include interactive elements. FIG. 5 illustrates a 3D workspace 500 for a 3D enabled display device 510 according to an embodiment of the present invention. For example, 3D icons may be spaced apart from each other by predetermined distances. The separation distance may be an inactive area between the 3D icons. Further, other areas of the workspaces may be unoccupied by content. In various embodiments, these areas may be designated as “dead zones,” wherein user interaction with the dead zones does not control or manipulate the workspace/device. As illustrated in FIG. 5, the device 510 may render a plurality of 3D icons in a 3D workspace, labeled “UI Layer 2.” Dead zones for UI Layer 2 are indicated as the space between the 3D icons.
  • In other embodiments, areas that do not include interactive elements may be designated as “active zones,” wherein user interaction within an active zone may control or manipulate rendering of a 3D workspace(s) or functions of the device 510. An active zone for UI Layer 2 is shown as the open area behind the 3D icons. A user may move their finger from left-to-right in UI Layer 2 to cause the 3D icons in that layer to scroll from left to right. In other examples, active zones may provide for activating workspaces, zooming, performing gesture commands, etc.
  • As discussed, a UI controller (e.g., UI controller 200 of FIG. 2) may also provide for 3D UI controls for user gestures performed within a 3D workspace to manipulate content rendered in the workspace. The UI controller may include gesture definitions and UI maps which may enable control and/or interaction with 3D content associated with various applications rendered on a 3D display.
  • FIGS. 6( a)-(b) illustrate 3D workspaces 600 for a 3D enabled display device 610 according to an embodiment of the present invention. The device 610 may include a 3D mapping application to display a 3D map. The device 610 may render the 3D map in a 3D workspace, shown as “UI Layer 2.” A UI controller (i.e., UI controller 200 of FIG. 2) may enable user control of the 3D map through gestures, selections, etc. to manipulate views of the map or select items in the map. As illustrated in FIG. 6( a), the user may perform a “spreading” gesture to “zoom-out” the 3D map. As illustrated in FIG. 6( b), the user may perform a “pinching” gesture to “zoom-in” the 3D map.
  • In an embodiment, a user may also select items in an application (e.g., buildings rendered in a 3D map of a mapping application) by performing a snap touch to activate content related to the items.
  • FIG. 7 illustrates another 3D workspace 700 for a 3D enabled display device 710 according to an embodiment of the present invention. The device 710 may include a 3D keyboard application to display a 3D keyboard. The device 710 may render the 3D keyboard in 3D workspaces, shown as “UI Layer 1” and “UI Layer 2.” The 3D keyboard may be rendered in UI Layer 2, while 3D icons for text input commands such as “Caps Lock,” “Delete,” or “Space” may be rendered in UI Layer 1. A user may input text by swiping a finger through 3D letter icons of the 3D keyboard or by “tapping” the letter icons in UI Layer 2. The user may select the text input commands by performing a snap touch on corresponding 3D icons for the commands.
  • In various embodiments, a UI controller (i.e., UI controller 200 of FIG. 2) may also include gesture definitions associated with 3D UI controls to control operations associated with host system functions of a 3D enabled display device. FIG. 8 illustrates another 3D workspace 800 for a 3D enabled display device 810 according to an embodiment of the present invention. As illustrated in FIG. 8, the device 810 may provide for user inputs through gestures performed in a 3D workspace, shown as “UI Layer 1.” For example, a user performing an ‘L’ gesture in UI Layer 1 to lock the device 810. In an embodiment, a user may also perform predetermined gesture commands for inputting various words or phrases.
  • Several embodiments of the present invention are specifically illustrated and described herein. However, it will be appreciated that modifications and variations of the present invention are covered by the above teachings. In other instances, well-known operations, components and circuits have not been described in detail so as not to obscure the embodiments. It can be appreciated that the specific structural and functional details disclosed herein may be representative and do not necessarily limit the scope of the embodiments.
  • Those skilled in the art may appreciate from the foregoing description that the present invention may be implemented in a variety of forms, and that the various embodiments may be implemented alone or in combination. Therefore, while the embodiments of the present invention have been described in connection with particular examples thereof, the true scope of the embodiments and/or methods of the present invention should not be so limited since other modifications will become apparent to the skilled practitioner upon a study of the drawings, specification, and following claims.
  • Various embodiments may be implemented using hardware elements, software elements, or a combination of both. Examples of hardware elements may include processors, microprocessors, circuits, circuit elements (e.g., transistors, resistors, capacitors, inductors, and so forth), integrated circuits, application specific integrated circuits (“ASIC”), programmable logic devices (“PLD”), digital signal processors (“DSP”), field programmable gate array (“FPGA”), logic gates, registers, semiconductor device, chips, microchips, chip sets, and so forth. Examples of software may include software components, programs, applications, computer programs, application programs, system programs, machine programs, operating system software, middleware, firmware, software modules, routines, subroutines, functions, methods, procedures, software interfaces, application program interfaces (“API”), instruction sets, computing code, computer code, code segments, computer code segments, words, values, symbols, or any combination thereof. Determining whether an embodiment is implemented using hardware elements and/or software elements may vary in accordance with any number of factors, such as desired computational rate, power levels, heat tolerances, processing cycle budget, input data rates, output data rates, memory resources, data bus speeds and other design or performance constraints.
  • Some embodiments may be implemented, for example, using a non-transitory computer-readable medium or article which may store an instruction or a set of instructions that, if executed by a machine, may cause the machine to perform a method and/or operations in accordance with the embodiments. Such a machine may include, for example, any suitable processing platform, computing platform, computing device, processing device, computing system, processing system, computer, processor, or the like, and may be implemented using any suitable combination of hardware and/or software. The computer-readable medium or article may include, for example, any suitable type of memory unit, memory device, memory article, memory medium, storage device, storage article, storage medium and/or storage unit, for example, memory, removable or non-removable media, erasable or non-erasable media, writeable or re-writeable media, digital or analog media, hard disk, floppy disk, Compact Disc Read Only Memory (CD-ROM), Compact Disc Recordable (CD-R), Compact Disc Rewriteable (CD-RW), optical disk, magnetic media, magneto-optical media, removable memory cards or disks, various types of Digital Versatile Disc (DVD), a tape, a cassette, or the like. The instructions may include any suitable type of code, such as source code, compiled code, interpreted code, executable code, static code, dynamic code, encrypted code, and the like, implemented using any suitable high-level, low-level, object-oriented, visual, compiled and/or interpreted programming language.

Claims (26)

1. A three-dimensional (“3D”) user interface (“UI”) processing system for a device, comprising:
a 3D display device to display 3D images;
at least one sensor to detect a user interaction event with the 3D display device; and
a controller, including
a memory to store instructional information related to interaction events; and
a processor to receive sensor data from the at least one sensor and to interpret the sensor data according to the instructional information.
2. The system of claim 1, wherein the at least one sensor is an optical sensor.
3. The system of claim 1, wherein the at least one sensor is a touch screen sensor.
4. The system of claim 3, wherein the touch screen sensor is capacitive.
5. The system of claim 3, wherein the touch screen sensor is resistive.
6. The system of claim 1, further comprising a haptics output device.
7. The system of claim 1, wherein the instructional information includes gesture definitions, UI maps, and command definitions, and wherein to interpret the sensor data includes calculating a gesture from the sensor data based on the gesture definitions, and calculating a user interaction from the sensor data based on a UI map.
8. The system of claim 7, wherein the calculated gesture and user interaction are reported to a host system.
9. The system of claim 7, further comprising the processor to correlate the calculated gesture and user interaction to a UI command, to generate the UI command, and to transmit the UI command to the 3D display device.
10. The system of claim 9, wherein the UI map includes data for 3D interactive elements where the UI command is generated if the user interaction is with a 3D interactive element.
11. The system of claim 7, further comprising the processor to correlate the calculated gesture and user interaction to a host command, to generate the host command, and to transmit the host command to a host system.
12. The system of claim 7, further comprising a haptics output device and the processor to correlate the calculated gesture and user interaction to a haptics command, to generate the haptics command, and to transmit the host command to the haptics output device.
13. The system of claim 7, wherein the UI map includes an active zone and a dead zone wherein the UI command is generated if the user interaction is with the active zone and no UI command is generated if the user interaction is with a dead zone.
14. The system of claim 13, further comprising a host system and if the user interaction is with the active zone, the processor to correlate the calculated gesture and user interaction to a host command, to generate the host command, and to transmit the host command to the host system.
15. The system of claim 13, further comprising a haptics output device and if the user interaction is with the active zone, the processor to correlate the calculated gesture and user interaction to a haptics command, to generate the haptics command, and to transmit the haptics command to the haptics output device.
16. An electronic device, comprising:
a display;
an input device to capture operator commands; and
a processing system to interpret data from the input device and to display workspaces on the display, wherein
the workspaces are linked to each other by a navigation model having at least one dimension laterally with respect to a face of the display and another dimension having depth with respect to the face of the display, and
when the processing system recognizes input data corresponding to a navigation command, the processing system identifies a new workspace according to the navigation model and the navigation command and causes the new workspace to be displayed.
17. The device of claim 16, wherein the input device is at least one optical sensor.
18. The device of claim 16, wherein the workspaces include 3D interactive elements where the navigation command is generated if the user interaction is with a 3D interactive element.
19. An electronic device, comprising:
a stereoscopic display to simulate a three-dimensional (“3D”) image;
an input device to capture operator commands; and
a processing system to interpret data from the input device and to display a 3D workspace on the display, wherein
the 3D workspace includes a plurality of interactive elements distributed across at least two 3D workspace layers at different depths, and
the processing system interprets input data corresponding to operator interaction with the interactive elements displayed in the workspace.
20. The device of claim 19, wherein the input device is at least one optical sensor.
21. The device of claim 19, wherein the processing system includes gesture definitions, workspace maps, and command definitions, and wherein to interpret data from the input device includes calculating a gesture from the sensor data based on the gesture definitions, calculating a user interaction with the interactive elements from the sensor data based on a workspace map, and correlating the calculated gesture and user interaction to a UI command.
22. A method of controlling a three-dimensional (“3D”) user interface, comprising:
receiving a sensor input based on a user interaction with the 3D user interface;
processing the sensor input;
generating a user interface command based on the processed sensor input and stored instructions; and
controlling operation of the 3D user interface based on the user interface command.
23. The method of claim 22, wherein the processing includes determining a gesture, location, and corresponding interaction event of the user interaction based on stored gesture definitions and user interface maps.
24. The method of claim 23, further comprising generating a host command and controlling operation of a host system based on the host command.
25. The method of claim 23, further comprising generating a haptics command and controlling a haptics output device based on the haptics command.
26. The method of claim 23, wherein user interaction with the 3D user interface includes interaction with 3D elements rendered in the 3D user interface.
US13/434,677 2011-04-01 2012-03-29 3d user interface control Abandoned US20120249475A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US13/434,677 US20120249475A1 (en) 2011-04-01 2012-03-29 3d user interface control

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201161470764P 2011-04-01 2011-04-01
US13/434,677 US20120249475A1 (en) 2011-04-01 2012-03-29 3d user interface control

Publications (1)

Publication Number Publication Date
US20120249475A1 true US20120249475A1 (en) 2012-10-04

Family

ID=46926543

Family Applications (4)

Application Number Title Priority Date Filing Date
US13/433,069 Abandoned US20120249461A1 (en) 2011-04-01 2012-03-28 Dedicated user interface controller for feedback responses
US13/433,105 Active 2032-08-25 US8937603B2 (en) 2011-04-01 2012-03-28 Method and apparatus for haptic vibration response profiling and feedback
US13/434,677 Abandoned US20120249475A1 (en) 2011-04-01 2012-03-29 3d user interface control
US13/434,623 Abandoned US20120249474A1 (en) 2011-04-01 2012-03-29 Proximity and force detection for haptic effect generation

Family Applications Before (2)

Application Number Title Priority Date Filing Date
US13/433,069 Abandoned US20120249461A1 (en) 2011-04-01 2012-03-28 Dedicated user interface controller for feedback responses
US13/433,105 Active 2032-08-25 US8937603B2 (en) 2011-04-01 2012-03-28 Method and apparatus for haptic vibration response profiling and feedback

Family Applications After (1)

Application Number Title Priority Date Filing Date
US13/434,623 Abandoned US20120249474A1 (en) 2011-04-01 2012-03-29 Proximity and force detection for haptic effect generation

Country Status (2)

Country Link
US (4) US20120249461A1 (en)
WO (4) WO2012135378A1 (en)

Cited By (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130268882A1 (en) * 2012-04-10 2013-10-10 Lg Electronics Inc. Display apparatus and method of controlling the same
US20140085249A1 (en) * 2012-09-25 2014-03-27 Synaptics Incorporated Systems and methods for decoupling image generation rate from reporting rate in capacitive sensing
US20140184510A1 (en) * 2013-01-02 2014-07-03 Samsung Electronics Co., Ltd. Mouse function provision method and terminal implementing the same
US20140215406A1 (en) * 2013-01-31 2014-07-31 Sony Corporation Mobile terminal
US20150121263A1 (en) * 2013-10-31 2015-04-30 Fih (Hong Kong) Limited Display device and method for navigating between display layers thereof
US20150138068A1 (en) * 2012-05-03 2015-05-21 Georgia Tech Research Corporation Methods, Controllers and Computer Program Products for Accessibility to Computing Devices
WO2016018305A1 (en) * 2014-07-30 2016-02-04 Hewlett-Packard Development Company, L.P. Detector for a display
US20160110078A1 (en) * 2011-06-09 2016-04-21 C/O Sony Corporation Information processing device, information processing method and program
US20160162095A1 (en) * 2014-05-21 2016-06-09 International Business Machines Corporation Evaluation of digital content using intentional user feedback obtained through haptic interface
WO2018067587A1 (en) 2016-10-09 2018-04-12 Alibaba Group Holding Limited Three-dimensional graphical user interface for informational input in virtual reality environment
US10198668B2 (en) 2014-07-16 2019-02-05 Samsung Electronics Co., Ltd. Apparatus and method for supporting computer aided diagnosis (CAD) based on probe speed
US10712930B2 (en) 2017-05-28 2020-07-14 International Business Machines Corporation 3D touch based user interface value pickers
CN112639685A (en) * 2018-09-04 2021-04-09 苹果公司 Display device sharing and interaction in Simulated Reality (SR)
US11099723B2 (en) 2014-10-02 2021-08-24 Huawei Technologies Co., Ltd. Interaction method for user interfaces
US11494986B2 (en) * 2017-04-20 2022-11-08 Samsung Electronics Co., Ltd. System and method for two dimensional application usage in three dimensional virtual reality environment

Families Citing this family (124)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5648207B2 (en) * 2009-09-04 2015-01-07 現代自動車株式会社 Vehicle control device
US10191546B2 (en) * 2011-06-20 2019-01-29 Immersion Corporation Haptic theme framework
US20130009915A1 (en) 2011-07-08 2013-01-10 Nokia Corporation Controlling responsiveness to user inputs on a touch-sensitive display
US9195351B1 (en) * 2011-09-28 2015-11-24 Amazon Technologies, Inc. Capacitive stylus
JP6130844B2 (en) * 2011-10-19 2017-05-17 トムソン ライセンシングThomson Licensing Remote control with feedback for blind navigation
IL216118A0 (en) * 2011-11-03 2012-02-29 Google Inc Customer support solution recommendation system
US8633911B2 (en) * 2011-12-14 2014-01-21 Synaptics Incorporated Force sensing input device and method for determining force information
EP2624100B1 (en) 2012-02-01 2017-06-14 Immersion Corporation Eccentric rotating mass actuator optimization for haptic effects
CN103425064B (en) * 2012-05-09 2017-12-22 布里斯托尔D/B/A远程自动化解决方案公司 Pass through the method and apparatus of process control equipment display information
US20150234493A1 (en) 2012-05-09 2015-08-20 Nima Parivar Varying output for a computing device based on tracking windows
US10108265B2 (en) 2012-05-09 2018-10-23 Apple Inc. Calibration of haptic feedback systems for input devices
US20130318437A1 (en) * 2012-05-22 2013-11-28 Samsung Electronics Co., Ltd. Method for providing ui and portable apparatus applying the same
WO2013188307A2 (en) 2012-06-12 2013-12-19 Yknots Industries Llc Haptic electromagnetic actuator
US9158405B2 (en) * 2012-06-15 2015-10-13 Blackberry Limited Electronic device including touch-sensitive display and method of controlling same
US9886116B2 (en) 2012-07-26 2018-02-06 Apple Inc. Gesture and touch input detection through force sensing
US9746924B2 (en) * 2012-09-11 2017-08-29 Nec Corporation Electronic device, method for controlling electronic device, and recording medium
US20140092003A1 (en) * 2012-09-28 2014-04-03 Min Liu Direct haptic feedback
KR20140047897A (en) * 2012-10-15 2014-04-23 삼성전자주식회사 Method for providing for touch effect and an electronic device thereof
US9589538B2 (en) * 2012-10-17 2017-03-07 Perceptive Pixel, Inc. Controlling virtual objects
US9524624B2 (en) * 2012-12-13 2016-12-20 Immersion Corporation Haptic system with increased LRA bandwidth
US9202350B2 (en) * 2012-12-19 2015-12-01 Nokia Technologies Oy User interfaces and associated methods
US10175874B2 (en) * 2013-01-04 2019-01-08 Samsung Electronics Co., Ltd. Display system with concurrent multi-mode control mechanism and method of operation thereof
US9304587B2 (en) 2013-02-13 2016-04-05 Apple Inc. Force sensing mouse
US10578499B2 (en) * 2013-02-17 2020-03-03 Microsoft Technology Licensing, Llc Piezo-actuated virtual buttons for touch surfaces
GB2513884B (en) 2013-05-08 2015-06-17 Univ Bristol Method and apparatus for producing an acoustic field
WO2014190018A1 (en) * 2013-05-21 2014-11-27 Stanley Innovation, Inc. A system and method for a human machine interface utilizing near-field quasi-state electrical field sensing technology
US10591992B2 (en) * 2013-06-17 2020-03-17 Lenovo (Singapore) Pte. Ltd. Simulation of control areas on touch surface using haptic feedback
JP6032364B2 (en) 2013-06-26 2016-11-24 富士通株式会社 DRIVE DEVICE, ELECTRONIC DEVICE, AND DRIVE CONTROL PROGRAM
WO2014207842A1 (en) * 2013-06-26 2014-12-31 富士通株式会社 Drive device, electronic apparatus, and drive control program
EP3019943A4 (en) * 2013-07-12 2017-05-31 Tactual Labs Co. Reducing control response latency with defined cross-control behavior
US11229239B2 (en) 2013-07-19 2022-01-25 Rai Strategic Holdings, Inc. Electronic smoking article with haptic feedback
US9520036B1 (en) * 2013-09-18 2016-12-13 Amazon Technologies, Inc. Haptic output generation with dynamic feedback control
US9213408B2 (en) * 2013-10-08 2015-12-15 Immersion Corporation Generating haptic effects while minimizing cascading
JP2015121983A (en) * 2013-12-24 2015-07-02 京セラ株式会社 Tactile sensation presentation device
US9448631B2 (en) 2013-12-31 2016-09-20 Microsoft Technology Licensing, Llc Input device haptics and pressure sensing
US20150242037A1 (en) 2014-01-13 2015-08-27 Apple Inc. Transparent force sensor with strain relief
US9471143B2 (en) * 2014-01-20 2016-10-18 Lenovo (Singapore) Pte. Ltd Using haptic feedback on a touch device to provide element location indications
US9182823B2 (en) * 2014-01-21 2015-11-10 Lenovo (Singapore) Pte. Ltd. Actuating haptic element of a touch-sensitive device
US9817489B2 (en) 2014-01-27 2017-11-14 Apple Inc. Texture capture stylus and method
US20150323994A1 (en) * 2014-05-07 2015-11-12 Immersion Corporation Dynamic haptic effect modification
US10146318B2 (en) 2014-06-13 2018-12-04 Thomas Malzbender Techniques for using gesture recognition to effectuate character selection
CN106575230A (en) 2014-09-02 2017-04-19 苹果公司 Semantic framework for variable haptic output
US10297119B1 (en) 2014-09-02 2019-05-21 Apple Inc. Feedback device in an electronic device
GB2530036A (en) 2014-09-09 2016-03-16 Ultrahaptics Ltd Method and apparatus for modulating haptic feedback
US9939901B2 (en) 2014-09-30 2018-04-10 Apple Inc. Haptic feedback assembly
US9400570B2 (en) 2014-11-14 2016-07-26 Apple Inc. Stylus with inertial sensor
US9846484B2 (en) * 2014-12-04 2017-12-19 Immersion Corporation Systems and methods for controlling haptic signals
US9575573B2 (en) 2014-12-18 2017-02-21 Apple Inc. Stylus with touch sensor
CA2976319C (en) 2015-02-20 2023-06-27 Ultrahaptics Ip Limited Algorithm improvements in a haptic system
AU2016221500B2 (en) 2015-02-20 2021-06-10 Ultrahaptics Ip Limited Perceptions in a haptic system
US9798409B1 (en) 2015-03-04 2017-10-24 Apple Inc. Multi-force input device
US9645647B2 (en) * 2015-05-13 2017-05-09 Immersion Corporation Systems and methods for haptic feedback for modular devices
EP3314369B1 (en) 2015-06-26 2021-07-21 SABIC Global Technologies B.V. Electromechanical actuators for haptic feedback in electronic devices
US10818162B2 (en) 2015-07-16 2020-10-27 Ultrahaptics Ip Ltd Calibration techniques in haptic systems
US10109161B2 (en) * 2015-08-21 2018-10-23 Immersion Corporation Haptic driver with attenuation
JP2018528534A (en) * 2015-09-25 2018-09-27 イマージョン コーポレーションImmersion Corporation Haptic effect design system
US10516348B2 (en) 2015-11-05 2019-12-24 Mems Drive Inc. MEMS actuator package architecture
JP2017111462A (en) * 2015-11-27 2017-06-22 京セラ株式会社 Feeling presentation device and feeling presentation method
WO2017111928A1 (en) * 2015-12-22 2017-06-29 Intel Corporation Reduction of touchscreen bounce
US10976819B2 (en) * 2015-12-28 2021-04-13 Microsoft Technology Licensing, Llc Haptic feedback for non-touch surface interaction
US11189140B2 (en) 2016-01-05 2021-11-30 Ultrahaptics Ip Ltd Calibration and detection techniques in haptic systems
US10061385B2 (en) 2016-01-22 2018-08-28 Microsoft Technology Licensing, Llc Haptic feedback for a touch input device
KR102334863B1 (en) * 2016-04-19 2021-12-03 니폰 덴신 덴와 가부시끼가이샤 Pseudo tactile force generation device
US10585480B1 (en) 2016-05-10 2020-03-10 Apple Inc. Electronic device with an input device having a haptic engine
DK201670737A1 (en) 2016-06-12 2018-01-22 Apple Inc Devices, Methods, and Graphical User Interfaces for Providing Haptic Feedback
DK179823B1 (en) 2016-06-12 2019-07-12 Apple Inc. Devices, methods, and graphical user interfaces for providing haptic feedback
FR3053489A1 (en) * 2016-06-29 2018-01-05 Dav CONTROL METHOD AND CONTROL INTERFACE FOR MOTOR VEHICLE
FR3053488A1 (en) * 2016-06-29 2018-01-05 Dav CONTROL METHOD AND CONTROL INTERFACE FOR MOTOR VEHICLE
US10268275B2 (en) 2016-08-03 2019-04-23 Ultrahaptics Ip Ltd Three-dimensional perceptions in haptic systems
US10671167B2 (en) 2016-09-01 2020-06-02 Apple Inc. Electronic device including sensed location based driving of haptic actuators and related methods
DK179278B1 (en) 2016-09-06 2018-03-26 Apple Inc Devices, methods and graphical user interfaces for haptic mixing
DK180050B1 (en) * 2016-09-06 2020-02-04 Apple Inc. Devices, methods and graphical user interfaces for generating tactile outputs
US10606355B1 (en) * 2016-09-06 2020-03-31 Apple Inc. Haptic architecture in a portable electronic device
DK201670720A1 (en) 2016-09-06 2018-03-26 Apple Inc Devices, Methods, and Graphical User Interfaces for Generating Tactile Outputs
KR102629409B1 (en) * 2016-11-11 2024-01-26 삼성전자주식회사 Method for providing object information and electronic device thereof
US10943578B2 (en) 2016-12-13 2021-03-09 Ultrahaptics Ip Ltd Driving techniques for phased-array systems
US10936067B1 (en) * 2017-02-13 2021-03-02 Snap, Inc. Generating a response that depicts haptic characteristics
US10732714B2 (en) 2017-05-08 2020-08-04 Cirrus Logic, Inc. Integrated haptic system
DK201770372A1 (en) 2017-05-16 2019-01-08 Apple Inc. Tactile feedback for locked device user interfaces
US11054932B2 (en) 2017-09-06 2021-07-06 Apple Inc. Electronic device having a touch sensor, force sensor, and haptic actuator in an integrated module
US11531395B2 (en) 2017-11-26 2022-12-20 Ultrahaptics Ip Ltd Haptic effects from focused acoustic fields
US10871829B2 (en) 2017-12-05 2020-12-22 Tactai, Inc. Touch enabling process, haptic accessory, and core haptic engine to enable creation and delivery of tactile-enabled experiences with virtual objects
US11360546B2 (en) 2017-12-22 2022-06-14 Ultrahaptics Ip Ltd Tracking in haptic systems
US11704983B2 (en) 2017-12-22 2023-07-18 Ultrahaptics Ip Ltd Minimizing unwanted responses in haptic systems
US10832537B2 (en) 2018-04-04 2020-11-10 Cirrus Logic, Inc. Methods and apparatus for outputting a haptic signal to a haptic transducer
US10911861B2 (en) 2018-05-02 2021-02-02 Ultrahaptics Ip Ltd Blocking plate structure for improved acoustic transmission efficiency
US10599221B2 (en) 2018-06-15 2020-03-24 Immersion Corporation Systems, devices, and methods for providing limited duration haptic effects
US11269415B2 (en) 2018-08-14 2022-03-08 Cirrus Logic, Inc. Haptic output systems
US10936071B2 (en) 2018-08-30 2021-03-02 Apple Inc. Wearable electronic device with haptic rotatable input
US10831276B2 (en) 2018-09-07 2020-11-10 Apple Inc. Tungsten frame of a haptic feedback module for a portable electronic device
US11098951B2 (en) 2018-09-09 2021-08-24 Ultrahaptics Ip Ltd Ultrasonic-assisted liquid manipulation
US10852830B2 (en) * 2018-09-11 2020-12-01 Apple Inc. Power efficient, dynamic management of haptic module mechanical offset
US10966007B1 (en) 2018-09-25 2021-03-30 Apple Inc. Haptic output system
US11378997B2 (en) 2018-10-12 2022-07-05 Ultrahaptics Ip Ltd Variable phase and frequency pulse-width modulation technique
GB201817495D0 (en) 2018-10-26 2018-12-12 Cirrus Logic Int Semiconductor Ltd A force sensing system and method
US11550395B2 (en) 2019-01-04 2023-01-10 Ultrahaptics Ip Ltd Mid-air haptic textures
US10828672B2 (en) 2019-03-29 2020-11-10 Cirrus Logic, Inc. Driver circuitry
US11283337B2 (en) 2019-03-29 2022-03-22 Cirrus Logic, Inc. Methods and systems for improving transducer dynamics
US11644370B2 (en) * 2019-03-29 2023-05-09 Cirrus Logic, Inc. Force sensing with an electromagnetic load
US10955955B2 (en) 2019-03-29 2021-03-23 Cirrus Logic, Inc. Controller for use in a device comprising force sensors
US11509292B2 (en) 2019-03-29 2022-11-22 Cirrus Logic, Inc. Identifying mechanical impedance of an electromagnetic load using least-mean-squares filter
US12035445B2 (en) 2019-03-29 2024-07-09 Cirrus Logic Inc. Resonant tracking of an electromagnetic load
US11842517B2 (en) 2019-04-12 2023-12-12 Ultrahaptics Ip Ltd Using iterative 3D-model fitting for domain adaptation of a hand-pose-estimation neural network
US10976825B2 (en) 2019-06-07 2021-04-13 Cirrus Logic, Inc. Methods and apparatuses for controlling operation of a vibrational output system and/or operation of an input sensor system
WO2020254788A1 (en) 2019-06-21 2020-12-24 Cirrus Logic International Semiconductor Limited A method and apparatus for configuring a plurality of virtual buttons on a device
US11921923B2 (en) * 2019-07-30 2024-03-05 Maxim Integrated Products, Inc. Oscillation reduction in haptic vibrators by minimization of feedback acceleration
KR20220080737A (en) 2019-10-13 2022-06-14 울트라립 리미티드 Dynamic capping by virtual microphones
US11374586B2 (en) 2019-10-13 2022-06-28 Ultraleap Limited Reducing harmonic distortion by dithering
US11408787B2 (en) 2019-10-15 2022-08-09 Cirrus Logic, Inc. Control methods for a force sensor system
US11380175B2 (en) 2019-10-24 2022-07-05 Cirrus Logic, Inc. Reproducibility of haptic waveform
US11169610B2 (en) 2019-11-08 2021-11-09 Ultraleap Limited Tracking techniques in haptic systems
US11545951B2 (en) 2019-12-06 2023-01-03 Cirrus Logic, Inc. Methods and systems for detecting and managing amplifier instability
US11715453B2 (en) 2019-12-25 2023-08-01 Ultraleap Limited Acoustic transducer structures
WO2021141936A1 (en) * 2020-01-06 2021-07-15 Tactai, Inc. Haptic waveform generation and rendering at interface device
US11662821B2 (en) 2020-04-16 2023-05-30 Cirrus Logic, Inc. In-situ monitoring, calibration, and testing of a haptic actuator
US11024135B1 (en) 2020-06-17 2021-06-01 Apple Inc. Portable electronic device having a haptic button assembly
US11816267B2 (en) 2020-06-23 2023-11-14 Ultraleap Limited Features of airborne ultrasonic fields
US11886639B2 (en) 2020-09-17 2024-01-30 Ultraleap Limited Ultrahapticons
US11775084B2 (en) * 2021-04-20 2023-10-03 Microsoft Technology Licensing, Llc Stylus haptic component arming and power consumption
US11567575B2 (en) * 2021-06-14 2023-01-31 Microsoft Technology Licensing, Llc Haptic response control
US11933822B2 (en) 2021-06-16 2024-03-19 Cirrus Logic Inc. Methods and systems for in-system estimation of actuator parameters
US11765499B2 (en) 2021-06-22 2023-09-19 Cirrus Logic Inc. Methods and systems for managing mixed mode electromechanical actuator drive
US11908310B2 (en) 2021-06-22 2024-02-20 Cirrus Logic Inc. Methods and systems for detecting and managing unexpected spectral content in an amplifier system
US11552649B1 (en) 2021-12-03 2023-01-10 Cirrus Logic, Inc. Analog-to-digital converter-embedded fixed-phase variable gain amplifier stages for dual monitoring paths

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070203545A1 (en) * 2006-02-24 2007-08-30 Medtronic, Inc. User interface with 3D environment for configuring stimulation therapy
US20100134409A1 (en) * 2008-11-30 2010-06-03 Lenovo (Singapore) Pte. Ltd. Three-dimensional user interface
US20100281440A1 (en) * 2008-04-24 2010-11-04 Underkoffler John S Detecting, Representing, and Interpreting Three-Space Input: Gestural Continuum Subsuming Freespace, Proximal, and Surface-Contact Modes
US20110093778A1 (en) * 2009-10-20 2011-04-21 Lg Electronics Inc. Mobile terminal and controlling method thereof
US20120102436A1 (en) * 2010-10-21 2012-04-26 Nokia Corporation Apparatus and method for user input for controlling displayed information

Family Cites Families (39)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5825308A (en) * 1996-11-26 1998-10-20 Immersion Human Interface Corporation Force feedback interface having isotonic and isometric functionality
US6411276B1 (en) 1996-11-13 2002-06-25 Immersion Corporation Hybrid control of haptic feedback for host computer and interface device
DE20080209U1 (en) * 1999-09-28 2001-08-09 Immersion Corp Control of haptic sensations for interface devices with vibrotactile feedback
US7730401B2 (en) * 2001-05-16 2010-06-01 Synaptics Incorporated Touch screen with user interface enhancement
US11275405B2 (en) * 2005-03-04 2022-03-15 Apple Inc. Multi-functional hand-held device
GB0323570D0 (en) * 2003-10-08 2003-11-12 Harald Philipp Touch-sensitivity control panel
US7295015B2 (en) * 2004-02-19 2007-11-13 Brooks Automation, Inc. Ionization gauge
US7956846B2 (en) 2006-01-05 2011-06-07 Apple Inc. Portable electronic device with content-dependent touch sensitivity
US7890863B2 (en) 2006-10-04 2011-02-15 Immersion Corporation Haptic effects with proximity sensing
US8103109B2 (en) 2007-06-19 2012-01-24 Microsoft Corporation Recognizing hand poses and/or object classes
KR20090006807A (en) 2007-07-11 2009-01-15 오의진 Data input device using finger motion detection and input conversion method using the same
US20090066660A1 (en) 2007-09-06 2009-03-12 Ure Michael J Interface with and communication between mobile electronic devices
KR101424259B1 (en) 2007-08-22 2014-07-31 삼성전자주식회사 Method and apparatus for providing input feedback in portable terminal
US8917247B2 (en) 2007-11-20 2014-12-23 Samsung Electronics Co., Ltd. External device identification method and apparatus in a device including a touch spot, and computer-readable recording mediums having recorded thereon programs for executing the external device identification method in a device including a touch spot
KR20090066368A (en) 2007-12-20 2009-06-24 삼성전자주식회사 A mobile terminal having a touch screen and a method of controlling the function thereof
JP5166955B2 (en) * 2008-04-24 2013-03-21 キヤノン株式会社 Information processing apparatus, information processing method, and information processing program
US20090279107A1 (en) 2008-05-09 2009-11-12 Analog Devices, Inc. Optical distance measurement by triangulation of an active transponder
US9285459B2 (en) 2008-05-09 2016-03-15 Analog Devices, Inc. Method of locating an object in 3D
US8099332B2 (en) 2008-06-06 2012-01-17 Apple Inc. User interface for application management for a mobile device
US20090309825A1 (en) * 2008-06-13 2009-12-17 Sony Ericsson Mobile Communications Ab User interface, method, and computer program for controlling apparatus, and apparatus
US8174372B2 (en) 2008-06-26 2012-05-08 Immersion Corporation Providing haptic feedback on a touch surface
KR101014263B1 (en) 2008-09-04 2011-02-16 삼성전기주식회사 Tactile sensor
KR20100036850A (en) 2008-09-30 2010-04-08 삼성전기주식회사 Touch panel apparatus using tactile sensor
KR101021440B1 (en) 2008-11-14 2011-03-15 한국표준과학연구원 Touch input device, mobile device using same and control method thereof
US9746544B2 (en) 2008-12-03 2017-08-29 Analog Devices, Inc. Position measurement systems using position sensitive detectors
US8823518B2 (en) * 2008-12-08 2014-09-02 Motorola Solutions, Inc. Method of sensor cluster processing for a communication device
US7843277B2 (en) * 2008-12-16 2010-11-30 Immersion Corporation Haptic feedback generation based on resonant frequency
US8686952B2 (en) 2008-12-23 2014-04-01 Apple Inc. Multi touch with multi haptics
US8291348B2 (en) 2008-12-31 2012-10-16 Hewlett-Packard Development Company, L.P. Computing device and method for selecting display regions responsive to non-discrete directional input actions and intelligent content analysis
US8760413B2 (en) 2009-01-08 2014-06-24 Synaptics Incorporated Tactile surface
JP5343871B2 (en) 2009-03-12 2013-11-13 株式会社リコー Touch panel device, display device with touch panel including the same, and control method for touch panel device
WO2010138385A1 (en) 2009-05-27 2010-12-02 Analog Devices, Inc. Multiuse optical sensor
US8279197B2 (en) 2009-08-25 2012-10-02 Pixart Imaging Inc. Method and apparatus for detecting defective traces in a mutual capacitance touch sensing device
KR20110031797A (en) * 2009-09-21 2011-03-29 삼성전자주식회사 Input device and method of mobile terminal
US8487759B2 (en) * 2009-09-30 2013-07-16 Apple Inc. Self adapting haptic device
KR101120894B1 (en) * 2009-10-20 2012-02-27 삼성전기주식회사 Haptic feedback device and electronic device
KR101802520B1 (en) 2010-03-16 2017-11-28 임머숀 코퍼레이션 Systems and methods for pre-touch and true touch
US9019230B2 (en) 2010-10-31 2015-04-28 Pixart Imaging Inc. Capacitive touchscreen system with reduced power consumption using modal focused scanning
US9164586B2 (en) * 2012-11-21 2015-10-20 Novasentis, Inc. Haptic system with localized response

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070203545A1 (en) * 2006-02-24 2007-08-30 Medtronic, Inc. User interface with 3D environment for configuring stimulation therapy
US20100281440A1 (en) * 2008-04-24 2010-11-04 Underkoffler John S Detecting, Representing, and Interpreting Three-Space Input: Gestural Continuum Subsuming Freespace, Proximal, and Surface-Contact Modes
US20100134409A1 (en) * 2008-11-30 2010-06-03 Lenovo (Singapore) Pte. Ltd. Three-dimensional user interface
US20110093778A1 (en) * 2009-10-20 2011-04-21 Lg Electronics Inc. Mobile terminal and controlling method thereof
US20120102436A1 (en) * 2010-10-21 2012-04-26 Nokia Corporation Apparatus and method for user input for controlling displayed information

Cited By (27)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9766793B2 (en) * 2011-06-09 2017-09-19 Sony Corporation Information processing device, information processing method and program
US20160110078A1 (en) * 2011-06-09 2016-04-21 C/O Sony Corporation Information processing device, information processing method and program
US20130268882A1 (en) * 2012-04-10 2013-10-10 Lg Electronics Inc. Display apparatus and method of controlling the same
US20150138068A1 (en) * 2012-05-03 2015-05-21 Georgia Tech Research Corporation Methods, Controllers and Computer Program Products for Accessibility to Computing Devices
US10281986B2 (en) * 2012-05-03 2019-05-07 Georgia Tech Research Corporation Methods, controllers and computer program products for accessibility to computing devices
US20140085249A1 (en) * 2012-09-25 2014-03-27 Synaptics Incorporated Systems and methods for decoupling image generation rate from reporting rate in capacitive sensing
US9874972B2 (en) * 2012-09-25 2018-01-23 Synaptics Incorporated Systems and methods for decoupling image generation rate from reporting rate in capacitive sensing
US9880642B2 (en) * 2013-01-02 2018-01-30 Samsung Electronics Co., Ltd. Mouse function provision method and terminal implementing the same
US20140184510A1 (en) * 2013-01-02 2014-07-03 Samsung Electronics Co., Ltd. Mouse function provision method and terminal implementing the same
US9582141B2 (en) * 2013-01-31 2017-02-28 Sony Corporation Three dimensional user interface for watch device
US20140215406A1 (en) * 2013-01-31 2014-07-31 Sony Corporation Mobile terminal
CN103970291A (en) * 2013-01-31 2014-08-06 索尼公司 Mobile terminal
US9569068B2 (en) * 2013-10-31 2017-02-14 Fih (Hong Kong) Limited Display device and method for navigating between display layers thereof
US20150121263A1 (en) * 2013-10-31 2015-04-30 Fih (Hong Kong) Limited Display device and method for navigating between display layers thereof
US10168815B2 (en) * 2014-05-21 2019-01-01 International Business Machines Corporation Evaluation of digital content using intentional user feedback obtained through haptic interface
US20160162095A1 (en) * 2014-05-21 2016-06-09 International Business Machines Corporation Evaluation of digital content using intentional user feedback obtained through haptic interface
US10198668B2 (en) 2014-07-16 2019-02-05 Samsung Electronics Co., Ltd. Apparatus and method for supporting computer aided diagnosis (CAD) based on probe speed
US10088917B2 (en) 2014-07-30 2018-10-02 Hewlett-Packard Development Company, L.P. Detector for a display
WO2016018305A1 (en) * 2014-07-30 2016-02-04 Hewlett-Packard Development Company, L.P. Detector for a display
US11099723B2 (en) 2014-10-02 2021-08-24 Huawei Technologies Co., Ltd. Interaction method for user interfaces
WO2018067587A1 (en) 2016-10-09 2018-04-12 Alibaba Group Holding Limited Three-dimensional graphical user interface for informational input in virtual reality environment
EP3523708A4 (en) * 2016-10-09 2019-09-04 Alibaba Group Holding Limited Three-dimensional graphical user interface for informational input in virtual reality environment
US10474242B2 (en) 2016-10-09 2019-11-12 Alibaba Group Holding Limited Three-dimensional graphical user interface for informational input in virtual reality environment
US11054912B2 (en) 2016-10-09 2021-07-06 Advanced New Technologies Co., Ltd. Three-dimensional graphical user interface for informational input in virtual reality environment
US11494986B2 (en) * 2017-04-20 2022-11-08 Samsung Electronics Co., Ltd. System and method for two dimensional application usage in three dimensional virtual reality environment
US10712930B2 (en) 2017-05-28 2020-07-14 International Business Machines Corporation 3D touch based user interface value pickers
CN112639685A (en) * 2018-09-04 2021-04-09 苹果公司 Display device sharing and interaction in Simulated Reality (SR)

Also Published As

Publication number Publication date
WO2012135534A1 (en) 2012-10-04
US8937603B2 (en) 2015-01-20
WO2012135373A3 (en) 2014-05-01
US20120249461A1 (en) 2012-10-04
WO2012135373A2 (en) 2012-10-04
WO2012135378A1 (en) 2012-10-04
WO2012135532A1 (en) 2012-10-04
US20120249474A1 (en) 2012-10-04
US20120249462A1 (en) 2012-10-04

Similar Documents

Publication Publication Date Title
US20120249475A1 (en) 3d user interface control
CN102349040B (en) For comprising the system and method at the interface of the haptic effect based on surface
US10013143B2 (en) Interfacing with a computing application using a multi-digit sensor
CN105683878B (en) User interface object operation in user interface
US9367235B2 (en) Detecting and interpreting real-world and security gestures on touch and hover sensitive devices
US8466934B2 (en) Touchscreen interface
US9423932B2 (en) Zoom view mode for digital content including multiple regions of interest
US9524097B2 (en) Touchscreen gestures for selecting a graphical object
US20110109577A1 (en) Method and apparatus with proximity touch detection
US20110069018A1 (en) Double Touch Inputs
US20120262386A1 (en) Touch based user interface device and method
US20130155018A1 (en) Device and method for emulating a touch screen using force information
US20120274550A1 (en) Gesture mapping for display device
US20130257734A1 (en) Use of a sensor to enable touch and type modes for hands of a user via a keyboard
US20110227947A1 (en) Multi-Touch User Interface Interaction
KR20070006477A (en) Variable menu arrangement method and display device using same
CN105425959A (en) Systems and methods for interfaces featuring surface-based haptic effects
US20150370472A1 (en) 3-d motion control for document discovery and retrieval
JP2011123896A (en) Method and system for duplicating object using touch-sensitive display
CN114153406A (en) Method and device for displaying application
US20140082559A1 (en) Control area for facilitating user input
KR102161061B1 (en) Method and terminal for displaying a plurality of pages
KR20160019449A (en) Disambiguation of indirect input
KR101442438B1 (en) Single touch process to achieve dual touch experience field
US10019127B2 (en) Remote display area including input lenses each depicting a region of a graphical user interface

Legal Events

Date Code Title Description
AS Assignment

Owner name: ANALOG DEVICES, INC., MASSACHUSETTS

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:MURPHY, MARK J.;CONWAY, MEL J.;FLANAGAN, ADRIAN;AND OTHERS;SIGNING DATES FROM 20120412 TO 20120611;REEL/FRAME:028361/0451

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION