WO2010131122A2 - User interface to provide enhanced control of an application program - Google Patents
User interface to provide enhanced control of an application program Download PDFInfo
- Publication number
- WO2010131122A2 WO2010131122A2 PCT/IB2010/001808 IB2010001808W WO2010131122A2 WO 2010131122 A2 WO2010131122 A2 WO 2010131122A2 IB 2010001808 W IB2010001808 W IB 2010001808W WO 2010131122 A2 WO2010131122 A2 WO 2010131122A2
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- touch input
- gui
- touch
- input events
- user
- Prior art date
Links
- 238000000034 method Methods 0.000 claims abstract description 31
- 230000008859 change Effects 0.000 claims abstract description 10
- 230000004044 response Effects 0.000 claims abstract description 9
- 238000012544 monitoring process Methods 0.000 claims abstract description 7
- 230000015654 memory Effects 0.000 description 16
- 238000009877 rendering Methods 0.000 description 12
- 230000008878 coupling Effects 0.000 description 8
- 238000010168 coupling process Methods 0.000 description 8
- 238000005859 coupling reaction Methods 0.000 description 8
- 238000005516 engineering process Methods 0.000 description 8
- 230000004913 activation Effects 0.000 description 7
- 230000006870 function Effects 0.000 description 7
- 230000003993 interaction Effects 0.000 description 7
- 238000013265 extended release Methods 0.000 description 4
- 238000001514 detection method Methods 0.000 description 3
- 230000008569 process Effects 0.000 description 3
- 230000000007 visual effect Effects 0.000 description 3
- 230000009471 action Effects 0.000 description 2
- 238000004891 communication Methods 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 238000010897 surface acoustic wave method Methods 0.000 description 2
- 238000012360 testing method Methods 0.000 description 2
- 238000013459 approach Methods 0.000 description 1
- 230000008901 benefit Effects 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 230000003247 decreasing effect Effects 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- PWPJGUXAGUPAHP-UHFFFAOYSA-N lufenuron Chemical compound C1=C(Cl)C(OC(F)(F)C(C(F)(F)F)F)=CC(Cl)=C1NC(=O)NC(=O)C1=C(F)C=CC=C1F PWPJGUXAGUPAHP-UHFFFAOYSA-N 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 230000002035 prolonged effect Effects 0.000 description 1
- 238000012552 review Methods 0.000 description 1
- 230000001960 triggered effect Effects 0.000 description 1
- 230000016776 visual perception Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04847—Interaction techniques to control parameter settings, e.g. interaction with sliders or dials
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
Definitions
- the present invention generally relates to mobile devices or handsets, and more specifically to mobile devices handling both touch and motion based inputs.
- GUI with respect to the desktop.
- Small screens and tiny keyboards are typical of mobile handsets that fit in your pocket.
- Recent so called smart phones have introduced the use of a touch screen in an attempt to simplify the user experience with his mobile handset. For instance, the touch interface of the iPhone® has revolutionized the mobile handset industry and brought whole new mobile user experiences.
- AP application programs
- touch inputs may control the AP in different ways.
- the desktop GUI comprising a plurality of AP icons may be seen as an AP itself.
- a user touching an AP icon will cause a control of the desktop GUI that will launch the AP corresponding to the touched icon.
- a sliding motion across the desktop GUI, or a drag touch input will cause another control of the desktop GUI, that will display another set of AP icons hidden so far.
- the user gets a feeling that he is browsing through pages of AP icons to select an interesting application program.
- a prolonged touch input or clutch input on any AP icon will cause all icons to start shaking around their position.
- the control associated to the clutch input opens the desktop GUI management. The user can then delete applications from the desktop or move them around in the desktop layout. If such a method facilitates the user experience, there is still today a lot of scope for innovation using touch interfaces of electronic devices, mobile or not.
- the existing examples all use well known touch inputs, such as a short touch, a clutch or a drag.
- the touch inputs may include touch inputs from two fingers that are dragged away or closer to each other which can control a zoom in or a zoom out of a map GUI or a picture.
- touch inputs could be used to increase the user experience with touch screens or panels.
- GUI graphical user interface
- a novel touch input is disclosed that allows further control of application programs.
- the user can either follow a first touch input with the tip of his finger by pressing the finger progressively in contact with the touch panel (extended touch input) or release his finger away from the panel when the first touch input is itself and extended touch input (release touch input) .
- This new touch input differs from existing multi touch input as the touched portion or region of the panel varies in size as the user moves his finger against or away from the screen.
- the present system also relates to a mobile device for imparting control to an application program (AP) running on said mobile device, said mobile device being arranged to:
- AP application program
- GUI graphical user interface
- the present system also relates to an application embodied on a computer readable medium and arranged to impart control to an application program (AP) running on a mobile device, the application comprising:
- GUI graphical user interface
- FIG. 1 shows a mobile device in accordance with an embodiment of the present system
- FIG. 2 shows an illustrative process flow diagram in accordance with an embodiment of the present system
- FIG. 3 shows an exemplary flowchart in accordance with an embodiment of the present system
- FIGs. 4A-4B show exemplary illustrations of an application program controlled according to an embodiment of the present system
- FIG. 5 shows an exemplary implementation in accordance with an embodiment of the present system.
- an operative coupling may include one or more of a wired connection and/or a wireless connection between two or more devices that enables a one and/or two-way communication path between the devices and/or portions thereof.
- An operative coupling may also include a wired and/or wireless coupling to enable communication between a service platform, such as the profiling platform in accordance with an embodiment of the present system, and one or more user devices.
- An operative coupling may also relate to an interaction between program portions and thereby may not describe a physical connection so much as an interaction based coupling.
- rendering and formatives thereof as utilized herein refer to providing content, such as digital media or a graphical user interface (GUI), such that it may be perceived by at least one user sense, such as a sense of sight and/or a sense of hearing.
- GUI graphical user interface
- the present system may render a user interface on a display device so that it may be seen and interacted with by a user.
- rendering may also comprise all the actions required to generate a GUI prior to the display, like e.g. a map representation generated on a server side for a browser application on a user device.
- an electronic device provides a GUI for controlling an application program (AP) through touch inputs.
- AP application program
- a mobile device or handsets The man skilled in the art may easily apply the present teachings to any electronic device presenting a touch sensitive panel, referred also hereafter as a touch sensitive display or screen.
- GUI graphical user interface
- GUI being rendered on the mobile device through a local application program connected to the web-based server.
- Applications like Google Maps ® are implemented today using that approach.
- the provided visual environment may be displayed by the processor on a display device of the mobile device, namely a touch sensitive panel (touch panel in short), which a user may use to provide a number of touch inputs of different types.
- a GUI is a type of user interface which allows a user to interact with electronic devices such as computers, hand-held devices, household appliances, office equipment and the likes.
- GUIs are typically used to render visual and textual images which describe various visual metaphors of an operating system, an application, etc., and implemented on a processor/computer including rendering on ⁇ display device.
- GUIs can represent programs, files and operational functions with graphical images, objects, or vector representations.
- the graphical images can include windows, fields, dialog boxes, menus, icons, buttons, cursors, scroll bars, maps, etc.
- Such images can be arranged in predefined layouts, or can be created dynamically (by the device itself or by a web-based server) to serve the specific actions being taken by a user.
- the user can select and/or activate various graphical images in order to initiate functions and tasks, i.e. controls, associated therewith.
- a user can select a button that opens, closes, minimizes, or maximizes a window, or an icon that launches a particular application program.
- the GUI may present a typical user interface including a windowing environment and as such, may include menu items, pull-down menu items, icons, pop-up windows, etc., that are typical of those provided in a windowing environment, such as may be represented within a WindowsTM Operating System GUI as provided by Microsoft Corporation and/or an OS XTM Operating System GUI, such as provided on an iPhoneTM, MacBookTM, iMacTM, etc., as provided by Apple, Inc., and/or another operating system.
- a WindowsTM Operating System GUI as provided by Microsoft Corporation and/or an OS XTM Operating System GUI, such as provided on an iPhoneTM, MacBookTM, iMacTM, etc., as provided by Apple, Inc., and/or another operating system.
- an application program (AP) - or software - may be seen as any tool that functions and is operated by means of a computer, with the purpose of performing one or more functions or tasks for a user or another application program.
- AP application program
- a GUI of the AP may be displayed on the mobile device display.
- FIG. 1 is an illustration of an exemplary mobile device 1 10 used in the present system.
- the mobile device 1 10 comprises a display device 1 1 1 , a processor 1 12, a controller 1 13 of the display device, and an input device 1 15.
- Mobile device 1 10 may be for instance a desktop or laptop computer, a mobile device, a PDA (personal digital assistant) ...
- the user interaction with and manipulation of the application program rendered on a GUI is achieved using the display device 1 1 1 , or screen, which is presently a touch panel operationally coupled to the processor 1 12 controlling the displayed interface.
- Processor 1 12 may control the rendering and/or the display of the GUI on the display device 1 1 1 depending on the type of application program, i.e. resident or web-based. Processor 1 12 may also handle the user entries according to the present method. The user entries to interact with an application program may be provided through interactions with the touch panel 1 1 1 .
- the touch panel 1 1 1 can be seen as an input device allowing interactions with a finger of a user or other devices such as a stylus. Such an input device can, for example, be used to make selections of portions of the GUI of the AP.
- the input received from a user's touch is sent to the processor 1 12.
- the touch panel is configured to detect and report the (location of the) touches to the processor 1 12 and the processor 1 12 can interpret the touches in accordance with the application program and the currently displayed GUI. For example, the processor 1 12 can initiate a task, i.e. a control of the AP, in accordance with a particular touch.
- the controller 1 13, i.e. a dedicated processor, can be used to process touches locally and reduce demand for the main processor 1 12 of the computer system.
- the touch panel 1 1 1 can be based on sensing technologies including but not limited to capacitive sensing, resistive sensing, surface acoustic wave sensing, pressure sensing, optical sensing, and/or the likes.
- sensing technologies including but not limited to capacitive sensing, resistive sensing, surface acoustic wave sensing, pressure sensing, optical sensing, and/or the likes.
- a finger of the user touching panel 1 1 1 1 other devices such as a stylus may be used in place of the user finger.
- the touch interface is a touch interface
- touch panel 1 1 1 can be based on single point sensing or multipoint sensing.
- Single point sensing can be capable of only distinguishing a single touch
- multipoint sensing can be capable of distinguishing multiple touches that occur at the same time.
- the captured touch input may be referred to as a touch input event (or touch event in short) that allows imparting a control on the AP.
- a touch input event or touch event in short
- the duration and/or frequency of the touch inputs may be taken into account to distinguish different types of touch events.
- One of the touch inputs illustrated herein may be seen as touching and holding a point on the screen with a single finger, or "clutching" the screen. Clutching the screen is distinguishable from conventional touch inputs by the amount of time it takes to press the finger down on the screen and when the finger is lifted from the screen. A clutch event would only be captured if the finger has not been released from the point or portion on the screen before a given time threshold CLUTCH_THRESHOLD.
- the processor 1 1 2 is arranged to monitor further touch input event/events (hereinafter, simply events) on the GUI, and to impart a control of the AP in response to the monitored further touch input events when determining that said further touch input events correspond to touched portions contiguous with one another on the GUI.
- said further touch input events may cause the first touched portion to change in size, such as grow in size.
- the changing in size may represent a control metaphor that may be applied to other types of control and/or to other parameters. For example, with the proposed system by using the touch panel remote control, the brightness of a TV may be adjusted.
- Other applications would readily occur to a person of ordinary skill in the art and art intended to be encompassed in the description of the present system.
- FIG. 4A and 4B Illustrations of such touch events are presented in FIGs. 4A and 4B. In FIG. 4A and 4B.
- the user may touch a position on the map to define a neighborhood on this map with the tip of his finger 41 1.
- the map based application is configured to update, either locally (local application) or remotely (using a web based server), the GUI and to display a default neighborhood 431 centered on the captured first touch input.
- the captured first touch input corresponds to a first portion of the GUI.
- the finger position 412 in FIG. 4B the user may start pressing his finger sideways to the first touch input so as to have his first phalange come into touch with the screen. As the user's phalange comes progressively into contact with the GUI, the first touch portion will increase in size.
- processor 1 1 2 is capturing more and more contiguous touch input events, i.e. successive touch inputs next to each other.
- more local touch input events are captured, while previously captured touch inputs are still active. This is what causes the whole touched first portion to increase in size.
- the new touch input of the present system can be seen as an extended touch input. This differs from multi-touch inputs where two or more fingers are used. If the touched surface on the touch panel increases as more fingers come into touch with the screen, the touch inputs are not contiguous with one another. In some cases when the user performed the first touch input using the tip of this finger, the tip may be released as more touch events are caused by the rest of the phalange.
- the first touch portion may first be translated sideways, following the new touch events from the moving finger. Nevertheless the touched portion on the GUI will at some point increase in size as the user try to press his whole finger or the outermost phalange against the screen. As more touch events are detected in the present method, an AP control is imparted as seen in FIG. 4B as the neighborhood 432 changes in size.
- the AP control here corresponds to an update of the GUI with an increased neighborhood size for instance proportional to the size of the touched first portion.
- FIGs. 4A and 4B reference is made to a phalange of a finger but the portion of the finger coming into touch with the screen may be larger or smaller than the tip phalange depending on the type of control the user may need to impart on the AP.
- the first touch input corresponds to a touch input with the tip of the finger 41 1 .
- the start position may be the full finger, or portion thereof, against the GUI, causing the GUI to be updated and to show a large default neighborhood as a large first touched portion has been detected by processor 1 12 on touch panel 1 1 1.
- the touch input events actually corresponds to release events from the screen.
- the new touch input can be seen as an extended release from a full finger touch input.
- the present system will be illustrated using a map application that can be controlled using the novel touch input of the present system.
- the user is presented with a default neighborhood centered on a selected point of the map and may control the size of the neighborhood using the novel touch input.
- the present teaching may be generalized to any AP that can be controlled through touch inputs.
- FIG. 2 shows an illustrative process flow diagram in accordance with an embodiment of the present system.
- An application program is running on the processor 1 12 of the mobile device 1 10.
- the application program running (e.g., being executed) by the processor changes the processor into a special purpose processor for operation in accordance with the present system.
- Such an AP may for instance be a map application, like for instance Google MapsTM or Yahoo MapsTM.
- the map application may run locally or a web-based application connected to a distance geo-server hosting the main application.
- FIGs. 4A and 4B Such an AP is illustrated in FIGs. 4A and 4B mentioned here before.
- An optional activation phase may be carried out to trigger the monitoring of the novel touch input of the present system.
- Such an activation phase may be useful if several types of touch inputs (simple, clutch, ...) can be handled by the touch panel. More generally, the present new touch input may only be activated provided some criteria are matched. In a first additional embodiment of the present system, this activation phase may require:
- the new touch input may be monitored only if the initial touch input matches a predefined criterion.
- this activation phase may require: - allowing the capture of the further touch input events when the first touch input matches a predefined criterion.
- the new touch input may be monitored only if the first touch input matches a predefined criterion.
- a Graphical User Interface (GUI) of the AP is rendered on the touch panel 1 1 1.
- the GUI as seen is FIG. 4A renders a geographical map, i.e. a map based GUI 400.
- the map may be for instance a default map based on the user's current location or on a user preferred location, as known from a user profile.
- the displayed map is centered on the user office location.
- map based GUI 400 may be updated to display the default neighborhood 431 .
- the update of the GUI 400 may be server based or provided locally as described here above depending on the AP.
- a touch event listener may be provided so as to monitor any further touch input from the user on GUI 431.
- a subsequent act 230 the user may provide a first touch input that is captured by processor 1 12 through the touch panel 1 1 1. That first touch input corresponds to a first touched portion on the touch panel 1 1 1.
- the touched portion may differ in size depending on the touch panel technology but is generally limited to the tip of the finger surface.
- an activation test 240 may be performed to determine whether the first touch input is provided within the default neighborhood 431. If not (no to act 240), the monitoring of further touch input is ended in act 250.
- the activation test of act 240 corresponds to the criterion mentioned in the optional activation phase of the second additional embodiment here above.
- a touch input event may correspond to either a further touch input caused by the user finger, or a point on the touch panel that is no longer in contact with the user finger.
- touch input events are contiguous. By contiguous, one may understand that consecutive touched - or no longer touched - points on the touch panel are next to each other within the detection range of the touch panel technology. In other words, when a next touch input event is captured, the previously captured touch input event is still ongoing.
- the first touched portion on the touch panel 1 1 1 varies in size. Indeed:
- processor 1 12 will check whether the further touch inputs are contiguous with one another, for example, causing the first touched portion to increase in sized.
- two contiguous touch inputs may vary in distance.
- Some touch panel technology e.g. surface acoustic wave sensing
- Others may be limited to a discrete number of points (e.g. capacitive sensing, resistive sensing, ...), here after referred to as discrete sensing.
- processor 1 12 will check if the consecutive touch input events are contiguous.
- processor 1 12 will detect a series of touch input events contiguous with one another as the finger is moving towards or away from the touch panel. With a discrete sensing technology, processor 1 12 will compare the distance between the two successive touch input events and may compare it to the detection range of the touch panel. Provided their locations are within the detection range, processor 1 12 can define the touch input events are contiguous.
- the further touch input events monitored in act 260 will cause the size of the first touched portions to vary, either increasing (extended touch) or decreasing (extended release).
- processor 1 12 may associate pixels from the touch panel that correspond to the portion of the touch panel that triggered the touched input event. A possible way to ensure that the touch input events are contiguous, and cause the touched portion on the screen to increase is to measure this at the pixel level. Provided the additional pixels associated to a further touch input event are in contact with the pixels associated to the previous touch input event, processor 1 12 will indeed have detected contiguous touch input events.
- the monitoring will end in a further act 265.
- an AP control may be imparted by processor 1 12.
- this corresponds to the neighborhood 432 to vary in size.
- an update of the map based GUI 400 may be generated to show a neighborhood varying in size, e.g. following the finger as more touch input events are detected with the present system.
- An extended touch (as illustrated in FIG. 4B) will result in a larger neighborhood, following for instance any new touch input.
- the application program could either be a stand alone application resident on the mobile device (such as its operating system for instance) or the client to a web based application (such as map based application using for instance a client downloaded to the mobile device to upload a map) .
- FIG. 3 is an illustration of a message flow chart between different parts of the mobile device involved in imparting an AP control according to an exemplary embodiment of the present system.
- the different parts in the present system illustrated in FIG. 3 are: - the application program AP, in the present illustration a map based application program,
- - a neighborhood engine provided to build a neighborhood overlay on a map GUI, and; - ⁇ user interface (Ui) engine arranged to render a AP GUI of the application program.
- the exemplary flowchart of FIG. 3 corresponds to the exemplary embodiment of the present method described in relation to FIG. 2.
- the AP will add a touch event listener to the map based GUI illustrated in FIG. 4A, through processor 1 12.
- the AP will request from a neighborhood engine a default neighborhood.
- the neighborhood engine can be part of the application program or a separate module to define neighborhood characteristics to be rendered on a map based GUI.
- the AP GUI will be updated with the default neighborhood as seen in the illustration of FIG. 4A.
- Another touch event listener may be added by the AP (act 220 of FIG. 2) to further capture a first touch input from the user (act 230).
- the map based AP will check with the neighborhood engine with the first touch input is located within the default neighborhood. If so, further touch input listeners will be added to the AP GUI (act 250 in FIG. 2, and loop 310 in FIG. 3). If the captured touch input events are contiguous (act 31 1 in FIG. 3 or Yes to act 260 in FIG. 2), the imparted AP control may include:
- FIG. 5 shows a system 500 in accordance with an embodiment of the present system.
- the system 500 includes a user device 590 that has a processor 510 operationally coupled to a memory 520, a rendering device 530, such as one or more of a display, speaker, etc., a user input device 570, such as a sensor panel, and a connection 580 operationally coupled to the user device 590.
- the connection 580 may be an operable connection between the device 590, as a user device, and another device that has similar elements as the device 590, such as a web server such as one or more content providers.
- the user device may be for instance a mobile phone, a smart phone, a PDA (personal digital assistant) or any type of wireless portable device.
- the user device may be an electronic device such as a desktop computer or a server.
- the present method is suited for a wireless device with a display panel that is also a sensor panel to offer the user an enhanced control over an application program running on the user device.
- the memory 520 may be any type of device for storing for instance application data related to an application program controlled through touch inputs, to the operating system of the user device, to a browser as well as other application programs controllable with the present method.
- the application data are received by the processor 510 for configuring the processor 510 to perform operation acts in accordance with the present system.
- the processor 510 so configured becomes a special purpose machine particularly suited for performing in accordance with the present system.
- the operation acts include rendering a GUI of the AP, capturing on the sensor panel a first touch input on the AP GUI and corresponding to a first touched portion of said GUI, and when further captured touch input events are identified as contiguous, imparting an AP control.
- the user input 570 may include the sensor panel as well as a keyboard, mouse, trackball, touchpad or other devices, which may be stand alone or be a part of a system, such as part of a personal computer (e.g., desktop computer, laptop computer, etc.) personal digital assistant, mobile phone, converged device, or other rendering device for communicating with the processor 510 via any type of link, such as a wired or wireless link.
- the user input device 570 is operable for interacting with the processor 510 including interaction within a paradigm of a GUI and/or other elements of the present system, such as to enable web browsing, selection of a portion of the GUI provided by a touch input.
- the rendering device 530 may operate as a touch sensitive display for communicating with the processors 510 (e.g., providing selection of portions of the AP GUI). In this way, a user may interact with the processor 510 including interaction within a paradigm of a GUI, such as to operation of the present system, device and method.
- the user device 590, the processor 510, memory 520, rendering device 530 and/or user input device 570 may all or partly be portions of a computer system or other device, and/or be embedded in a portable device, such as a mobile telephone, personal computer (PC), personal digital assistant (PDA), converged device such as a smart telephone, etc.
- the device 590, corresponding user interfaces and other portions of the system 500 are provided for imparting an enhanced control in accordance with the present system over application programs.
- the methods of the present system are particularly suited to be carried out by a computer software program, such program containing modules corresponding to one or more of the individual steps or acts described and/or envisioned by the present system, such as the different engines, the application program, the user interface engine, etc.
- a computer software program such program containing modules corresponding to one or more of the individual steps or acts described and/or envisioned by the present system, such as the different engines, the application program, the user interface engine, etc.
- Such program may of course be embodied in a computer-readable medium, such as an integrated chip, a peripheral device or memory, such as the memory 520 or other memory coupled to the processor 510.
- the computer-readable medium and/or memory 520 may be any recordable medium (e.g., RAM, ROM, removable memory, CD-ROM, hard drives, DVD, floppy disks or memory cards) or may be a transmission medium utilizing one or more of radio frequency (RF) coupling, Bluetooth coupling, infrared coupling, etc. Any medium known or developed that can store and/or transmit information suitable for use with a computer system may be used as the computer-readable medium and/or memory 520.
- RF radio frequency
- Additional memories may also be used. These memories configure processor 510 to implement the methods, operational acts, and functions disclosed herein.
- the operation acts may include controlling the rendering device 530 to render elements in a form of a GUI and/or controlling the rendering device 530 to render other information in accordance with the present system.
- the term "memory" should be construed broadly enough to encompass any information able to be read from or written to an address in the addressable space accessed by a processor. With this definition, information on a network is still within memory 520, for instance, because the processor 510 may retrieve the information from the network for operation in accordance with the present system. For example, a portion of the memory as understood herein may reside as a portion of the content providers, and/or the user device.
- the processor 510 is capable of providing control signals and/or performing operations in response to input signals from the user input device 570 and executing instructions stored in the memory 520.
- the processor 510 may be an application-specific or general-use integrated circuit(s).
- the processor 510 may be a dedicated processor for performing in accordance with the present system or may be a general-purpose processor wherein only one of many functions operates for performing in accordance with the present system.
- the processor 510 may operate utilizing a program portion, multiple program segments, or may be a hardware device utilizing a dedicated or multi-purpose integrated circuit.
- the changing in size may represent a control metaphor that may be applied to other types of control and/or to other parameters.
- the brightness of a TV may be adjusted.
- Other applications would readily occur to a person of ordinary skill in the art and art intended to be encompassed in the description of the present system.
- exemplary user interfaces are provided to facilitate an understanding of the present system, other user interfaces may be provided and/or elements of one user interface may be combined with another of the user interfaces in accordance with further embodiments of the present system.
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Position Input By Displaying (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
A method for imparting control to an application program (A P) running on an electronic device, said electronic device comprising a touch panel and processor for controlling said touch panel, said method being carried out by said processor and comprising the acts of displaying a graphical user interface (GUI) of the A P on the touch panel; capturing a first touch input on the GUI, said first touch input corresponding to a first touched portion of said GUI; - monitoring further touch input events on the GUI, imparting an AP control in response to the monitored further touch input events when determining that said further touch input events correspond to touched portions contiguous with one another on the GUI, causing the first touched portion to change in size.
Description
USER INTERFACE TO PROVIDE ENHANCED CONTROL OF AN APPLICATION PROGRAM
FIELD OF THE PRESENT SYSTEM: The present invention generally relates to mobile devices or handsets, and more specifically to mobile devices handling both touch and motion based inputs.
BACKGROUND OF THE PRESENT SYSTEM: Mobile handsets have an inherently impoverished graphical user interface
(GUI) with respect to the desktop. Small screens and tiny keyboards are typical of mobile handsets that fit in your pocket. Recent so called smart phones have introduced the use of a touch screen in an attempt to simplify the user experience with his mobile handset. For instance, the touch interface of the iPhone® has revolutionized the mobile handset industry and brought whole new mobile user experiences.
In existing smart phones, application programs (AP) may be controlled using touch inputs. Different touch inputs may control the AP in different ways. For instance, using the example of the iPhone®, the desktop GUI comprising a plurality of AP icons may be seen as an AP itself. A user touching an AP icon will cause a control of the desktop GUI that will launch the AP corresponding to the touched icon. A sliding motion across the desktop GUI, or a drag touch input, will cause another control of the desktop GUI, that will display another set of AP icons hidden so far. The user gets a feeling that he is browsing through pages of AP icons to select an interesting application program. A prolonged touch input or clutch input on any AP icon will cause all icons to start shaking around their position. The control associated to the clutch input opens the desktop GUI management. The user can then delete applications from the desktop or move them around in the desktop layout. If such a method facilitates the user experience, there is still today a lot of scope for innovation using touch interfaces of electronic devices, mobile or not.
For instance, co-pending US application 61 /120,445 from the same Applicant, entitled "User interface to provide easy generation of neighborhoods in a map" and included herein by reference, discloses an innovative method to
generate a neighborhood in a map based application using touch inputs. The user will define the center of the neighborhood through a first touch input on a graphical user interface (GUI) representing a map, and may drag his finger on the GUI to define a circle around the center, the radius of which corresponding to the distance between the center and the current position of his finger. The GUI is automatically updated with a circle representing the neighborhood and following the finger.
The existing examples all use well known touch inputs, such as a short touch, a clutch or a drag. The touch inputs may include touch inputs from two fingers that are dragged away or closer to each other which can control a zoom in or a zoom out of a map GUI or a picture.
Other touch inputs could be used to increase the user experience with touch screens or panels.
SUMMARY OF THE PRESENT SYSTEM:
It is an object of the present system to overcome disadvantages and/or make improvements in the prior art.
The present system relates to a method f for imparting control to an application program (AP) running on an electronic device, said electronic device comprising a touch panel and processor for controlling said touch panel, said method being carried out by said processor and comprising the acts of:
- displaying a graphical user interface (GUI) of the AP on the touch panel;
- capturing a first touch input on the GUI, said first touch input corresponding to a first touched portion of said GUI; - monitoring further touch input events on the GUI,
- imparting an AP control in response to the monitored further touch input events when determining that said further touch input events correspond to touched portions contiguous with one another on the GUI, causing the first touched portion to change in size. In the present system, a novel touch input is disclosed that allows further control of application programs. The user can either follow a first touch input with the tip of his finger by pressing the finger progressively in contact with the touch panel (extended touch input) or release his finger away from the panel when the first touch input is itself and extended touch input (release touch input) . This new
touch input differs from existing multi touch input as the touched portion or region of the panel varies in size as the user moves his finger against or away from the screen. Such a touch input may be used for instance to zoom in or out or to change the size of a selected portion of the AP GUI. The present system also relates to a mobile device for imparting control to an application program (AP) running on said mobile device, said mobile device being arranged to:
- display a graphical user interface (GUI) of the AP on the touch panel;
- capture a first touch input on the GUI, said first touch input corresponding to a first touched portion of said GUI;
- monitor further touch input events on the GUI,
- impart an AP control in response to the monitored further touch input events when determining that said further touch input events correspond to touched portions contiguous with one another on the GUI, causing the first touched portion to change in size.
The present system also relates to an application embodied on a computer readable medium and arranged to impart control to an application program (AP) running on a mobile device, the application comprising:
- instructions to display a graphical user interface (GUI) of the AP on the touch panel;
- instructions to capture a first touch input on the GUI, said first touch input corresponding to a first touched portion of said GUI;
- instructions to monitor further touch input events on the GUI, - instructions to impart an AP control in response to the monitored further touch input events when determining that said further touch input events correspond to touched portions contiguous with one another on the GUI, causing the first touched portion to change in size.
BRIEF DESCRIPTION OF THE DRAWINGS: The invention is explained in further detail, and by way of example, with reference to the accompanying drawings wherein:
FIG. 1 shows a mobile device in accordance with an embodiment of the present system;
FIG. 2 shows an illustrative process flow diagram in accordance with an embodiment of the present system;
FIG. 3, shows an exemplary flowchart in accordance with an embodiment of the present system; FIGs. 4A-4B show exemplary illustrations of an application program controlled according to an embodiment of the present system; and,
FIG. 5 shows an exemplary implementation in accordance with an embodiment of the present system.
DETAILED DESCRIPTION OF THE PRESENT SYSTEM:
The following are descriptions of illustrative embodiments that when taken in conjunction with the following drawings will demonstrate the above noted features and advantages, as well as further ones. In the following description, for purposes of explanation rather than limitation, illustrative details are set forth such as architecture, interfaces, techniques, element attributes, etc. However, it will be apparent to those of ordinary skill in the art that other embodiments that depart from these details would still be understood to be within the scope of the appended claims. Moreover, for the purpose of clarity, detailed descriptions of well known devices, circuits, tools, techniques and methods are omitted so as not to obscure the description of the present system. It should be expressly understood that the drawings are included for illustrative purposes and do not represent the scope of the present system. In the accompanying drawings, like reference numbers in different drawings may designate similar elements.
For purposes of simplifying a description of the present system, the terms "operatively coupled", "coupled" and formatives thereof as utilized herein refer to a connection between devices and/or portions thereof that enables operation in accordance with the present system. For example, an operative coupling may include one or more of a wired connection and/or a wireless connection between two or more devices that enables a one and/or two-way communication path between the devices and/or portions thereof. An operative coupling may also include a wired and/or wireless coupling to enable communication between a service platform, such as the profiling platform in accordance with an embodiment of the present system, and one or more user devices. An operative coupling may also relate to an interaction between program portions and thereby
may not describe a physical connection so much as an interaction based coupling.
The term rendering and formatives thereof as utilized herein refer to providing content, such as digital media or a graphical user interface (GUI), such that it may be perceived by at least one user sense, such as a sense of sight and/or a sense of hearing. For example, the present system may render a user interface on a display device so that it may be seen and interacted with by a user. The term rendering may also comprise all the actions required to generate a GUI prior to the display, like e.g. a map representation generated on a server side for a browser application on a user device.
The system, device(s), method, user interface, etc., described herein address problems in prior art systems. In accordance with an embodiment of the present system, an electronic device provides a GUI for controlling an application program (AP) through touch inputs. In the description hereafter, reference will be made to a mobile device or handsets. The man skilled in the art may easily apply the present teachings to any electronic device presenting a touch sensitive panel, referred also hereafter as a touch sensitive display or screen.
A graphical user interface (GUI) may be provided in accordance with an embodiment of the present system: - by an application program running locally on a device processor, such as part of a computer system of a mobile device, and/or,
- as provided by a network connected device, such as a web-based server hosting the application, the GUI being rendered on the mobile device through a local application program connected to the web-based server. Applications like Google Maps ® are implemented today using that approach.
The provided visual environment may be displayed by the processor on a display device of the mobile device, namely a touch sensitive panel (touch panel in short), which a user may use to provide a number of touch inputs of different types. A GUI is a type of user interface which allows a user to interact with electronic devices such as computers, hand-held devices, household appliances, office equipment and the likes. GUIs are typically used to render visual and textual images which describe various visual metaphors of an operating system, an application, etc., and implemented on a processor/computer including rendering
on α display device. Furthermore, GUIs can represent programs, files and operational functions with graphical images, objects, or vector representations. The graphical images can include windows, fields, dialog boxes, menus, icons, buttons, cursors, scroll bars, maps, etc. Such images can be arranged in predefined layouts, or can be created dynamically (by the device itself or by a web-based server) to serve the specific actions being taken by a user. In general, the user can select and/or activate various graphical images in order to initiate functions and tasks, i.e. controls, associated therewith. By way of example, a user can select a button that opens, closes, minimizes, or maximizes a window, or an icon that launches a particular application program. By way of another example, the GUI may present a typical user interface including a windowing environment and as such, may include menu items, pull-down menu items, icons, pop-up windows, etc., that are typical of those provided in a windowing environment, such as may be represented within a Windows™ Operating System GUI as provided by Microsoft Corporation and/or an OS X™ Operating System GUI, such as provided on an iPhone™, MacBook™, iMac™, etc., as provided by Apple, Inc., and/or another operating system.
In the description here after, an application program (AP) - or software - may be seen as any tool that functions and is operated by means of a computer, with the purpose of performing one or more functions or tasks for a user or another application program. To interact with and control an AP, a GUI of the AP may be displayed on the mobile device display.
FIG. 1 is an illustration of an exemplary mobile device 1 10 used in the present system. The mobile device 1 10 comprises a display device 1 1 1 , a processor 1 12, a controller 1 13 of the display device, and an input device 1 15. Mobile device 1 10 may be for instance a desktop or laptop computer, a mobile device, a PDA (personal digital assistant) ...
In the present system, the user interaction with and manipulation of the application program rendered on a GUI is achieved using the display device 1 1 1 , or screen, which is presently a touch panel operationally coupled to the processor 1 12 controlling the displayed interface.
Processor 1 12 may control the rendering and/or the display of the GUI on the display device 1 1 1 depending on the type of application program, i.e. resident or web-based. Processor 1 12 may also handle the user entries according
to the present method. The user entries to interact with an application program may be provided through interactions with the touch panel 1 1 1 .
The touch panel 1 1 1 can be seen as an input device allowing interactions with a finger of a user or other devices such as a stylus. Such an input device can, for example, be used to make selections of portions of the GUI of the AP. The input received from a user's touch is sent to the processor 1 12. The touch panel is configured to detect and report the (location of the) touches to the processor 1 12 and the processor 1 12 can interpret the touches in accordance with the application program and the currently displayed GUI. For example, the processor 1 12 can initiate a task, i.e. a control of the AP, in accordance with a particular touch.
The controller 1 13, i.e. a dedicated processor, can be used to process touches locally and reduce demand for the main processor 1 12 of the computer system. The touch panel 1 1 1 can be based on sensing technologies including but not limited to capacitive sensing, resistive sensing, surface acoustic wave sensing, pressure sensing, optical sensing, and/or the likes. Here after, for simplification purpose, reference will be made to a finger of the user touching panel 1 1 1 , other devices such as a stylus may be used in place of the user finger.
The touch interface:
In the present system, different types of touch inputs can be monitored through touch panel 1 1 1 . For instance, the touch panel 1 1 1 can be based on single point sensing or multipoint sensing. Single point sensing can be capable of only distinguishing a single touch, while multipoint sensing can be capable of distinguishing multiple touches that occur at the same time.
In the present system, once the type of touch input has been captured and identified, the captured touch input may be referred to as a touch input event (or touch event in short) that allows imparting a control on the AP. For single point sensing, the duration and/or frequency of the touch inputs may be taken into account to distinguish different types of touch events. One of the touch inputs illustrated herein may be seen as touching and holding a point on the screen with a single finger, or "clutching" the screen. Clutching the screen is distinguishable from conventional touch inputs by the amount of time it takes to press the finger down on the screen and when the finger is lifted from the screen. A clutch event
would only be captured if the finger has not been released from the point or portion on the screen before a given time threshold CLUTCH_THRESHOLD.
In the present system, another type of touch input is introduced. The user may start with a first touch input on the GUI of the AP, the first touch input corresponding to a first touched portion of said GUI. In the present method, the processor 1 1 2 is arranged to monitor further touch input event/events (hereinafter, simply events) on the GUI, and to impart a control of the AP in response to the monitored further touch input events when determining that said further touch input events correspond to touched portions contiguous with one another on the GUI. For example, said further touch input events may cause the first touched portion to change in size, such as grow in size. In accordance with further embodiments of the present system, the changing in size may represent a control metaphor that may be applied to other types of control and/or to other parameters. For example, with the proposed system by using the touch panel remote control, the brightness of a TV may be adjusted. Other applications would readily occur to a person of ordinary skill in the art and art intended to be encompassed in the description of the present system.
Examples of touch inputs: Illustrations of such touch events are presented in FIGs. 4A and 4B. In FIG.
4A, representing a GUI of a map based application, the user may touch a position on the map to define a neighborhood on this map with the tip of his finger 41 1. The map based application is configured to update, either locally (local application) or remotely (using a web based server), the GUI and to display a default neighborhood 431 centered on the captured first touch input. The captured first touch input corresponds to a first portion of the GUI. As can be seen by the finger position 412 in FIG. 4B, the user may start pressing his finger sideways to the first touch input so as to have his first phalange come into touch with the screen. As the user's phalange comes progressively into contact with the GUI, the first touch portion will increase in size. Indeed processor 1 1 2 is capturing more and more contiguous touch input events, i.e. successive touch inputs next to each other. When the user is performing such a touch input, more local touch input events are captured, while previously captured touch inputs are still active. This is what causes the whole touched first portion to increase in size. The new touch
input of the present system can be seen as an extended touch input. This differs from multi-touch inputs where two or more fingers are used. If the touched surface on the touch panel increases as more fingers come into touch with the screen, the touch inputs are not contiguous with one another. In some cases when the user performed the first touch input using the tip of this finger, the tip may be released as more touch events are caused by the rest of the phalange. The first touch portion may first be translated sideways, following the new touch events from the moving finger. Nevertheless the touched portion on the GUI will at some point increase in size as the user try to press his whole finger or the outermost phalange against the screen. As more touch events are detected in the present method, an AP control is imparted as seen in FIG. 4B as the neighborhood 432 changes in size. The AP control here corresponds to an update of the GUI with an increased neighborhood size for instance proportional to the size of the touched first portion. In FIGs. 4A and 4B, reference is made to a phalange of a finger but the portion of the finger coming into touch with the screen may be larger or smaller than the tip phalange depending on the type of control the user may need to impart on the AP. Furthermore, the first touch input corresponds to a touch input with the tip of the finger 41 1 . Reversely, the start position may be the full finger, or portion thereof, against the GUI, causing the GUI to be updated and to show a large default neighborhood as a large first touched portion has been detected by processor 1 12 on touch panel 1 1 1. As the user lifts his finger off the screen so that only the tip of the finger is left in the end of the motion, the touch input events actually corresponds to release events from the screen. In this alternative embodiment of the present system, the new touch input can be seen as an extended release from a full finger touch input.
Exemplary embodiments of the present system and method:
In the description here after, the present system will be illustrated using a map application that can be controlled using the novel touch input of the present system. As explained earlier in relation to FIGs. 4A and 4B, the user is presented with a default neighborhood centered on a selected point of the map and may control the size of the neighborhood using the novel touch input. The present
teaching may be generalized to any AP that can be controlled through touch inputs.
FIG. 2 shows an illustrative process flow diagram in accordance with an embodiment of the present system. An application program is running on the processor 1 12 of the mobile device 1 10. The application program running (e.g., being executed) by the processor changes the processor into a special purpose processor for operation in accordance with the present system. Such an AP may for instance be a map application, like for instance Google Maps™ or Yahoo Maps™. The map application may run locally or a web-based application connected to a distance geo-server hosting the main application. Such an AP is illustrated in FIGs. 4A and 4B mentioned here before. An optional activation phase may be carried out to trigger the monitoring of the novel touch input of the present system. Such an activation phase may be useful if several types of touch inputs (simple, clutch, ...) can be handled by the touch panel. More generally, the present new touch input may only be activated provided some criteria are matched. In a first additional embodiment of the present system, this activation phase may require:
- the capture of an initial touch input on the GUI, and;
- allowing the capture of the first touch input and further touch input events when the initial touch input matches a predefined criterion. In other words, the new touch input may be monitored only if the initial touch input matches a predefined criterion.
In an second additional embodiment of the present system, this activation phase may require: - allowing the capture of the further touch input events when the first touch input matches a predefined criterion. In other words, the new touch input may be monitored only if the first touch input matches a predefined criterion.
In a preliminary act 200, a Graphical User Interface (GUI) of the AP is rendered on the touch panel 1 1 1. The GUI as seen is FIG. 4A renders a geographical map, i.e. a map based GUI 400. The map may be for instance a default map based on the user's current location or on a user preferred location, as known from a user profile. In the illustration of FIG. 4A the displayed map is centered on the user office location. Once the map based GUI is displayed on the
touch panel 1 1 1 , the user may provide an initial touch input on the map so as to select a center of a neighborhood.
In a further act 210, following the initial touch input, map based GUI 400 may be updated to display the default neighborhood 431 . The update of the GUI 400 may be server based or provided locally as described here above depending on the AP.
In a further act 220, a touch event listener may be provided so as to monitor any further touch input from the user on GUI 431.,
In a subsequent act 230, the user may provide a first touch input that is captured by processor 1 12 through the touch panel 1 1 1. That first touch input corresponds to a first touched portion on the touch panel 1 1 1. The touched portion may differ in size depending on the touch panel technology but is generally limited to the tip of the finger surface. Optionally, an activation test 240 may be performed to determine whether the first touch input is provided within the default neighborhood 431. If not (no to act 240), the monitoring of further touch input is ended in act 250. The activation test of act 240 corresponds to the criterion mentioned in the optional activation phase of the second additional embodiment here above.
If the first touch input is provided within the default neighborhood 431 (yes to act 240), the new touch input may be monitored and further touch input events may be monitored by processor 1 12 through touch panel 1 1 1 . A touch input event may correspond to either a further touch input caused by the user finger, or a point on the touch panel that is no longer in contact with the user finger.
In the new touch input of the present system, as the user lowers his finger against the touch panel after a first touch input using the tip of his finger, more points from his finger will come into touch with the touch panel 1 1 1 . When starting from a first touch input wherein the whole finger or a portion of it is held flat against the panel 1 1 1 , raising the finger away from the touch screen will cause touch input events corresponding to points on the panel becoming untouched. In both types of sequences, touch input events are contiguous. By contiguous, one may understand that consecutive touched - or no longer touched - points on the touch panel are next to each other within the detection range of the touch panel technology. In other words, when a next touch input event is captured, the previously captured touch input event is still ongoing. This
could also be seen as continuous touch input events. This differs as mentioned before from multi-touch inputs as - even if the touched region increases as more fingers come into touch with the touch panel 1 1 1 - the locations of the touch inputs are neither contiguous nor continuous. This does not correspond to an extended touch or extended release of touch.
Whether the user lifts his finger away from the touch screen or lowers it against the touch screen, the first touched portion on the touch panel 1 1 1 varies in size. Indeed:
- more points on the finger come into touch with the touch panel when the user lowers or flattens in finger, for instance the first phalange, against the panel,
- fewer points remain in contact with the touch panel as the user raises or lifts his finger away from the panel.
In a further act 260, processor 1 12 will check whether the further touch inputs are contiguous with one another, for example, causing the first touched portion to increase in sized. Depending on the touch panel technology, two contiguous touch inputs may vary in distance. Some touch panel technology (e.g. surface acoustic wave sensing) may offer a monitoring of the whole touch panel surface, here after referred to as continued sensing. Others may be limited to a discrete number of points (e.g. capacitive sensing, resistive sensing, ...), here after referred to as discrete sensing.
In any event, when a new touch input event is detected, while the previously detected one is still active, i.e. that its status has not changed, processor 1 12 will check if the consecutive touch input events are contiguous.
With a continued sensing technology, processor 1 12 will detect a series of touch input events contiguous with one another as the finger is moving towards or away from the touch panel. With a discrete sensing technology, processor 1 12 will compare the distance between the two successive touch input events and may compare it to the detection range of the touch panel. Provided their locations are within the detection range, processor 1 12 can define the touch input events are contiguous.
In either sensing technology, the further touch input events monitored in act 260 will cause the size of the first touched portions to vary, either increasing (extended touch) or decreasing (extended release). For each touch input events, including the first touch input from the user, processor 1 12 may associate pixels
from the touch panel that correspond to the portion of the touch panel that triggered the touched input event. A possible way to ensure that the touch input events are contiguous, and cause the touched portion on the screen to increase is to measure this at the pixel level. Provided the additional pixels associated to a further touch input event are in contact with the pixels associated to the previous touch input event, processor 1 12 will indeed have detected contiguous touch input events.
If the further touch input events are not contiguous (no to act 260), the monitoring will end in a further act 265. When the further touch input events are determined to be contiguous, thereby causing the first touch portion to vary in size, an AP control may be imparted by processor 1 12. In the illustration of FIG. 4B, this corresponds to the neighborhood 432 to vary in size. As further touch inputs are monitored as contiguous, an update of the map based GUI 400 may be generated to show a neighborhood varying in size, e.g. following the finger as more touch input events are detected with the present system. An extended touch (as illustrated in FIG. 4B) will result in a larger neighborhood, following for instance any new touch input. The user will feel like the neighborhood contour is following his finger as the finger is coming into touch with the touch panel. Reversely, an extended release (not shown) will result in a smaller neighborhood, and the user will feel like the neighborhood is following his finger as he releases more points from the screen.
In the present system, the application program could either be a stand alone application resident on the mobile device (such as its operating system for instance) or the client to a web based application (such as map based application using for instance a client downloaded to the mobile device to upload a map) .
FIG. 3 is an illustration of a message flow chart between different parts of the mobile device involved in imparting an AP control according to an exemplary embodiment of the present system.
The different parts in the present system illustrated in FIG. 3 are: - the application program AP, in the present illustration a map based application program,
- the screen or touch panel,
- a neighborhood engine provided to build a neighborhood overlay on a map GUI, and;
- α user interface (Ui) engine arranged to render a AP GUI of the application program.
The exemplary flowchart of FIG. 3 corresponds to the exemplary embodiment of the present method described in relation to FIG. 2. In order to receive the initial touch input from the user of act 200, the AP will add a touch event listener to the map based GUI illustrated in FIG. 4A, through processor 1 12. Once an initial touch input event has been captured on the touch panel, thereby defining a neighborhood center on the map, the AP will request from a neighborhood engine a default neighborhood. The neighborhood engine can be part of the application program or a separate module to define neighborhood characteristics to be rendered on a map based GUI. Once the default neighborhood has been provided by the neighborhood engine, the AP GUI will be updated with the default neighborhood as seen in the illustration of FIG. 4A.
Another touch event listener may be added by the AP (act 220 of FIG. 2) to further capture a first touch input from the user (act 230). The map based AP will check with the neighborhood engine with the first touch input is located within the default neighborhood. If so, further touch input listeners will be added to the AP GUI (act 250 in FIG. 2, and loop 310 in FIG. 3). If the captured touch input events are contiguous (act 31 1 in FIG. 3 or Yes to act 260 in FIG. 2), the imparted AP control may include:
- the calculation of the distance between the center of the neighborhood and the latest captured touch input event, using for instance the neighborhood engine,
- the rendering of an AP GUI update showing the new neighborhood overlay as seen for instance in FIG. 4B.
FIG. 5 shows a system 500 in accordance with an embodiment of the present system. The system 500 includes a user device 590 that has a processor 510 operationally coupled to a memory 520, a rendering device 530, such as one or more of a display, speaker, etc., a user input device 570, such as a sensor panel, and a connection 580 operationally coupled to the user device 590. The connection 580 may be an operable connection between the device 590, as a user device, and another device that has similar elements as the device 590, such as a web server such as one or more content providers. The user device may be for instance a mobile phone, a smart phone, a PDA (personal digital assistant) or
any type of wireless portable device. Alternatively, the user device may be an electronic device such as a desktop computer or a server. The present method is suited for a wireless device with a display panel that is also a sensor panel to offer the user an enhanced control over an application program running on the user device.
The memory 520 may be any type of device for storing for instance application data related to an application program controlled through touch inputs, to the operating system of the user device, to a browser as well as other application programs controllable with the present method. The application data are received by the processor 510 for configuring the processor 510 to perform operation acts in accordance with the present system. The processor 510 so configured becomes a special purpose machine particularly suited for performing in accordance with the present system. The operation acts include rendering a GUI of the AP, capturing on the sensor panel a first touch input on the AP GUI and corresponding to a first touched portion of said GUI, and when further captured touch input events are identified as contiguous, imparting an AP control.
The user input 570 may include the sensor panel as well as a keyboard, mouse, trackball, touchpad or other devices, which may be stand alone or be a part of a system, such as part of a personal computer (e.g., desktop computer, laptop computer, etc.) personal digital assistant, mobile phone, converged device, or other rendering device for communicating with the processor 510 via any type of link, such as a wired or wireless link. The user input device 570 is operable for interacting with the processor 510 including interaction within a paradigm of a GUI and/or other elements of the present system, such as to enable web browsing, selection of a portion of the GUI provided by a touch input.
In accordance with an embodiment of the present system, the rendering device 530 may operate as a touch sensitive display for communicating with the processors 510 (e.g., providing selection of portions of the AP GUI). In this way, a user may interact with the processor 510 including interaction within a paradigm of a GUI, such as to operation of the present system, device and method. Clearly the user device 590, the processor 510, memory 520, rendering device 530 and/or user input device 570 may all or partly be portions of a computer system or other device, and/or be embedded in a portable device, such as a mobile telephone,
personal computer (PC), personal digital assistant (PDA), converged device such as a smart telephone, etc.
The system, device and method described herein address problems in prior art systems. In accordance with an embodiment of the present system, the device 590, corresponding user interfaces and other portions of the system 500 are provided for imparting an enhanced control in accordance with the present system over application programs.
The methods of the present system are particularly suited to be carried out by a computer software program, such program containing modules corresponding to one or more of the individual steps or acts described and/or envisioned by the present system, such as the different engines, the application program, the user interface engine, etc. Such program may of course be embodied in a computer-readable medium, such as an integrated chip, a peripheral device or memory, such as the memory 520 or other memory coupled to the processor 510.
The computer-readable medium and/or memory 520 may be any recordable medium (e.g., RAM, ROM, removable memory, CD-ROM, hard drives, DVD, floppy disks or memory cards) or may be a transmission medium utilizing one or more of radio frequency (RF) coupling, Bluetooth coupling, infrared coupling, etc. Any medium known or developed that can store and/or transmit information suitable for use with a computer system may be used as the computer-readable medium and/or memory 520.
Additional memories may also be used. These memories configure processor 510 to implement the methods, operational acts, and functions disclosed herein. The operation acts may include controlling the rendering device 530 to render elements in a form of a GUI and/or controlling the rendering device 530 to render other information in accordance with the present system.
Moreover, the term "memory" should be construed broadly enough to encompass any information able to be read from or written to an address in the addressable space accessed by a processor. With this definition, information on a network is still within memory 520, for instance, because the processor 510 may retrieve the information from the network for operation in accordance with the present system. For example, a portion of the memory as understood herein may reside as a portion of the content providers, and/or the user device.
The processor 510 is capable of providing control signals and/or performing operations in response to input signals from the user input device 570 and executing instructions stored in the memory 520. The processor 510 may be an application-specific or general-use integrated circuit(s). Further, the processor 510 may be a dedicated processor for performing in accordance with the present system or may be a general-purpose processor wherein only one of many functions operates for performing in accordance with the present system. The processor 510 may operate utilizing a program portion, multiple program segments, or may be a hardware device utilizing a dedicated or multi-purpose integrated circuit.
Finally, the above discussion is intended to be merely illustrative of the present system and should not be construed as limiting the appended claims to any particular embodiment or group of embodiments. Thus, while the present system has been described with reference to exemplary embodiments, including user interfaces, it should also be appreciated that numerous modifications and alternative embodiments may be devised by those having ordinary skill in the art without departing from the broader and intended spirit and scope of the present system as set forth in the claims that follow. For example, in describing first and further contiguous touch input events, the illustrative description regarding a map AP, the first and further contiguous touch input events together where described as causing the first touched portion to change in size, such as grow in size. In accordance with further embodiments of the present system, the changing in size may represent a control metaphor that may be applied to other types of control and/or to other parameters. For example, with the proposed system by using the touch panel remote control, the brightness of a TV may be adjusted. Other applications would readily occur to a person of ordinary skill in the art and art intended to be encompassed in the description of the present system. Further, while exemplary user interfaces are provided to facilitate an understanding of the present system, other user interfaces may be provided and/or elements of one user interface may be combined with another of the user interfaces in accordance with further embodiments of the present system.
The section headings included herein are intended to facilitate a review but are not intended to limit the scope of the present system. Accordingly, the
specification and drawings are to be regarded in an illustrative manner and are not intended to limit the scope of the appended claims.
In interpreting the appended claims, it should be understood that: a) the word "comprising" does not exclude the presence of other elements or acts than those listed in a given claim; b) the word "a" or "an" preceding an element does not exclude the presence of a plurality of such elements c) any reference signs in the claims do not limit their scope; d) several "means" may be represented by the same item or hardware or software implemented structure or function; e) any of the disclosed elements may be comprised of hardware portions (e.g., including discrete and integrated electronic circuitry), software portions (e.g., computer programming), and any combination thereof; f) hardware portions may be comprised of one or both of analog and digital portions; g) any of the disclosed devices or portions thereof may be combined together or separated into further portions unless specifically stated otherwise; h) no specific sequence of acts or steps is intended to be required unless specifically indicated; and i) the term "plurality of" an element includes two or more of the claimed element, and does not imply any particular range of number of elements; that is, a plurality of elements may be as few as two elements, and may include an immeasurable number of elements.
Claims
1 . A method for imparting control to an application program (AP) running on an electronic device, said electronic device comprising a touch panel and processor for controlling said touch panel, said method being carried out by said processor and comprising the acts of: - displaying a graphical user interface (GUI) of the AP on the touch panel;
- capturing a first touch input on the GUI, said first touch input corresponding to a first touched portion of said GUI;
- monitoring further touch input events on the GUI,
- imparting an AP control in response to the monitored further touch input events when determining that said further touch input events correspond to touched portions contiguous with one another on the GUI, causing the first touched portion to change in size.
2. A method according to claim 1 , further comprising the act of: - allowing the capture of the further touch input events when the first touch input matches a predefined criterion.
3. A method according to claim 1 , further comprising prior to the act of capturing a first touch input on the GUI the acts of: - capturing of an initial touch input on the GUI, and;
- allowing the capture of the first touch input and further touch input events when the initial touch input matches a predefined criterion .
4. A method according to claim 1 wherein the first touch input and the further touch input events are associated to pixels, and wherein the act of determining that the further touch input events are contiguous comprises an act of determining that the corresponding pixels are contiguous.
5. A method according to claim 1 , wherein the further touch input events are touch inputs.
6. A method according to claim 1 , wherein the further touch input events are touch release.
7. A mobile device for imparting control to an application program (AP) running on said mobile device, said mobile device being arran ged to:
- display a graphical user interface (GUI) of the AP on the touch panel;
- capture a first touch input on the GUI, said first touch input corresponding to a first touched portion of said GUI;
- monitor further touch input events on the GUI,
- impart an AP control in response to the monitored further touch input events when determining that said further touch input events correspond to touched portions contiguous with one another on the GUI, causing the first touched portion to change in size.
8. An application embodied on a computer readable medium and arranged to imparting control to an application program (AP) running on a mobile device, the application comprising: - instructions to display a graphical user interface (GUI) of the AP on the touch panel;
- instructions to capture a first touch input on the GUI, said first touch input corresponding to a first touched portion of said GUI;
- instructions to monitor further touch input events on the GUI, - instructions to impart an AP control in response to the monitored further touch input events when determining that said further touch input events correspond to touched portions contiguous with one another on the GUI, causing the first touched portion to change in size.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US17802509P | 2009-05-13 | 2009-05-13 | |
US61/178,025 | 2009-05-13 |
Publications (2)
Publication Number | Publication Date |
---|---|
WO2010131122A2 true WO2010131122A2 (en) | 2010-11-18 |
WO2010131122A3 WO2010131122A3 (en) | 2011-01-06 |
Family
ID=43003481
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/IB2010/001808 WO2010131122A2 (en) | 2009-05-13 | 2010-05-11 | User interface to provide enhanced control of an application program |
Country Status (1)
Country | Link |
---|---|
WO (1) | WO2010131122A2 (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103309591A (en) * | 2012-03-09 | 2013-09-18 | 宏碁股份有限公司 | touchable electronic device |
Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US12044508B2 (en) | 2021-06-03 | 2024-07-23 | Fechheimer Brothers Company | Cover for ballistic carrier |
Family Cites Families (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9292111B2 (en) * | 1998-01-26 | 2016-03-22 | Apple Inc. | Gesturing with a multipoint sensing device |
US7663607B2 (en) * | 2004-05-06 | 2010-02-16 | Apple Inc. | Multipoint touchscreen |
US20070097096A1 (en) * | 2006-03-25 | 2007-05-03 | Outland Research, Llc | Bimodal user interface paradigm for touch screen devices |
US20080024454A1 (en) * | 2006-07-31 | 2008-01-31 | Paul Everest | Three-dimensional touch pad input device |
US7973778B2 (en) * | 2007-04-16 | 2011-07-05 | Microsoft Corporation | Visual simulation of touch pressure |
-
2010
- 2010-05-11 WO PCT/IB2010/001808 patent/WO2010131122A2/en active Application Filing
Patent Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US12044508B2 (en) | 2021-06-03 | 2024-07-23 | Fechheimer Brothers Company | Cover for ballistic carrier |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103309591A (en) * | 2012-03-09 | 2013-09-18 | 宏碁股份有限公司 | touchable electronic device |
Also Published As
Publication number | Publication date |
---|---|
WO2010131122A3 (en) | 2011-01-06 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US12045440B2 (en) | Method, device, and graphical user interface for tabbed and private browsing | |
US12093515B2 (en) | Remote user interface | |
US12013996B2 (en) | Systems and methods for interacting with multiple applications that are simultaneously displayed on an electronic device with a touch-sensitive display | |
US20210191582A1 (en) | Device, method, and graphical user interface for a radial menu system | |
US10386995B2 (en) | User interface for combinable virtual desktops | |
US9632688B2 (en) | Enhanced user interface to transfer media content | |
CN102625931B (en) | For the user interface of promotional activities in the electronic device | |
CN105074616B (en) | The method of user interface and correlation | |
EP2669786A2 (en) | Method for displaying item in terminal and terminal using the same | |
EP2680119A2 (en) | Enhanced user interface to suspend a drag and drop operation | |
US20140006949A1 (en) | Enhanced user interface to transfer media content | |
US10474346B2 (en) | Method of selection of a portion of a graphical user interface | |
EP2610726A1 (en) | Drag and drop operation in a graphical user interface with highlight of target objects | |
CN112199000A (en) | Multi-dimensional object rearrangement | |
CN103189830A (en) | Mode switching | |
JP2014116001A (en) | Method for providing information based on context, and system and recording medium thereof | |
WO2010131122A2 (en) | User interface to provide enhanced control of an application program | |
KR100966848B1 (en) | Method and apparatus for displaying rolling cube menu bar | |
KR101352506B1 (en) | Method for displaying item and terminal thereof |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 10750164 Country of ref document: EP Kind code of ref document: A2 |
|
NENP | Non-entry into the national phase in: |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 10750164 Country of ref document: EP Kind code of ref document: A2 |