US20130265264A1 - Electronic device with switchable user interface and electronic device with accessible touch operation - Google Patents
Electronic device with switchable user interface and electronic device with accessible touch operation Download PDFInfo
- Publication number
- US20130265264A1 US20130265264A1 US13/907,968 US201313907968A US2013265264A1 US 20130265264 A1 US20130265264 A1 US 20130265264A1 US 201313907968 A US201313907968 A US 201313907968A US 2013265264 A1 US2013265264 A1 US 2013265264A1
- Authority
- US
- United States
- Prior art keywords
- touch
- display
- user interface
- electronic device
- finger
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/0483—Interaction with page-structured environments, e.g. book metaphor
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
Definitions
- the present invention generally relates to an electronic device.
- the present invention relates to an electronic device with a switchable user interface and an electronic device with accessible touch operation.
- touch screens is being adopted as a new input interface for replacing traditional keyboards in electronic products, such as notebook computers, cell phones, and portable multi-media players etc.
- electronic products such as notebook computers, cell phones, and portable multi-media players etc.
- touch screens is being adopted as a new input interface for replacing traditional keyboards in electronic products, such as notebook computers, cell phones, and portable multi-media players etc.
- a user uses a traditional cell phone to input text or select a menu item, the user has to click at the keyboard while looking at the image displayed on the screen at the same time.
- touch screen is a more convenient input technique.
- the sizes of touch screens on some compact handheld electronic devices are very limited therefore the sizes of menu items in the user interfaces have to be reduced, so that the screens can display as many function options as possible.
- OS Windows operating system
- the Windows OS supports many different functions and these functions are mostly displayed in the “start” function list as hierarchic menu items.
- start the user has to click at the “start” function list, then looks for the menu item corresponding to the specific function in the function list, and eventually clicks at the menu item in the function list to start the specific function.
- a desired menu has to be opened through such hierarchical selection, which is very inconvenient to a handheld electronic device.
- the case of a traditional handheld electronic device is usually closely attached around the display area of the touch display and is much higher than the touch sensing surface of the touch display.
- the extruding part of the case may encumber the operation of an input tool (for example, a finger or a stylus) and may hurt the user's finger, therefore the user cannot touch the pixels at the edges of the display area quickly and effectively and accordingly cannot operate the user interface smoothly.
- the non-display area of the touch display may also sense touches and is usually covered by the case of a handheld electronic device, which not only obstructs the operation of the user but also restricts the application of the touch display.
- the present invention is directed to an electronic device with a switchable user interface and an electronic device with accessible touch operation.
- the electronic device activates, closes and switches user interfaces according to a position and a moving direction of a touch generated by an input tool touched on a touch display.
- the present invention is directed to an electronic device with a switchable user interface.
- the electronic device comprises a display, a touch sensing means, a position detecting module and a processing module.
- the touch sensing means is suitable for detecting a touch of an input tool.
- the position detecting module coupled to the display and the touch sensing means is suitable for determining whether or not the touch is generated on a specific area of the touch sensing means and determining whether or not a position of the touch on the touch sensing means is changed if the touch is generated on the specific area.
- the processing module coupled to the position detecting module is suitable for activating a user interface and displaying the user interface on the display if the position of the touch on the touch sensing means is changed.
- the present invention further provides an electronic device with a switchable user interface.
- the electronic device comprises a display, a touch sensing means, a position detecting module, a direction detecting module and a processing module.
- the touch sensing means is suitable for detecting a touch of an input tool.
- the position detecting module is coupled to the display and the touch sensing means and suitable for determining whether or not the touch is generated on a specific area of the touch sensing means, and determining whether or not a position of the touch on the touch sensing means is changed if the touch is generated on the specific area.
- the direction detecting module coupled to the position detecting module is suitable for detecting a moving direction of the touch.
- the processing module coupled to the position detecting module and the direction detecting module is suitable for switching a first user interface to a second user interface on the display according to the moving direction if the position of the touch on the touch sensing means is changed.
- the present invention further provides an electronic device with accessible touch operation which is suitable for activating a user interface.
- the electronic device comprises a case, a touch display and a processor.
- the case has an opening, and the touch display disposed in the opening of the case.
- the touch display is suitable for receiving a touch of an input tool and has a touch sensing surface.
- An external surface of the case is substantially not higher than the touch sensing surface.
- the processor coupled to the touch display is suitable for determining whether or not the touch is generated on a specific area of the touch display, determining whether or not a position of the touch on the touch display is changed if the touch is generated on the specific area, and activating and displaying the user interface on the touch display if the position of the touch on the touch display is changed.
- the present invention further provides an electronic device with accessible touch operation which is suitable for switching a user interface.
- the electronic device comprises a case, a touch display and a processor. Wherein the case has an opening and the touch display is disposed in the opening of the case.
- the touch display is suitable for receiving a touch of an input tool, and the touch display has a touch sensing surface.
- An external surface of the case is substantially not higher than the touch sensing surface.
- the processor coupled to the touch display is suitable for determining whether or not the touch is generated on a specific area of the touch display, determining whether or not a position of the touch on the touch display is changed if the touch is generated on the specific area, detecting a moving direction of the touch, and switching the user interface to another user interface on the touch display according to the moving direction if the position of the touch on the touch display is changed.
- the electronic device provided by the present invention comprising the touch sensing surface for receiving the operation of the input tool, wherein the touch sensing surface and the case of the electronic device are located on the same surface.
- the electronic device activates, closes and switches user interfaces according to the position and the moving direction of the touch generated by the input tool, so as to enter the desired user interface more quickly and increase the convenience of operating the electronic device.
- FIG. 1 is a flowchart illustrating a method for operating a user interface according to an embodiment of the present invention.
- FIG. 2 is a diagram of an electronic device according to an embodiment of the present invention.
- FIG. 3 is a diagram of a user interface according to an embodiment of the present invention.
- FIG. 4 is a diagram of a user interface according to an embodiment of the present invention.
- FIG. 5 is a diagram of a user interface according to an embodiment of the present invention.
- FIG. 6 is a diagram of a 3 D motion user interface according to an embodiment of the present invention.
- FIG. 7 is a diagram of an electronic device with a switchable user interface according to an embodiment of the present invention.
- FIG. 8 is a flowchart illustrating a method for operating a user interface according to another embodiment of the present invention.
- FIG. 9 is a diagram of an electronic device with a switchable user interface according to an embodiment of the present invention.
- FIG. 10A is a perspective view of an electronic device with accessible touch operation according to an embodiment of the present invention.
- FIG. 10B is a cross-sectional view of the electronic device in FIG. 10A .
- the present invention is directed to a method for operating a user interface which offers foregoing advantages. Embodiments of the present invention will be described in details with reference to accompanying drawings.
- FIG. 1 is a flowchart illustrating a method for operating a user interface according to an embodiment of the present invention.
- the electronic device may be a cell phone, a personal digital assistant (PDA), a smart phone, or a notebook computer etc, and the scope thereof is not limited herein.
- PDA personal digital assistant
- a touch generated while touching the touch display with an input tool is detected in step 110 , wherein the input tool may be a finger of the user or a stylus.
- the touch display 210 is divided into a display area 211 and a non-display area 213 , wherein the display area 211 is a well-known touch screen area which is used for both displaying images and receiving operations such as text input or menu selections from the user. Even though the non-display area 213 cannot display images, it can also sense user contacts therefore can be used for detecting the operation of the input tool.
- the display area 211 and the non-display area 213 of the touch display 210 may be located on the same surface or different surfaces, which is not restricted herein.
- the specific area includes a marginal area close to an edge of the display area (i.e. the area going inwardly from an edge of the display area within the size of a finger contact, for example, the marginal area 215 as shown in FIG. 2 ), a central area of the display area 211 (i.e. the area in the display area excluding the marginal area, for example, the central area 230 as shown in FIG. 2 ), or the non-display area of the touch display (for example, the non-display area 213 as shown in FIG. 2 ).
- a marginal area close to an edge of the display area i.e. the area going inwardly from an edge of the display area within the size of a finger contact, for example, the marginal area 215 as shown in FIG. 2
- a central area of the display area 211 i.e. the area in the display area excluding the marginal area, for example, the central area 230 as shown in FIG. 2
- the non-display area of the touch display for example, the non-display area 213
- step 130 whether or not the position of the touch on the touch display is changed is deteiniined.
- a moving distance of the touch has to be calculated, wherein the moving distance may be a moving distance produced within a particular time (for example, 0.2 second).
- the moving distance is compared with a predeteiiiiined value, and it is determined that the position of the touch on the touch display is changed if the distance is greater than the predetermined value.
- the change in the position of the touch shows that the input tool moves within the particular time after it touches the touch display.
- the user interface may be activated according to the position of the specific area.
- the user interface is dragged out from a display edge of the touch display and then displayed, while in the present embodiment, the user interface is dragged out from the edge of the display area closest to the touch and then displayed in the display area.
- the marginal area 215 is used as the specific area for activating a user interface in the electronic device 200 , after the user touches the marginal area 215 with his/her finger and moves his/her finger a distance longer than the predetermined value, a new user interface is dragged out from the edge 217 of the display area 211 and displayed in the display area 211 .
- the non-display area 213 is used as the specific area for activating a user interface in the electronic device 200 , after the user touches the touch point 220 in the non-display area 213 with his/her finger and moves his/her finger a distance longer than the predetermined value, the user interface is activated at the edge closest to the touch point 220 (i.e. the edge 217 of the display area 211 ) and dragged into the display area 211 .
- the activated user interface may be a user interface having a finger-touchable icon (for example, the user interface 300 as shown in FIG. 3 ), a user interface having a finger-touchable image (for example, the user interface 400 as shown in FIG. 4 ), or a common user interface (for example, the user interface 500 as shown in FIG. 5 ), wherein the finger-touchable image may be a miniature of the screenshot of an application program.
- a finger-touchable icon for example, the user interface 300 as shown in FIG. 3
- a user interface having a finger-touchable image for example, the user interface 400 as shown in FIG. 4
- a common user interface for example, the user interface 500 as shown in FIG. 5
- the finger-touchable image may be a miniature of the screenshot of an application program.
- a user interface has been displayed in an electronic device, and another user interface can be activated, switched, or closed on the original user interface from a display edge of the touch display (for example, the edge of the display area closest to the touch) through the same or a similar procedure as illustrated in FIG. 1 .
- a user can activate a common user interface (for example, the user interface 500 as shown in FIG. 5 ) on a user interface having a finger-touchable icon (for example, the user interface 300 as shown in FIG. 3 ).
- the user can activate a common user interface (for example, the user interface 500 as shown in FIG. 5 ) on a user interface having a finger-touchable image (for example, the user interface 400 as shown in FIG. 4 ).
- the user also can activate a user interface having a finger-touchable icon or a finger-touchable image (for example, the user interface 300 in FIG. 3 and the user interface 400 in FIG. 4 ) on a common user interface (for example, the user interface 500 as shown in FIG. 5 ).
- the user can activate a user interface having a finger-touchable image (or finger-touchable icon) on another user interface having a finger-touchable image (or finger-touchable icon).
- the original user interface and the user interface activated later on are respectively located on two surfaces (for example, two adjacent surfaces) of a 3D motion user interface (for example, the 3D motion user interface 610 as shown in FIG. 6 ).
- the 3D motion user interface 610 may be a polyhedron or a cube, and the scope thereof is not limited herein.
- the 3D motion user interface 610 rotates around an axis. For example, as shown in FIG. 6 , the 3D motion user interface 610 may rotate upwards around the axis X 611 , rotates rightwards around the axis Z 613 , or rotates around the tilted axis 615 .
- the method for operating a user interface described above may be executed in any system having computer functionalities.
- foregoing embodiment may be implemented as a computer program, and the computer program can be stored in a computer readable recording medium (for example, a CD-ROM or a hard disk) and loaded into a computer system so that foregoing method for operating a user interface can be executed.
- a computer readable recording medium for example, a CD-ROM or a hard disk
- FIG. 7 is a diagram of an electronic device with a switchable user interface according to an embodiment of the present invention.
- the electronic device 700 includes a display 710 , a touch sensing means 720 , a position detecting module 730 , and a processing module 740 .
- the touch sensing means 720 detects a touch generated by an input tool (for example, a finger of a user or a stylus).
- the touch sensing means 720 may be a touch pad or a touch panel.
- the touch sensing means 720 and the display 710 form a touch display if the touch sensing means 720 is a touch panel.
- the touch panel includes a display area and a non-display area corresponding to the display 710 , wherein the display area and the non-display area may be located on the same surface or different surfaces.
- the position detecting module 730 is connected to the display 710 and the touch sensing means 720 is used for determining whether or not the touch is generated on a specific area of the touch sensing means 720 .
- the specific area includes a marginal area of the display area, a central area of the display area excluding the marginal area, or the non-display area. If the touch is generated on the specific area, the position detecting module 730 determines whether or not the position of the touch on the touch display is changed.
- the processing module 740 is connected to the position detecting module 730 , and when the position detecting module 730 determines that the position of the touch on the touch display is changed, the processing module 740 activates and displays a user interface on the display 710 according to the position of the specific area.
- a user can operate the electronic device 700 to activate a new user interface or to activate another user interface for replacing an existing user interface by using a finger or a stylus.
- the operating method is similar to the embodiment described above therefore will not be described herein.
- FIG. 8 is a flowchart illustrating a method for operating a user interface according to another embodiment of the present invention.
- the touch display is divided into a display area and a non-display area.
- step 810 when the user operates the electronic device using an input tool, a touch generated while the input tool touches the touch display is detected, wherein the input tool may be a finger of the user or a stylus.
- the specific area may be a marginal area in the display area of the touch display (for example, the marginal area 215 as shown in FIG. 2 ), a central area in the display area excluding the marginal area (for example, the central area 230 as shown in FIG. 2 ), or a non-display area (for example, the non-display area 213 as shown in FIG. 2 ).
- step 830 whether the position of the touch on the touch display is changed is then determined in step 830 . For example, when the user drags the input tool on the touch display, the position of the touch between the input tool and the touch display is then changed, and accordingly a moving distance of the touch is calculated. It is determined that the position of the touch on the touch display is changed when the moving distance is greater than a predetermined value.
- the action of the user to select a menu item with the input tool will not be misunderstood as an action to switch a user interface, and accordingly incorrect response of the electronic device can be avoided.
- the user interface is switched according to the moving direction of the touch.
- the user interface is dragged out from a display edge of the touch display opposite to the moving direction of the touch.
- the user interface may be a user interface having a finger-touchable icon (for example, the user interface 300 as shown in FIG. 3 ), a user interface having a finger-touchable image (for example, the user interface 400 as shown in FIG. 4 ), or a common user interface (for example, the user interface 500 as shown in FIG. 5 ), wherein the finger-touchable image may be a miniature of the screenshot of an application program.
- the central area 230 in FIG. 2 is used as the specific area and the touch generated when the input tool touches the touch display is located at the touch point 231 , then after the user drags the input tool to the right of the touch display to a distance longer than the predetermined value, the user interface slides out from the edge 233 (i.e. the edge of the display area to the left of the touch point 231 ) and is displayed in the display area 211 .
- a user interface may have been activated in an electronic device, and this original user interface can be switched through a similar method as described above.
- Switching a user interface includes closing an original user interface, activating another user interface on the original user interface, and switching from the original user interface to another user interface.
- a user interface having a finger-touchable icon or a finger-touchable image for example, the user interface 300 in FIG. 3 or the user interface 400 in FIG. 4
- a common user interface for example, the user interface 500 as shown in FIG. 5 .
- a common user interface may also be switched to a user interface having a finger-touchable icon or a finger-touchable image.
- a user interface having a finger-touchable icon or a finger-touchable image may also be switched to another user interface having a finger-touchable icon or a finger-touchable image.
- the original user interface and the user interface activated later on are respectively located on two surfaces (for example, two adjacent surfaces) of a 3D motion user interface (for example, the 3D motion user interface 610 as shown in FIG. 6 ).
- the 3D motion user interface 610 may be a polyhedron or a cube. As shown in FIG. 6 , the 3D motion user interface 610 rotates around an axis. For example, the 3D motion user interface 610 may rotate upwards around the axis X 611 , rotate rightwards around the axis Z 613 , or rotates around a tilted axis 615 .
- the method for operating a user interface described above may be executed in any system having computer functionalities.
- foregoing embodiment may be implemented as a computer program, and the computer program can be stored in a computer readable recording medium (for example, a CD-ROM or a hard disk) and loaded into a computer system so that foregoing method for operating a user interface can be executed.
- a computer readable recording medium for example, a CD-ROM or a hard disk
- FIG. 9 is a diagram of an electronic device with a switchable user interface according to an embodiment of the present invention.
- the electronic device 900 includes a display 910 , a touch sensing means 920 , a position detecting module 930 , a direction detecting module 940 , and a processing module 950 .
- the touch sensing means 920 detects a touch generated by an input tool (for example, a finger of a user or a stylus).
- the touch sensing means 920 may be a touch pad or a touch panel.
- the touch sensing means 920 and the display 910 form a touch display if the touch sensing means 920 is a touch panel.
- the touch panel includes a display area and a non-display area corresponding to the display 910 , wherein the display area and the non-display area may be located on the same surface or difference surfaces, and which is not restricted herein.
- the position detecting module 930 is connected to both the display 910 and the touch sensing means 920 and determines whether the touch is generated on a specific area of the touch sensing means 920 .
- the specific area includes a marginal area of the display area, a central area excluding the marginal area in the display area, or the non-display area. If the touch is generated on the specific area, the position detecting module 930 determines whether or not the position of the touch on the touch display is changed.
- the direction detecting module 940 is connected to the position detecting module 930 and detects the moving direction of the touch.
- the processing module 950 is connected to both the position detecting module 930 and the direction detecting module 940 , and when the position detecting module 930 determines that the position of the touch on the touch display is changed, the processing module 950 switches the user interface on the display 910 according to the moving direction of the touch detected by the direction detecting module 940 .
- a user can operate the electronic device 900 to switch a user interface or to switch to a new user interface on an existing user interface by using a finger or a stylus.
- the operating method is similar to the embodiment described above therefore will not be described herein.
- FIG. 10A is a perspective view of an electronic device with accessible touch operation according to an embodiment of the present invention
- FIG. 10B is a cross-sectional view of the electronic device in FIG. 10A
- the electronic device e.g. a handheld electronic device
- the electronic device includes a case 1001 , a touch display 1002 , and a processor 1003 .
- the case 1001 has an external surface 1004 and a containing space 1005 , wherein the containing space 1005 is connected to the surroundings through an opening 1006 on the external surface 1004 .
- the touch display 1002 includes a display 1007 and a touch sensing means 1008 .
- the display 1007 is disposed in the containing space 1005 of the case 1001 .
- the touch sensing means 1008 is disposed in the opening 1006 on the external surface 1004 of the case 1001 and receives a touch generated by an input tool.
- the touch sensing means 1008 has a touch sensing surface 1009 , wherein the touch sensing surface 1009 includes a display area 1010 and a non-display area 1011 .
- the fringe of the opening 1006 on the case 1001 is connected continuously to the touch sensing surface 1009 , and the external surface 1004 of the case 1001 is not higher than the touch sensing surface 1009 .
- the case 1001 does not include the hotkeys or buttons on an electronic device.
- the processor 1003 is coupled to the display 1007 and the touch sensing means 1008 and determines whether or not the touch is generated on a specific area of the touch display 1002 . If the touch is generated on the specific area, then the processor 1003 determines whether the position of the touch on the touch display is changed and detects the moving direction of the touch. When the position of the touch on the touch display is changed, the processor 1003 activates and displays a user interface or switches the user interface in the touch display 1002 accordingly to the moving direction of the touch.
- the external surface 1004 of the case 1001 is not higher than the touch sensing surface 1009 , the external surface 1004 of the case 1001 and the touch sensing surface 1009 form a continuous smooth surface which allows the user to move and operate the input tool obstructionless.
- the non-display area 1011 exposed by the touch sensing surface 1009 is not covered by the case 1001 as in prior art, thus, in the design of an electronic device, an input tool can be moved and operated obstrutionless, and besides, the non-display area 1011 can be fully utilized and more convenient operations can be provided to the user.
- the processor 1003 can activate, switch, or close a user interface according to the position and moving direction of a touch generated while the input tool touches the touch sensing means 1008 .
- the details of the method, such as operation procedure and functionalities thereof, have been described in foregoing embodiment therefore will not be described herein.
- the user interface can be activated, switched and closed according to the position and the moving direction of the touch generated by the input tool.
- the convenience of using the input tool can be increased, and an easier operation way is provided for the user to improve the efficiency of operating the electronic device.
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
- Position Input By Displaying (AREA)
Abstract
An electronic device with switchable user interface and an electronic device with accessible touch operation are provided. The electronic device with switchable user interface includes a display, a touch sensing means, a position detecting module and a processing module. The touch sensing means is used for sensing a touch of an input tool. The position detecting module is used for determining whether or not the touch is generated on a specific area of the touch sensing means. If the touch is generated on the specific area, the position detecting module determines whether the position of the touch is varied. The processing module coupled to the position detecting module is used for activating a user interface on the display if there is a position variation of the touch. Consequently, the convenience of operating the electronic device is improved.
Description
- This application is a continuation application of and claims the priority benefit of U.S. application Ser. No. 12/109,359, filed on Apr. 25, 2008, which claims the priority benefit of Taiwan application serial no. 96117290, filed on May 15, 2007. All disclosure of the Taiwan application and co-pending US patent application (Ser. No. 12/109,357, filed on Apr. 25, 2008) filed concurrently by the same applicant, which claims the priority benefit of Taiwan application serial no. 96117292, filed on May 15, 2007, are incorporated herein by reference.
- 1. Field of the Invention
- The present invention generally relates to an electronic device. In particularly, the present invention relates to an electronic device with a switchable user interface and an electronic device with accessible touch operation.
- 2. Description of Related Art
- Along with the advancement of pointing stick and touch pad technique, touch screens is being adopted as a new input interface for replacing traditional keyboards in electronic products, such as notebook computers, cell phones, and portable multi-media players etc. For example, when a user uses a traditional cell phone to input text or select a menu item, the user has to click at the keyboard while looking at the image displayed on the screen at the same time. However, if a cell phone having a touch screen is used, the user can input text or start an application program directly on the screen by using a stylus. Accordingly, touch screen is a more convenient input technique.
- The sizes of touch screens on some compact handheld electronic devices are very limited therefore the sizes of menu items in the user interfaces have to be reduced, so that the screens can display as many function options as possible. For example, regarding a handheld electronic device built in with a Windows operating system (OS), the Windows OS supports many different functions and these functions are mostly displayed in the “start” function list as hierarchic menu items. Thus, if a user wants to start a specific function, the user has to click at the “start” function list, then looks for the menu item corresponding to the specific function in the function list, and eventually clicks at the menu item in the function list to start the specific function. A desired menu has to be opened through such hierarchical selection, which is very inconvenient to a handheld electronic device.
- Besides, if the user loses the stylus or forgets to bring it with him/her, the user may have to input text or select menu items with his/her finger. In this case, an incorrect menu item may be selected and accordingly an undesired user interface may be activated due to the small size of the menu. Then the user has to return to the previous level of the user interface and re-selects the menu item to activate the correct user interface. As described above, unnecessary operation time is consumed by such situation. Accordingly, how to allow a user to activate various user interfaces in a touch screen directly with his/her finger and accordingly to operate on these user interfaces conveniently is one of the most important factors for improving the convenience in the operation of a electronic device.
- In addition, the case of a traditional handheld electronic device is usually closely attached around the display area of the touch display and is much higher than the touch sensing surface of the touch display. The extruding part of the case may encumber the operation of an input tool (for example, a finger or a stylus) and may hurt the user's finger, therefore the user cannot touch the pixels at the edges of the display area quickly and effectively and accordingly cannot operate the user interface smoothly. In addition, the non-display area of the touch display may also sense touches and is usually covered by the case of a handheld electronic device, which not only obstructs the operation of the user but also restricts the application of the touch display.
- Accordingly, the present invention is directed to an electronic device with a switchable user interface and an electronic device with accessible touch operation. Wherein the electronic device activates, closes and switches user interfaces according to a position and a moving direction of a touch generated by an input tool touched on a touch display.
- The present invention is directed to an electronic device with a switchable user interface. The electronic device comprises a display, a touch sensing means, a position detecting module and a processing module. Wherein the touch sensing means is suitable for detecting a touch of an input tool. The position detecting module coupled to the display and the touch sensing means is suitable for determining whether or not the touch is generated on a specific area of the touch sensing means and determining whether or not a position of the touch on the touch sensing means is changed if the touch is generated on the specific area. And the processing module coupled to the position detecting module is suitable for activating a user interface and displaying the user interface on the display if the position of the touch on the touch sensing means is changed.
- The present invention further provides an electronic device with a switchable user interface. Wherein the electronic device comprises a display, a touch sensing means, a position detecting module, a direction detecting module and a processing module. The touch sensing means is suitable for detecting a touch of an input tool. The position detecting module is coupled to the display and the touch sensing means and suitable for determining whether or not the touch is generated on a specific area of the touch sensing means, and determining whether or not a position of the touch on the touch sensing means is changed if the touch is generated on the specific area. The direction detecting module coupled to the position detecting module is suitable for detecting a moving direction of the touch. The processing module coupled to the position detecting module and the direction detecting module is suitable for switching a first user interface to a second user interface on the display according to the moving direction if the position of the touch on the touch sensing means is changed.
- The present invention further provides an electronic device with accessible touch operation which is suitable for activating a user interface. Wherein the electronic device comprises a case, a touch display and a processor. The case has an opening, and the touch display disposed in the opening of the case. The touch display is suitable for receiving a touch of an input tool and has a touch sensing surface. An external surface of the case is substantially not higher than the touch sensing surface. The processor coupled to the touch display is suitable for determining whether or not the touch is generated on a specific area of the touch display, determining whether or not a position of the touch on the touch display is changed if the touch is generated on the specific area, and activating and displaying the user interface on the touch display if the position of the touch on the touch display is changed.
- The present invention further provides an electronic device with accessible touch operation which is suitable for switching a user interface. The electronic device comprises a case, a touch display and a processor. Wherein the case has an opening and the touch display is disposed in the opening of the case. The touch display is suitable for receiving a touch of an input tool, and the touch display has a touch sensing surface. An external surface of the case is substantially not higher than the touch sensing surface. And the processor coupled to the touch display is suitable for determining whether or not the touch is generated on a specific area of the touch display, determining whether or not a position of the touch on the touch display is changed if the touch is generated on the specific area, detecting a moving direction of the touch, and switching the user interface to another user interface on the touch display according to the moving direction if the position of the touch on the touch display is changed.
- The electronic device provided by the present invention comprising the touch sensing surface for receiving the operation of the input tool, wherein the touch sensing surface and the case of the electronic device are located on the same surface. The electronic device activates, closes and switches user interfaces according to the position and the moving direction of the touch generated by the input tool, so as to enter the desired user interface more quickly and increase the convenience of operating the electronic device.
- The accompanying drawings are included to provide a further understanding of the invention, and are incorporated in and constitute a part of this specification. The drawings illustrate embodiments of the invention and, together with the description, serve to explain the principles of the invention.
-
FIG. 1 is a flowchart illustrating a method for operating a user interface according to an embodiment of the present invention. -
FIG. 2 is a diagram of an electronic device according to an embodiment of the present invention. -
FIG. 3 is a diagram of a user interface according to an embodiment of the present invention. -
FIG. 4 is a diagram of a user interface according to an embodiment of the present invention. -
FIG. 5 is a diagram of a user interface according to an embodiment of the present invention. -
FIG. 6 is a diagram of a 3D motion user interface according to an embodiment of the present invention. -
FIG. 7 is a diagram of an electronic device with a switchable user interface according to an embodiment of the present invention. -
FIG. 8 is a flowchart illustrating a method for operating a user interface according to another embodiment of the present invention. -
FIG. 9 is a diagram of an electronic device with a switchable user interface according to an embodiment of the present invention. -
FIG. 10A is a perspective view of an electronic device with accessible touch operation according to an embodiment of the present invention. -
FIG. 10B is a cross-sectional view of the electronic device inFIG. 10A . - Reference will now be made in detail to the present preferred embodiments of the invention, examples of which are illustrated in the accompanying drawings. Wherever possible, the same reference numbers are used in the drawings and the description to refer to the same or like parts.
- While using an electronic device having a touch display, the operation time could be shortened and the operation efficiency could be improved if different user interfaces can be activated or switched without selecting any menu items. Accordingly, the present invention is directed to a method for operating a user interface which offers foregoing advantages. Embodiments of the present invention will be described in details with reference to accompanying drawings.
-
FIG. 1 is a flowchart illustrating a method for operating a user interface according to an embodiment of the present invention. In the present embodiment, the detailed procedure of activating a user interface in the touch display of an electronic device will be described, wherein the electronic device may be a cell phone, a personal digital assistant (PDA), a smart phone, or a notebook computer etc, and the scope thereof is not limited herein. - Referring to
FIG. 1 , first, a touch generated while touching the touch display with an input tool is detected instep 110, wherein the input tool may be a finger of the user or a stylus. In theelectronic device 200 illustrated inFIG. 2 , thetouch display 210 is divided into adisplay area 211 and anon-display area 213, wherein thedisplay area 211 is a well-known touch screen area which is used for both displaying images and receiving operations such as text input or menu selections from the user. Even though thenon-display area 213 cannot display images, it can also sense user contacts therefore can be used for detecting the operation of the input tool. In the present embodiment, thedisplay area 211 and thenon-display area 213 of thetouch display 210 may be located on the same surface or different surfaces, which is not restricted herein. - Next, in
step 120, whether or not the touch is generated on a specific area of the touch display is determined. The specific area includes a marginal area close to an edge of the display area (i.e. the area going inwardly from an edge of the display area within the size of a finger contact, for example, themarginal area 215 as shown inFIG. 2 ), a central area of the display area 211 (i.e. the area in the display area excluding the marginal area, for example, thecentral area 230 as shown inFIG. 2 ), or the non-display area of the touch display (for example, thenon-display area 213 as shown inFIG. 2 ). - If the touch is generated on the specific area of the touch display, then in
step 130, whether or not the position of the touch on the touch display is changed is deteiniined. To be specific, to determine whether or not the position of the touch on the touch display is changed, first, a moving distance of the touch has to be calculated, wherein the moving distance may be a moving distance produced within a particular time (for example, 0.2 second). Next, the moving distance is compared with a predeteiiiiined value, and it is determined that the position of the touch on the touch display is changed if the distance is greater than the predetermined value. In other words, the change in the position of the touch shows that the input tool moves within the particular time after it touches the touch display. - Finally, in
step 140, the user interface may be activated according to the position of the specific area. To activate the user interface, the user interface is dragged out from a display edge of the touch display and then displayed, while in the present embodiment, the user interface is dragged out from the edge of the display area closest to the touch and then displayed in the display area. - For example, if the
marginal area 215 is used as the specific area for activating a user interface in theelectronic device 200, after the user touches themarginal area 215 with his/her finger and moves his/her finger a distance longer than the predetermined value, a new user interface is dragged out from theedge 217 of thedisplay area 211 and displayed in thedisplay area 211. - In another embodiment of the present invention, if the
non-display area 213 is used as the specific area for activating a user interface in theelectronic device 200, after the user touches thetouch point 220 in thenon-display area 213 with his/her finger and moves his/her finger a distance longer than the predetermined value, the user interface is activated at the edge closest to the touch point 220 (i.e. theedge 217 of the display area 211) and dragged into thedisplay area 211. - In the embodiment described above, the activated user interface may be a user interface having a finger-touchable icon (for example, the
user interface 300 as shown inFIG. 3 ), a user interface having a finger-touchable image (for example, theuser interface 400 as shown inFIG. 4 ), or a common user interface (for example, theuser interface 500 as shown inFIG. 5 ), wherein the finger-touchable image may be a miniature of the screenshot of an application program. - In another embodiment of the present invention, a user interface has been displayed in an electronic device, and another user interface can be activated, switched, or closed on the original user interface from a display edge of the touch display (for example, the edge of the display area closest to the touch) through the same or a similar procedure as illustrated in
FIG. 1 . - For example, through the method described above, a user can activate a common user interface (for example, the
user interface 500 as shown inFIG. 5 ) on a user interface having a finger-touchable icon (for example, theuser interface 300 as shown inFIG. 3 ). Or, the user can activate a common user interface (for example, theuser interface 500 as shown inFIG. 5 ) on a user interface having a finger-touchable image (for example, theuser interface 400 as shown inFIG. 4 ). Moreover, the user also can activate a user interface having a finger-touchable icon or a finger-touchable image (for example, theuser interface 300 inFIG. 3 and theuser interface 400 inFIG. 4 ) on a common user interface (for example, theuser interface 500 as shown inFIG. 5 ). Furthermore, the user can activate a user interface having a finger-touchable image (or finger-touchable icon) on another user interface having a finger-touchable image (or finger-touchable icon). - In the present embodiment, the original user interface and the user interface activated later on are respectively located on two surfaces (for example, two adjacent surfaces) of a 3D motion user interface (for example, the 3D
motion user interface 610 as shown inFIG. 6 ). The 3Dmotion user interface 610 may be a polyhedron or a cube, and the scope thereof is not limited herein. The 3Dmotion user interface 610 rotates around an axis. For example, as shown inFIG. 6 , the 3Dmotion user interface 610 may rotate upwards around theaxis X 611, rotates rightwards around theaxis Z 613, or rotates around the tiltedaxis 615. - It should be mentioned that the method for operating a user interface described above may be executed in any system having computer functionalities. In other words, foregoing embodiment may be implemented as a computer program, and the computer program can be stored in a computer readable recording medium (for example, a CD-ROM or a hard disk) and loaded into a computer system so that foregoing method for operating a user interface can be executed.
-
FIG. 7 is a diagram of an electronic device with a switchable user interface according to an embodiment of the present invention. Referring toFIG. 7 , theelectronic device 700 includes adisplay 710, a touch sensing means 720, aposition detecting module 730, and aprocessing module 740. - The touch sensing means 720 detects a touch generated by an input tool (for example, a finger of a user or a stylus). In the present embodiment, the touch sensing means 720 may be a touch pad or a touch panel. The touch sensing means 720 and the
display 710 form a touch display if the touch sensing means 720 is a touch panel. Accordingly, the touch panel includes a display area and a non-display area corresponding to thedisplay 710, wherein the display area and the non-display area may be located on the same surface or different surfaces. - The
position detecting module 730 is connected to thedisplay 710 and the touch sensing means 720 is used for determining whether or not the touch is generated on a specific area of the touch sensing means 720. In the present embodiment, the specific area includes a marginal area of the display area, a central area of the display area excluding the marginal area, or the non-display area. If the touch is generated on the specific area, theposition detecting module 730 determines whether or not the position of the touch on the touch display is changed. - The
processing module 740 is connected to theposition detecting module 730, and when theposition detecting module 730 determines that the position of the touch on the touch display is changed, theprocessing module 740 activates and displays a user interface on thedisplay 710 according to the position of the specific area. - Through the user interface operating method described above, a user can operate the
electronic device 700 to activate a new user interface or to activate another user interface for replacing an existing user interface by using a finger or a stylus. The operating method is similar to the embodiment described above therefore will not be described herein. -
FIG. 8 is a flowchart illustrating a method for operating a user interface according to another embodiment of the present invention. In the present embodiment, the method provided by the present invention for switching a user interface in an electronic device will be further described in details. The touch display is divided into a display area and a non-display area. Referring toFIG. 8 , first, instep 810, when the user operates the electronic device using an input tool, a touch generated while the input tool touches the touch display is detected, wherein the input tool may be a finger of the user or a stylus. - Next, in
step 820, whether the touch is generated on a specific area of the touch display is determined. In the present embodiment, the specific area may be a marginal area in the display area of the touch display (for example, themarginal area 215 as shown inFIG. 2 ), a central area in the display area excluding the marginal area (for example, thecentral area 230 as shown inFIG. 2 ), or a non-display area (for example, thenon-display area 213 as shown inFIG. 2 ). - If the touch is generated on the specific area, whether the position of the touch on the touch display is changed is then determined in
step 830. For example, when the user drags the input tool on the touch display, the position of the touch between the input tool and the touch display is then changed, and accordingly a moving distance of the touch is calculated. It is determined that the position of the touch on the touch display is changed when the moving distance is greater than a predetermined value. To be specific, by calculating the moving distance of the touch within a particular time and comparing the moving distance and the predetermined value to determine whether the position of the touch on the touch display is changed, the action of the user to select a menu item with the input tool will not be misunderstood as an action to switch a user interface, and accordingly incorrect response of the electronic device can be avoided. - Finally, in
step 840, the user interface is switched according to the moving direction of the touch. To switch a user interface, the user interface is dragged out from a display edge of the touch display opposite to the moving direction of the touch. In the present embodiment, the user interface may be a user interface having a finger-touchable icon (for example, theuser interface 300 as shown inFIG. 3 ), a user interface having a finger-touchable image (for example, theuser interface 400 as shown inFIG. 4 ), or a common user interface (for example, theuser interface 500 as shown inFIG. 5 ), wherein the finger-touchable image may be a miniature of the screenshot of an application program. - For example, if the
central area 230 inFIG. 2 is used as the specific area and the touch generated when the input tool touches the touch display is located at thetouch point 231, then after the user drags the input tool to the right of the touch display to a distance longer than the predetermined value, the user interface slides out from the edge 233 (i.e. the edge of the display area to the left of the touch point 231) and is displayed in thedisplay area 211. - In another embodiment of the present invention, a user interface may have been activated in an electronic device, and this original user interface can be switched through a similar method as described above. Switching a user interface includes closing an original user interface, activating another user interface on the original user interface, and switching from the original user interface to another user interface. For example, through the operating method described above, a user interface having a finger-touchable icon or a finger-touchable image (for example, the
user interface 300 inFIG. 3 or theuser interface 400 inFIG. 4 ) may be switched to a common user interface (for example, theuser interface 500 as shown inFIG. 5 ). Contrarily, a common user interface may also be switched to a user interface having a finger-touchable icon or a finger-touchable image. Besides, a user interface having a finger-touchable icon or a finger-touchable image may also be switched to another user interface having a finger-touchable icon or a finger-touchable image. - In the present embodiment, the original user interface and the user interface activated later on are respectively located on two surfaces (for example, two adjacent surfaces) of a 3D motion user interface (for example, the 3D
motion user interface 610 as shown inFIG. 6 ). The 3Dmotion user interface 610 may be a polyhedron or a cube. As shown inFIG. 6 , the 3Dmotion user interface 610 rotates around an axis. For example, the 3Dmotion user interface 610 may rotate upwards around theaxis X 611, rotate rightwards around theaxis Z 613, or rotates around a tiltedaxis 615. - The method for operating a user interface described above may be executed in any system having computer functionalities. In other words, foregoing embodiment may be implemented as a computer program, and the computer program can be stored in a computer readable recording medium (for example, a CD-ROM or a hard disk) and loaded into a computer system so that foregoing method for operating a user interface can be executed.
-
FIG. 9 is a diagram of an electronic device with a switchable user interface according to an embodiment of the present invention. Referring toFIG. 9 , theelectronic device 900 includes adisplay 910, a touch sensing means 920, aposition detecting module 930, adirection detecting module 940, and aprocessing module 950. - The touch sensing means 920 detects a touch generated by an input tool (for example, a finger of a user or a stylus). In the present embodiment, the touch sensing means 920 may be a touch pad or a touch panel. The touch sensing means 920 and the
display 910 form a touch display if the touch sensing means 920 is a touch panel. The touch panel includes a display area and a non-display area corresponding to thedisplay 910, wherein the display area and the non-display area may be located on the same surface or difference surfaces, and which is not restricted herein. - The
position detecting module 930 is connected to both thedisplay 910 and the touch sensing means 920 and determines whether the touch is generated on a specific area of the touch sensing means 920. In the present embodiment, the specific area includes a marginal area of the display area, a central area excluding the marginal area in the display area, or the non-display area. If the touch is generated on the specific area, theposition detecting module 930 determines whether or not the position of the touch on the touch display is changed. - The
direction detecting module 940 is connected to theposition detecting module 930 and detects the moving direction of the touch. Theprocessing module 950 is connected to both theposition detecting module 930 and thedirection detecting module 940, and when theposition detecting module 930 determines that the position of the touch on the touch display is changed, theprocessing module 950 switches the user interface on thedisplay 910 according to the moving direction of the touch detected by thedirection detecting module 940. - Through the user interface operating method described above, a user can operate the
electronic device 900 to switch a user interface or to switch to a new user interface on an existing user interface by using a finger or a stylus. The operating method is similar to the embodiment described above therefore will not be described herein. -
FIG. 10A is a perspective view of an electronic device with accessible touch operation according to an embodiment of the present invention, andFIG. 10B is a cross-sectional view of the electronic device inFIG. 10A . The electronic device (e.g. a handheld electronic device) includes acase 1001, atouch display 1002, and aprocessor 1003. Thecase 1001 has anexternal surface 1004 and a containingspace 1005, wherein the containingspace 1005 is connected to the surroundings through anopening 1006 on theexternal surface 1004. Thetouch display 1002 includes adisplay 1007 and a touch sensing means 1008. Thedisplay 1007 is disposed in the containingspace 1005 of thecase 1001. The touch sensing means 1008 is disposed in theopening 1006 on theexternal surface 1004 of thecase 1001 and receives a touch generated by an input tool. The touch sensing means 1008 has atouch sensing surface 1009, wherein thetouch sensing surface 1009 includes adisplay area 1010 and anon-display area 1011. The fringe of theopening 1006 on thecase 1001 is connected continuously to thetouch sensing surface 1009, and theexternal surface 1004 of thecase 1001 is not higher than thetouch sensing surface 1009. Here thecase 1001 does not include the hotkeys or buttons on an electronic device. Theprocessor 1003 is coupled to thedisplay 1007 and the touch sensing means 1008 and determines whether or not the touch is generated on a specific area of thetouch display 1002. If the touch is generated on the specific area, then theprocessor 1003 determines whether the position of the touch on the touch display is changed and detects the moving direction of the touch. When the position of the touch on the touch display is changed, theprocessor 1003 activates and displays a user interface or switches the user interface in thetouch display 1002 accordingly to the moving direction of the touch. - It should be mentioned that since the
external surface 1004 of thecase 1001 is not higher than thetouch sensing surface 1009, theexternal surface 1004 of thecase 1001 and thetouch sensing surface 1009 form a continuous smooth surface which allows the user to move and operate the input tool obstructionless. Moreover, thenon-display area 1011 exposed by thetouch sensing surface 1009 is not covered by thecase 1001 as in prior art, thus, in the design of an electronic device, an input tool can be moved and operated obstrutionless, and besides, thenon-display area 1011 can be fully utilized and more convenient operations can be provided to the user. - As described in foregoing embodiment, the
processor 1003 can activate, switch, or close a user interface according to the position and moving direction of a touch generated while the input tool touches the touch sensing means 1008. The details of the method, such as operation procedure and functionalities thereof, have been described in foregoing embodiment therefore will not be described herein. - In overview, in the electronic device with the switchable user interface and the electronic device with the accessible touch operation described in above embodiments, the user interface can be activated, switched and closed according to the position and the moving direction of the touch generated by the input tool. As a result, the convenience of using the input tool can be increased, and an easier operation way is provided for the user to improve the efficiency of operating the electronic device.
- It will be apparent to those skilled in the art that various modifications and variations can be made to the structure of the present invention without departing from the scope or spirit of the invention. In view of the foregoing, it is intended that the present invention cover modifications and variations of this invention provided they fall within the scope of the following claims and their equivalents.
Claims (20)
1. An electronic device, comprising:
a display configured to display a first user interface, which has a plurality of finger-touchable icons, wherein each of the finger-touchable icons is corresponding to an application program;
a touch sensing means configured to detect a touch of an input tool and having a specific area;
a position detecting module coupled to the touch sensing means and configured to determine whether the touch is generated on the specific area of the touch sensing means and to determine whether a position of the touch on the touch sensing means is changed if the touch is generated on the specific area; and
a processing module coupled to the position detecting module and configured to activate a second user interface, which has at least one event notification, by dragging the second user interface out from a display edge of the display to replace the first user interface on the display if the touch is generated on the specific area and the position of the touch on the touch sensing means is changed;
wherein the processing module is further configured to close the second user interface on the first user interface if the position detecting module determines that another touch is generated on another specific area of the touch sensing means and that a position of the another touch is changed.
2. The electronic device according to claim 1 , wherein the event notification is associated with a personal schedule.
3. The electronic device according to claim 1 , wherein the event notification is associated with an unread message.
4. The electronic device according to claim 1 , wherein the second user interface further has a time or a date.
5. The electronic device according to claim 1 , wherein the position detecting module is further configured to calculate a moving distance of the touch, determine whether the moving distance is greater than a predetermined value, and determine that the position of the touch is changed if the moving distance is greater than the predetermined value.
6. The electronic device according to claim 5 , wherein the position detecting module is configured to calculate the moving distance of the touch within a predetermined time.
7. The electronic device according to claim 1 , wherein the touch sensing means is a touch panel, and the touch panel and the display constitute a touch display.
8. The electronic device according to claim 7 , wherein the touch panel comprises a display area and a non-display area on the display, and the display area and the non-display area are located on the same surface.
9. The electronic device according to claim 8 , wherein the specific area is a marginal area of the display area, the non-display area, or a central area of the display area.
10. The electronic device according to claim 1 , wherein the finger-touchable icons are arranged in an array form.
11. The electronic device according to claim 1 , wherein the electronic device is a personal digital assistant (PDA) or a smart phone.
12. An electronic device, comprising:
a case having an external surface and an opening defined therein;
a touch display disposed in the opening of the case and configured to display a first user interface, which has a plurality of finger-touchable icons, and to receive a touch of an input tool, wherein each of the finger-touchable icons is corresponding to an application program, the touch display has a touch sensing surface, and the external surface of the case is substantially not higher than the touch sensing surface; and
a processing means coupled to the touch display and configured to determine whether the touch is generated on a specific area of the touch display, to determine whether a position of the touch on the touch display is changed if the touch is generated on the specific area, and to activate a second user interface, which has at least one event notification, by dragging the second user interface out from a display edge of the touch display to replace the first user interface on the touch display if the touch is generated on the specific area and the position of the touch on the touch display is changed;
wherein the processing means is further configured to close the second user interface on the first user interface if the processing means determines that another touch is generated on another specific area of the touch display and that a position of the another touch is changed.
13. The electronic device according to claim 12 , wherein the event notification is associated with a personal schedule.
14. The electronic device according to claim 12 , wherein the event notification is associated with an unread message.
15. The electronic device according to claim 12 , wherein the second user interface further has a time or a date.
16. The electronic device according to claim 12 , wherein the processing means is further to calculate a moving distance of the touch within a predetermined time, determine whether or not the moving distance is greater than a predetermined value, and determine that the position of the touch is changed if the moving distance is greater than the predetermined value.
17. The electronic device according to claim 12 , wherein the finger-touchable icons are arranged in an array form.
18. The electronic device according to claim 12 , wherein the electronic device is a personal digital assistant (PDA) or a smart phone.
19. An electronic device, comprising:
a touch display configured to display a first user interface, which has a plurality of finger-touchable icons arranged in a first array form, and configured to detect a touch of an input tool, wherein the touch display has a display area and a non-display area, and each of the plurality of finger-touchable icons is corresponding to a first application program;
a position detecting module coupled to the touch display for determining whether or not the touch is generated on the non-display area of the touch display; and
a processing module coupled to the position detecting module and configured to switch the first user interface, having the plurality of finger-touchable icons arranged in the first array form, to a second user interface, which has at least one finger-touchable image, on the touch display if the touch is generated on the non-display area, wherein the finger-touchable image is a screenshot miniature of a second application program.
20. The electronic device according to claim 19 , wherein the display area and the non-display area are located on the same surface.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US13/907,968 US20130265264A1 (en) | 2007-05-15 | 2013-06-02 | Electronic device with switchable user interface and electronic device with accessible touch operation |
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
TW96117290 | 2007-05-15 | ||
TW096117290A TWI337321B (en) | 2007-05-15 | 2007-05-15 | Electronic device with switchable user interface and accessable touch operation |
US12/109,359 US8456442B2 (en) | 2007-05-15 | 2008-04-25 | Electronic device with switchable user interface and electronic device with accessible touch operation |
US13/907,968 US20130265264A1 (en) | 2007-05-15 | 2013-06-02 | Electronic device with switchable user interface and electronic device with accessible touch operation |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/109,359 Continuation US8456442B2 (en) | 2007-05-15 | 2008-04-25 | Electronic device with switchable user interface and electronic device with accessible touch operation |
Publications (1)
Publication Number | Publication Date |
---|---|
US20130265264A1 true US20130265264A1 (en) | 2013-10-10 |
Family
ID=39498230
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/109,359 Active 2031-04-09 US8456442B2 (en) | 2007-05-15 | 2008-04-25 | Electronic device with switchable user interface and electronic device with accessible touch operation |
US13/907,968 Abandoned US20130265264A1 (en) | 2007-05-15 | 2013-06-02 | Electronic device with switchable user interface and electronic device with accessible touch operation |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/109,359 Active 2031-04-09 US8456442B2 (en) | 2007-05-15 | 2008-04-25 | Electronic device with switchable user interface and electronic device with accessible touch operation |
Country Status (3)
Country | Link |
---|---|
US (2) | US8456442B2 (en) |
EP (1) | EP2003539A1 (en) |
TW (1) | TWI337321B (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105892907A (en) * | 2016-03-25 | 2016-08-24 | 乐视控股(北京)有限公司 | Use method of rolling screen capture, and terminal |
CN106657619A (en) * | 2016-11-30 | 2017-05-10 | 努比亚技术有限公司 | Screenshot method and device |
Families Citing this family (23)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
TWI461045B (en) | 2008-05-02 | 2014-11-11 | Htc Corp | Handheld electronic device and executing application method and digital data storage media |
US20100083108A1 (en) * | 2008-09-26 | 2010-04-01 | Research In Motion Limited | Touch-screen device having soft escape key |
TWI397852B (en) * | 2008-11-12 | 2013-06-01 | Htc Corp | Function selection systems and methods, and machine readable medium thereof |
TW201035822A (en) * | 2009-03-23 | 2010-10-01 | Coretronic Display Solution Corp | Touch display system and control method thereof |
TW201035829A (en) * | 2009-03-31 | 2010-10-01 | Compal Electronics Inc | Electronic device and method of operating screen |
JP5563250B2 (en) * | 2009-06-30 | 2014-07-30 | 株式会社ジャパンディスプレイ | Stereoscopic image display device |
TWI445384B (en) * | 2010-04-26 | 2014-07-11 | Htc Corp | Method, communication devices, and computer program product for controlling communication |
TW201142777A (en) * | 2010-05-28 | 2011-12-01 | Au Optronics Corp | Sensing display panel |
TWI435261B (en) | 2010-08-17 | 2014-04-21 | Wistron Corp | Electronic device and method for implementing icon board based operation interface thereof |
TWI438662B (en) * | 2010-12-01 | 2014-05-21 | Wintek China Technology Ltd | Touch panel and touch display panel having the same |
US9436301B2 (en) | 2011-06-29 | 2016-09-06 | Google Technology Holdings LLC | Portable electronic device having interchangeable user interfaces and method thereof |
US9411442B2 (en) | 2011-06-29 | 2016-08-09 | Google Technology Holdings LLC | Electronic device having managed input components |
CA2808792C (en) * | 2011-08-12 | 2016-02-16 | Research In Motion Limited | Portable electronic device and method of controlling same |
US8884892B2 (en) | 2011-08-12 | 2014-11-11 | Blackberry Limited | Portable electronic device and method of controlling same |
TWI461963B (en) | 2011-08-17 | 2014-11-21 | Wistron Corp | Computer keyboard and control method thereof |
EP2581817A1 (en) * | 2011-10-14 | 2013-04-17 | Research In Motion Limited | System and method for controlling an electronic device having a touch-sensitive non-display area |
US8610684B2 (en) | 2011-10-14 | 2013-12-17 | Blackberry Limited | System and method for controlling an electronic device having a touch-sensitive non-display area |
TWI462069B (en) * | 2011-12-30 | 2014-11-21 | Au Optronics Corp | Touch display panel |
KR102157270B1 (en) * | 2013-04-26 | 2020-10-23 | 삼성전자주식회사 | User terminal device with a pen and control method thereof |
CN104978134B (en) * | 2014-04-08 | 2018-06-12 | 宏碁股份有限公司 | Method for detecting upper cover touch to adjust display picture and electronic device |
JP2017182259A (en) * | 2016-03-29 | 2017-10-05 | パナソニックIpマネジメント株式会社 | Display processing apparatus and display processing program |
CN106599170A (en) * | 2016-12-09 | 2017-04-26 | 联想(北京)有限公司 | Media file browsing method and information processing equipment |
US10790682B2 (en) | 2018-03-30 | 2020-09-29 | Intel Corporation | Hybrid power boost charging with peak power protection |
Citations (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5715416A (en) * | 1994-09-30 | 1998-02-03 | Baker; Michelle | User definable pictorial interface for a accessing information in an electronic file system |
US20060020899A1 (en) * | 2004-04-26 | 2006-01-26 | Microsoft Corporation | Scaling icons for representing files |
US20060101350A1 (en) * | 2004-11-09 | 2006-05-11 | Research In Motion Limited | Dynamic bar oriented user interface |
US20060209040A1 (en) * | 2005-03-18 | 2006-09-21 | Microsoft Corporation | Systems, methods, and computer-readable media for invoking an electronic ink or handwriting interface |
US7166791B2 (en) * | 2002-07-30 | 2007-01-23 | Apple Computer, Inc. | Graphical user interface and methods of use thereof in a multimedia player |
US20070091194A1 (en) * | 2005-10-24 | 2007-04-26 | Samsung Electronics Co., Ltd. | Apparatus to provide a screen capturing function and a method of providing the screen capturing function |
US20070211034A1 (en) * | 2006-02-13 | 2007-09-13 | Griffin Jason T | Handheld wireless communication device with function keys in exterior key columns |
US7289083B1 (en) * | 2000-11-30 | 2007-10-30 | Palm, Inc. | Multi-sided display for portable computer |
US20080066016A1 (en) * | 2006-09-11 | 2008-03-13 | Apple Computer, Inc. | Media manager with integrated browsers |
US20080201307A1 (en) * | 1998-06-12 | 2008-08-21 | Swartz Gregory J | System and method for iconic software environment management |
US20080205772A1 (en) * | 2006-10-06 | 2008-08-28 | Blose Andrew C | Representative image selection based on hierarchical clustering |
US20100017872A1 (en) * | 2002-12-10 | 2010-01-21 | Neonode Technologies | User interface for mobile computer unit |
US20100214250A1 (en) * | 2001-05-16 | 2010-08-26 | Synaptics Incorporated | Touch screen with user interface enhancement |
US7899829B1 (en) * | 2005-12-14 | 2011-03-01 | Unifi Scientific Advances, Inc. | Intelligent bookmarks and information management system based on same |
US8533199B2 (en) * | 2005-12-14 | 2013-09-10 | Unifi Scientific Advances, Inc | Intelligent bookmarks and information management system based on the same |
Family Cites Families (20)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5821930A (en) * | 1992-08-23 | 1998-10-13 | U S West, Inc. | Method and system for generating a working window in a computer system |
JPH09190268A (en) | 1996-01-11 | 1997-07-22 | Canon Inc | Information processor and method for processing information |
US5729219A (en) | 1996-08-02 | 1998-03-17 | Motorola, Inc. | Selective call radio with contraposed touchpad |
US5956025A (en) * | 1997-06-09 | 1999-09-21 | Philips Electronics North America Corporation | Remote with 3D organized GUI for a home entertainment system |
US5943052A (en) * | 1997-08-12 | 1999-08-24 | Synaptics, Incorporated | Method and apparatus for scroll bar control |
US8479122B2 (en) * | 2004-07-30 | 2013-07-02 | Apple Inc. | Gestures for touch sensitive input devices |
US6292173B1 (en) * | 1998-09-11 | 2001-09-18 | Stmicroelectronics S.R.L. | Touchpad computer input system and method |
US6897833B1 (en) * | 1999-09-10 | 2005-05-24 | Hewlett-Packard Development Company, L.P. | Portable user interface |
US6690365B2 (en) | 2001-08-29 | 2004-02-10 | Microsoft Corporation | Automatic scrolling |
US7249327B2 (en) * | 2002-03-22 | 2007-07-24 | Fuji Xerox Co., Ltd. | System and method for arranging, manipulating and displaying objects in a graphical user interface |
US11275405B2 (en) * | 2005-03-04 | 2022-03-15 | Apple Inc. | Multi-functional hand-held device |
US20050168441A1 (en) * | 2002-11-05 | 2005-08-04 | Fujitsu Limited | Display control device, display control method, computer product |
CN1291303C (en) | 2003-12-05 | 2006-12-20 | 陞达科技股份有限公司 | Method for controlling scrolling of window screen of electronic device |
JP4855654B2 (en) | 2004-05-31 | 2012-01-18 | ソニー株式会社 | On-vehicle device, on-vehicle device information providing method, on-vehicle device information providing method program, and on-vehicle device information providing method program |
CN1306377C (en) | 2004-07-14 | 2007-03-21 | 义隆电子股份有限公司 | How to control the scrolling of the scroll on the touchpad |
US7636889B2 (en) | 2006-01-06 | 2009-12-22 | Apple Inc. | Controlling behavior of elements in a display environment |
CN100381997C (en) | 2006-04-29 | 2008-04-16 | 怡利电子工业股份有限公司 | Menu selection method of touch key |
US20070281668A1 (en) * | 2006-05-31 | 2007-12-06 | Cisco Technology, Inc. | Dialing assistant that includes an interface with a geographic display |
US7864163B2 (en) * | 2006-09-06 | 2011-01-04 | Apple Inc. | Portable electronic device, method, and graphical user interface for displaying structured electronic documents |
KR101558207B1 (en) * | 2009-09-14 | 2015-10-07 | 엘지전자 주식회사 | Item setting method of mobile terminal and mobile terminal |
-
2007
- 2007-05-15 TW TW096117290A patent/TWI337321B/en active
-
2008
- 2008-04-25 US US12/109,359 patent/US8456442B2/en active Active
- 2008-04-29 EP EP08008196A patent/EP2003539A1/en not_active Ceased
-
2013
- 2013-06-02 US US13/907,968 patent/US20130265264A1/en not_active Abandoned
Patent Citations (17)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5715416A (en) * | 1994-09-30 | 1998-02-03 | Baker; Michelle | User definable pictorial interface for a accessing information in an electronic file system |
US20080201307A1 (en) * | 1998-06-12 | 2008-08-21 | Swartz Gregory J | System and method for iconic software environment management |
US7289083B1 (en) * | 2000-11-30 | 2007-10-30 | Palm, Inc. | Multi-sided display for portable computer |
US20100214250A1 (en) * | 2001-05-16 | 2010-08-26 | Synaptics Incorporated | Touch screen with user interface enhancement |
US7166791B2 (en) * | 2002-07-30 | 2007-01-23 | Apple Computer, Inc. | Graphical user interface and methods of use thereof in a multimedia player |
US20100017872A1 (en) * | 2002-12-10 | 2010-01-21 | Neonode Technologies | User interface for mobile computer unit |
US20060020899A1 (en) * | 2004-04-26 | 2006-01-26 | Microsoft Corporation | Scaling icons for representing files |
US20110314424A1 (en) * | 2004-04-26 | 2011-12-22 | Microsoft Corporation | Scaling type overlay icons |
US20060101350A1 (en) * | 2004-11-09 | 2006-05-11 | Research In Motion Limited | Dynamic bar oriented user interface |
US20060209040A1 (en) * | 2005-03-18 | 2006-09-21 | Microsoft Corporation | Systems, methods, and computer-readable media for invoking an electronic ink or handwriting interface |
US20070091194A1 (en) * | 2005-10-24 | 2007-04-26 | Samsung Electronics Co., Ltd. | Apparatus to provide a screen capturing function and a method of providing the screen capturing function |
US7899829B1 (en) * | 2005-12-14 | 2011-03-01 | Unifi Scientific Advances, Inc. | Intelligent bookmarks and information management system based on same |
US8533199B2 (en) * | 2005-12-14 | 2013-09-10 | Unifi Scientific Advances, Inc | Intelligent bookmarks and information management system based on the same |
US20130311862A1 (en) * | 2005-12-14 | 2013-11-21 | Prajno Malla | Intelligent bookmarks and information management system based on the same |
US20070211034A1 (en) * | 2006-02-13 | 2007-09-13 | Griffin Jason T | Handheld wireless communication device with function keys in exterior key columns |
US20080066016A1 (en) * | 2006-09-11 | 2008-03-13 | Apple Computer, Inc. | Media manager with integrated browsers |
US20080205772A1 (en) * | 2006-10-06 | 2008-08-28 | Blose Andrew C | Representative image selection based on hierarchical clustering |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105892907A (en) * | 2016-03-25 | 2016-08-24 | 乐视控股(北京)有限公司 | Use method of rolling screen capture, and terminal |
CN106657619A (en) * | 2016-11-30 | 2017-05-10 | 努比亚技术有限公司 | Screenshot method and device |
Also Published As
Publication number | Publication date |
---|---|
EP2003539A1 (en) | 2008-12-17 |
TW200844815A (en) | 2008-11-16 |
US20090278805A1 (en) | 2009-11-12 |
TWI337321B (en) | 2011-02-11 |
US8456442B2 (en) | 2013-06-04 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9411496B2 (en) | Method for operating user interface and recording medium for storing program applying the same | |
US8456442B2 (en) | Electronic device with switchable user interface and electronic device with accessible touch operation | |
JP3143462U (en) | Electronic device having switchable user interface and electronic device having convenient touch operation function | |
CN201107762Y (en) | Electronic device with switchable user interface and barrier-free touch operation | |
CN101308415B (en) | Electronic device with switchable user interface and unobstructed touch operation | |
US9489107B2 (en) | Navigating among activities in a computing device | |
RU2335011C2 (en) | System and method for navigation on graphic user interface on reduced display | |
EP2433275B1 (en) | Hand-held device with ancillary touch activated zoom or transformation of active element, method of operation and computer readable medium | |
US8749484B2 (en) | Multi-screen user interface with orientation based control | |
EP3198391B1 (en) | Multi-finger touchpad gestures | |
TWI381305B (en) | Method for displaying and operating user interface and electronic device | |
US10509549B2 (en) | Interface scanning for disabled users | |
US20110047459A1 (en) | User interface | |
US20090213081A1 (en) | Portable Electronic Device Touchpad Input Controller | |
KR101690111B1 (en) | Dual configuartion computer | |
TWI389015B (en) | Method for operating software input panel | |
CN101308416A (en) | User interface operation method and recording medium thereof | |
EP2065794A1 (en) | Touch sensor for a display screen of an electronic device | |
US20090049411A1 (en) | Method and apparatus to control portable device based on graphical user interface | |
US20030179182A1 (en) | Article comprising an adaptable input devce | |
CN102830908A (en) | Electronic device and desktop browsing method thereof | |
US20130086502A1 (en) | User interface | |
US20090135156A1 (en) | Touch sensor for a display screen of an electronic device | |
JP2009087075A (en) | Information processor, and information processor control method and program | |
Yang | Blurring the boundary between direct & indirect mixed mode input environments |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |