[go: up one dir, main page]

AU2015258317A1 - Apparatus and method for controlling motion-based user interface - Google Patents

Apparatus and method for controlling motion-based user interface Download PDF

Info

Publication number
AU2015258317A1
AU2015258317A1 AU2015258317A AU2015258317A AU2015258317A1 AU 2015258317 A1 AU2015258317 A1 AU 2015258317A1 AU 2015258317 A AU2015258317 A AU 2015258317A AU 2015258317 A AU2015258317 A AU 2015258317A AU 2015258317 A1 AU2015258317 A1 AU 2015258317A1
Authority
AU
Australia
Prior art keywords
image
touch screen
zooming
input
motion
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
AU2015258317A
Other versions
AU2015258317B2 (en
Inventor
Hyun-Su Hong
Yung-Keun Jung
Il-Hwan Kim
Jae-Myeon Lee
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Samsung Electronics Co Ltd
Original Assignee
Samsung Electronics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from KR1020100100435A external-priority patent/KR101915615B1/en
Application filed by Samsung Electronics Co Ltd filed Critical Samsung Electronics Co Ltd
Priority to AU2015258317A priority Critical patent/AU2015258317B2/en
Publication of AU2015258317A1 publication Critical patent/AU2015258317A1/en
Application granted granted Critical
Publication of AU2015258317B2 publication Critical patent/AU2015258317B2/en
Priority to AU2017210607A priority patent/AU2017210607B2/en
Ceased legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • User Interface Of Digital Computer (AREA)

Abstract

An apparatus for controlling a motion-based user interface, the apparatus comprising: a touch screen for displaying an image and for receiving input of a user touch; a sensor unit for sensing a motion of the apparatus in response to the input of the user touch; and a controller configured to: receive information of the sensed motion for zoom in or zoom out from the sensor unit, zoom in or zoom out the image based on the information of the sensed motion, and display the image which is changed according to the zoom in or zoom out. 7146176 1 (GHM.4ter- P93101 A[ J 1 20/11/2015 [DRAWINGS] [Figure 1] OUTPUT UNIT 105 103 107 LMEM ORY-- CNTLLR IPTUT [Figure 2A] [Figure 2B] [Figure 3] OUTPUT UNIT 305 303 307 MEMORY CONTROLLER IPTUI CALCUALTOR 309 SENSOR UNIT 311 [Figure 4A]

Description

1 APPARATUS AND METHOD FOR CONTROLLING A MOTION-BASED USER INTERFACE Related Application This application is a divisional application of Australian application no. 2011314532, the disclosure of which, as filed or as accepted, is incorporated herein by reference. Field of Invention The present invention relates to a user interface, and more particularly, to an apparatus and method for controlling an interface according to a motion that a mobile terminal has made. Background Art At present, electronic devices such as TVs, MP3 players, Portable Multimedia Players (PMPs), smart phones, etc. are equipped with a variety of input/output devices in order to enable a user to conveniently control the electronic device. Among the input/output devices, the utilization of smart phones has recently increased exponentially. Especially, the proportion of using touch panels is gradually growing in the market of mobile terminals including a portable phone, a smart phone, and a laptop. As touch screen panels are expected to gain more popularity in most of smart phones, the market of touch screens for mobile terminals will be rapidly boosted. Touch screen panels are also widely used in electronic appliances such as TVs or refrigerators. The market of electronic appliances will rank second in adopting touch screen panels, following the mobile terminal market. Recently, extensive research has been made on recognition of a user's intention and action based on visual information, for natural interaction between a user and a touch screen. Particularly, a user-friendly user interface is configured to recognize a gesture input by a finger or a touch pen. The trend of user interfaces is now shifting from an interface that operates according to a single finger-based single touch input on a touch screen to an interface that operates according to a multi-finger-based multi-touch input on a touch screen. A touch screen is a stack of a plane for sensing an input and a plane serving as a display. Therefore, a user's intention can be analyzed and perceived from a multi-touch input on the touch screen and the analysis and perception result may be output on the touch screen. Especially a multi-touch-based user interface is designed in such a manner that the number of finger touches/pen touches on the touch screen and an associated operation are recognized and an associated command is executed. The interior structure of a mobile terminal supporting multi-touch input will be described below. FIG. 1 is a block diagram of a conventional mobile terminal supporting multi-touch input. In FIG. 1, the mobile terminal includes an output unit 101, a controller 103, a memory 105, and an input unit 107. Referring to FIG. 1, the output unit 101 outputs an image such as a drawing or a Web page on a touch screen. The image such as a drawing or a Web page is stored in the memory 105. When the image is output on the touch screen, a user may enlarge or shrink the image using the input unit 107 according to user selection. An intended part of the image displayed on the touch screen may be enlarged or contracted by touching it with two fingers simultaneously. Instead of fingers, a touch pen may be used. Upon input of multiple touches through the input unit 107, 7146176 1 (GHM.4er- P93101 A[ J 1 20/11/2015 2 the controller 103 controls the output unit 101 to display the multi-touched area enlarged or shrunk on the touch screen. An exemplary operation performed upon input of multiple touches in the mobile terminal having the configuration illustrated in FIG. 1 will be described below. FIGs. 2A and 2B illustrate an exemplary conventional operation performed upon input of multiple touches. The operation is specifically for enlarging a specific area by multiple touches on a touch screen. Referring to FIG. 2A, a user touches the touch screen 201 with two fingers 205 in a pinching motion to enlarge a car image 203 displayed on a touch screen 201. The enlarged area may be different depending on a position at which the two fingers 205 touch. Referring to FIG. 2B, the car image 203 is enlarged by spreading out the two fingers 205 on the touch screen 201. The degree to which the car image 203 is enlarged may depend on the distance between the two fingers 205 spread apart from each other. While only the operation for enlarging an image is illustrated in FIGs. 2A and 2B, the enlarged image may also be shrunk using the two fingers 205. Conventionally, the zoom-in or zoom-out ratio of a predetermined part of the touch screen can be adjusted only by pressing a predefined zoom-in or zoom-out key or making multiple touches. When a user wants to move from one page to another page or from one part to another part on the same page, the user is supposed to press a predefined move key or touch and drag an area of the touch screen. This means that for continuous zoom-in and zoom-out, the user should make consecutive key inputs or continuous multiple touches. In addition, to move from a specific part to another part on the same page, the user should inconveniently input keys successively or make continuous touches-and-drags. Therefore, it is difficult to fast and accurately perform a user-intended operation simply with an input on the touch screen or a gesture drawn on the touch screen. Summary of Invention Accordingly, an embodiment of the present invention provides an apparatus and method for controlling a user interface so that an image may be enlarged, shrunk, and moved based on a motion. In accordance with an embodiment of the present invention, there is provided an apparatus for controlling a motion-based user interface, the apparatus comprising: a touch screen for displaying an image and for receiving input of a user touch; a sensor unit for sensing a motion of the apparatus in response to the input of the user touch; and a controller configured to: receive information of the sensed motion for zoom in or zoom out from the sensor unit, zoom in or zoom out the image based on the information of the sensed motion, and display the image which is changed according to the zoom in or zoom out. In accordance with another embodiment of the present invention, there is provided a method for controlling a motion-based user interface, the method comprising: 7146176 1 (GHM.4er- P93101 A[ J 1 20/11/2015 3 displaying an image on a touch screen; receiving an input of a user touch on the touch screen; sensing a motion of an apparatus in response to the input of the user touch; when obtaining information of the sensed motion of the apparatus for zoom in or zoom out, zooming in or zooming out the image based on the obtained information; and displaying the image which is changed according to the zooming in or the zooming out. In accordance with another embodiment of the present invention, there is provided a method for controlling a motion-based user interface, the method comprising: outputting an image to an output unit; in response to sensing input of at least one button through an input unit, obtaining a number of the at least one button; obtaining information of a motion of an apparatus based on the number of the at least one button; zooming in or zooming out the image based on the obtained information; and displaying the image which is changed according to the operation. In accordance with another embodiment of the present invention, there is provided a non-transitory computer-readable recording medium having recorded thereon a program to be executed on a computer, wherein the program comprises an executable, when executed by a processor, for causing the processor to: receive an input of a user touch on a touch screen, sense a motion of an apparatus in response to the input of the user touch, when obtaining information of the sensed motion of the apparatus for zoom in or zoom out, zooming in or zooming out the image based on the obtained information, and display the image which is changed according to the zooming in or zooming out. Advantageous Effects According to the apparatus and method of an embodiment the present invention, the user may easily enlarge, shrink, and move an image using a motion. Brief Description of Drawings Objects, features and advantages of certain embodiments of the present invention will be more apparent from the following detailed description taken in conjunction with the accompanying drawings, in which: FIG. 1 is a block diagram of a conventional mobile terminal supporting multi-touch input FIGs. 2A and 2B illustrate an exemplary conventional operation that is performed based on a multi-touch input FIG. 3 is a block diagram of a mobile terminal for controlling a motion-based user interface according to an embodiment of the present invention FIGs. 4A to 5C are exemplary views illustrating motions of the mobile terminal according to the embodiment of the present invention FIGs. 6A and 6B are exemplary views illustrating an operation for enlarging or 7146176 1 (GHM.4er- P93101 A[ J 1 20/11/2015 4 shrinking an image according to an embodiment of the present invention; FIGs. 7A and 7B are exemplary views illustrating an operation for moving an image according to an embodiment of the present invention; FIGs. 8A and 8B are exemplary views illustrating operations for controlling a motion-based user interface using predetermined keys according to an embodiment of the present invention; and FIG. 9 is a flowchart illustrating an operation for controlling a motion-based user interface according to an embodiment of the present invention. Throughout the drawings, the same drawing reference numerals will be understood to refer to the same elements, features and structures. Detailed Description of Embodiments of the Invention Reference will now be made in detail to the preferred embodiments of the present invention with reference to the accompanying drawings. While specific details such as components are described in the following description, they are provided to help comprehensive understanding of the present invention and it is clearly understood to those skilled in the art that variations or modifications can be made to the details within the scope and spirit of the present invention. FIG. 3 is a block diagram of a mobile terminal for controlling a motion-based user interface according to an embodiment of the present invention. In FIG. 3, the mobile terminal includes an output unit 301, a controller 303, a memory 305, an input unit 307, a motion calculator 309, and a sensor unit 311. Referring to FIG. 3, the output unit 301 outputs an image such as a photo or a Web page on a touch screen. The image such as a photo or a Web page is stored in the memory 305. When needed, another image stored in the memory 305 may be used or an intended photo may be captured using a camera module equipped in the mobile terminal. The input unit 307 receives information about the number of spots touched on the touch screen, a drag made on the touch screen, etc. The motion calculator 309 calculates the movement direction or angle of the mobile terminal and transmits information about the calculated movement direction or angle to the controller 303. The mobile terminal may basically move up, down, to the left, and to the right. In addition, the mobile terminal may move forth in a horizontal direction approaching the user or back in a horizontal direction receding from the user. The angle at which the mobile terminal has moved is calculated to be an angle at which the mobile terminal is inclined with respect to a current position set as 0 degree. The motion calculator 309 uses the sensor unit 311 to calculate the movement direction or angle of the mobile terminal. More specifically, the motion calculator 309 calculates a direction or angle in which the mobile terminal has moved using a gyroscope sensor from among one or more sensors of the sensor unit 311. The gyroscope sensor is a sensor adapted to sense rotations in addition to the functions of an acceleration sensor and thus to track motions on 6 axes. Therefore, compared to the acceleration sensor, the gyroscope sensor can sense more precise motions. The gyroscope sensor senses height and rotational inclination as well as acceleration and deceleration. The controller 303 receives information about the number of touched spots and information about a touch and drag on the touch screen from the input unit 307, receives 7146176 1 (GHM.4er- P93101 A[ J 1 20/11/2015 5 information about a calculated motion of the mobile terminal from the motion calculator 309, and controls an image displayed on the touch screen in a different manner according to the number of touched spots. If two or more spots are touched in any area of the touch screen, the controller 303 determines the multi-touch input to be a command for enlarging or shrinking an image displayed on the touch screen and enlarges or shrinks the image according to information about a change in the motion of the mobile terminal received from the motion calculator 309. For example, when two spots are touched in any area of the touch screen and the mobile terminal is brought closer to the user at an unchanged angle or the touch screen of the mobile terminal is inclined forward with respect to the user, the image is enlarged. On the other hand, when the mobile terminal recedes from the user at an unchanged angle or the touch screen of the mobile terminal is reclined backward from the user, the image is shrunk. This method has the same effects as achieved in the conventional multi-touch scheme in which an image is enlarged or shrunk using two fingers, which will be apparent from a description of FIGs. 4A, 4B and 4C. FIGs. 4A, 4B and 4C are exemplary views illustrating motions of the mobile terminal according to an embodiment of the present invention. In FIGs. 4A, 4B and 4C, it is assumed that a user touches two spots on the touch screen. Referring to FIG. 4A, when the user pulls the mobile terminal in the -Z-axis direction with the current angle of the mobile terminal unchanged, facing the touch screen of the mobile terminal, while touching any two spots on the touch screen of the mobile terminal, a displayed image is enlarged. On the contrary, when the user pushes the mobile terminal in the Z-axis direction, facing the touch screen of the mobile terminal, the displayed image is shrunk. FIG. 4B is a side view of the mobile terminal, illustrating a path in which the mobile terminal is moved. It is noted from FIG. 4B that the mobile terminal is moved only in the Z-axis or -Z-axis direction without a change in its angle. FIG. 4C illustrates a variation in the inclination of the mobile terminal, with the Y and -Y axes set as a rotational axis. An image displayed on the touch screen may be enlarged or shrunk by inclining the mobile terminal forward or backward with respect to the user. If the user touches one spot in any area of the touch screen, the controller 303 determines the single-touch input to be a command for moving an image displayed on the touch screen and moves the image up, down, to the left, or to the right according to information about a change in the motion of the mobile terminal received from the motion calculator 309. For instance, if the user moves the mobile terminal to the left or rotates the mobile terminal to the left, while touching one spot in any area of the touch screen, a displayed image is moved to the left. That is, the same effect as with a scheme for moving an image in an intended direction by touching and dragging an area of the touch screen is achieved, which will be apparent from FIG. 5. FIGs. 5A, 5B and 5C illustrate motions of the mobile terminal according to an embodiment of the present invention. In FIGs. 5A, 5B and 5C, it is assumed that the user touches one spot on the touch screen. Referring to FIG. 5A, when the user moves the mobile terminal in the X-axis, -X-axis, Y-axis or -Y-axis direction, while touching one spot on the touch screen of the mobile terminal with the current angle of the mobile terminal unchanged, an image displayed on the touch screen moves in the direction in which the mobile terminal is moved. FIG. 5Bfillustrates changes in 7146176 1 (GHMa4er P93101 A[ J 1 20/11/2015 6 the rotation of the mobile terminal, with the Z and -Z axes set as a rotational axis and FIG. 5C illustrates changes in the rotation of the mobile terminal, with the Y and -Y axes set as a rotational axis. If the Z and -Z axes are set as a rotational axis and the mobile terminal is rotated to the right or to the left on the Y and -Y axes, the displayed image may move sideways. On the other hand, if the Y and -Y axes are set as a rotational axis and the touch screen of the mobile terminal is inclined or reclined on the Y and -Y axes, the displayed image may move up or down. If the Y and -Y axes are set a rotational axis, the displayed image may be transformed differently according to the number of touched spots on the touch screen. In FIG. 4C, the image is enlarged or shrunk, whereas in FIG. 5C, the image is moved up or down. As is done conventionally, the controller 303 may also control an image displayed on the touch screen in correspondence with a gesture such as a drag that may follow a touch of at least one spot in any area of the touch screen. For instance, if one spot is touched in any area of the touch screen and the touched spot is dragged to the left, the display image may be moved to the left or the next image may be displayed. If two spots are touched on the touch screen, the image may be enlarged or shrunk according to a conventional multi-touch scheme. Now a description will be given of an operation for controlling a displayed image, with reference to FIG. 3. FIGs. 6A and 6B are exemplary views illustrating an operation for enlarging or shrinking an image according to an embodiment of the present invention. Referring to FIG. 6A, the user pulls the mobile terminal, grabbing the touch screen so that two spots in any area of the touch screen are touched, while viewing the touch screen. A displayed image is enlarged in correspondence with the pulling motion. FIG. 6B is a side view illustrating the user pulling the mobile terminal. It is noted that the image is enlarged simply by bringing the mobile terminal closer to the user as illustrated in FIGs. 4A and 4B without inclining the mobile terminal forward with respect to the user. While not shown in FIGs. 6A and 6B, when the user pushes the mobile terminal backward, grabbing the touch screen so that two spots in any area of the touch screen are touched, while viewing the touch screen, the displayed image may be shrunk. The function of enlarging an image by pulling the mobile terminal and shrinking an image by pushing the mobile terminal may be changed by a user setting. For instance, the function may be changed in such a manner that an image is shrunk by pulling the mobile terminal and enlarged by pushing the mobile terminal. The zoom-in or zoom-out ratio of an image may be set to be proportional to a rotational angle measurement in each axis direction illustrated in FIGs. 4A to 5C, using he gyroscope sensor included in the sensor unit 311 of the mobile terminal displaced according to a user motion. In the situation where this image zoom-in or zoom-out function is available, the user may enlarge or shrink an image according to the conventional multi-touch scheme. During the conventional multi-touch-based zoom-in or zoom-out function in progress, the motion-based image zoom-in or zoom-out function may be deactivated to avoid mutual interference between the conventional and proposed zoom-in or zoom-out functions. If only one spot is touched or no touch is made on the touch screen, the image zoom-in and zoom-out operation is deactivated in the present invention. FIGs. 7A and 7B are exemplary views illustrating an operation for moving an image 7146176 1 (GHM.4er- P93101 A[ J 1 20/11/2015 7 according to an embodiment of the present invention. Referring to FIG. 7A, the user moves the mobile terminal from the left to the right, viewing the touch screen, while grabbing the mobile terminal so that one spot is touched in any area of the touch screen. An image displayed on the touch screen moves from the right to the left in correspondence with the motion of the mobile terminal. Referring to FIG. 7B focusing on the touch screen, the left touch screen displays an image prior to the movement, and the right touch screen displays an image output when the mobile terminal moves from the left to the right. As the image moves, an image hidden at the right side outside the screen appears on the screen. In the case where a plurality of images are arranged, the images may sequentially show up on the screen by moving the mobile terminal. The distance for which an image moves may be set to be proportional to a rotational angle measurement in each axis direction illustrated in FIGs. 4A to 5C, using the gyroscope sensor included in the sensor unit 311 of the mobile terminal displaced according to a user motion. In the situation where this image shift function is available, the user may move an image by touching one spot and dragging the touch on the touch screen. During the dragging function in progress, the motion-based image shift function may be deactivated to avoid mutual interference between the dragging function and the image shift function of the present invention. If only one spot is touched or no touch is made on the touch screen, the image shift operation is deactivated in the present invention. While it is described according to the present invention that a displayed image can be controlled in a different manner according to the number of touched spots in any area of the touch screen, the same thing may be achieved by assigning specific keys and controlling the displayed image according to the number of pressed keys, which will be described below. FIGs. 8A and 8B are exemplary views illustrating an operation for controlling a motion-based user interface using assigned specific keys according to an embodiment of the present invention. Referring to FIG. 8A, when the user presses buttons of digits 1 and 3 at the same time or at a predetermined interval on a keypad during executing an image viewer, the mobile terminal may be set to recognize the motion as a command for enlarging or shrinking a displayed image. According to a subsequent motion of the mobile terminal, the displayed image may be enlarged or shrunk. Referring to FIG. 8B, when the user presses either of the buttons of digits 1 and 3 on the keypad during executing the image viewer, the mobile terminal may be set to recognize the motion as a command for shifting a displayed image. Meanwhile, the keypad may be replaced with buttons that operate based on user touches. FIG. 9 is a flowchart illustrating an operation for controlling a motion-based user interface according to an embodiment of the present invention. Referring to FIG. 9, upon receipt of a user input, the mobile terminal executes an image viewer to display an image such as a photo or Web page in step 901. The image is stored in the memory 305. If the mobile terminal is wireless Internet-enabled, it may display a Web page without executing the image viewer. In step 903, the mobile terminal determines whether a touch input has been sensed through the input unit 307. If a touch input has not been sensed, the mobile terminal continues executing the image viewer in step 901. Upon sensing of a touch input, the mobile terminal goes to step 905. 7146176 1 (GHM.4er- P93101 A[ J 1 20/11/2015 8 In step 905, the mobile terminal counts the number of spots touched on the touch screen and controls a displayed image in a different manner according to the number of touched spots. If two or more spots have been touched, the mobile terminal determines to enlarge or shrink the image and starts to sense a motion of the mobile terminal. In step 907, if the mobile terminal moves forward or backward or is inclined forward with respect to the user or reclined backward as illustrated in FIGs. 4A, 4B and 4C, the mobile terminal calculates a degree to which the mobile terminal moves up, down, to the left, to the right, back, or forth or a degree to which the apparatus is inclined forward, reclined backward, rotated to the left, or rotated to the right, using a gyroscope sensor included in the sensor unit and enlarges or shrinks the image in correspondence with the calculated degree. The motion of the mobile terminal is sensed by the gyroscope sensor of the sensor unit 311 and the distance for which the mobile terminal has moved or the degree to which the mobile terminal has been inclined is calculated by the motion calculator 309. In addition, two or more buttons may be assigned and pressed instead of two spots so as to enlarge or shrink the image. If a single spot has been touched, the mobile terminal determines to shift the image and starts to sense a motion of the mobile terminal. When the mobile terminal moves up, down, to the left, or to the right, or is inclined forward, reclined backward, rotated to the left, or rotated to the right as illustrated in FIGs. 5A, 5B and 5C, the moved distance or inclination or rotation degree of the mobile terminal is calculated and the image is shifted in a direction corresponding to the calculated degree. As stated above, the motion of the mobile terminal is sensed by the gyroscope sensor of the sensor unit 311 and the distance for which the mobile terminal has moved or the degree to which the mobile terminal has been inclined is calculated by the motion calculator 309. In addition, a specific button may be assigned and pressed instead of one spot so as to shift the image. While the present invention has been particularly shown and described with reference to embodiments thereof, it will be understood by those of ordinary skill in the art that various changes in form and details may be made therein without departing from the spirit and scope of the present invention as defined by the following claims. In the claims which follow and in the preceding description of the invention, except where the context requires otherwise due to express language or necessary implication, the word "comprise" or variations such as "comprises" or "comprising" is used in an inclusive sense, that is to specify the presence of the stated features but not to preclude the presence or addition of further features in various embodiments of the invention. 7146176 1 (GHM.4er- P93101 A[ J 1 20/11/2015

Claims (19)

1. An apparatus for controlling a motion-based user interface, the apparatus comprising: a touch screen for displaying an image and for receiving input of a user touch; a sensor unit for sensing a motion of the apparatus in response to the input of the user touch; and a controller configured to: receive information of the sensed motion for zoom in or zoom out from the sensor unit, zoom in or zoom out the image based on the information of the sensed motion, and display the image which is changed according to the zoom in or zoom out.
2. The apparatus of claim 1, wherein the controller is further configured to zoom in the image on the touch screen in response to the apparatus moving in a facing direction of the touch screen.
3. The apparatus of claim 1, wherein the controller is further configured to zoom out the image on the touch screen in response to the apparatus moving opposite to a facing direction of the touch screen.
4. The apparatus of claim 1, wherein, if the number of touched spots based on the input of the user touch is 2 or more, the controller is further configured to recognize the input as a command for zooming in or zooming out the image.
5. The apparatus of claim 1, wherein, if the number of touched spots based on the input of the user touch is 1, the controller is further configured to recognize the input as a command for shifting the image and, when the input is recognized as a command to move the image, control shifting of the image based on information of the motion of the apparatus for shifting the image.
6. The apparatus of claim 1, wherein the sensor unit includes a gyroscope, and wherein the controller is further configured to determine the information of the motion of the apparatus according to a degree to which the apparatus moves up, down, to the left, to the right, back, or forth or a degree to which the apparatus is inclined forward, reclined backward, rotated to the left, or rotated to the right, using the gyroscope.
7. A method for controlling a motion-based user interface, the method comprising: displaying an image on a touch screen; receiving an input of a user touch on the touch screen; sensing a motion of an apparatus in response to the input of the user touch; when obtaining information of the sensed motion of the apparatus for zoom in or zoom 7146176 1 (GHMa4er- P93101 A[ J 1 20/11/2015 10 out, zooming in or zooming out the image based on the obtained information; and displaying the image which is changed according to the zooming in or the zooming out.
8. The method of claim 7, wherein the zooming in or zooming out of the image on the touch screen comprises zooming in the image on the touch screen in response to the apparatus moving in a facing direction of the touch screen.
9. The method of claim 7, wherein the zooming in or zooming out of the image comprises zooming in the image on the touch screen in response to the apparatus being inclined in a facing direction of the touch screen.
10. The method of claim 7, wherein the zooming in or zooming out of the image on the touch screen comprises zooming out the image on the touch screen in response to apparatus moving in a direction opposite to a facing direction of the touch screen.
11. The method of claim 7, wherein zooming in or zooming out of the image on the touch screen comprises zooming out the image on the touch screen in response to the apparatus reclining in a direction opposite a facing direction of the touch screen.
12. The apparatus of claim 7, wherein, if a number of touched spots based on the input of the user touch is 2 or more, the input of the user touch is recognized as a command for zooming in or zooming out the image on the touch screen.
13. The method of claim 7, further comprising: when obtaining information of the motion of the apparatus for shifting the image on the touch screen in response to the input of the user touch, shifting the image based on the obtained information, wherein the input of the user touch is recognized as a command for shifting the image, if a number of at least one touched spot obtained in response to the input of the user touch is 1.
14. The method of claim 7, wherein the information of the motion of the apparatus is determined according to a degree to which the apparatus moves up, down, to the left, to the right, back, or forth or a degree to which the apparatus is inclined forward, backward, to the left, or to the right.
15. A method for controlling a motion-based user interface, the method comprising: outputting an image to an output unit; in response to sensing input of at least one button through an input unit, obtaining a number of the at least one button; obtaining information of a motion of an apparatus based on the number of the at least one button; zooming in or zooming out the image based on the obtained information; and displaying the image which is changed according to the operation. 7146176 1 (GHM.44er1 P93101 A[ J 1 20/11/2015 11
16. The apparatus of claim 1, wherein the image is a Web page.
17. The method of claim 7, wherein the image is a Web page.
18. The method of claim 15, wherein the image is a Web page.
19. A non-transitory computer-readable recording medium having recorded thereon a program to be executed on a computer, wherein the program comprises an executable, when executed by a processor, for causing the processor to: receive an input of a user touch on a touch screen, sense a motion of an apparatus in response to the input of the user touch, when obtaining information of the sensed motion of the apparatus for zoom in or zoom out, zooming in or zooming out the image based on the obtained information, and display the image which is changed according to the zooming in or zooming out. 7146176 1 (GHM.4ter- P93101 A[ J 1 20/11/2015
AU2015258317A 2010-10-14 2015-11-20 Apparatus and method for controlling motion-based user interface Ceased AU2015258317B2 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
AU2015258317A AU2015258317B2 (en) 2010-10-14 2015-11-20 Apparatus and method for controlling motion-based user interface
AU2017210607A AU2017210607B2 (en) 2010-10-14 2017-08-04 Apparatus and method for controlling motion-based user interface

Applications Claiming Priority (5)

Application Number Priority Date Filing Date Title
KR10-2010-0100435 2010-10-14
KR1020100100435A KR101915615B1 (en) 2010-10-14 2010-10-14 Apparatus and method for controlling user interface based motion
AU2011314532A AU2011314532B2 (en) 2010-10-14 2011-10-13 Apparatus and method for controlling motion-based user interface
PCT/KR2011/007628 WO2012050377A2 (en) 2010-10-14 2011-10-13 Apparatus and method for controlling motion-based user interface
AU2015258317A AU2015258317B2 (en) 2010-10-14 2015-11-20 Apparatus and method for controlling motion-based user interface

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
AU2011314532A Division AU2011314532B2 (en) 2010-10-14 2011-10-13 Apparatus and method for controlling motion-based user interface

Related Child Applications (1)

Application Number Title Priority Date Filing Date
AU2017210607A Division AU2017210607B2 (en) 2010-10-14 2017-08-04 Apparatus and method for controlling motion-based user interface

Publications (2)

Publication Number Publication Date
AU2015258317A1 true AU2015258317A1 (en) 2015-12-10
AU2015258317B2 AU2015258317B2 (en) 2017-05-04

Family

ID=54775765

Family Applications (2)

Application Number Title Priority Date Filing Date
AU2015258317A Ceased AU2015258317B2 (en) 2010-10-14 2015-11-20 Apparatus and method for controlling motion-based user interface
AU2017210607A Ceased AU2017210607B2 (en) 2010-10-14 2017-08-04 Apparatus and method for controlling motion-based user interface

Family Applications After (1)

Application Number Title Priority Date Filing Date
AU2017210607A Ceased AU2017210607B2 (en) 2010-10-14 2017-08-04 Apparatus and method for controlling motion-based user interface

Country Status (1)

Country Link
AU (2) AU2015258317B2 (en)

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4720879B2 (en) * 2008-08-29 2011-07-13 ソニー株式会社 Information processing apparatus and information processing method

Also Published As

Publication number Publication date
AU2015258317B2 (en) 2017-05-04
AU2017210607A1 (en) 2017-11-02
AU2017210607B2 (en) 2019-03-21

Similar Documents

Publication Publication Date Title
US10360655B2 (en) Apparatus and method for controlling motion-based user interface
US10216407B2 (en) Display control apparatus, display control method and display control program
KR101500051B1 (en) Gui applications for use with 3d remote controller
US20120079421A1 (en) Electronic device system with information processing mechanism and method of operation thereof
JP5304577B2 (en) Portable information terminal and display control method
US20120079420A1 (en) Electronic device system with process continuation mechanism and method of operation thereof
US20100097337A1 (en) Method for operating page and electronic device
CN103218149A (en) Remote controller, and system and method using the same
US20130239032A1 (en) Motion based screen control method in a mobile terminal and mobile terminal for the same
AU2017210607B2 (en) Apparatus and method for controlling motion-based user interface
KR101499018B1 (en) An apparatus for providing a user interface supporting prompt and fine-grained scroll speed and the method thereof
KR20110066545A (en) Method and terminal for displaying an image using a touch screen
KR102049259B1 (en) Apparatus and method for controlling user interface based motion
JP6971573B2 (en) Electronic devices, their control methods and programs
KR101136327B1 (en) A touch and cursor control method for portable terminal and portable terminal using the same
JP5516794B2 (en) Portable information terminal, display control method and program
KR20160040028A (en) Display apparatus and control methods thereof

Legal Events

Date Code Title Description
FGA Letters patent sealed or granted (standard patent)
MK14 Patent ceased section 143(a) (annual fees not paid) or expired