WO2013068793A1 - A method, apparatus, computer program and user interface - Google Patents
A method, apparatus, computer program and user interface Download PDFInfo
- Publication number
- WO2013068793A1 WO2013068793A1 PCT/IB2011/055048 IB2011055048W WO2013068793A1 WO 2013068793 A1 WO2013068793 A1 WO 2013068793A1 IB 2011055048 W IB2011055048 W IB 2011055048W WO 2013068793 A1 WO2013068793 A1 WO 2013068793A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- touch sensitive
- sensitive display
- user interface
- user
- mode
- Prior art date
Links
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M1/00—Substation equipment, e.g. for use by subscribers
- H04M1/72—Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
- H04M1/724—User interfaces specially adapted for cordless or mobile telephones
- H04M1/72403—User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M2250/00—Details of telephonic subscriber devices
- H04M2250/22—Details of telephonic subscriber devices including a touch pad, a touch sensor or a touch detector
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M2250/00—Details of telephonic subscriber devices
- H04M2250/56—Details of telephonic subscriber devices including a user help function
Definitions
- Embodiments of the present disclosure relate to a method, apparatus, computer program and user interface.
- they relate to a method, apparatus, computer program and user interface which is convenient for a user to use without viewing the apparatus.
- Apparatus comprising touch sensitive displays which enable the user to control the apparatus are well known.
- apparatus such as mobile telephones or tablet computers or satellite navigation apparatus may have a user interface which comprises a touch sensitive display.
- problems may arise if the user needs to actuate the touch sensitive display but is unable to view the display. For instance, if the user is carrying out more than one task simultaneously they might not be able to view the apparatus properly. As an example, if the user is driving or walking while using the apparatus they would need to be looking at where they are driving or walking rather than the apparatus. Alternatively a user may be visually impaired and might not be capable of visually distinguishing between the items on the touch sensitive display particularly.
- a method comprising: providing a plurality of user interface items on a portion of a touch sensitive display wherein in a first mode of operation actuation of an area of the touch sensitive display in which a user interface item is displayed causes a function associated with the user interface item to be performed; and configuring the touch sensitive display into a second mode of operation wherein in the second mode of operation actuation of an area of the touch sensitive display in which a user interface item is displayed causes a non-visual output indicative of the function associated with a user interface item to be provided without enabling the function.
- the non-visual output may comprise an audio output. In some embodiments the non-visual output may comprise a tactile output.
- the touch sensitive display may be configured in the second mode of operation in response to a user input.
- the user input may comprise actuating a designated portion of the touch sensitive display.
- the user input may comprise actuating the designated portion of the touch sensitive display simultaneously to actuating the portion of the touch sensitive display in which the user interface items are displayed.
- the function associated with the simultaneously actuated user interface item may be enabled.
- the function associated with the simultaneously actuated user interface item might not be enabled.
- the designated portion of the touch sensitive display may comprise a corner of the touch sensitive display. In some embodiments a plurality of designated portions of the touch sensitive display may be provided.
- the user interface items may comprise user selectable icons.
- the plurality of user selectable icons may be provided in a different portion of the touch sensitive display to the designated area.
- the method may further comprise configuring the touch sensitive display back into the first mode of operation in response to a further user input.
- an apparatus comprising: at least one processor; and at least one memory including computer program code; wherein the at least one memory and the computer program code are configured to, with the at least one processor, enable the apparatus to: provide a plurality of user interface items on a portion of a touch sensitive display wherein in a first mode of operation actuation of an area of the touch sensitive display in which a user interface item is displayed causes a function associated with the user interface item to be performed; and configure the touch sensitive display into a second mode of operation wherein in the second mode of operation actuation of an area of the touch sensitive display in which a user interface item is displayed causes a non-visual output indicative of the function associated with a user interface item to be provided without enabling the function.
- the non-visual output may comprise an audio output. In some embodiments the non-visual output may comprise a tactile output.
- the at least one memory and the computer program code may be configured to, with the at least one processor, enable the apparatus to detect a user input and configure the touch sensitive display in the second mode of operation in response to the detection of the user input.
- the user input may comprise actuating a designated portion of the touch sensitive display.
- the user input may comprise actuating the designated portion of the touch sensitive display simultaneously to actuating the portion of the touch sensitive display in which the user interface items are displayed.
- the at least one memory and the computer program code may be configured to, with the at least one processor, enable the apparatus to detect when the user ends the actuation of the designated portion of the touch sensitive display, and in response to detecting the end of the actuation of the designated portion of the touch sensitive display, enable the function associated with the simultaneously actuated user interface item to be performed.
- the at least one memory and the computer program code may be configured to, with the at least one processor, enable the apparatus to detect when the user ends the actuation of the designated portion of the touch sensitive display, wherein in response to detecting the end of the actuation of the designated portion of the touch sensitive display, the function associated with the simultaneously actuated user interface item is not performed.
- the designated portion of the touch sensitive display may comprise a corner of the touch sensitive display. In some embodiments a plurality of designated portions of the touch sensitive display may be provided.
- the user interface items may comprise user selectable icons.
- the plurality of user selectable icons may be provided in a different portion of the touch sensitive display to the designated area.
- the at least one memory and the computer program code may be configured to, with the at least one processor, enable the apparatus to detect a further user input and configure the touch sensitive display back in the first mode of operation in response to the further user input.
- a computer program comprising computer program instructions that, when executed by at least one processor, enable an apparatus at least to perform: providing a plurality of user interface items on a portion of a touch sensitive display wherein in a first mode of operation actuation of an area of the touch sensitive display in which a user interface item is displayed causes a function associated with the user interface item to be performed; and configuring the touch sensitive display into a second mode of operation wherein in the second mode of operation actuation of an area of the touch sensitive display in which a user interface item is displayed causes a non-visual output indicative of the function associated with a user interface item to be provided without enabling the function.
- a computer program comprising program instructions for causing a computer to perform the method as described above.
- an electromagnetic carrier signal carrying the computer program as described above there may also be provided an electromagnetic carrier signal carrying the computer program as described above.
- a user interface comprising: a touch sensitive display wherein the touch sensitive display is configured to; provide a plurality of user interface items on a portion of the touch sensitive display wherein in a first mode of operation actuation of an area of the touch sensitive display in which a user interface item is displayed causes a function associated with the user interface item to be performed; and when configured in a second mode of operation cause a non-visual output indicative of the function associated with a user interface item in response to actuation of an area of the touch sensitive display in which a user interface item is displayed without enabling the function.
- the non-visual output may comprise an audio output.
- the non-visual output may comprise a tactile output.
- the touch sensitive display may be configured in the second mode of operation in response to a user input.
- the user input may comprise actuating a designated portion of the touch sensitive display.
- the apparatus may be for wireless communication. BRIEF DESCRIPTION
- Fig. 1 schematically illustrates an apparatus according to an examplary embodiment of the disclosure
- Fig. 2 schematically illustrates an apparatus according to another examplary embodiment of the disclosure
- FIGs. 3A and 3B are block diagrams which schematically illustrate methods according to an examplary embodiment of the disclosure.
- Figs. 4A to 4D illustrate graphical user interfaces according to an examplary embodiment of the disclosure.
- the Figures illustrate a method comprising: providing a plurality of user interface items 53 on a portion of a touch sensitive display 15 wherein in a first mode of operation actuation of an area of the touch sensitive display 15 in which a user interface item 53 is displayed causes a function associated with the user interface item 53 to be performed; and configuring the touch sensitive display 15 into a second mode of operation wherein in the second mode of operation actuation of an area of the touch sensitive display 15 in which a user interface item 53 is displayed causes a non-visual output indicative of the function associated with a user interface item to be provided without enabling the function.
- Fig. 1 schematically illustrates an apparatus 1 according to an embodiment of the disclosure.
- the apparatus 1 may be an electronic apparatus.
- the apparatus 1 may be, for example, a mobile cellular telephone, a tablet computer, a personal computer, a camera, a gaming device, a personal digital assistant, a personal music player, an electronic reader or any other apparatus which may enable a user to make inputs via a touch sensitive display.
- the apparatus 1 may be a handheld apparatus 1 which can be carried in a user's hand, handbag or jacket pocket for example.
- the apparatus 1 may comprise additional features that are not illustrated.
- the apparatus 1 may comprise one or more transmitters and receivers.
- the apparatus 1 illustrated in Fig. 1 comprises: a user interface 13 and a controller 4.
- the controller 4 comprises at least one processor 3 and at least one memory 5 and the user interface 13 comprises a touch sensitive display 15 and a user input device 17.
- the controller 4 provides means for controlling the apparatus 1 .
- the controller 4 may be implemented using instructions that enable hardware functionality, for example, by using executable computer program instructions 1 1 in one or more general-purpose or special-purpose processors 3 that may be stored on a computer readable storage medium 23 (e.g. disk, memory etc.) to be executed by such processors 3.
- a computer readable storage medium 23 e.g. disk, memory etc.
- the controller 4 may be configured to control the apparatus 1 to perform functions.
- the functions may comprise, for example, communications functions such as telephone calls, email services or messages such as SMS (short message service) messages, MMS (multimedia message service) messages or instant messages, or access to social networking applications or any other communications functions, media functions such as access to image capturing devices, stored images or videos or data files, access to music or other audio files or any other media functions.
- Other functions may include games or calendar applications or access to services such as satellite navigation systems.
- the controller 4 may also be configured to enable the apparatus 1 to perform a method comprising: providing a plurality of user interface items on a portion of a touch sensitive display 15 wherein in a first mode of operation actuation of an area of the touch sensitive display 15 in which a user interface item 53 is displayed causes a function associated with the user interface item 53 to be performed; and configuring the touch sensitive display 15 into a second mode of operation wherein in the second mode of operation actuation of an area of the touch sensitive display 15 in which a user interface item 53 is displayed causes a non-visual output indicative of the function associated with a user interface item 53 to be provided without enabling the function.
- the at least one processor 3 is configured to receive input commands from the user interface 13 and also to provide output commands to the user interface 13.
- the at least one processor 3 is also configured to write to and read from the at least one memory 5. Outputs of the user interface 13 may be provided as inputs to the controller 4.
- the user input device 17 provides means for enabling a user of the apparatus 1 to input information which may be used to control the apparatus 1 .
- the user input device 17 may comprise any means which enables a user to input information into the apparatus 1 .
- the user input device 17 may comprise a touch sensitive display 15 or a portion of a touch sensitive display 15, a key pad, an accelerometer or other means configured to detect orientation and/or movement of the apparatus 1 , audio input means which enable an audio input signal to be detected and converted into a control signal for the controller 4 or a combination of different types of user input devices.
- the touch sensitive display 15 may be actuated by a user contacting the surface of the touch sensitive display 15 with an object such as their finger or a stylus. A user may contact the surface of the touch sensitive display 15 by physically touching the surface of the touch sensitive display 15 with an object or by bringing the object close enough to the surface to activate the sensors of the touch sensitive display 15.
- the touch sensitive display 15 may comprises a capacitive touch sensitive display, or a resistive touch sensitive display 15 or any other suitable means for detecting a touch input.
- the touch sensitive display 15 may comprise any means which enables information to be displayed to a user of the apparatus 1 .
- the information which is displayed may comprise a plurality of user interface items 53.
- the user interface items may enable the user of the apparatus 1 to control the apparatus 1 .
- the user interface items 53 may comprise user selectable icons 61 .
- the user selectable icons 61 may be associated with functions of the apparatus 1 so that user selection of the icon causes the associated function to be performed.
- the user interface 13 comprises a touch sensitive display 15 the user selectable icons 61 may be selected by actuating the area of the touch sensitive display 15 in which the user selectable icon 61 is displayed.
- the touch sensitive display 15 may be configured to display graphical user interfaces 51 as illustrated in Figs. 4A to 4D.
- the at least one memory 5 is configured to store a computer program 9 comprising computer program instructions 1 1 that control the operation of the apparatus 1 when loaded into the at least one processor 3.
- the computer program instructions 1 1 provide the logic and routines that enable the apparatus 1 to perform the examplary methods illustrated in Figs. 3A and 3B.
- the at least one processor 3 by reading the at least one memory 5 is able to load and execute the computer program 9.
- the computer program instructions 1 1 may provide computer readable program means configured to control the apparatus 1 .
- the program instructions 1 1 may provide, when loaded into the controller 4; means for providing a plurality of user interface items 53 on a portion of a touch sensitive display 15 wherein in a first mode of operation actuation of an area of the touch sensitive display 15 in which a user interface item 53 is displayed causes a function associated with the user interface item 53 to be performed; and means for configuring the touch sensitive display 15 into a second mode of operation wherein in the second mode of operation actuation of an area of the touch sensitive display 15 in which a user interface item 53 is displayed causes a non-visual output indicative of the function associated with a user interface item 53 to be provided without enabling the function.
- the computer program 9 may arrive at the apparatus 1 via any suitable delivery mechanism 21 .
- the delivery mechanism 21 may comprise, for example, a computer-readable storage medium, a computer program product 23, a memory device, a record medium such as a CD-ROM or DVD, an article of manufacture that tangibly embodies the computer program 9.
- the delivery mechanism may be a signal configured to reliably transfer the computer program 9.
- the apparatus 1 may propagate or transmit the computer program 9 as a computer data signal.
- the memory 5 may comprise a single component or it may be implemented as one or more separate components some or all of which may be integrated/removable and/or may provide permanent/semi-permanent/ dynamic/cached storage.
- references to 'computer-readable storage medium', 'computer program product', 'tangibly embodied computer program' etc. or a 'controller', 'computer', 'processor' etc. should be understood to encompass not only computers having different architectures such as single/multi- processor architectures and sequential (e.g. Von Neumann)/parallel architectures but also specialized circuits such as field-programmable gate arrays (FPGA), application specific integration circuits (ASIC), signal processing devices and other devices. References to computer program, instructions, code etc.
- Fig. 2 illustrates an apparatus V according to another embodiment of the disclosure.
- the apparatus 1 ' illustrated in Fig. 2 may be a chip or a chip-set.
- the apparatus 1 ' comprises at least one processor 3 and at least one memory 5 as described above in relation to Fig. 1 .
- Figs. 3A and 3B schematically illustrate methods according to examplary embodiments of the disclosure which may be performed by the apparatus 1 as illustrated in Figs. 1 and 2.
- Fig. 3A illustrates an examplary method of configuring an apparatus 1 into a mode of operation which enables a user to control the apparatus 1 without having to view the apparatus 1 .
- a touch sensitive display 15 is configured in a first mode of operation.
- the first mode of operation may be a default mode of operation.
- a plurality of user interface items 53 are displayed on the touch sensitive display 15.
- the user interface items 53 are associated with functions of the apparatus 1 so that actuation of the area of the touch sensitive display 15 in which a user interface item 53 is displayed causes the function associated with that user interface item 53 to be performed.
- the actuation required to cause a function to be performed may be a specific type of user input, for example it may comprise actuating the area in which the user interface item 53 is displayed for at least a minimum period of time or making multiple inputs in the same area, for example, a tap input, a double tap input, a trace or swipe gesture input, a flick gesture input or any other suitable input.
- the user input may comprise any input which is made by the user and which is detected by the user input means 17.
- the user input may be an input which a user can make without viewing the apparatus 1 .
- it may comprise physical actuation of a portion of the apparatus 1 which is easy for the user to find without viewing the apparatus 1 .
- the user input may comprise actuation of a designated portion 55, 59 of the touch sensitive display 15.
- the designated portion 55, 59 of the touch sensitive display 15 may comprise an edge portion or a corner portion of the touch sensitive display 15 which may be easy for a user to find without viewing the apparatus 1 .
- user input may comprise actuation of a key.
- the key may be provided in a location so that it is easy for a user to find without looking at the apparatus 1 , for example, it may be located on a side of the apparatus 1 and not positioned adjacent to or in proximity to any other keys.
- the user input may comprise a different type of input, for example it may comprise an audio input which may enable voice control of the apparatus 1 .
- the user input may comprise an input which may be made to any part of the apparatus 1 , for example, a shaking or movement of the apparatus 1 which may be detected by an accelerometer.
- the user input means 17 provides an output signal to the controller 4 indicative of the detected input and at block 35 the apparatus 1 is configured into a second mode of operation.
- the touch sensitive display 15 is configured so that actuation of the area of the touch sensitive display 15 in which a user interface item 53 is displayed does not cause the function associated with that user interface item to be performed. Instead a non-visual output is provided indicative of the function associated with the user interface item 53.
- the non-visual output may comprise any output which can be detected by the user without viewing the apparatus 1 .
- the non-visual output may comprise an audio output.
- the audio output may comprise an acoustic signal which may be provided by an audio output device such as a loudspeaker.
- the acoustic signal may comprise a pressure wave which may be detected by the user, for example by the user's ear.
- the audio output device may be configured to provide the audio output in response to an electrical input signal provided by the controller 4.
- the electrical input signal may contain information indicative of the audio output which is to be provided.
- the non-visual output may comprise a tactile output.
- the tactile output may be provided instead of or in addition to an audio output.
- the tactile output may comprise any output which the user of the apparatus 1 may sense through touch.
- the tactile output may comprise a raised portion or an indented portion of the touch sensitive display 15, a change in the texture of the surface or part of the touch sensitive display 15, or a vibration of the apparatus 1 or part of the apparatus 1 .
- the tactile output may be provided as localized projections or indentations in the surface of the touch sensitive display 15.
- the projections and indentations may be provided by any suitable means such as a layer of electroactive polymer, a mechanical or fluid pumping system or a piezo electric transducer.
- the tactile output may be provided by providing a stimulus such as an electrical impulse to the user rather than making a physical modification to the surface of the touch sensitive display 15 or any other part of the apparatus 1 .
- a current may be provided to the touch sensitive display 15 to change the electrical charge of the touch sensitive display 15. The user may be able to sense this through their skin and so be provided with a tactile output.
- 3B illustrates an examplary method in which the user actuates the touch sensitive display 15 when the apparatus 1 is configured in the second mode of operation as described above.
- the user actuates the area of the touch sensitive display 15 in which a user interface item 53 is displayed.
- the user interface item 53 may be normally associated with a function such that in a normal or default mode of operation actuation of the area in which the user interface item 53 is displayed causes the function to be performed.
- an output signal is provided to the controller 4 indicative of the user interface item 53 which has been actuated.
- the controller 4 controls the apparatus 1 to provide a non-visual input indicative of the function which is normally associated with the user interface item.
- the non-visual output may comprise an audio output and/or a tactile output.
- the non-visual output may provide sufficient information to the user of the apparatus 1 to enable them to determine the function normally associated with the user interface item 53 without having to view the apparatus 1 .
- the non-visual output indicative of the function is performed without causing the function itself to be performed. If the user wishes to enable the function to be performed they may need to make a further user input.
- the touch sensitive display may be configured back into the first mode of operation in response to a further user input.
- a user may be able to make a user input, to indicate that they wish the function indicated by the non-visual output to be performed.
- Figs. 4A to 4D illustrate graphical user interfaces 51 according to an examplary embodiment of the disclosure.
- the graphical user interfaces 51 may be displayed on a touch sensitive display 15 of an apparatus 1 as described above.
- the graphical user interface 51 comprises a plurality of user interface items 53.
- the plurality of graphical user interface items 53 may comprise any item displayed on the touch sensitive display 15.
- the plurality of user interface items 53 comprises a plurality of user selectable icons 61 .
- the plurality of user selectable icons 61 comprise graphical items which are associated with functions of the apparatus 1 such that, in a normal mode of operation, actuation of the area in which the respective user selectable icon is displayed causes the associated function to be performed.
- the plurality of user selectable icons 61 may be arranged in a menu structure or on a home screen. The user may be able to navigate through the plurality of user selectable icons 61 by scrolling through the menu structure or home screen.
- the plurality of user selectable icons 61 which are provided may be dynamic, that is they are not necessarily provided in a fixed position on the touch sensitive display 15.
- the user selectable icons 61 which are provided and their respective position may depend on a plurality of factors such as the mode of operation of the apparatus 1 , the position of the menu or home screen to which the user has scrolled or navigated, the functions which are currently available on the apparatus or the mode of operation of the apparatus 1 . This may make it difficult for a user to locate a specific one of the plurality of user selectable icons 61 without viewing the touch sensitive display 15.
- the plurality of user interface items 53 also comprises a plurality of designated portions 55, 57, 59 of the touch sensitive display 15.
- three designated portions 55, 57, 59 are provided.
- a first designated portion 55 is provided in the upper left hand corner of the touch sensitive display 15
- a second designated portion 59 is provided in the upper right hand corner of the touch sensitive display 15
- a third designated portion 57 is provided in the centre of the upper side of the touch sensitive display 15.
- the designated portions 55, 57, 59 may be fixed with respect to the edge of the touch sensitive display 15 so that they do not move even when the user scrolls through the plurality of user selectable icons 61 .
- the designated portions 55, 57, 59 may be located in any portion of the display 15 or even the housing of the apparatus 1 .
- the first and second designated portions 55, 59 enable the touch sensitive display 15 to be configured into a second mode of operation as described in more detail below.
- the third designated portion 57 is provided in the centre of the upper side of the touch sensitive display 15 and is associated with a reader function such that the actuation of the third designated portion 57 causes the apparatus 1 to provide an audio output indicating all the user interface items currently displayed on the touch sensitive display 15. This may be an advantageous feature if the user of the apparatus 1 is visually impaired.
- user interface items 53 may also be provided which are not illustrated in Figs 4A to 4D.
- user interface items 53 which provide information to a user but do not cause a function to be performed, for example a graphical item may be provided indicating the remaining power available in a power source or of an available signal strength or status of a communication link.
- the plurality of user selectable icons 61 are provided in a different area of the touch sensitive display 15 to the designated portions 55, 57, 59. This may make it easier for a user to locate the respective user interface items 53.
- the touch sensitive display 15 is configured in the first mode of operation so that if a user were to actuate the area in which any of the plurality of user selectable icons 61 is displayed the controller 4 would control the apparatus 1 to cause the associated function to be performed.
- Fig. 4B the user actuates the first designated portion 55 of the touch sensitive display 15 by touching this portion with their finger 63. This causes the controller 4 to configure the touch sensitive display 15 into the second mode of operation as described above.
- Fig. 4C the user maintains the actuation of the first designated portion 55 of the touch sensitive display 15 with the first finger 63 and simultaneously actuates the area of the touch sensitive display 15 in which a user selectable icon 61 is displayed with a second finger 67.
- a non-visual output may be an audio output and/or a tactile output.
- the content of the non-visual output may be selected by a user to provide them with information which enables them to easily determine the function associated with the user selectable icon 61 .
- a visual output may be provided in addition to the non-visual output.
- the user may actuate a plurality of different areas of the touch sensitive display 15 for example they may hold their first finger 63 fixed in the upper left hand corner of the touch sensitive display 15 and move their second finger 67 across the touch sensitive display 15. This may cause a plurality of different non-visual outputs to be provided as a plurality of different areas of the touch sensitive display 15 are actuated.
- Fig. 4D the user ends the actuation of the first designated area 55 by lifting their finger off the surface of the touch sensitive display 15. In the embodiment of the disclosure illustrated in Figs. 4A to 4D this causes the touch sensitive display 15 to be switched back to the first normal mode of operation.
- the user may also configure the touch sensitive display 15 into the second mode of operation by actuating the second designated portion 59.
- the second designated portion 59 may operate in an identical manner to the first designated portion 55 so that in order to maintain the touch sensitive display 15 in the second mode of operation the user has to keep their finger in contact with the designated portion 55, 59. This may provide an advantage that the user could actuate either designated portion 55, 59, it may make it easier for the user find the designated portion 55, 59 and they could use whichever designated portion 55, 59 is most convenient for them. The user could then control the apparatus to cause a function to be performed by ending the actuation of the designated portion 55, 59 whilst maintaining the actuation of the area in which the user selectable icon 61 is displayed.
- the two designated portions 55, 59 may have different modes of actuation.
- the first designated portion 55 may be actuated by touching the first designated portion 55 and maintaining contact with the surface of the touch sensitive display 15 while simultaneously actuating the area of the touch sensitive display 15 in which the plurality of user selectable icons 61 is displayed.
- the second designated portion 59 may be actuated by touching the second designated portion 59 and then subsequently actuating the area of the touch sensitive display 15 in which the plurality of user selectable icons 61 is displayed without having to maintain the simultaneous actuation of the second designated portion 59. This may enable a user to end the actuation of the designated portion 59 without causing the function to be performed. This may make the apparatus 1 easier for a user to use with one hand.
- the plurality of user selectable icons 61 may be dynamic, that is their position on the touch sensitive display 15 is not fixed and they may be moved. In some embodiments the plurality of user selectable icons 61 may be moved in response to an input by a user, for example a user may make an input which causes the plurality of user selectable icons 61 to be scrolled through. The scrolling may continue until another user input or interrupt is detected. In other embodiments the movement of the plurality of user selectable icons 61 on the touch sensitive display 15 may be automatic, without any direct input from the user of the apparatus 1 .
- the plurality of user selectable icons 61 may comprise items within a list of received messages such as emails or notifications from a social networking site.
- the list of received messages or notifications may be automatically refreshed whenever a new message or notification is received. This may cause the items within the list to be moved on the display 15. When a user selects one of the items in the list this may enable the function of opening the message or notification or accessing information relating to the message or notification.
- this may cause the dynamic items of the touch sensitive display 15 to be temporarily fixed so that they remain in the same position on the touch sensitive display 15. For example if a user is scrolling through the plurality of user selectable icons 61 and then actuates one of the designated portions 55, 57, 59, this may cause the scrolling to be suspended while the apparatus 1 is in the second mode of operation. The scrolling may be resumed once the apparatus 1 is configured out of the second mode of operation.
- the plurality of user selectable icons 61 comprises items within a list of received messages such as emails or notifications from a social networking site
- actuating one of the designated portions 55, 57, 59 and enabling the second mode of operation may cause the automatic refreshing of the list to be temporarily disabled. This may enable a user to obtain non-visual indications of new notifications and messages which have been received.
- the non-visual indications of new notifications and messages which have been received may comprise an indication of the type of message or notification.
- different non-visual outputs may be provided for different types of messages, such as emails, SMS or MMS messages, instant messages or notifications from social networking sites.
- the non-visual indication may include further details such as the sender of the message or notification and the time and date at which the message or notification was received.
- the non-visual indication may comprise a non-visual output of the whole received message or notification, for example, the text of the message may be converted into an audio output which may be provided in response to the user input.
- the apparatus 1 may be configured in a locked mode of operation.
- the touch sensitive display 15 In the locked mode of operation the touch sensitive display 15 may be configured so that it is not responsive to user inputs so that actuating an area of the touch sensitive display 15 in which a user selectable icon 61 is displayed would not enable the function associated with the user selectable icon to be performed.
- the user When the apparatus 1 is in the locked mode of operation the user may still be able to make a user input to cause the apparatus 1 to be configured in the second mode of operation and enable the non-visual outputs to be provided.
- the designated portions 55, 57, 59 of the touch sensitive display 15 may still be responsive to a user input to enable the apparatus 1 to be configured out of the locked mode of operation and into the second mode of operation. In some embodiments this may enable the user to control the apparatus 1 to provide a non-visual output indicative of a received message or notification without having to configure the apparatus 1 into the unlocked mode of operation.
- Embodiments of the disclosure provide an apparatus 1 with a touch screen display 15 which can be configured to enable the apparatus 1 to be used without looking at the touch screen display 15.
- Configuring the apparatus 1 into a mode of operation in which a non-visual output is provided but a function is not performed enables a user to easily find the user interface items 53 they need without having to look at the apparatus 1 . This could be useful for users who are performing other tasks, such as driving or walking, whilst using the apparatus 1 . It may also be useful for visually impaired users or user who may have difficulty viewing certain user interface items 53. Also the embodiments of the disclosure enable an apparatus to be easily switched between the respective modes of operation by user inputs such as actuating designated areas of the touch sensitive display or voice inputs. These inputs may be inputs which a user can make easily and accurately without looking at the apparatus 1 or the touch sensitive display 15.
- every user selectable icon 61 may be associated with a non-visual output so that actuation of any of the respective areas in which the user selectable icons 61 are displayed causes a non-visual output to be provided. This may be particularly advantageous for apparatus where the user is visually impaired and may need assistance in finding distinguishing between the respective user interface items 53.
- a subset of the plurality of user selectable icons 61 may be associated with a non-visual output. The user may be able to select which user selectable icons are to be associated with a non-visual output.
- This may be particularly advantageous for apparatus where the user intends to use the apparatus while performing other tasks, for example if the user is using the apparatus as a navigation system or music player whilst driving they might find it useful to be able to find the user interface icons associated with these functions but might not be interested in the user interface icons associated with, for example, viewing captured or stored images, as these functions would be unlikely to be used while they are not able to look at the apparatus. This may make it easier for the user to find the user selectable items 61 they need.
- the blocks illustrated in the Figs. 3A and 3B may represent steps in a method and/or sections of code in the computer program 9.
- the illustration of a particular order to the blocks does not necessarily imply that there is a required or preferred order for the blocks and the order and arrangement of the block may be varied. Furthermore, it may be possible for some blocks to be omitted.
- the designated portions are provided in the upper left hand and upper right hand corners respectively.
- the designated portion may comprise any corner of the touch sensitive display. This may provide the advantage that the user of the apparatus does not need to know the orientation of the apparatus in order to find a designated portion of the touch sensitive display. This may be particularly beneficial if the user is visually impaired.
- the apparatus 1 may also be configured to provide a non-visual output for items which do not have a specific function associated with them.
- the display 15 may comprise icons indicative of the status of the apparatus 1 for example, the battery power level of the apparatus 1 or the signal strength available to the apparatus 1 .
- the apparatus may be configured to provide a non-visual output relating to these status indicators as well as the user selectable items.
Landscapes
- Engineering & Computer Science (AREA)
- Human Computer Interaction (AREA)
- Computer Networks & Wireless Communication (AREA)
- Signal Processing (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
A method, apparatus, computer program and user interface wherein the method comprises: providing a plurality of user interface items on a portion of a touch sensitive display wherein in a first mode of operation actuation of an area of the touch sensitive display in which a user interface item is displayed causes a function associated with the user interface item to be performed; and configuring the touch sensitive display into a second mode of operation wherein in the second mode of operation actuation of an area of the touch sensitive display in which a user interface item is displayed causes a non- visual output indicative of the function associated with a user interface item to be provided without enabling the function.
Description
A Method, Apparatus, Computer Program and User Interface TECHNOLOGICAL FIELD
Embodiments of the present disclosure relate to a method, apparatus, computer program and user interface. In particular, they relate to a method, apparatus, computer program and user interface which is convenient for a user to use without viewing the apparatus.
BACKGROUND
Apparatus comprising touch sensitive displays which enable the user to control the apparatus are well known. For example apparatus such as mobile telephones or tablet computers or satellite navigation apparatus may have a user interface which comprises a touch sensitive display.
Problems may arise if the user needs to actuate the touch sensitive display but is unable to view the display. For instance, if the user is carrying out more than one task simultaneously they might not be able to view the apparatus properly. As an example, if the user is driving or walking while using the apparatus they would need to be looking at where they are driving or walking rather than the apparatus. Alternatively a user may be visually impaired and might not be capable of visually distinguishing between the items on the touch sensitive display particularly.
BRIEF SUMMARY
According to various, but not necessarily all, embodiments of the disclosure there is provided a method comprising: providing a plurality of user interface items on a portion of a touch sensitive display wherein in a first mode of operation actuation of an area of the touch sensitive display in which a user interface item is displayed causes a function associated with the user interface item to be performed; and configuring the touch sensitive display into a second mode of operation wherein in the second mode of operation actuation of an area of the touch sensitive display in which a user interface item is displayed causes a non-visual output indicative of the function associated with a user interface item to be provided without enabling the function.
In some embodiments the non-visual output may comprise an audio output. In some embodiments the non-visual output may comprise a tactile output.
In some embodiments the touch sensitive display may be configured in the second mode of operation in response to a user input. The user input may comprise actuating a designated portion of the touch sensitive display. The user input may comprise actuating the designated portion of the touch sensitive display simultaneously to actuating the portion of the touch sensitive display in which the user interface items are displayed. When the user ends the actuation of the designated portion of the touch sensitive display the function associated with the simultaneously actuated user interface item may be enabled. Alternatively when the user ends the actuation of the designated portion of the touch sensitive display the function associated with the simultaneously actuated user interface item might not be enabled.
In some embodiments the designated portion of the touch sensitive display may comprise a corner of the touch sensitive display.
In some embodiments a plurality of designated portions of the touch sensitive display may be provided.
In some embodiments the user interface items may comprise user selectable icons. The plurality of user selectable icons may be provided in a different portion of the touch sensitive display to the designated area.
In some embodiments the method may further comprise configuring the touch sensitive display back into the first mode of operation in response to a further user input.
According to various, but not necessarily all, embodiments of the disclosure there is also provided an apparatus comprising: at least one processor; and at least one memory including computer program code; wherein the at least one memory and the computer program code are configured to, with the at least one processor, enable the apparatus to: provide a plurality of user interface items on a portion of a touch sensitive display wherein in a first mode of operation actuation of an area of the touch sensitive display in which a user interface item is displayed causes a function associated with the user interface item to be performed; and configure the touch sensitive display into a second mode of operation wherein in the second mode of operation actuation of an area of the touch sensitive display in which a user interface item is displayed causes a non-visual output indicative of the function associated with a user interface item to be provided without enabling the function.
In some embodiments the non-visual output may comprise an audio output. In some embodiments the non-visual output may comprise a tactile output.
In some embodiments the at least one memory and the computer program code may be configured to, with the at least one processor, enable the
apparatus to detect a user input and configure the touch sensitive display in the second mode of operation in response to the detection of the user input. The user input may comprise actuating a designated portion of the touch sensitive display. The user input may comprise actuating the designated portion of the touch sensitive display simultaneously to actuating the portion of the touch sensitive display in which the user interface items are displayed.
In some embodiments the at least one memory and the computer program code may be configured to, with the at least one processor, enable the apparatus to detect when the user ends the actuation of the designated portion of the touch sensitive display, and in response to detecting the end of the actuation of the designated portion of the touch sensitive display, enable the function associated with the simultaneously actuated user interface item to be performed. Alternatively the at least one memory and the computer program code may be configured to, with the at least one processor, enable the apparatus to detect when the user ends the actuation of the designated portion of the touch sensitive display, wherein in response to detecting the end of the actuation of the designated portion of the touch sensitive display, the function associated with the simultaneously actuated user interface item is not performed.
In some embodiments the designated portion of the touch sensitive display may comprise a corner of the touch sensitive display. In some embodiments a plurality of designated portions of the touch sensitive display may be provided.
In some embodiments the user interface items may comprise user selectable icons. The plurality of user selectable icons may be provided in a different portion of the touch sensitive display to the designated area.
In some embodiments the at least one memory and the computer program code may be configured to, with the at least one processor, enable the apparatus to detect a further user input and configure the touch sensitive display back in the first mode of operation in response to the further user input.
According to various, but not necessarily all, embodiments of the disclosure there is also provided a computer program comprising computer program instructions that, when executed by at least one processor, enable an apparatus at least to perform: providing a plurality of user interface items on a portion of a touch sensitive display wherein in a first mode of operation actuation of an area of the touch sensitive display in which a user interface item is displayed causes a function associated with the user interface item to be performed; and configuring the touch sensitive display into a second mode of operation wherein in the second mode of operation actuation of an area of the touch sensitive display in which a user interface item is displayed causes a non-visual output indicative of the function associated with a user interface item to be provided without enabling the function. In some embodiments there may also be provided a computer program comprising program instructions for causing a computer to perform the method as described above.
In some embodiments there may also be provided a non-transitory entity embodying the computer program as described above.
In some embodiments there may also be provided an electromagnetic carrier signal carrying the computer program as described above. According to various, but not necessarily all, embodiments of the disclosure there is also provided a user interface comprising: a touch sensitive display wherein the touch sensitive display is configured to; provide a plurality of user
interface items on a portion of the touch sensitive display wherein in a first mode of operation actuation of an area of the touch sensitive display in which a user interface item is displayed causes a function associated with the user interface item to be performed; and when configured in a second mode of operation cause a non-visual output indicative of the function associated with a user interface item in response to actuation of an area of the touch sensitive display in which a user interface item is displayed without enabling the function. In some embodiments the non-visual output may comprise an audio output.
In some embodiments the non-visual output may comprise a tactile output.
In some embodiments the touch sensitive display may be configured in the second mode of operation in response to a user input. The user input may comprise actuating a designated portion of the touch sensitive display.
The apparatus may be for wireless communication. BRIEF DESCRIPTION
For a better understanding of various examples of embodiments of the present disclosure reference will now be made by way of example only to the accompanying drawings in which:
Fig. 1 schematically illustrates an apparatus according to an examplary embodiment of the disclosure;
Fig. 2 schematically illustrates an apparatus according to another examplary embodiment of the disclosure;
Figs. 3A and 3B are block diagrams which schematically illustrate methods according to an examplary embodiment of the disclosure; and
Figs. 4A to 4D illustrate graphical user interfaces according to an examplary embodiment of the disclosure.
DETAILED DESCRIPTION
The Figures illustrate a method comprising: providing a plurality of user interface items 53 on a portion of a touch sensitive display 15 wherein in a first mode of operation actuation of an area of the touch sensitive display 15 in which a user interface item 53 is displayed causes a function associated with the user interface item 53 to be performed; and configuring the touch sensitive display 15 into a second mode of operation wherein in the second mode of operation actuation of an area of the touch sensitive display 15 in which a user interface item 53 is displayed causes a non-visual output indicative of the function associated with a user interface item to be provided without enabling the function. Fig. 1 schematically illustrates an apparatus 1 according to an embodiment of the disclosure. The apparatus 1 may be an electronic apparatus. The apparatus 1 may be, for example, a mobile cellular telephone, a tablet computer, a personal computer, a camera, a gaming device, a personal digital assistant, a personal music player, an electronic reader or any other apparatus which may enable a user to make inputs via a touch sensitive display. The apparatus 1 may be a handheld apparatus 1 which can be carried in a user's hand, handbag or jacket pocket for example.
Features referred to in the following description are illustrated in Figs. 1 and 2. However, it should be understood that the apparatus 1 may comprise additional features that are not illustrated. For example, in embodiments of the disclosure where the apparatus 1 is configured for wireless communication the apparatus 1 may comprise one or more transmitters and receivers.
The apparatus 1 illustrated in Fig. 1 comprises: a user interface 13 and a controller 4. In the illustrated embodiment the controller 4 comprises at least
one processor 3 and at least one memory 5 and the user interface 13 comprises a touch sensitive display 15 and a user input device 17.
The controller 4 provides means for controlling the apparatus 1 . The controller 4 may be implemented using instructions that enable hardware functionality, for example, by using executable computer program instructions 1 1 in one or more general-purpose or special-purpose processors 3 that may be stored on a computer readable storage medium 23 (e.g. disk, memory etc.) to be executed by such processors 3.
The controller 4 may be configured to control the apparatus 1 to perform functions. A person skilled in the art would appreciate that the apparatus 1 may be used for any number and range of functions and applications. The functions may comprise, for example, communications functions such as telephone calls, email services or messages such as SMS (short message service) messages, MMS (multimedia message service) messages or instant messages, or access to social networking applications or any other communications functions, media functions such as access to image capturing devices, stored images or videos or data files, access to music or other audio files or any other media functions. Other functions may include games or calendar applications or access to services such as satellite navigation systems.
The controller 4 may also be configured to enable the apparatus 1 to perform a method comprising: providing a plurality of user interface items on a portion of a touch sensitive display 15 wherein in a first mode of operation actuation of an area of the touch sensitive display 15 in which a user interface item 53 is displayed causes a function associated with the user interface item 53 to be performed; and configuring the touch sensitive display 15 into a second mode of operation wherein in the second mode of operation actuation of an area of the touch sensitive display 15 in which a user interface item 53 is displayed
causes a non-visual output indicative of the function associated with a user interface item 53 to be provided without enabling the function.
The at least one processor 3 is configured to receive input commands from the user interface 13 and also to provide output commands to the user interface 13. The at least one processor 3 is also configured to write to and read from the at least one memory 5. Outputs of the user interface 13 may be provided as inputs to the controller 4. The user input device 17 provides means for enabling a user of the apparatus 1 to input information which may be used to control the apparatus 1 . The user input device 17 may comprise any means which enables a user to input information into the apparatus 1 . For example the user input device 17 may comprise a touch sensitive display 15 or a portion of a touch sensitive display 15, a key pad, an accelerometer or other means configured to detect orientation and/or movement of the apparatus 1 , audio input means which enable an audio input signal to be detected and converted into a control signal for the controller 4 or a combination of different types of user input devices. The touch sensitive display 15 may be actuated by a user contacting the surface of the touch sensitive display 15 with an object such as their finger or a stylus. A user may contact the surface of the touch sensitive display 15 by physically touching the surface of the touch sensitive display 15 with an object or by bringing the object close enough to the surface to activate the sensors of the touch sensitive display 15. The touch sensitive display 15 may comprises a capacitive touch sensitive display, or a resistive touch sensitive display 15 or any other suitable means for detecting a touch input.
The touch sensitive display 15 may comprise any means which enables information to be displayed to a user of the apparatus 1 . The information which is displayed may comprise a plurality of user interface items 53. The
user interface items may enable the user of the apparatus 1 to control the apparatus 1 .
In some embodiments of the disclosure the user interface items 53 may comprise user selectable icons 61 . The user selectable icons 61 may be associated with functions of the apparatus 1 so that user selection of the icon causes the associated function to be performed. In embodiments of the disclosure where the user interface 13 comprises a touch sensitive display 15 the user selectable icons 61 may be selected by actuating the area of the touch sensitive display 15 in which the user selectable icon 61 is displayed.
The touch sensitive display 15 may be configured to display graphical user interfaces 51 as illustrated in Figs. 4A to 4D. The at least one memory 5 is configured to store a computer program 9 comprising computer program instructions 1 1 that control the operation of the apparatus 1 when loaded into the at least one processor 3. The computer program instructions 1 1 provide the logic and routines that enable the apparatus 1 to perform the examplary methods illustrated in Figs. 3A and 3B.
The at least one processor 3 by reading the at least one memory 5 is able to load and execute the computer program 9.
The computer program instructions 1 1 may provide computer readable program means configured to control the apparatus 1 . The program instructions 1 1 may provide, when loaded into the controller 4; means for providing a plurality of user interface items 53 on a portion of a touch sensitive display 15 wherein in a first mode of operation actuation of an area of the touch sensitive display 15 in which a user interface item 53 is displayed causes a function associated with the user interface item 53 to be performed; and means for configuring the touch sensitive display 15 into a second mode of operation wherein in the second mode of operation actuation of an area of
the touch sensitive display 15 in which a user interface item 53 is displayed causes a non-visual output indicative of the function associated with a user interface item 53 to be provided without enabling the function. The computer program 9 may arrive at the apparatus 1 via any suitable delivery mechanism 21 . The delivery mechanism 21 may comprise, for example, a computer-readable storage medium, a computer program product 23, a memory device, a record medium such as a CD-ROM or DVD, an article of manufacture that tangibly embodies the computer program 9. The delivery mechanism may be a signal configured to reliably transfer the computer program 9. The apparatus 1 may propagate or transmit the computer program 9 as a computer data signal.
The memory 5 may comprise a single component or it may be implemented as one or more separate components some or all of which may be integrated/removable and/or may provide permanent/semi-permanent/ dynamic/cached storage.
References to 'computer-readable storage medium', 'computer program product', 'tangibly embodied computer program' etc. or a 'controller', 'computer', 'processor' etc. should be understood to encompass not only computers having different architectures such as single/multi- processor architectures and sequential (e.g. Von Neumann)/parallel architectures but also specialized circuits such as field-programmable gate arrays (FPGA), application specific integration circuits (ASIC), signal processing devices and other devices. References to computer program, instructions, code etc. should be understood to encompass software for a programmable processor or firmware such as, for example, the programmable content of a hardware device whether instructions for a processor, or configuration settings for a fixed-function device, gate array or programmable logic device etc.
Fig. 2 illustrates an apparatus V according to another embodiment of the disclosure. The apparatus 1 ' illustrated in Fig. 2 may be a chip or a chip-set. The apparatus 1 ' comprises at least one processor 3 and at least one memory 5 as described above in relation to Fig. 1 .
Figs. 3A and 3B schematically illustrate methods according to examplary embodiments of the disclosure which may be performed by the apparatus 1 as illustrated in Figs. 1 and 2. Fig. 3A illustrates an examplary method of configuring an apparatus 1 into a mode of operation which enables a user to control the apparatus 1 without having to view the apparatus 1 .
At block 31 a touch sensitive display 15 is configured in a first mode of operation. The first mode of operation may be a default mode of operation. In the first mode of operation a plurality of user interface items 53 are displayed on the touch sensitive display 15. In the first mode of operation the user interface items 53 are associated with functions of the apparatus 1 so that actuation of the area of the touch sensitive display 15 in which a user interface item 53 is displayed causes the function associated with that user interface item 53 to be performed.
In some embodiments of the disclosure the actuation required to cause a function to be performed may be a specific type of user input, for example it may comprise actuating the area in which the user interface item 53 is displayed for at least a minimum period of time or making multiple inputs in the same area, for example, a tap input, a double tap input, a trace or swipe gesture input, a flick gesture input or any other suitable input.
At block 33 a user input is detected. The user input may comprise any input which is made by the user and which is detected by the user input means 17.
The user input may be an input which a user can make without viewing the apparatus 1 . For example, it may comprise physical actuation of a portion of the apparatus 1 which is easy for the user to find without viewing the apparatus 1 . In some examplary embodiments of the disclosure the user input may comprise actuation of a designated portion 55, 59 of the touch sensitive display 15. The designated portion 55, 59 of the touch sensitive display 15 may comprise an edge portion or a corner portion of the touch sensitive display 15 which may be easy for a user to find without viewing the apparatus 1 . In other embodiments of the disclosure user input may comprise actuation of a key. The key may be provided in a location so that it is easy for a user to find without looking at the apparatus 1 , for example, it may be located on a side of the apparatus 1 and not positioned adjacent to or in proximity to any other keys. In other embodiments of the disclosure the user input may comprise a different type of input, for example it may comprise an audio input which may enable voice control of the apparatus 1 . Alternatively the user input may comprise an input which may be made to any part of the apparatus 1 , for example, a shaking or movement of the apparatus 1 which may be detected by an accelerometer.
Once the user input has been detected the user input means 17 provides an output signal to the controller 4 indicative of the detected input and at block 35 the apparatus 1 is configured into a second mode of operation. In the second mode of operation the touch sensitive display 15 is configured so that actuation of the area of the touch sensitive display 15 in which a user interface item 53 is displayed does not cause the function associated with that user interface item to be performed. Instead a non-visual output is provided indicative of the function associated with the user interface item 53.
The non-visual output may comprise any output which can be detected by the user without viewing the apparatus 1 . In some embodiments of the disclosure
the non-visual output may comprise an audio output. The audio output may comprise an acoustic signal which may be provided by an audio output device such as a loudspeaker. The acoustic signal may comprise a pressure wave which may be detected by the user, for example by the user's ear. The audio output device may be configured to provide the audio output in response to an electrical input signal provided by the controller 4. The electrical input signal may contain information indicative of the audio output which is to be provided.
In some embodiments of the disclosure the non-visual output may comprise a tactile output. The tactile output may be provided instead of or in addition to an audio output.
The tactile output may comprise any output which the user of the apparatus 1 may sense through touch. For example, the tactile output may comprise a raised portion or an indented portion of the touch sensitive display 15, a change in the texture of the surface or part of the touch sensitive display 15, or a vibration of the apparatus 1 or part of the apparatus 1 .
In some embodiments of the disclosure the tactile output may be provided as localized projections or indentations in the surface of the touch sensitive display 15. The projections and indentations may be provided by any suitable means such as a layer of electroactive polymer, a mechanical or fluid pumping system or a piezo electric transducer. In some embodiments the tactile output may be provided by providing a stimulus such as an electrical impulse to the user rather than making a physical modification to the surface of the touch sensitive display 15 or any other part of the apparatus 1 . For example, a current may be provided to the touch sensitive display 15 to change the electrical charge of the touch sensitive display 15. The user may be able to sense this through their skin and so be provided with a tactile output.
Fig. 3B illustrates an examplary method in which the user actuates the touch sensitive display 15 when the apparatus 1 is configured in the second mode of operation as described above. At block 41 the user actuates the area of the touch sensitive display 15 in which a user interface item 53 is displayed. The user interface item 53 may be normally associated with a function such that in a normal or default mode of operation actuation of the area in which the user interface item 53 is displayed causes the function to be performed. In response to the actuation of the touch sensitive display 15 an output signal is provided to the controller 4 indicative of the user interface item 53 which has been actuated.
At block 43, in response to the detected user input, the controller 4controls the apparatus 1 to provide a non-visual input indicative of the function which is normally associated with the user interface item. As described above the non-visual output may comprise an audio output and/or a tactile output.
The non-visual output may provide sufficient information to the user of the apparatus 1 to enable them to determine the function normally associated with the user interface item 53 without having to view the apparatus 1 .
In the embodiments of the disclosure the non-visual output indicative of the function is performed without causing the function itself to be performed. If the user wishes to enable the function to be performed they may need to make a further user input. For example, the touch sensitive display may be configured back into the first mode of operation in response to a further user input. Alternatively a user may be able to make a user input, to indicate that they wish the function indicated by the non-visual output to be performed. Figs. 4A to 4D illustrate graphical user interfaces 51 according to an examplary embodiment of the disclosure. The graphical user interfaces 51
may be displayed on a touch sensitive display 15 of an apparatus 1 as described above.
The graphical user interface 51 comprises a plurality of user interface items 53. The plurality of graphical user interface items 53 may comprise any item displayed on the touch sensitive display 15.
In the embodiments of the disclosure illustrated in Figs. 4A to 4D the plurality of user interface items 53 comprises a plurality of user selectable icons 61 . The plurality of user selectable icons 61 comprise graphical items which are associated with functions of the apparatus 1 such that, in a normal mode of operation, actuation of the area in which the respective user selectable icon is displayed causes the associated function to be performed. The plurality of user selectable icons 61 may be arranged in a menu structure or on a home screen. The user may be able to navigate through the plurality of user selectable icons 61 by scrolling through the menu structure or home screen. The plurality of user selectable icons 61 which are provided may be dynamic, that is they are not necessarily provided in a fixed position on the touch sensitive display 15. The user selectable icons 61 which are provided and their respective position may depend on a plurality of factors such as the mode of operation of the apparatus 1 , the position of the menu or home screen to which the user has scrolled or navigated, the functions which are currently available on the apparatus or the mode of operation of the apparatus 1 . This may make it difficult for a user to locate a specific one of the plurality of user selectable icons 61 without viewing the touch sensitive display 15.
In the embodiments of the disclosure illustrated in Figs. 4A to 4D the plurality of user interface items 53 also comprises a plurality of designated portions 55, 57, 59 of the touch sensitive display 15. In the examplary embodiments three designated portions 55, 57, 59 are provided. A first designated portion 55 is provided in the upper left hand corner of the touch sensitive display 15, a
second designated portion 59 is provided in the upper right hand corner of the touch sensitive display 15 and a third designated portion 57 is provided in the centre of the upper side of the touch sensitive display 15. The designated portions 55, 57, 59 may be fixed with respect to the edge of the touch sensitive display 15 so that they do not move even when the user scrolls through the plurality of user selectable icons 61 . This may enable a user to easily locate the respective designated portions 55, 57, 59 even when they are not looking at the apparatus 1 . It is to be appreciated that in the various embodiments the designated portions 55, 57, 59 may be located in any portion of the display 15 or even the housing of the apparatus 1 .
In the examplary embodiments of the disclosure the first and second designated portions 55, 59 enable the touch sensitive display 15 to be configured into a second mode of operation as described in more detail below. The third designated portion 57 is provided in the centre of the upper side of the touch sensitive display 15 and is associated with a reader function such that the actuation of the third designated portion 57 causes the apparatus 1 to provide an audio output indicating all the user interface items currently displayed on the touch sensitive display 15. This may be an advantageous feature if the user of the apparatus 1 is visually impaired.
In other embodiments of the disclosure other user interface items 53 may also be provided which are not illustrated in Figs 4A to 4D. For example there may be user interface items 53 which provide information to a user but do not cause a function to be performed, for example a graphical item may be provided indicating the remaining power available in a power source or of an available signal strength or status of a communication link.
In the examplary embodiment illustrated in Figs. 4A to 4D the plurality of user selectable icons 61 are provided in a different area of the touch sensitive display 15 to the designated portions 55, 57, 59. This may make it easier for a user to locate the respective user interface items 53.
In Fig. 4A the touch sensitive display 15 is configured in the first mode of operation so that if a user were to actuate the area in which any of the plurality of user selectable icons 61 is displayed the controller 4 would control the apparatus 1 to cause the associated function to be performed.
In Fig. 4B the user actuates the first designated portion 55 of the touch sensitive display 15 by touching this portion with their finger 63. This causes the controller 4 to configure the touch sensitive display 15 into the second mode of operation as described above.
In Fig. 4C the user maintains the actuation of the first designated portion 55 of the touch sensitive display 15 with the first finger 63 and simultaneously actuates the area of the touch sensitive display 15 in which a user selectable icon 61 is displayed with a second finger 67. As the touch sensitive display 15 is in the second mode of operation this causes a non-visual output to be provided indicative of the function associated with the user selectable icon 61 displayed in the actuated area 15. The non-visual output may be an audio output and/or a tactile output. The content of the non-visual output may be selected by a user to provide them with information which enables them to easily determine the function associated with the user selectable icon 61 . In some embodiments a visual output may be provided in addition to the non-visual output.
The user may actuate a plurality of different areas of the touch sensitive display 15 for example they may hold their first finger 63 fixed in the upper left hand corner of the touch sensitive display 15 and move their second finger 67 across the touch sensitive display 15. This may cause a plurality of different non-visual outputs to be provided as a plurality of different areas of the touch sensitive display 15 are actuated.
In Fig. 4D the user ends the actuation of the first designated area 55 by lifting their finger off the surface of the touch sensitive display 15. In the embodiment of the disclosure illustrated in Figs. 4A to 4D this causes the touch sensitive display 15 to be switched back to the first normal mode of operation.
As the user has continued to actuate the area of the touch sensitive display 15 in which the user selectable icons 61 are displayed this causes the functions associated with the currently selected user selectable icon 61 to be performed.
In some embodiments of the disclosure the user may also configure the touch sensitive display 15 into the second mode of operation by actuating the second designated portion 59. In some embodiments of the disclosure the second designated portion 59 may operate in an identical manner to the first designated portion 55 so that in order to maintain the touch sensitive display 15 in the second mode of operation the user has to keep their finger in contact with the designated portion 55, 59. This may provide an advantage that the user could actuate either designated portion 55, 59, it may make it easier for the user find the designated portion 55, 59 and they could use whichever designated portion 55, 59 is most convenient for them. The user could then control the apparatus to cause a function to be performed by ending the actuation of the designated portion 55, 59 whilst maintaining the actuation of the area in which the user selectable icon 61 is displayed.
In some embodiments of the disclosure the two designated portions 55, 59 may have different modes of actuation. For example, the first designated portion 55 may be actuated by touching the first designated portion 55 and maintaining contact with the surface of the touch sensitive display 15 while simultaneously actuating the area of the touch sensitive display 15 in which the plurality of user selectable icons 61 is displayed. The second designated portion 59 may be actuated by touching the second designated portion 59 and
then subsequently actuating the area of the touch sensitive display 15 in which the plurality of user selectable icons 61 is displayed without having to maintain the simultaneous actuation of the second designated portion 59. This may enable a user to end the actuation of the designated portion 59 without causing the function to be performed. This may make the apparatus 1 easier for a user to use with one hand.
As mentioned above, in some embodiments the plurality of user selectable icons 61 may be dynamic, that is their position on the touch sensitive display 15 is not fixed and they may be moved. In some embodiments the plurality of user selectable icons 61 may be moved in response to an input by a user, for example a user may make an input which causes the plurality of user selectable icons 61 to be scrolled through. The scrolling may continue until another user input or interrupt is detected. In other embodiments the movement of the plurality of user selectable icons 61 on the touch sensitive display 15 may be automatic, without any direct input from the user of the apparatus 1 . For example, the plurality of user selectable icons 61 may comprise items within a list of received messages such as emails or notifications from a social networking site. The list of received messages or notifications may be automatically refreshed whenever a new message or notification is received. This may cause the items within the list to be moved on the display 15. When a user selects one of the items in the list this may enable the function of opening the message or notification or accessing information relating to the message or notification.
In some embodiments when the user actuates one of the designated portions 55, 57, 59 this may cause the dynamic items of the touch sensitive display 15 to be temporarily fixed so that they remain in the same position on the touch sensitive display 15. For example if a user is scrolling through the plurality of user selectable icons 61 and then actuates one of the designated portions 55, 57, 59, this may cause the scrolling to be suspended while the apparatus 1 is
in the second mode of operation. The scrolling may be resumed once the apparatus 1 is configured out of the second mode of operation.
If the plurality of user selectable icons 61 comprises items within a list of received messages such as emails or notifications from a social networking site, actuating one of the designated portions 55, 57, 59 and enabling the second mode of operation may cause the automatic refreshing of the list to be temporarily disabled. This may enable a user to obtain non-visual indications of new notifications and messages which have been received.
The non-visual indications of new notifications and messages which have been received may comprise an indication of the type of message or notification. For example, different non-visual outputs may be provided for different types of messages, such as emails, SMS or MMS messages, instant messages or notifications from social networking sites. In some embodiments the non-visual indication may include further details such as the sender of the message or notification and the time and date at which the message or notification was received. In some embodiments the non-visual indication may comprise a non-visual output of the whole received message or notification, for example, the text of the message may be converted into an audio output which may be provided in response to the user input.
In some embodiments the apparatus 1 may be configured in a locked mode of operation. In the locked mode of operation the touch sensitive display 15 may be configured so that it is not responsive to user inputs so that actuating an area of the touch sensitive display 15 in which a user selectable icon 61 is displayed would not enable the function associated with the user selectable icon to be performed. When the apparatus 1 is in the locked mode of operation the user may still be able to make a user input to cause the apparatus 1 to be configured in the second mode of operation and enable the non-visual outputs to be provided.
For example, when the apparatus 1 is in the locked mode of operation the designated portions 55, 57, 59 of the touch sensitive display 15 may still be responsive to a user input to enable the apparatus 1 to be configured out of the locked mode of operation and into the second mode of operation. In some embodiments this may enable the user to control the apparatus 1 to provide a non-visual output indicative of a received message or notification without having to configure the apparatus 1 into the unlocked mode of operation. Embodiments of the disclosure provide an apparatus 1 with a touch screen display 15 which can be configured to enable the apparatus 1 to be used without looking at the touch screen display 15. Configuring the apparatus 1 into a mode of operation in which a non-visual output is provided but a function is not performed enables a user to easily find the user interface items 53 they need without having to look at the apparatus 1 . This could be useful for users who are performing other tasks, such as driving or walking, whilst using the apparatus 1 . It may also be useful for visually impaired users or user who may have difficulty viewing certain user interface items 53. Also the embodiments of the disclosure enable an apparatus to be easily switched between the respective modes of operation by user inputs such as actuating designated areas of the touch sensitive display or voice inputs. These inputs may be inputs which a user can make easily and accurately without looking at the apparatus 1 or the touch sensitive display 15.
In some embodiments of the disclosure when the touch sensitive display 15 is in the second mode of operation every user selectable icon 61 may be associated with a non-visual output so that actuation of any of the respective areas in which the user selectable icons 61 are displayed causes a non-visual output to be provided. This may be particularly advantageous for apparatus where the user is visually impaired and may need assistance in finding distinguishing between the respective user interface items 53. In other
embodiments of the disclosure a subset of the plurality of user selectable icons 61 may be associated with a non-visual output. The user may be able to select which user selectable icons are to be associated with a non-visual output. This may be particularly advantageous for apparatus where the user intends to use the apparatus while performing other tasks, for example if the user is using the apparatus as a navigation system or music player whilst driving they might find it useful to be able to find the user interface icons associated with these functions but might not be interested in the user interface icons associated with, for example, viewing captured or stored images, as these functions would be unlikely to be used while they are not able to look at the apparatus. This may make it easier for the user to find the user selectable items 61 they need.
The blocks illustrated in the Figs. 3A and 3B may represent steps in a method and/or sections of code in the computer program 9. The illustration of a particular order to the blocks does not necessarily imply that there is a required or preferred order for the blocks and the order and arrangement of the block may be varied. Furthermore, it may be possible for some blocks to be omitted.
Although embodiments of the present disclosure have been described in the preceding paragraphs with reference to various examples, it should be appreciated that modifications to the examples given can be made without departing from the scope of the disclosure as claimed. For example in some embodiments of the disclosure a visual output may be provided in addition to the non-visual output.
In the illustrated embodiments of the disclosure the designated portions are provided in the upper left hand and upper right hand corners respectively. In other embodiments of the disclosure the designated portion may comprise any corner of the touch sensitive display. This may provide the advantage that the user of the apparatus does not need to know the orientation of the
apparatus in order to find a designated portion of the touch sensitive display. This may be particularly beneficial if the user is visually impaired.
In some embodiments the apparatus 1 may also be configured to provide a non-visual output for items which do not have a specific function associated with them. For example, the display 15 may comprise icons indicative of the status of the apparatus 1 for example, the battery power level of the apparatus 1 or the signal strength available to the apparatus 1 . When the apparatus 1 is in the second mode of operation the apparatus may be configured to provide a non-visual output relating to these status indicators as well as the user selectable items.
Features described in the preceding description may be used in combinations other than the combinations explicitly described.
Although functions have been described with reference to certain features, those functions may be perfornnable by other features whether described or not. Although features have been described with reference to certain embodiments, those features may also be present in other embodiments whether described or not.
Whilst endeavouring in the foregoing specification to draw attention to those features of the disclosure believed to be of particular importance it should be understood that the Applicant claims protection in respect of any patentable feature or combination of features hereinbefore referred to and/or shown in the drawings whether or not particular emphasis has been placed thereon. l/we claim:
Claims
1. A method comprising:
providing a plurality of user interface items on a portion of a touch sensitive display wherein in a first mode of operation actuation of an area of the touch sensitive display in which a user interface item is displayed causes a function associated with the user interface item to be performed; and
configuring the touch sensitive display into a second mode of operation wherein in the second mode of operation actuation of an area of the touch sensitive display in which a user interface item is displayed causes a non- visual output indicative of the function associated with a user interface item to be provided without enabling the function.
2. A method as claimed in claim 1 wherein the non-visual output comprises an audio output.
3. A method as claimed in any preceding claim wherein the non-visual output comprises a tactile output.
4. A method as claimed in any preceding claim wherein the touch sensitive display is configured in the second mode of operation in response to a user input.
5. A method as claimed in claim 4 wherein the user input comprises actuating a designated portion of the touch sensitive display.
6. A method as claimed in claim 5 wherein the user input comprises actuating the designated portion of the touch sensitive display simultaneously to actuating the portion of the touch sensitive display in which the user interface items are displayed.
7. A method as claimed in claim 6 wherein when the user ends the actuation of the designated portion of the touch sensitive display the function associated with the simultaneously actuated user interface item is enabled.
8. A method as claimed in claim 6 wherein when the user ends the actuation of the designated portion of the touch sensitive display the function associated with the simultaneously actuated user interface item is not enabled.
9. A method as claimed in any of claims 5 to 8 wherein the designated portion of the touch sensitive display comprises a corner of the touch sensitive display.
10. A method as claimed in any of claims 5 to 9 wherein a plurality of designated portions of the touch sensitive display are provided.
1 1 . A method as claimed in any preceding claim wherein the user interface items comprise user selectable icons.
12. A method as claimed in claim 1 1 wherein the plurality of user selectable icons are provided in a different portion of the touch sensitive display to the designated area.
13. A method as claimed in any preceding claim further comprising configuring the touch sensitive display back into the first mode of operation in response to a further user input.
14. An apparatus comprising:
at least one processor; and
at least one memory including computer program code;
wherein the at least one memory and the computer program code are configured to, with the at least one processor, enable the apparatus to: provide a plurality of user interface items on a portion of a touch sensitive display wherein in a first mode of operation actuation of an area of the touch sensitive display in which a user interface item is displayed causes a function associated with the user interface item to be performed; and
configure the touch sensitive display into a second mode of operation wherein in the second mode of operation actuation of an area of the touch sensitive display in which a user interface item is displayed causes a non- visual output indicative of the function associated with a user interface item to be provided without enabling the function.
15. An apparatus as claimed in claim 14 wherein the non-visual output comprises an audio output.
16. An apparatus as claimed in any of claims 14 to 15 wherein the non- visual output comprises a tactile output.
17. An apparatus as claimed in any of claims 14 to 16 wherein the at least one memory and the computer program code are configured to, with the at least one processor, enable the apparatus to detect a user input and configure the touch sensitive display in the second mode of operation in response to the detection of the user input.
18. An apparatus as claimed in claim 17 wherein the user input comprises actuating a designated portion of the touch sensitive display.
19. An apparatus as claimed in claim 18 wherein the user input comprises actuating the designated portion of the touch sensitive display simultaneously to actuating the portion of the touch sensitive display in which the user interface items are displayed.
20. An apparatus as claimed in claim 19 wherein the at least one memory and the computer program code are configured to, with the at least one processor, enable the apparatus to detect when the user ends the actuation of the designated portion of the touch sensitive display, and in response to detecting the end of the actuation of the designated portion of the touch sensitive display, enable the function associated with the simultaneously actuated user interface item to be performed.
21. An apparatus as claimed in claim 19 wherein the at least one memory and the computer program code are configured to, with the at least one processor, enable the apparatus to detect when the user ends the actuation of the designated portion of the touch sensitive display, wherein in response to detecting the end of the actuation of the designated portion of the touch sensitive display, the function associated with the simultaneously actuated user interface item is not performed.
22. An apparatus as claimed in any of claims 14 to 17 wherein the designated portion of the touch sensitive display comprises a corner of the touch sensitive display.
23. An apparatus as claimed in any of claims 18 to 22 wherein a plurality of designated portions of the touch sensitive display are provided.
24. An apparatus as claimed in any of claims 14 to 21 wherein the user interface items comprise user selectable icons.
25. An apparatus as claimed in claim 24 wherein the plurality of user selectable icons are provided in a different portion of the touch sensitive display to the designated area.
26. An apparatus as claimed in any of claims 14 to 25 wherein the at least one memory and the computer program code are configured to, with the at least one processor, enable the apparatus to detect a further user input and configure the touch sensitive display back in the first mode of operation in response to the further user input.
27. A computer program comprising computer program instructions that, when executed by at least one processor, enable an apparatus at least to perform:
providing a plurality of user interface items on a portion of a touch sensitive display wherein in a first mode of operation actuation of an area of the touch sensitive display in which a user interface item is displayed causes a function associated with the user interface item to be performed; and
configuring the touch sensitive display into a second mode of operation wherein in the second mode of operation actuation of an area of the touch sensitive display in which a user interface item is displayed causes a non- visual output indicative of the function associated with a user interface item to be provided without enabling the function.
28. A computer program comprising program instructions for causing a computer to perform the method of any of claims 1 to 13.
29. A non-transitory entity embodying the computer program as claimed in any of claims 27 to 28.
30. An electromagnetic carrier signal carrying the computer program as claimed in any of claims 27 to 28.
31. A user interface comprising:
a touch sensitive display wherein the touch sensitive display is configured to;
provide a plurality of user interface items on a portion of the touch sensitive display wherein in a first mode of operation actuation of an area of the touch sensitive display in which a user interface item is displayed causes a function associated with the user interface item to be performed; and when configured in a second mode of operation cause a non-visual output indicative of the function associated with a user interface item in response to actuation of an area of the touch sensitive display in which a user interface item is displayed without enabling the function.
32. A user interface as claimed in claim 30 wherein the non-visual output comprises an audio output.
33. A user interface as claimed in any of claims 30 to 31 wherein the non- visual output comprises a tactile output.
34. A user interface as claimed in any of claims 30 to 32 wherein the touch sensitive display is configured in the second mode of operation in response to a user input.
35. A user interface as claimed in claim 33 wherein the user input comprises actuating a designated portion of the touch sensitive display.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/IB2011/055048 WO2013068793A1 (en) | 2011-11-11 | 2011-11-11 | A method, apparatus, computer program and user interface |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/IB2011/055048 WO2013068793A1 (en) | 2011-11-11 | 2011-11-11 | A method, apparatus, computer program and user interface |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2013068793A1 true WO2013068793A1 (en) | 2013-05-16 |
Family
ID=48288602
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/IB2011/055048 WO2013068793A1 (en) | 2011-11-11 | 2011-11-11 | A method, apparatus, computer program and user interface |
Country Status (1)
Country | Link |
---|---|
WO (1) | WO2013068793A1 (en) |
Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6532005B1 (en) * | 1999-06-17 | 2003-03-11 | Denso Corporation | Audio positioning mechanism for a display |
US20030234824A1 (en) * | 2002-06-24 | 2003-12-25 | Xerox Corporation | System for audible feedback for touch screen displays |
US20060046031A1 (en) * | 2002-12-04 | 2006-03-02 | Koninklijke Philips Electronics N.V. | Graphic user interface having touch detectability |
US20090167701A1 (en) * | 2007-12-28 | 2009-07-02 | Nokia Corporation | Audio and tactile feedback based on visual environment |
US20100079410A1 (en) * | 2008-09-30 | 2010-04-01 | Sony Ericsson Mobile Communications Ab | Three-dimensional touch interface |
WO2010116028A2 (en) * | 2009-04-06 | 2010-10-14 | Aalto-Korkeakoulusäätiö | Method for controlling an apparatus |
GB2470418A (en) * | 2009-05-22 | 2010-11-24 | Nec Corp | Haptic information delivery |
US20110113328A1 (en) * | 2009-11-12 | 2011-05-12 | International Business Machines Corporation | System and method to provide access for blind users on kiosks |
US20110138284A1 (en) * | 2009-12-03 | 2011-06-09 | Microsoft Corporation | Three-state touch input system |
-
2011
- 2011-11-11 WO PCT/IB2011/055048 patent/WO2013068793A1/en active Application Filing
Patent Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6532005B1 (en) * | 1999-06-17 | 2003-03-11 | Denso Corporation | Audio positioning mechanism for a display |
US20030234824A1 (en) * | 2002-06-24 | 2003-12-25 | Xerox Corporation | System for audible feedback for touch screen displays |
US20060046031A1 (en) * | 2002-12-04 | 2006-03-02 | Koninklijke Philips Electronics N.V. | Graphic user interface having touch detectability |
US20090167701A1 (en) * | 2007-12-28 | 2009-07-02 | Nokia Corporation | Audio and tactile feedback based on visual environment |
US20100079410A1 (en) * | 2008-09-30 | 2010-04-01 | Sony Ericsson Mobile Communications Ab | Three-dimensional touch interface |
WO2010116028A2 (en) * | 2009-04-06 | 2010-10-14 | Aalto-Korkeakoulusäätiö | Method for controlling an apparatus |
GB2470418A (en) * | 2009-05-22 | 2010-11-24 | Nec Corp | Haptic information delivery |
US20110113328A1 (en) * | 2009-11-12 | 2011-05-12 | International Business Machines Corporation | System and method to provide access for blind users on kiosks |
US20110138284A1 (en) * | 2009-12-03 | 2011-06-09 | Microsoft Corporation | Three-state touch input system |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11907013B2 (en) | Continuity of applications across devices | |
US20230409165A1 (en) | User interfaces for widgets | |
US11481094B2 (en) | User interfaces for location-related communications | |
US11477609B2 (en) | User interfaces for location-related communications | |
EP2849047B1 (en) | Methods and software for facilitating the selection of multiple items at an electronic device | |
US9257098B2 (en) | Apparatus and methods for displaying second content in response to user inputs | |
EP2590049B1 (en) | User Interface | |
US20100088654A1 (en) | Electronic device having a state aware touchscreen | |
US12189865B2 (en) | Navigating user interfaces using hand gestures | |
KR20150031010A (en) | Apparatus and method for providing lock screen | |
EP3241100B1 (en) | Method and apparatus for processing notifications on a mobile computing device | |
US11438452B1 (en) | Propagating context information in a privacy preserving manner | |
EP2564288B1 (en) | An apparatus, method, computer program and user interface | |
EP2564290B1 (en) | An apparatus, method, computer program and user interface | |
KR102142699B1 (en) | Method for operating application and electronic device thereof | |
JP5943856B2 (en) | Mobile terminal having multifaceted graphic objects and display switching method | |
WO2013068793A1 (en) | A method, apparatus, computer program and user interface | |
RU2607611C2 (en) | User interface | |
EP2629170A1 (en) | Electronic device and method of controlling same |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 11875380 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 11875380 Country of ref document: EP Kind code of ref document: A1 |