[go: up one dir, main page]

CN113093983B - Device and method for accessing common device functions - Google Patents

Device and method for accessing common device functions Download PDF

Info

Publication number
CN113093983B
CN113093983B CN202110560299.6A CN202110560299A CN113093983B CN 113093983 B CN113093983 B CN 113093983B CN 202110560299 A CN202110560299 A CN 202110560299A CN 113093983 B CN113093983 B CN 113093983B
Authority
CN
China
Prior art keywords
user interface
application
display
input
touch
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110560299.6A
Other languages
Chinese (zh)
Other versions
CN113093983A (en
Inventor
S·O·勒梅
C·P·福斯
R·R·德林杰
J·R·达斯科拉
C·G·斯塔克
I·A·乔德里
M·范奥斯
A·贝尔扎迪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Apple Inc
Original Assignee
Apple Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from DKPA201670616A external-priority patent/DK201670616A1/en
Priority claimed from DKPA201670621A external-priority patent/DK201670621A1/en
Application filed by Apple Inc filed Critical Apple Inc
Publication of CN113093983A publication Critical patent/CN113093983A/en
Application granted granted Critical
Publication of CN113093983B publication Critical patent/CN113093983B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/26Power supply means, e.g. regulation thereof
    • G06F1/32Means for saving power
    • G06F1/3203Power management, i.e. event-based initiation of a power-saving mode
    • G06F1/3234Power saving characterised by the action undertaken
    • G06F1/325Power saving in peripheral device
    • G06F1/3265Power saving in display device
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0414Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means using force sensing means to determine a position
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0483Interaction with page-structured environments, e.g. book metaphor
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/66Substation equipment, e.g. for use by subscribers with means for preventing unauthorised or fraudulent calling
    • H04M1/667Preventing unauthorised calls from a telephone set
    • H04M1/67Preventing unauthorised calls from a telephone set by electronic means
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72403User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/70Protecting specific internal or peripheral components, in which the protection of a component leads to protection of the entire computer
    • G06F21/71Protecting specific internal or peripheral components, in which the protection of a component leads to protection of the entire computer to assure secure computing or processing of information
    • G06F21/74Protecting specific internal or peripheral components, in which the protection of a component leads to protection of the entire computer to assure secure computing or processing of information operating in dual or compartmented mode, i.e. at least one secure mode
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04803Split screen, i.e. subdividing the display area or the window area into separate subareas
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D10/00Energy efficient computing, e.g. low power processors, power management or thermal management

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Security & Cryptography (AREA)
  • Signal Processing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

本发明题为:“用于访问常用设备功能的设备和方法”。本发明公开了一种电子设备,该电子设备显示第一用户界面,该第一用户界面包括对应于安装在设备上的多个应用程序中的不同应用程序的多个应用程序图标。该设备检测触敏表面上与多个应用程序图标的第一应用程序图标对应的位置处的第一输入,该第一应用程序图标对应于多个应用程序中的第一应用程序。响应于检测到第一输入,该设备在重叠区域中显示第一迷你应用程序对象或所述第一迷你应用程序对象的预览,其中所述第一迷你应用程序对象对应于所述多个应用程序的第一应用程序;并且重叠区域包括用于将所述第一迷你应用程序对象添加到显示多个迷你应用程序对象的第二用户界面的示能表示。该设备检测所述触敏表面上的与用于将所述第一迷你应用程序对象添加到所述第二用户界面的所述示能表示对应的位置处的第二输入。响应于检测到第二输入,该设备将所述第一迷你应用程序对象添加到显示所述多个迷你应用程序对象的所述第二用户界面。

The present invention is entitled: "Device and method for accessing common device functions". The present invention discloses an electronic device that displays a first user interface, which includes multiple application icons corresponding to different applications among multiple applications installed on the device. The device detects a first input at a position corresponding to a first application icon of the multiple application icons on a touch-sensitive surface, and the first application icon corresponds to a first application of the multiple applications. In response to detecting the first input, the device displays a first mini-application object or a preview of the first mini-application object in an overlapping area, wherein the first mini-application object corresponds to a first application of the multiple applications; and the overlapping area includes an enabling representation for adding the first mini-application object to a second user interface that displays multiple mini-application objects. The device detects a second input at a position on the touch-sensitive surface corresponding to the enabling representation for adding the first mini-application object to the second user interface. In response to detecting the second input, the device adds the first mini-application object to the second user interface that displays the multiple mini-application objects.

Description

Device and method for accessing common device functions
The application is a divisional application of Chinese application patent application with the application date of 2017, 05-24-month, the national application number of 201710383083.0 and the application name of 'equipment and method for accessing common equipment functions'.
Technical Field
The present disclosure relates generally to electronic devices having touch-sensitive surfaces, including but not limited to electronic devices having touch-sensitive surfaces that include multiple user interfaces for accessing common device functions.
Background
In recent years, the use of touch-sensitive surfaces as input devices for computers and other electronic computing devices has grown significantly. Exemplary touch-sensitive surfaces include touch pads and touch screen displays. Such surfaces are widely used to manipulate user interface objects on a display.
Exemplary manipulations include accessing controls for controlling device functions and remote device functions. But the methods for performing these manipulations are cumbersome and inefficient. As the field of use of remote devices expands, accessing particular controls and functions becomes more time consuming. For example, it is time consuming for a user to enter an input sequence to access a control through a drop down menu, especially if the user does not know in advance the location of the desired control. Navigating between applications to access device functions is also time consuming.
Disclosure of Invention
Accordingly, there is a need for electronic devices having faster, more efficient methods and interfaces for accessing common device functions. Such methods and interfaces optionally complement or replace conventional methods for accessing common device functions. Such methods and interfaces reduce the cognitive burden on the user and result in a more efficient human-machine interface. For battery-operated devices, such methods and interfaces may save electricity and increase the time between battery charges.
The above-described drawbacks and other problems associated with user interfaces of electronic devices having touch-sensitive surfaces may be reduced or eliminated with the disclosed devices. In some embodiments, the device is a desktop computer. In some embodiments, the device is portable (e.g., a notebook, tablet, or handheld device). In some embodiments, the device has a touch pad. In some implementations, the device has a touch sensitive display (also referred to as a "touch screen" or "touch screen display"). In some embodiments, the device has a Graphical User Interface (GUI), one or more processors, memory, and one or more modules, programs, or sets of instructions stored in the memory for performing a variety of functions. In some embodiments, the user interacts with the GUI on the touch-sensitive surface primarily through stylus and/or finger contacts and gestures. In some embodiments, these functions optionally include image editing, drawing, rendering, word processing, website creation, disk editing, spreadsheet making, game playing, phone calls, video conferencing, email sending and receiving, instant messaging, fitness support, digital photography, digital video recording, web browsing, digital music playing, and/or digital video playing. Executable instructions for performing these functions are optionally included in a non-transitory computer-readable storage medium or other computer program product configured for execution by one or more processors.
According to some embodiments, a method is performed at an electronic device having a display and a touch-sensitive surface. The method includes detecting a first input when the device is in a display off state, and in response to detecting the first input, activating a display of the device and displaying a first user interface on the display corresponding to the display on state of the device, detecting a swipe gesture on the touch-sensitive surface when the first user interface corresponding to the display on state of the device is displayed, and in response to detecting the swipe gesture on the touch-sensitive surface, replacing the display of the first user interface with a display of a second user interface that displays first content according to determining that the device is in a locked mode of the display on state and the swipe gesture is in a first direction, replacing the display of the first user interface with a display of the second user interface that displays first content and first additional content according to determining that the device is in an unlocked mode of the display on state of the device and the swipe gesture is in the first direction, the first additional content not being displayed when the device is in the locked mode of the display on state.
According to some embodiments, a method is performed at an electronic device having a display and a touch-sensitive surface. The method includes detecting a first input to activate a display of the device when the device is in a display off state, in response to detecting the first input, activating the display of the device and displaying a first user interface on the display corresponding to the display on state of the device, detecting a swipe gesture on the touch-sensitive surface when the first user interface corresponding to the display on state of the device is displayed, and in response to detecting the swipe gesture, replacing the display of the first user interface with a display of a camera application user interface in accordance with determining that the swipe gesture is in a first direction, replacing the display of the first user interface with a display of a mini-application user interface in accordance with determining that the swipe gesture is in a second direction different from the first direction, the mini-application user interface being configured to include a plurality of mini-application objects, wherein respective ones of the plurality of mini-application objects have respective applications stored in the device, displaying the first page of the multi-page control panel user interface in accordance with determining that the swipe gesture is in a third direction different from the first direction and the second direction, and displaying the multi-page control panel user interface is configured to be notified in accordance with the first direction and the fourth direction different from the first direction.
According to some embodiments, a method is performed at an electronic device having a display and a touch-sensitive surface. The method includes displaying a first user interface including a plurality of application icons corresponding to different ones of a plurality of applications installed on a device, detecting a first input on a touch-sensitive surface at a location corresponding to a first one of the plurality of application icons, the first application icon corresponding to the first one of the plurality of applications, displaying a first mini-application object or a preview of the first mini-application object in a coverage area in response to detecting the first input, wherein the first mini-application object corresponds to the first one of the plurality of applications, and the coverage area includes an affordance for adding the first mini-application object to a second user interface displaying the plurality of mini-application objects, detecting a second input on the touch-sensitive surface at a location corresponding to the affordance for adding the first mini-application object to the second user interface, and adding the first application object to the second mini-application object in response to detecting the second input.
According to some embodiments, a method is performed at an electronic device having a display and a touch-sensitive surface. The method includes detecting a first gesture on the touch-sensitive surface while an initial user interface is displayed on the display, displaying a first page of the multi-page control panel on the display in response to detecting the first gesture, wherein the first page of the multi-page control panel includes a plurality of device control affordances, detecting a second gesture on the touch-sensitive surface while the first page of the multi-page control panel is displayed, and displaying a second page of the multi-page control panel in response to detecting the second gesture, wherein the second page of the multi-page control panel includes a plurality of content playback control affordances, and the second page of the multi-page control panel replaces the first page of the multi-page control panel on the display.
According to some embodiments, an electronic device includes a display unit configured to display a user interface, a touch-sensitive surface unit configured to detect contact, and a processing unit coupled with the display unit and the touch-sensitive surface unit. The processing unit is configured to detect a first input while the device is in a display off state. In response to detecting the first input, the processing unit is configured to activate a display unit of the device and enable display of a first user interface corresponding to a display on state of the device on the display unit. The processing unit is configured to detect a swipe gesture on the touch-sensitive surface unit while displaying a first user interface corresponding to a display-on state of the device. In response to detecting a swipe gesture on the touch-sensitive surface unit, the processing unit is configured to replace a display of the first user interface with a display of a second user interface in accordance with a determination that the device is in a locked mode with the display in an on state and the swipe gesture is in a first direction, the second user interface displaying the first content and the first additional content, the first additional content not being displayed when the device is in the locked mode with the display in the on state, and to replace the display of the first user interface with a display of the second user interface in accordance with a determination that the device is in an unlocked mode with the display in the on state and the swipe gesture is in the first direction.
According to some embodiments, an electronic device includes a display unit configured to display a user interface, a touch-sensitive surface unit configured to detect contact, and a processing unit coupled with the display unit and the touch-sensitive surface unit. The processing unit is configured to detect a first input to activate a display unit of the device while the device is in a display off state. In response to detecting the first input, the processing unit is configured to activate a display of the device and enable display of a first user interface on the display unit corresponding to a display on state of the device. The processing unit is configured to detect a swipe gesture on the touch-sensitive surface unit while displaying a first user interface corresponding to a display-on state of the device. In response to detecting the swipe gesture, the processing unit is configured to replace the display of the first user interface with the display of the camera application user interface in accordance with determining that the swipe gesture is in a first direction, replace the display of the first user interface with the display of a mini-application object user interface in accordance with determining that the swipe gesture is in a second direction different from the first direction, the mini-application object user interface configured to include a plurality of mini-application objects, wherein respective mini-application objects of the plurality of mini-application objects have corresponding applications stored in the device, enable the display of a first page of the multi-page control panel user interface in accordance with determining that the swipe gesture is in a third direction different from the first direction and the second direction, and enable the display of a notification user interface configured to display a plurality of notifications in accordance with determining that the swipe gesture is in a fourth direction different from the first direction, the second direction, and the third direction.
According to some embodiments, an electronic device includes a display unit configured to display a first user interface including a plurality of application icons corresponding to different ones of a plurality of applications installed on the device, a touch-sensitive surface unit configured to detect contact, and a processing unit coupled with the display unit and the touch-sensitive surface unit. The processing unit is configured to detect a first input on the touch-sensitive surface unit at a location corresponding to a first application icon of the plurality of application icons, the first application icon corresponding to a first application of the plurality of applications. In response to detecting the first input, the processing unit is configured to enable display of a first mini-application object or a preview of the first mini-application object in the overlay area, wherein the first mini-application object corresponds to a first application of the plurality of applications, and the overlay area includes an affordance for adding the first mini-application object to a second user interface that displays the plurality of mini-application objects. The processing unit is further configured to detect a second input on the touch-sensitive surface unit corresponding to a location for adding the first mini-application object to the affordance of the second user interface. In response to detecting the second input, the processing unit is configured to add the first mini-application object to a second user interface displaying a plurality of mini-application objects.
According to some embodiments, an electronic device includes a display unit configured to display a user interface, a touch-sensitive surface unit configured to detect contact, and a processing unit coupled with the display unit and the touch-sensitive surface unit. The processing unit is configured to detect a first gesture on the touch-sensitive surface unit while the initial user interface is displayed on the display unit. In response to detecting the first gesture, the processing unit is configured to enable display of a first page of the multi-page control panel on the display unit, wherein the first page of the multi-page control panel includes a plurality of device control affordances. While displaying a first page of the multi-page control panel, the processing unit is configured to detect a second gesture on the touch-sensitive surface unit. In response to detecting the second gesture, the processing unit is configured to enable display of a second page of the multi-page control panel, wherein the second page of the multi-page control panel includes a plurality of content playback control affordances, and the second page of the multi-page control panel replaces the first page of the multi-page control panel on the display unit.
According to some embodiments, an electronic device includes a display, a touch-sensitive surface, one or more sensors optionally for detecting intensity of contact with the touch-sensitive surface, one or more processors, memory, and one or more programs stored in the memory and configured to be executed by the one or more processors, and the one or more programs include instructions for performing or causing performance of the operations of any of the methods described herein. According to some embodiments, a computer-readable storage medium has stored therein instructions that, when executed by an electronic device having a display, a touch-sensitive surface, and optionally one or more sensors for detecting intensity of contact with the touch-sensitive surface, cause the device to perform or cause to perform operations of any of the methods described herein. According to some implementations, a graphical user interface on an electronic device with a display, a touch-sensitive surface, optionally one or more sensors to detect intensity of contacts with the touch-sensitive surface, a memory, and one or more processors to execute one or more programs stored in the memory includes one or more of the elements displayed in any of the methods described above, the one or more elements updated in response to an input, as described in any of the methods described herein. According to some embodiments, an electronic device includes a display, a touch-sensitive surface, and optionally one or more sensors for detecting intensity of contact with the touch-sensitive surface, and means for performing or causing performance of the operations of any of the methods described herein. According to some embodiments, an information processing apparatus for use in an electronic device having a display, a touch-sensitive surface, and optionally one or more sensors for detecting intensity of contact with the touch-sensitive surface, comprises means for performing or causing performance of operations of any of the methods described herein.
Thus, an electronic device having a display, a touch-sensitive surface, and optionally one or more sensors for detecting the intensity of contact with the touch-sensitive surface is provided with faster, more efficient methods and interfaces for accessing common device functions, thereby improving the effectiveness, efficiency, and user satisfaction of such devices. Such methods and interfaces may supplement or replace conventional methods for accessing common device functions.
Drawings
For a better understanding of the various described embodiments, reference should be made to the following detailed description taken in conjunction with the following drawings, in which like reference numerals designate corresponding parts throughout the several views.
FIG. 1A is a block diagram illustrating a portable multifunction device with a touch-sensitive display in accordance with some embodiments.
FIG. 1B is a block diagram illustrating exemplary components for event processing according to some embodiments.
Fig. 2 illustrates a portable multifunction device with a touch screen in accordance with some embodiments.
FIG. 3 is a block diagram of an exemplary multifunction device with a display and a touch-sensitive surface in accordance with some embodiments.
Fig. 4A illustrates an exemplary user interface for an application menu on a portable multifunction device in accordance with some embodiments.
Fig. 4B illustrates an exemplary user interface of a multifunction device with a touch-sensitive surface separate from a display in accordance with some embodiments.
Fig. 4C-4E illustrate exemplary dynamic intensity thresholds according to some embodiments.
Fig. 5A-5 BJ illustrate exemplary user interfaces for accessing common device functions according to some embodiments.
Fig. 6A-6E are flowcharts illustrating methods of accessing a control from a user interface that is open from a display, according to some embodiments.
Fig. 7A-7E are flowcharts illustrating methods of accessing a control from a user interface that is open from a display, according to some embodiments.
Fig. 8A-8C are flowcharts illustrating methods of adding mini-application objects to a mini-application user interface according to some embodiments.
Fig. 9A-9E are flowcharts illustrating methods of using a multi-page control panel navigation control according to some embodiments.
Fig. 10-13 are functional block diagrams of electronic devices according to some embodiments.
Detailed Description
Many electronic devices provide various interfaces for accessing application features and modifying device settings. Such interfaces may require a user to activate the device, provide authentication information, and/or go down multiple menus to access desired application features or device settings. The methods described herein provide interfaces for accessing common device functions, such as application features and device settings. Such interfaces reduce the amount of input required to access such features, which reduces power usage and extends the battery life of the device by enabling a user to use the device faster and more efficiently.
The following fig. 1A-1B, 2 and 3 provide a description of an exemplary device. Fig. 4A-4B and 5A-5 BJ illustrate exemplary user interfaces for providing haptic feedback. Fig. 6A-6E illustrate a flowchart of a method of accessing a control from a user interface that is open from a display, according to some embodiments. Fig. 7A-7E illustrate a flowchart of a method of accessing a control from a user interface that is open from a display, according to some embodiments. Fig. 8A-8C illustrate a flowchart of a method of adding a mini-application object to a mini-application user interface, according to some embodiments. Fig. 9A-9E illustrate a flowchart of a method of using a multi-page control panel navigation control, according to some embodiments. The user interfaces in fig. 5A to 5BJ are used to illustrate the processes in fig. 6A to 6E, fig. 7A to 7E, fig. 8A to 8C, and fig. 9A to 9E.
Exemplary apparatus
Reference will now be made in detail to embodiments, examples of which are illustrated in the accompanying drawings. Numerous specific details are set forth in the following detailed description in order to provide a thorough understanding of the various described embodiments. It will be apparent, however, to one of ordinary skill in the art that the various described embodiments may be practiced without these specific details. In other instances, well-known methods, procedures, components, circuits, and networks have not been described in detail so as not to unnecessarily obscure aspects of the embodiments.
It will also be understood that, although the terms "first," "second," etc. may be used herein to describe various elements in some cases, these elements should not be limited by these terms. These terms are only used to distinguish one element from another element. For example, a first contact may be named a second contact, and similarly, a second contact may be named a first contact without departing from the scope of the various described embodiments. The first contact and the second contact are both contacts, but they are not the same contact unless the context clearly indicates otherwise.
The terminology used in the description of the various illustrated embodiments herein is for the purpose of describing particular embodiments only and is not intended to be limiting. As used in the description of the various described embodiments and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will also be understood that the term "and/or" as used herein refers to and encompasses any and all possible combinations of one or more of the associated listed items. It will be further understood that the terms "comprises" and/or "comprising," when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
As used herein, the term "if" is optionally interpreted to mean "when..once..once.," "when" or "upon") or "in response to a determination" or "in response to detection" depending on the context. Similarly, the phrase "if determined" or "if detected [ stated condition or event ]" is optionally interpreted to mean "upon determination" or "in response to determination" or "upon detection of [ stated condition or event ]" or "in response to detection of [ stated condition or event ]" depending on the context.
Embodiments of electronic devices, user interfaces for such devices, and associated processes for using such devices are described herein. In some embodiments, the device is a portable communication device, such as a mobile phone, that also includes other functions, such as PDA and/or music player functions. Exemplary embodiments of the portable multifunction device include, but are not limited to, those from Apple inc (Cupertino, california)iPodAndAn apparatus. Other portable electronic devices, such as a laptop or tablet computer having a touch-sensitive surface (e.g., a touch screen display and/or a touchpad), are optionally used. It should also be appreciated that in some embodiments, the device is not a portable communication device, but rather a desktop computer having a touch-sensitive surface (e.g., a touch screen display and/or a touch pad).
In the following discussion, an electronic device including a display and a touch-sensitive surface is described. However, it should be understood that the electronic device optionally includes one or more other physical user interface devices, such as a physical keyboard, mouse, and/or joystick.
The device typically supports one or more of a variety of applications such as a notepad application, a drawing application, a presentation application, a word processing application, a website creation application, a disk editing application, a spreadsheet application, a gaming application, a telephony application, a video conferencing application, an email application, an instant messaging application, a workout support application, a photograph management application, a digital camera application, a digital video camera application, a web browsing application, a digital music player application, and/or a digital video player application.
The various applications executing on the device optionally use at least one common physical user interface device, such as a touch-sensitive surface. One or more functions of the touch-sensitive surface and corresponding information displayed on the device are optionally adjusted and/or changed from application to application and/or within a respective application. In this way, the common physical architecture of the devices (such as the touch-sensitive surface) optionally supports various applications with a user interface that is intuitive and transparent to the user.
Attention is now directed to embodiments of a portable device having a touch sensitive display. Fig. 1A is a block diagram illustrating a portable multifunction device 100 with a touch-sensitive display system 112 in accordance with some embodiments. Touch-sensitive display system 112 is sometimes referred to as a "touch screen" for convenience and is sometimes referred to simply as a touch-sensitive display. Device 100 includes memory 102 (which optionally includes one or more computer-readable storage media), memory controller 122, one or more processing units (CPUs) 120, peripheral interface 118, RF circuitry 108, audio circuitry 110, speaker 111, microphone 113, input/output (I/O) subsystem 106, other input or control devices 116, and external ports 124. The apparatus 100 optionally includes one or more optical sensors 164. The device 100 optionally includes one or more intensity sensors 165 for detecting the intensity of a contact on the device 100 (e.g., a touch-sensitive surface, such as the touch-sensitive display system 112 of the device 100). Device 100 optionally includes one or more haptic output generators 167 for generating haptic outputs on device 100 (e.g., generating haptic outputs on a touch-sensitive surface, such as touch-sensitive display system 112 of device 100 or touch pad 355 of device 300). These components optionally communicate via one or more communication buses or signal lines 103.
As used in this specification and in the claims, the term "haptic output" refers to a physical displacement of a device relative to a previous position of the device, a physical displacement of a component of the device (e.g., a touch sensitive surface) relative to another component of the device (e.g., a housing), or a displacement of a component relative to a centroid of the device, to be detected by a user with a user's feel. For example, in the case of a device or device component in contact with a touch-sensitive surface of a user (e.g., a finger, palm, or other portion of a user's hand), the haptic output generated by the physical displacement will be interpreted by the user as a tactile sensation corresponding to a perceived change in a physical characteristic of the device or device component. For example, movement of a touch-sensitive surface (e.g., a touch-sensitive display or touch pad) is optionally interpreted by a user as a "press click" or "click-down" of a physically actuated button. In some cases, the user will feel a tactile sensation, such as "press click" or "click down", even when the physical actuation button associated with the touch-sensitive surface that is physically pressed (e.g., displaced) by the user's movement is not moved. As another example, movement of the touch-sensitive surface is optionally interpreted or perceived by a user as "roughness" of the touch-sensitive surface, even when the smoothness of the touch-sensitive surface is unchanged. While such interpretation of touches by a user will be limited by the user's individualized sensory perception, the sensory perception of many touches is common to most users. Thus, when a haptic output is described as corresponding to a particular sensory perception of a user (e.g., "click down", "roughness"), unless otherwise stated, the haptic output generated corresponds to a physical displacement of the device or component thereof that would generate the sensory perception of a typical (or ordinary) user.
It should be understood that the device 100 is only one example of a portable multifunction device, and that the device 100 optionally has more or fewer components than shown, optionally combines two or more components, or optionally has a different configuration or arrangement of the components. The various components shown in fig. 1A are implemented in hardware, software, firmware, or any combination thereof (including one or more signal processing and/or application specific integrated circuits).
Memory 102 optionally includes high-speed random access memory, and also optionally includes non-volatile memory, such as one or more disk storage devices, flash memory devices, or other non-volatile solid-state memory devices. Access to memory 102 by other components of device 100, such as one or more CPUs 120 and peripheral interfaces 118, is optionally controlled by a memory controller 122.
The peripheral interface 118 can be used to couple input and output peripherals of the device to the one or more CPUs 120 and the memory 102. The one or more processors 120 run or execute various software programs and/or sets of instructions stored in the memory 102 to perform various functions for the device 100 and process data.
In some embodiments, peripheral interface 118, one or more CPUs 120, and memory controller 122 are optionally implemented on a single chip, such as chip 104. In some other embodiments, they are optionally implemented on separate chips.
The RF (radio frequency) circuit 108 receives and transmits RF signals, also known as electromagnetic signals. Rf circuitry 108 converts/converts electrical signals to/from electromagnetic signals and communicates with a communication network and other communication devices via the electromagnetic signals. RF circuitry 108 optionally includes well known circuitry for performing these functions including, but not limited to, an antenna system, an RF transceiver, one or more amplifiers, a tuner, one or more oscillators, a digital signal processor, a codec chipset, a Subscriber Identity Module (SIM) card, memory, and the like. RF circuitry 108 optionally communicates via wireless communication with a network, such as the internet (also known as the World Wide Web (WWW)), an intranet, and/or a wireless network, such as a cellular telephone network, a wireless Local Area Network (LAN), and/or a Metropolitan Area Network (MAN), among other devices. Wireless communications optionally use any of a variety of communication standards, protocols, and technologies including, but not limited to, global system for mobile communications (GSM), enhanced Data GSM Environment (EDGE), high Speed Downlink Packet Access (HSDPA), high Speed Uplink Packet Access (HSUPA), evolution, pure data (EV-DO), HSPA, hspa+, dual cell HSPA (DC-HSPDA), long Term Evolution (LTE), near Field Communications (NFC), wideband code division multiple access (W-CDMA), code Division Multiple Access (CDMA), time Division Multiple Access (TDMA), bluetooth, wireless fidelity (Wi-Fi) (e.g., IEEE802.11a, IEEE802.11 ac, IEEE802.11 ax, IEEE802.11 b, IEEE802.11 g, and/or IEEE802.11 n), voice over internet protocol (VoIP), wi-MAX, email protocols (e.g., internet Message Access Protocol (IMAP), and/or Post Office Protocol (POP)), instant messaging (e.g., extensible message processing field protocol (XMPP), session initiation protocol (sime), message and IMPS) for instant messaging and field utilization extensions, and SMS) protocols, or other protocols, or any other communications protocol that have not been developed until an appropriate date has been submitted.
Audio circuitry 110, speaker 111, and microphone 113 provide an audio interface between the user and device 100. Audio circuitry 110 receives audio data from peripheral interface 118, converts the audio data to electrical signals, and transmits the electrical signals to speaker 111. The speaker 111 converts electrical signals into human audible sound waves. The audio circuit 110 also receives an electrical signal converted from sound waves through a microphone 113. The audio circuitry 110 converts the electrical signals to audio data and transmits the audio data to the peripheral interface 118 for processing. The audio data is optionally retrieved from and/or transmitted to the memory 102 and/or the RF circuitry 108 by the peripheral interface 118. In some embodiments, the audio circuit 110 also includes a headset jack (e.g., 212 in fig. 2). The headset jack provides an interface between the audio circuit 110 and a removable audio input/output peripheral such as an output-only earphone or a headset having both an output (e.g., a monaural or binaural earphone) and an input (e.g., a microphone).
The I/O subsystem 106 couples input/output peripheral devices on the device 100, such as the touch-sensitive display system 112 and other input or control devices 116, to the peripheral device interface 118. The I/O subsystem 106 optionally includes a display controller 156, an optical sensor controller 158, an intensity sensor controller 159, a haptic feedback controller 161, and one or more input controllers 160 for other input or control devices. The one or more input controllers 160 receive electrical signals from/transmit electrical signals to other input or control devices 116. Other input control devices 116 optionally include physical buttons (e.g., push buttons, rocker buttons, etc.), dials, slider switches, joysticks, click wheels, and the like. In some alternative implementations, one or more input controllers 160 are optionally coupled to (or not coupled to) any of a keyboard, an infrared port, a USB port, a stylus, and/or a pointing device such as a mouse. One or more buttons (e.g., 208 in fig. 2) optionally include an up/down button for volume control of speaker 111 and/or microphone 113. The one or more buttons optionally include a push button (e.g., 206 in fig. 2).
The touch sensitive display system 112 provides an input interface and an output interface between the device and the user. The display controller 156 receives electrical signals from the touch sensitive display system 112 and/or transmits electrical signals to the touch sensitive display system 712. The touch sensitive display system 112 displays visual output to a user. Visual output optionally includes graphics, text, icons, video, and any combination thereof (collectively, "graphics"). In some embodiments, some or all of the visual output corresponds to a user interface object. As used herein, the term "affordance" refers to a user-interactive graphical user interface object (e.g., a graphical user interface object configured to respond to input directed to the graphical user interface object). Examples of user interactive graphical user interface objects include, but are not limited to, buttons, sliders, icons, selectable menu items, switches, or other user interface controls.
The touch-sensitive display system 112 has a touch-sensitive surface, sensor, or set of sensors that accepts input from a user based on haptic and/or tactile contact. The touch-sensitive display system 112 and the display controller 156 (along with any associated modules and/or sets of instructions in the memory 102) detect contact (and any movement or interruption of the contact) on the touch-sensitive display system 112 and translate the detected contact into interactions with user interface objects (e.g., one or more soft keys, icons, web pages, or images) displayed on the touch-sensitive display system 112. In one exemplary embodiment, the point of contact between the touch sensitive display system 112 and the user corresponds to a user's finger or stylus.
Touch-sensitive display system 112 optionally uses LCD (liquid crystal display) technology, LPD (light emitting polymer display) technology, or LED (light emitting diode) technology, but in other embodiments other display technologies are used. The touch sensitive display system 112 and the display controller 156 optionally detect contact and any movement or interruption thereof using any of a variety of touch sensing technologies now known or later developed, including but not limited to capacitive, resistive, infrared, and surface acoustic wave technologies, as well as other proximity sensor arrays or other elements for determining one or more points of contact with the touch sensitive display system 112. In one exemplary embodiment, a projected mutual capacitance sensing technique is used, such as that from Apple inc (Cupertino, california)iPodAndThe technology found in (a) is provided.
The touch sensitive display system 112 optionally has a video resolution in excess of 100 dpi. In some implementations, the touch screen video resolution exceeds 400dpi (e.g., 500dpi, 800dpi, or greater). The user optionally contacts the touch-sensitive display system 112 using any suitable object or appendage, such as a stylus, finger, or the like. In some embodiments, the user interface is designed to work with finger-based contacts and gestures, which may not be as accurate as stylus-based input due to the large contact area of the finger on the touch screen. In some embodiments, the device translates the finger-based coarse input into a precise pointer/cursor position or command for performing the action desired by the user.
In some embodiments, the device 100 optionally includes a touch pad (not shown) for activating or deactivating specific functions in addition to the touch screen. In some implementations, the touch pad is a touch sensitive area of the device that, unlike the touch screen, does not display visual output. The touch pad is optionally a touch-sensitive surface separate from the touch-sensitive display system 112 or an extension of the touch-sensitive surface formed by the touch screen.
The apparatus 100 also includes a power system 162 for powering the various components. The power system 162 optionally includes a power management system, one or more power sources (e.g., battery, alternating Current (AC)), a recharging system, a power failure detection circuit, a power converter or inverter, a power status indicator (e.g., light Emitting Diode (LED)), and any other components associated with the generation, management, and distribution of power in the portable device.
The apparatus 100 optionally further comprises one or more optical sensors 164. FIG. 1A shows an optical sensor coupled to an optical sensor controller 158 in the I/O subsystem 106. The one or more optical sensors 164 optionally include a Charge Coupled Device (CCD) or a Complementary Metal Oxide Semiconductor (CMOS) phototransistor. The one or more optical sensors 164 receive light projected through the one or more lenses from the environment and convert the light into data representing an image. In conjunction with imaging module 143 (also referred to as a camera module), one or more optical sensors 164 optionally capture still images and/or video. In some embodiments, an optical sensor is located on the rear of the device 100, opposite the touch sensitive display system 112 on the front of the device, so that the touch screen can be used as a viewfinder for still image and/or video image acquisition. In some embodiments, another optical sensor is located on the front of the device to acquire an image of the user (e.g., for self-timer shooting, for video conferencing while the user views other video conference participants on a touch screen, etc.).
The apparatus 100 optionally further comprises one or more contact intensity sensors 165. FIG. 1A shows a contact intensity sensor coupled to an intensity sensor controller 159 in the I/O subsystem 106. The one or more contact strength sensors 165 optionally include one or more piezoresistive strain gauges, capacitive force sensors, electrical force sensors, piezoelectric force sensors, optical force sensors, capacitive touch-sensitive surfaces, or other strength sensors (e.g., sensors for measuring force (or pressure) of a contact on a touch-sensitive surface). One or more contact strength sensors 165 receive contact strength information (e.g., pressure information or a surrogate for pressure information) from the environment. In some implementations, at least one contact intensity sensor is juxtaposed or adjacent to a touch-sensitive surface (e.g., touch-sensitive display system 112). In some embodiments, at least one contact intensity sensor is located on the rear of the device 100, opposite the touch sensitive display system 112 located on the front of the device 100.
The device 100 optionally further includes one or more proximity sensors 166. Fig. 1A shows a proximity sensor 166 coupled to the peripheral interface 118. Alternatively, the proximity sensor 166 is coupled to the input controller 160 in the I/O subsystem 106. In some implementations, the proximity sensor turns off and disables the touch-sensitive display system 112 when the multifunction device is placed near the user's ear (e.g., when the user is making a phone call).
The device 100 optionally further comprises one or more tactile output generators 167. FIG. 1A illustrates a haptic output generator coupled to a haptic feedback controller 161 in the I/O subsystem 106. The one or more tactile output generators 167 optionally include one or more electroacoustic devices such as speakers or other audio components, and/or electromechanical devices that convert energy into linear motion such as motors, solenoids, electroactive polymers, piezoelectric actuators, electrostatic actuators, or other tactile output generating components (e.g., components that convert electrical signals into tactile outputs on a device). The one or more haptic output generators 167 receive haptic feedback generation instructions from the haptic feedback module 133 and generate haptic output on the device 100 that can be perceived by a user of the device 100. In some embodiments, at least one tactile output generator is juxtaposed or adjacent to a touch-sensitive surface (e.g., touch-sensitive display system 112), and optionally generates a tactile output by moving the touch-sensitive surface vertically (e.g., inward/outward of the surface of device 100) or laterally (e.g., backward and forward in the same plane as the surface of device 100). In some embodiments, at least one tactile output generator sensor is located on the rear of the device 100, opposite the touch sensitive display system 112 located on the front of the device 100.
The device 100 optionally further includes one or more accelerometers 168. Fig. 1A shows accelerometer 168 coupled to peripheral interface 118. Alternatively, accelerometer 168 is optionally coupled with input controller 160 in I/O subsystem 106. In some implementations, information is displayed in a portrait view or a landscape view on a touch screen display based on analysis of data received from one or more accelerometers. In addition to the one or more accelerometers 168, the device 100 optionally includes magnetometers (not shown) and GPS (or GLONASS or other global navigation system) receivers (not shown) for acquiring information regarding the position and orientation (e.g., longitudinal or lateral) of the device 100.
In some embodiments, the software components stored in memory 102 include an operating system 126, a communication module (or instruction set) 128, a contact/motion module (or instruction set) 130, a graphics module (or instruction set) 132, a haptic feedback module (or instruction set) 133, a text input module (or instruction set) 134, a Global Positioning System (GPS) module (or instruction set) 135, and an application program (or instruction set) 136. Further, in some embodiments, memory 102 stores device/global internal state 157, as shown in fig. 1A and 3. The device/global internal state 157 includes one or more of an active application state that indicates which applications (if any) are currently active, a display state that indicates what applications, views, or other information occupy various areas of the touch-sensitive display system 112, a sensor state that includes information obtained from various sensors and other input or control devices 116 of the device, and location and/or positioning information regarding the device's location and/or pose.
Operating system 126 (e.g., iOS, darwin, RTXC, LINUX, UNIX, OSX, WINDOWS, or an embedded operating system such as VxWorks) includes various software components and/or drivers for controlling and managing general system tasks (e.g., memory management, storage control, power management, etc.), and facilitates communication between the various hardware and software components.
The communication module 128 facilitates communication with other devices through one or more external ports 124 and also includes various software components for processing data received by the RF circuitry 108 and/or the external ports 124. External port 124 (e.g., universal Serial Bus (USB), firewire, etc.) is adapted to be coupled directly to other devices or indirectly via a network (e.g., the internet, wireless LAN, etc.). In some embodiments, the external ports are some of the same as Apple inc (Cupertino, california)iPodThe same or similar and/or compatible multi-pin (e.g., 30-pin) connectors as used in iPod devices. In some embodiments, the external ports are some of the same as Apple inc (Cupertino, california)iPodThe same or similar and/or compatible lighting connectors as those used in iPod devices.
The contact/motion module 130 optionally detects contact with the touch-sensitive display system 112 (in conjunction with the display controller 156) and other touch-sensitive devices (e.g., a touchpad or physical click wheel). The contact/motion module 130 includes various software components for performing various operations related to detection of contact (e.g., by a finger or stylus), such as determining whether contact has occurred (e.g., detecting a finger press event), determining the strength of the contact (e.g., the force or pressure of the contact, or a substitute for the force or pressure of the contact), determining whether there is movement of the contact and tracking movement on the touch-sensitive surface (e.g., detecting one or more finger drag events), and determining whether the contact has ceased (e.g., detecting a finger lift event or contact disconnection). The contact/motion module 130 receives contact data from the touch-sensitive surface. Determining movement of the point of contact optionally includes determining a velocity (magnitude), a speed (magnitude and direction), and/or an acceleration (change in magnitude and/or direction) of the point of contact, the movement of the point of contact being represented by a series of contact data. These operations are optionally applied to single point contacts (e.g., single-finger contacts or stylus contacts) or simultaneous multi-point contacts (e.g., "multi-touch"/multi-finger contacts). In some embodiments, the contact/motion module 130 and the display controller 156 detect contact on the touch pad.
The contact/motion module 130 optionally detects gesture input by the user. Different gestures on the touch-sensitive surface have different contact patterns (e.g., different movements, timings, and/or intensities of the detected contacts). Thus, gestures are optionally detected by detecting a specific contact pattern. For example, detecting a single-finger tap gesture includes detecting a finger-down event, and then detecting a finger-up (lift-off) event at the same location (or substantially the same location) as the finger-down event (e.g., at an icon location). As another example, detecting a finger swipe gesture on the touch-sensitive surface includes detecting a finger-down event, then detecting one or more finger-drag events, then detecting a finger-up (lift-off) event. Similarly, by detecting specific contact patterns of the stylus, taps, swipes, drags, and other gestures of the stylus are optionally detected.
In some embodiments, detecting a single-finger flick gesture depends on a length of time between detecting a finger press event and a finger lift event, but is independent of an intensity of finger contact between detecting the finger press event and the finger lift event. In some embodiments, the flick gesture is detected in accordance with a determination that the length of time between the finger press event and the finger lift event is less than a predetermined value (e.g., less than 0.1, 0.2, 0.3, 0.4, or 0.5 seconds), regardless of whether the intensity of the finger contact during the flick reaches a given intensity threshold (greater than a nominal contact detection intensity threshold), such as a light press or a deep press intensity threshold. Thus, a single-finger flick gesture may satisfy an input criterion configured to be met even when a characteristic intensity of a contact does not satisfy a given intensity threshold. For clarity, finger contact in a flick gesture is typically required to meet a nominal contact detection intensity threshold to detect a finger press event, below which no contact is detected. Similar analysis applies to detecting a flick gesture by a stylus or other contact. Where the device is configured to detect finger or stylus contact hovering over a touch sensitive surface, the nominal contact detection intensity threshold optionally does not correspond to physical contact between the finger or stylus and the touch sensitive surface.
In some embodiments, detecting a single-finger flick gesture depends on a length of time between detecting a finger press event and a finger lift event, but is independent of an intensity of finger contact between detecting the finger press event and the finger lift event. In some embodiments, the flick gesture is detected in accordance with a determination that the length of time between the finger press event and the finger lift event is less than a predetermined value (e.g., less than 0.1, 0.2, 0.3, 0.4, or 0.5 seconds), regardless of whether the intensity of the finger contact during the flick reaches a given intensity threshold (greater than a nominal contact detection intensity threshold), such as a light press or a deep press intensity threshold. Thus, a single-finger flick gesture may satisfy a particular input criterion that does not require the characteristic intensity of the contact to satisfy a given intensity threshold to meet the particular input criterion. For clarity, finger contact in a flick gesture is typically required to meet a nominal contact detection intensity threshold to detect a finger press event, below which no contact is detected. Similar analysis applies to detecting a flick gesture by a stylus or other contact. In the case where the device is capable of detecting finger or stylus contact hovering over a touch sensitive surface, the nominal contact detection intensity threshold optionally does not correspond to physical contact between the finger or stylus and the touch sensitive surface.
The same concepts apply in a similar manner to other types of gestures. For example, a swipe gesture, pinch gesture, spread gesture, and/or long press gesture is optionally detected based on meeting criteria that one or more contacts that are independent of the intensity of contacts included in the gesture or do not require the performance of the gesture reach an intensity threshold in order to be recognized. For example, a swipe gesture is detected based on a movement amount of one or more contacts, a pinch gesture is detected based on a movement of two or more contacts toward each other, a spread gesture is detected based on a movement of two or more contacts away from each other, and a long press gesture is detected based on a duration of a contact on the touch-sensitive surface having a movement amount less than a threshold. Thus, a statement that a particular gesture recognition criterion does not require the intensity of one or more contacts to meet a respective intensity threshold to meet the particular gesture recognition criterion means that the particular gesture recognition criterion can be met when one or more contacts in the gesture do not meet the respective intensity threshold, but can also be met when one or more contacts in the gesture meet or exceed the respective intensity threshold. In some implementations, the flick gesture is detected based on determining that a finger press event and a finger lift event are detected within a predefined time period without regard to whether the contact is above or below a respective intensity threshold during the predefined time period, and the swipe gesture is detected based on determining that the contact movement is above a predefined magnitude even at the end of the contact movement. Even in implementations where the detection of gestures is affected by the intensity of the contact performing the gesture (e.g., the device detects a long press faster when the intensity of the contact is above an intensity threshold, or the device delays the detection of a tap input when the intensity of the contact is high), the detection of these gestures does not require the contact to reach a particular intensity threshold, provided that the criteria for recognizing the gesture are met if the contact does not reach the particular intensity threshold (e.g., even if the amount of time required for recognizing the gesture changes).
In some cases, the contact strength threshold, duration threshold, and movement threshold are combined in various different combinations in order to create heuristics for distinguishing between two or more different gestures for the same input element or region, such that different interactions with the same input element can provide a richer set of user interactions and responses. The statement that a particular set of gesture recognition criteria does not require that the intensity of one or more contacts meet a respective intensity threshold to meet the particular gesture recognition criteria does not preclude simultaneous evaluation of other intensity-dependent gesture recognition criteria to identify other gestures that do have criteria that are met when the gesture includes a contact with an intensity above the respective intensity threshold. For example, in some cases, a first gesture recognition criterion for a first gesture competes with a second gesture recognition criterion for a second gesture, wherein the first gesture recognition criterion does not require an intensity of one or more contacts to meet a respective intensity threshold to meet the first gesture recognition criterion, the second gesture recognition criterion being dependent on the one or more contacts reaching the respective intensity threshold. In such a competition, if the gesture first meets the second gesture recognition criteria for the second gesture, the gesture is optionally not recognized as meeting the first gesture recognition criteria for the first gesture. For example, if the contact reaches a respective intensity threshold before moving a predefined amount of movement, a deep press gesture is detected instead of a swipe gesture. Conversely, if the contact moves a predefined amount of movement before reaching the corresponding intensity threshold, a swipe gesture is detected instead of a deep press gesture. Even in this case, the first gesture recognition criteria for the first gesture still does not require that the intensity of the one or more contacts satisfy the respective intensity threshold to satisfy the first gesture recognition criteria because if the contact remains below the respective intensity threshold until the gesture ends (e.g., a swipe gesture having contacts that do not increase in intensity above the respective intensity threshold), the gesture will be recognized as a swipe gesture in accordance with the first gesture recognition criteria. In this way, a particular gesture recognition criterion that does not require the intensity of one or more contacts to meet the respective intensity threshold to meet the particular gesture recognition criterion will (a) in some cases ignore the intensity of the contact associated with the intensity threshold (e.g., for a tap gesture) and/or (B) in some cases still depend on the intensity of the contact associated with the intensity threshold, in the sense that if a set of competing intensity-dependent gesture recognition criteria (e.g., for a deep press gesture) recognizes an input as corresponding to an intensity-dependent gesture (e.g., for a long press gesture that competes for recognition with a deep press gesture) before the particular gesture recognition criteria recognizes the gesture corresponding to the input, the particular gesture recognition criterion (e.g., for a long press gesture) will fail.
Graphics module 132 includes various known software components for rendering and displaying graphics on touch-sensitive display system 112 or other displays, including components for changing the visual impact (e.g., brightness, transparency, saturation, contrast, or other visual attribute) of the displayed graphics. As used herein, the term "graphic" includes any object that may be displayed to a user, including but not limited to text, web pages, icons (such as user interface objects including soft keys), digital images, video, animation, and the like.
In some embodiments, graphics module 132 stores data representing graphics to be used. Each graphic is optionally assigned a corresponding code. The graphic module 132 receives one or more codes for designating graphics to be displayed from an application program or the like, and also receives coordinate data and other graphic attribute data together if necessary, and then generates screen image data to output to the display controller 156.
Haptic feedback module 133 includes various software components for generating instructions for use by haptic output generator 167 to generate haptic output at one or more locations on device 100 in response to user interaction with device 100.
Text input module 134, which is optionally a component of graphics module 132, provides a soft keyboard for entering text in various applications (e.g., contacts 137, email 140, IM 141, browser 147, and any other application requiring text input).
The GPS module 135 determines the location of the device and provides this information for use in various applications (e.g., to the phone 138 for location-based dialing, to the camera 143 as photo/video metadata, and to applications providing location-based services, such as weather desktops, local page desktops, and map/navigation desktops).
The application 136 optionally includes the following modules (or instruction sets) or a subset or superset thereof:
a contacts module 137 (sometimes referred to as an address book or contact list);
A telephone module 138;
A video conference module 139;
An email client module 140;
An Instant Messaging (IM) module 141;
a fitness support module 142;
A camera module 143 for still and/or video images;
An image management module 144;
A browser module 147;
a calendar module 148;
Desktop applet modules 149 optionally including one or more of a weather desktop applet 149-1, a stock desktop applet 149-2, a calculator desktop applet 149-3, an alarm desktop applet 149-4, a dictionary desktop applet 149-5 and other desktop applets obtained by a user, and a user-created desktop applet 149-6;
a desktop applet creator module 150 for forming a user-created desktop applet 149-6;
a search module 151;
A video and music player module 152, optionally consisting of a video player module and a music player module;
a notepad module 153;
map module 154, and/or
An online video module 155.
Examples of other applications 136 optionally stored in memory 102 include other word processing applications, other image editing applications, drawing applications, presentation applications, JAVA-enabled applications, encryption, digital rights management, voice recognition, and voice replication.
In conjunction with touch sensitive display system 112, display controller 156, contact module 130, graphics module 132, and text input module 134, contact module 137 includes executable instructions for managing an address book or contact list (e.g., in application internal state 192 of contact module 137 stored in memory 102 or memory 370), including adding one or more names to the address book, deleting one or more names from the address book, associating one or more phone numbers, one or more email addresses, one or more physical addresses, or other information with names, associating images with names of names, categorizing and ordering names, providing phone numbers and/or email addresses to initiate and/or facilitate communication through phone 138, video conference 139, email 140, or IM 141, and the like.
In conjunction with RF circuitry 108, audio circuitry 110, speaker 111, microphone 113, touch-sensitive display system 112, display controller 156, contact module 130, graphics module 132, and text input module 134, telephone module 138 includes executable instructions for entering a sequence of characters corresponding to a telephone number, accessing one or more telephone numbers in address book 137, modifying the entered telephone numbers, dialing the corresponding telephone numbers, conducting a conversation, and disconnecting or hanging up when the conversation is completed. As described above, wireless communication optionally uses any of a variety of communication standards, protocols, and technologies.
In conjunction with RF circuitry 108, audio circuitry 110, speaker 111, microphone 113, touch-sensitive display system 112, display controller 156, one or more optical sensors 164, optical sensor controller 158, contact module 130, graphics module 132, text input module 134, contact list 137, and telephony module 138, videoconferencing module 139 includes executable instructions for initiating, conducting, and terminating a videoconference between a user and one or more other participants according to user instructions.
In conjunction with RF circuitry 108, touch-sensitive display system 112, display controller 156, contact module 130, graphics module 132, and text input module 134, email client module 140 includes executable instructions for creating, sending, receiving, and managing emails in response to user instructions. In conjunction with the image management module 144, the email client module 140 makes it very easy to create and send emails with still or video images captured by the camera module 143.
In conjunction with the RF circuitry 108, the touch-sensitive display system 112, the display controller 156, the contact module 130, the graphics module 132, and the text input module 134, the instant message module 141 includes executable instructions for inputting a sequence of characters corresponding to an instant message, modifying previously entered characters, sending a corresponding instant message (e.g., using a Short Message Service (SMS) or Multimedia Message Service (MMS) protocol for a phone-based instant message or using XMPP, SIMPLE, apple Push Notification Service (APNs) or IMPS) for an internet-based instant message), receiving an instant message, and viewing the received instant message. In some embodiments, the transmitted and/or received instant message optionally includes graphics, photographs, audio files, video files, and/or other attachments supported in an MMS and/or Enhanced Messaging Service (EMS). As used herein, "instant message" refers to both telephone-based messages (e.g., messages sent using SMS or MMS) and internet-based messages (e.g., messages sent using XMPP, SIMPLE, APNs or IMPS).
In conjunction with the RF circuitry 108, the touch-sensitive display system 112, the display controller 156, the contact module 130, the graphics module 132, the text input module 134, the GPS module 135, the map module 154, and the music player module 146, the workout support module 142 includes executable instructions for creating a workout (e.g., including time, distance, and/or calorie burn targets), communicating with workout sensors (in sports devices and smart watches), receiving workout sensor data, calibrating sensors for monitoring the workout, selecting and playing music for the workout, and displaying, storing, and transmitting workout data.
In conjunction with touch-sensitive display system 112, display controller 156, one or more optical sensors 164, optical sensor controller 158, contact module 130, graphics module 132, and image management module 144, camera module 143 includes executable instructions for capturing still images or video (including video streams) and storing them into memory 102, modifying features of the still images or video, and/or deleting the still images or video from memory 102.
In conjunction with the touch-sensitive display system 112, the display controller 156, the contact module 130, the graphics module 132, the text input module 134, and the camera module 143, the image management module 144 includes executable instructions for arranging, modifying (e.g., editing), or otherwise manipulating, tagging, deleting, presenting (e.g., in a digital slide or album), and storing still images and/or video images.
In conjunction with RF circuitry 108, touch-sensitive display system 112, display system controller 156, touch module 130, graphics module 132, and text input module 134, browser module 147 includes executable instructions for browsing the internet (including searching, linking to, receiving, and displaying web pages or portions thereof, as well as attachments and other files linked to web pages) according to user instructions.
In conjunction with RF circuitry 108, touch-sensitive display system 112, display system controller 156, contact module 130, graphics module 132, text input module 134, email client module 140, and browser module 147, calendar module 148 includes executable instructions for creating, displaying, modifying, and storing calendars and data associated with calendars (e.g., calendar entries, to-do items, etc.) according to user instructions.
In conjunction with the RF circuitry 108, the touch-sensitive display system 112, the display system controller 156, the contact module 130, the graphics module 132, the text input module 134, and the browser module 147, the desktop applet module 149 is a mini-application (e.g., weather desktop applet 149-1, stock desktop applet 149-2, calculator desktop applet 149-3, alarm clock desktop applet 149-4, and dictionary desktop applet 149-5) or a mini-application created by a user (e.g., user created desktop applet 149-6) that is optionally downloaded and used by the user. In some embodiments, the desktop applet includes an HTML (hypertext markup language) file, a CSS (cascading style sheet) file, and a JavaScript file. In some embodiments, the desktop applet includes an XML (extensible markup language) file and a JavaScript file (e.g., yahoo.
In conjunction with RF circuitry 108, touch-sensitive display system 112, display system controller 156, contact module 130, graphics module 132, text input module 134, and browser module 147, desktop applet creator module 150 includes executable instructions for creating an applet (e.g., turning a user-specified portion of a web page into the applet).
In conjunction with touch-sensitive display system 112, display system controller 156, contact module 130, graphics module 132, and text input module 134, search module 151 includes executable instructions for searching for text, music, sound, images, video, and/or other files in memory 102 that match one or more search criteria (e.g., one or more user-specified search terms) according to user instructions.
In conjunction with the touch-sensitive display system 112, the display system controller 156, the contact module 130, the graphics module 132, the audio circuit 110, the speaker 111, the RF circuit 108, and the browser module 147, the video and music player module 152 includes executable instructions that allow a user to download and playback recorded music and other sound files stored in one or more file formats, such as MP3 or AAC files, and executable instructions for displaying, rendering, or otherwise playing back video (e.g., on the touch-sensitive display system 112, or on an external display that is wirelessly connected or via the external port 124). In some embodiments, the device 100 optionally includes the functionality of an MP3 player such as iPod (trademark of Apple inc.).
In conjunction with touch-sensitive display system 112, display controller 156, touch module 130, graphics module 132, and text input module 134, notepad module 153 includes executable instructions for creating and managing notepads, backlog, and the like in accordance with user instructions.
In conjunction with the RF circuitry 108, the touch-sensitive display system 112, the display system controller 156, the contact module 130, the graphics module 132, the text input module 134, the GPS module 135, and the browser module 147, the map module 154 includes executable instructions for receiving, displaying, modifying, and storing maps and data associated with maps (e.g., driving directions; data related to shops and other points of interest at or near a particular location; and other location-based data) according to user instructions.
In conjunction with the touch sensitive display system 112, the display system controller 156, the contact module 130, the graphics module 132, the audio circuit 110, the speaker 111, the RF circuit 108, the text input module 134, the email client module 140, and the browser module 147, the online video module 155 includes executable instructions that allow a user to access, browse, receive (e.g., by streaming and/or downloading), playback (e.g., on the touch screen 112 or on an external display connected wirelessly or via the external port 124), send emails containing links to particular online videos, and otherwise manage online videos in one or more file formats such as h.264. In some embodiments, the instant messaging module 141 is used instead of the email client module 140 to send links to specific online videos.
Each of the above-described modules and applications corresponds to a set of executable instructions for performing one or more of the functions described above, as well as the methods described herein (e.g., computer-implemented methods and other information processing methods described herein). These modules (i.e., sets of instructions) need not be implemented in separate software programs, procedures or modules, and thus various subsets of these modules are optionally combined or otherwise rearranged in various embodiments. In some embodiments, memory 102 optionally stores a subset of the modules and data structures described above. Further, memory 102 optionally stores additional modules and data structures not described above.
In some embodiments, device 100 is a device on which the operation of a predefined set of functions is performed exclusively by a touch screen and/or touch pad. The number of physical input control devices (such as push buttons, dials, etc.) on the device 100 is optionally reduced by using a touch screen and/or touch pad as the primary input control device for operation of the device 100.
The predefined set of functions performed solely by the touch screen and/or touch pad optionally includes navigation between user interfaces. In some embodiments, the touch pad, when touched by a user, navigates the device 100 from any user interface displayed on the device 100 to a main menu, home menu, or root menu. In such implementations, a "menu button" is implemented using a touch pad. In some other embodiments, the menu button is a physical push button or other physical input control device, rather than a touch pad.
FIG. 1B is a block diagram illustrating exemplary components for event processing according to some embodiments. In some implementations, memory 102 (in fig. 1A) or memory 370 (fig. 3) includes event sorter 170 (e.g., in operating system 126) and respective applications 136-1 (e.g., any of the aforementioned applications 136, 137-155, 380-390).
The event classifier 170 receives the event information and determines the application view 191 of the application 136-1 and the application 136-1 to which the event information is to be delivered. The event sorter 170 includes an event monitor 171 and an event dispatcher module 174. In some implementations, the application 136-1 includes an application internal state 192 that indicates one or more current application views that are displayed on the touch-sensitive display system 112 when the application is active or executing. In some embodiments, the device/global internal state 157 is used by the event classifier 170 to determine which application or applications are currently active, and the application internal state 192 is used by the event classifier 170 to determine the application view 191 to which to deliver event information.
In some embodiments, the application internal state 192 includes additional information such as one or more of resume information to be used when the application 136-1 resumes execution, user interface state information indicating information being displayed by the application 136-1 or information ready for display by the application, a state queue for enabling a user to return to a previous state or view of the application 136-1, and a repeat/undo queue of previous actions taken by the user.
Event monitor 171 receives event information from peripheral interface 118. The event information includes information about sub-events (e.g., user touches on the touch sensitive display system 112 as part of a multi-touch gesture). The peripheral interface 118 transmits information it receives from the I/O subsystem 106 or sensors, such as a proximity sensor 166, one or more accelerometers 168, and/or microphone 113 (via audio circuitry 110). The information received by the peripheral interface 118 from the I/O subsystem 106 includes information from the touch-sensitive display system 112 or touch-sensitive surface.
In some embodiments, event monitor 171 sends requests to peripheral interface 118 at predetermined intervals. In response, the peripheral interface 118 transmits event information. In other embodiments, the peripheral interface 118 transmits event information only if there is a significant event (e.g., an input above a predetermined noise threshold and/or an input exceeding a predetermined duration is received).
In some implementations, the event classifier 170 also includes a hit view determination module 172 and/or an active event identifier determination module 173.
When the touch sensitive display system 112 displays more than one view, the hit view determination module 172 provides a software process for determining where within one or more views a sub-event has occurred. The view is made up of controls and other elements that are viewable by a user on the display.
Another aspect of the user interface associated with an application is a set of views, sometimes referred to herein as application views or user interface windows, in which information is displayed and touch-based gestures occur. The application view (of the respective application) in which the touch is detected optionally corresponds to a level of programming within the application's programming or view hierarchy. For example, the lowest horizontal view in which a touch is detected is optionally referred to as a hit view, and the set of events that are considered to be correct inputs is optionally determined based at least in part on the hit view of the initial touch that begins a touch-based gesture.
The click view determination module 172 receives information related to sub-events of the contact-based gesture. When an application has multiple views organized in a hierarchy, hit view determination module 172 identifies the hit view as the lowest view in the hierarchy that should handle sub-events. In most cases, the hit view is the lowest level view in which the initiating sub-event (i.e., the first sub-event in the sequence of sub-events that form the event or potential event) occurs. Once the hit view is identified by the hit view determination module, the hit view typically receives all sub-events related to the same touch or input source for which it was identified as the hit view.
The activity event recognizer determination module 173 determines which view or views within the view hierarchy should receive a particular sequence of sub-events. In some implementations, the active event identifier determination module 173 determines that only the hit view should receive a particular sequence of sub-events. In other embodiments, the activity event recognizer determination module 173 determines that all views including the physical location of the sub-event are actively engaged views, and thus determines that all actively engaged views should receive a particular sequence of sub-events. In other embodiments, even if the touch sub-event is completely confined to the region associated with one particular view, the view higher in the hierarchy will remain the actively engaged view.
The event dispatcher module 174 dispatches event information to an event recognizer (e.g., event recognizer 180). In embodiments that include an active event recognizer determination module 173, the event dispatcher module 174 delivers event information to the event recognizers determined by the active event recognizer determination module 173. In some embodiments, the event dispatcher module 174 stores event information in an event queue that is retrieved by the corresponding event receiver module 182.
In some embodiments, the operating system 126 includes an event classifier 170. Alternatively, the application 136-1 includes an event classifier 170. In another embodiment, the event sorter 170 is a stand-alone module or part of another module stored in the memory 102 (such as the contact/motion module 130).
In some embodiments, the application 136-1 includes a plurality of event handlers 190 and one or more application views 191, where each application view includes instructions for processing touch events that occur within a respective view of the user interface of the application. Each application view 191 of the application 136-1 includes one or more event recognizers 180. Typically, the respective application view 191 includes a plurality of event recognizers 180. In other embodiments, one or more of the event recognizers 180 are part of a separate module, such as a user interface toolkit (not shown) or a higher level object from which the application 136-1 inherits methods and other properties. In some implementations, the respective event handler 190 includes one or more of a data updater 176, an object updater 177, a GUI updater 178, and/or event data 179 received from the event classifier 170. Event handler 190 optionally utilizes or invokes data updater 176, object updater 177, or GUI updater 178 to update the application internal state 192. Alternatively, one or more of application views 191 include one or more corresponding event handlers 190. Additionally, in some implementations, one or more of the data updater 176, the object updater 177, and the GUI updater 178 are included in a respective application view 191.
The corresponding event identifier 180 receives event information (e.g., event data 179) from the event classifier 170 and identifies events from the event information. Event recognizer 180 includes event receiver 182 and event comparator 184. In some embodiments, event recognizer 180 further includes at least a subset of metadata 183 and event delivery instructions 188 (which optionally include sub-event delivery instructions).
The event receiver 182 receives event information from the event sorter 170. The event information includes information about sub-events (e.g., touches or touch movements). The event information also includes additional information, such as the location of the sub-event, according to the sub-event. When a sub-event relates to movement of a touch, the event information optionally also includes the rate and direction of the sub-event. In some embodiments, the event includes rotation of the device from one orientation to another orientation (e.g., from a portrait orientation to a landscape orientation, and vice versa), and the event information includes corresponding information about a current orientation of the device (also referred to as a device pose).
The event comparator 184 compares the event information with predefined event or sub-event definitions, determines an event or sub-event based on the comparison, or determines or updates the state of the event or sub-event. In some embodiments, event comparator 184 includes event definition 186. Event definition 186 includes definitions (e.g., a predefined sequence of sub-events) for events such as event 1 (187-1), event 2 (187-2), and other events. In some implementations, sub-events in event 187 include, for example, touch start, touch end, touch movement, touch cancellation, and multi-touch. In one example, the definition of event 1 (187-1) is a double click on the displayed object. For example, the double click includes a first touch (touch start) of a predetermined stage on the displayed object, a first lift-off (touch end) of a predetermined stage, a second touch (touch start) of a predetermined stage on the displayed object, and a second lift-off (touch end) of a predetermined stage. In another example, the definition of event 2 (187-2) is a drag on the display object. For example, dragging includes a touch (or contact) on the displayed object for a predetermined period, movement of the touch on the touch-sensitive display system 112, and lifting of the touch (touch end). In some embodiments, the event also includes information for one or more associated event handlers 190.
In some implementations, the event definitions 187 include definitions of events for respective user interface objects. In some implementations, the event comparator 184 performs a hit test to determine which user interface object is associated with a sub-event. For example, in an application view that displays three user interface objects on touch-sensitive display system 112, when a touch is detected on touch-sensitive display system 112, event comparator 184 performs a hit test to determine which of the three user interface objects is associated with the touch (sub-event). If each displayed object is associated with a respective event handler 190, the event comparator uses the results of the hit test to determine which event handler 190 should be activated. For example, event comparator 184 selects an event handler associated with the sub-event and the object that triggered the hit test.
In some implementations, the definition of the respective event 187 also includes a delay action that delays delivery of the event information until it has been determined whether the sequence of sub-events does or does not correspond to an event type of the event recognizer.
When the respective event recognizer 180 determines that the sequence of sub-events does not match any of the events in the event definition 186, the respective event recognizer 180 enters an event impossible, event failed, or event end state after which subsequent sub-events of the touch-based gesture are ignored. In this case, the other event recognizers (if any) that remain active for the hit view continue to track and process sub-events of the sustained touch-based gesture.
In some embodiments, the respective event recognizer 180 includes metadata 183 with configurable attributes, flags, and/or lists that indicate how the event delivery system should perform sub-event delivery to the actively engaged event recognizer. In some embodiments, metadata 183 includes configurable attributes, flags, and/or lists that indicate how event recognizers interact or are able to interact with each other. In some embodiments, metadata 183 includes configurable properties, flags, and/or lists that indicate whether sub-events are delivered to different levels in a view or programmatic hierarchy.
In some implementations, when one or more particular sub-events of an event are identified, the respective event identifier 180 activates an event handler 190 associated with the event. In some implementations, the respective event identifier 180 delivers event information associated with the event to the event handler 190. The activate event handler 190 is different from sending (and deferring) sub-events to the corresponding hit view. In some embodiments, event recognizer 180 throws a flag associated with the recognized event, and event handler 190 associated with the flag obtains the flag and performs a predefined process.
In some implementations, the event delivery instructions 188 include sub-event delivery instructions that deliver event information about the sub-event without activating the event handler. Instead, the sub-event delivery instructions deliver the event information to an event handler associated with the sub-event sequence or to an actively engaged view. Event handlers associated with the sequence of sub-events or with the actively engaged views receive the event information and perform a predetermined process.
In some embodiments, the data updater 176 creates and updates data used in the application 136-1. For example, the data updater 176 updates a telephone number used in the contact module 137 or stores a video file used in the video player module 145. In some embodiments, object updater 177 creates and updates objects used in application 136-1. For example, the object updater 177 creates a new user interface object or updates the location of the user interface object. GUI updater 178 updates the GUI. For example, the GUI updater 178 prepares the display information and sends it to the graphics module 132 for display on a touch-sensitive display.
In some embodiments, one or more event handlers 190 include or have access to data updater 176, object updater 177, and GUI updater 178. In some embodiments, the data updater 176, the object updater 177, and the GUI updater 178 are included in a single module of the respective application 136-1 or application view 191. In other embodiments, they are included in two or more software modules.
It should be appreciated that the above discussion regarding event handling of user touches on a touch sensitive display also applies to other forms of user inputs that utilize an input device to operate the multifunction device 100, not all of which are initiated on a touch screen. For example, optionally with mouse movements and mouse button presses, optionally in combination with single or multiple keyboard presses or holds, contact movements on a touch pad, such as taps, drags, scrolls, etc., stylus inputs, movements of the device, verbal instructions, detected eye movements, biometric inputs, and/or any combination thereof as inputs corresponding to sub-events, define the event to be identified.
Fig. 2 illustrates a portable multifunction device 100 with a touch screen (e.g., touch-sensitive display system 112 in fig. 1A) in accordance with some embodiments. The touch screen optionally displays one or more graphics within a User Interface (UI) 200. In this embodiment, as well as other embodiments described below, a user can select one or more of these graphics by making a gesture on the graphics, for example, with one or more fingers 202 (not drawn to scale in the figures) or one or more styluses 203 (not drawn to scale in the figures). In some embodiments, selection of one or more graphics will occur when a user breaks contact with the one or more graphics. In some embodiments, the gesture optionally includes one or more taps, one or more swipes (left to right, right to left, up and/or down), and/or scrolling of a finger that has been in contact with the device 100 (right to left, left to right, up and/or down). In some implementations or in some cases, inadvertent contact with the graphic does not select the graphic. For example, when the gesture corresponding to the selection is a tap, a swipe gesture that swipes over the application icon optionally does not select the corresponding application.
The device 100 optionally also includes one or more physical buttons, such as a "home" button or menu button 204. As previously described, menu button 204 is optionally used to navigate to any application 136 in a set of applications that are optionally executed on device 100. Alternatively, in some embodiments, the menu buttons are implemented as soft keys in a GUI displayed on the touch screen display 112.
In some embodiments, the device 100 includes a touch screen display, menu buttons 204, a press button 206 for powering the device on/off and for locking the device, one or more volume adjustment buttons 208, a Subscriber Identity Module (SIM) card slot 210, a headset jack 212, and a docking/charging external port 124. Depressing the button 206 is optionally used to turn the device on/off by pressing the button and holding the button in the depressed state for a predefined time interval, to lock the device by pressing the button and releasing the button before the predefined time interval has elapsed, and/or to unlock the device or initiate an unlocking procedure. In some embodiments, the device 100 also accepts voice input through the microphone 113 for activating or deactivating certain functions. The device 100 also optionally includes one or more contact intensity sensors 165 for detecting intensity of contacts on the touch-sensitive display system 112, and/or one or more tactile output generators 167 for generating tactile outputs for a user of the device 100.
FIG. 3 is a block diagram of an exemplary multifunction device with a display and a touch-sensitive surface in accordance with some embodiments. The device 300 need not be portable. In some embodiments, the device 300 is a laptop computer, a desktop computer, a tablet computer, a multimedia player device, a navigation device, an educational device (such as a child learning toy), a gaming system, or a control device (e.g., a home controller or an industrial controller). The device 300 generally includes one or more processing units (CPUs) 310, one or more network or other communication interfaces 360, memory 370, and one or more communication buses 320 for interconnecting these components. Communication bus 320 optionally includes circuitry (sometimes called a chipset) that interconnects and controls communications between system components. The device 300 includes an input/output (I/O) interface 330 with a display 340, typically a touch screen display. The I/O interface 330 also optionally includes a keyboard and/or mouse (or other pointing device) 350 and a touch pad 355, a tactile output generator 357 (e.g., similar to the one or more tactile output generators 167 described above with reference to fig. 1A) for generating tactile outputs on the device 300, a sensor 359 (e.g., an optical sensor, an acceleration sensor, a proximity sensor, a touch-sensitive sensor, and/or a contact intensity sensor similar to the one or more contact intensity sensors 165 described above with reference to fig. 1A). Memory 370 includes high-speed random access memory such as DRAM, SRAM, DDR RAM, or other random access solid state memory devices, and optionally includes non-volatile memory such as one or more magnetic disk storage devices, optical disk storage devices, flash memory devices, or other non-volatile solid state memory devices. Memory 370 optionally includes one or more storage devices located remotely from one or more CPUs 310. In some embodiments, memory 370 stores programs, modules, and data structures similar to those stored in memory 102 of portable multifunction device 100 (fig. 1A), or a subset thereof. Further, memory 370 optionally stores additional programs, modules, and data structures not present in memory 102 of portable multifunction device 100. For example, memory 370 of device 300 optionally stores drawing module 380, presentation module 382, word processing module 384, website creation module 386, disk editing module 388, and/or spreadsheet module 390, while memory 102 of portable multifunction device 100 (fig. 1A) optionally does not store these modules.
Each of the above identified elements in fig. 3 is optionally stored in one or more of the previously mentioned memory devices. Each of the identified modules corresponds to a set of instructions for performing the functions described above. The above identified modules or programs (i.e., sets of instructions) need not be implemented as separate software programs, procedures or modules, and thus various subsets of these modules are optionally combined or otherwise rearranged in various embodiments. In some embodiments, memory 370 optionally stores a subset of the modules and data structures described above. Further, memory 370 optionally stores additional modules and data structures not described above.
Attention is now directed to embodiments of a user interface ("UI") optionally implemented on the portable multifunction device 100.
Fig. 4A illustrates an exemplary user interface of an application menu on the portable multifunction device 100 in accordance with some embodiments. A similar user interface is optionally implemented on device 300. In some embodiments, the user interface 400 includes the following elements, or a subset or superset thereof:
one or more signal strength indicators 402 for one or more wireless communications, such as cellular signals and Wi-Fi signals;
time 404;
A bluetooth indicator 405;
A battery status indicator 406;
a tray 408 with commonly used application icons such as:
an icon 416 labeled "phone" of phone module 138, optionally including an indicator 414 of the number of missed or voicemail messages;
An icon 418 of the email client module 140 labeled "mail," optionally including an indicator 410 of the number of unread emails;
an icon 420 labeled "browser" of browser module 147, and
An icon 422 labeled "iPod" of the video and music player module 152 (also known as iPod (trademark of Apple inc.) module 152), and
Icons of other applications, such as:
icon 424 of IM module 141 labeled "message";
Icon 426 of calendar module 148 labeled "calendar";
icon 428 of image management module 144 labeled "photo";
an icon 430 labeled "camera" of camera module 143;
icon 432 of online video module 155 labeled "online video";
Icon 434 labeled "stock market" for stock market desktop applet 149-2;
Icon 436 of map module 154 labeled "map";
icon 438 labeled "weather" for weather desktop applet 149-1;
an icon 440 labeled "clock" for the alarm desktop applet 149-4;
an icon 442 labeled "fitness support" for fitness support module 142;
icon 444 labeled "notepad" of notepad module 153, and
An icon 446 of a setup application or module that provides access to the settings of the device 100 and its various applications 136.
It should be noted that the iconic labels shown in fig. 4A are merely exemplary. For example, in some embodiments, the icon 422 of the video and music player module 152 is labeled "music" or "music player". Other labels are optionally used for various application icons. In some embodiments, the label of the respective application icon includes a name of the application corresponding to the respective application icon. In some embodiments, the label of a particular application icon is different from the name of the application corresponding to the particular application icon.
Fig. 4B illustrates an exemplary user interface on a device (e.g., device 300 in fig. 3) having a touch-sensitive surface 451 (e.g., tablet or touchpad 355 in fig. 3) separate from display 450. The device 300 also optionally includes one or more contact intensity sensors (e.g., one or more of the sensors 357) for detecting the intensity of contacts on the touch-sensitive surface 451 and/or one or more tactile output generators 357 for generating tactile outputs for a user of the device 300.
Fig. 4B illustrates an exemplary user interface on a device (e.g., device 300 in fig. 3) having a touch-sensitive surface 451 (e.g., tablet or touchpad 355 in fig. 3) separate from display 450. Although some examples will be given later with reference to inputs on touch screen display 112 (where the touch sensitive surface and the display are combined), in some embodiments the device detects inputs on a touch sensitive surface separate from the display, as shown in fig. 4B. In some implementations, the touch-sensitive surface (e.g., 451 in fig. 4B) has a primary axis (e.g., 452 in fig. 4B) that corresponds to the primary axis (e.g., 453 in fig. 4B) on the display (e.g., 450). According to these implementations, the device detects contact with the touch-sensitive surface 451 at locations corresponding to respective locations on the display (e.g., 460 and 462 in fig. 4B) (e.g., 460 corresponds to 468 and 462 corresponds to 470 in fig. 4B). In this way, user inputs (e.g., contacts 460 and 462 and movement thereof) detected by the device on the touch-sensitive surface (e.g., 451 in FIG. 4B) are used by the device to manipulate a user interface on the display (e.g., 450 in FIG. 4B) of the multifunction device when the touch-sensitive surface is separated from the display. It should be appreciated that similar approaches are optionally used for other user interfaces described herein.
Additionally, while the following examples are primarily presented with reference to finger inputs (e.g., finger contacts, single-finger flick gestures, finger swipe gestures, etc.), it should be understood that in some embodiments one or more of these finger inputs are replaced by input from another input device (e.g., mouse-based input or stylus input). For example, a swipe gesture is optionally replaced by a mouse click (e.g., rather than a contact) and subsequent movement of the cursor along the swipe path (e.g., rather than movement of a contact). As another example, a flick gesture is optionally replaced by a mouse click (e.g., instead of detection of contact and subsequent cessation of detection of contact) when the cursor is over the position of the flick gesture. Similarly, when multiple user inputs are detected simultaneously, it should be appreciated that multiple computer mice are optionally used simultaneously, or that the mice and finger contacts are optionally used simultaneously.
As used herein, the term "focus selector" is an input element that indicates the current portion of a user interface with which a user is interacting. In some implementations that include a cursor or other position marker, the cursor acts as a "focus selector" such that when an input (e.g., a press input) is detected on a touch-sensitive surface (e.g., touch pad 355 in fig. 3 or touch-sensitive surface 451 in fig. 4B) while the cursor is hovering over the particular user interface element (e.g., a button, window, slider, or other user interface element), the particular user interface element is adjusted according to the detected input. In some implementations including a touch screen display (e.g., touch sensitive display system 112 in fig. 1A or the touch screen in fig. 4A) that enables direct interaction with user interface elements on the touch screen display, the detected contact on the touch screen acts as a "focus selector" such that when an input (e.g., a press input by a contact) is detected on the touch screen display at the location of a particular user interface element (e.g., a button, window, slider, or other user interface element), the particular user interface element is adjusted according to the detected input. In some implementations, the focus moves from one area of the user interface to another area of the user interface without a corresponding movement of the cursor or movement of contact on the touch screen display (e.g., by moving the focus from one button to another using tab or arrow keys), in which the focus selector moves according to focus movement between the different areas of the user interface. Regardless of the particular form that the focus selector takes, the focus selector is typically a user interface element (or contact on a touch screen display) that is controlled by the user to communicate the user's intended interaction with the user interface (e.g., by indicating to the device the element with which the user of the user interface intends to interact). For example, upon detection of a press input on a touch-sensitive surface (e.g., a touchpad or touch screen), the position of a focus selector (e.g., a cursor, contact, or selection box) over a respective button will indicate that the user desires to activate the respective button (rather than other user interface elements shown on the device display).
As used in this specification and the claims, the term "intensity" of a contact on a touch-sensitive surface refers to the force or pressure (force per unit area) of the contact on the touch-sensitive surface (e.g., finger contact or stylus contact), or refers to an alternative to the force or pressure of the contact on the touch-sensitive surface (surrogate). The intensity of the contact has a range of values that includes at least four different values and more typically includes hundreds of different values (e.g., at least 256). The intensity of the contact is optionally determined (or measured) using various methods and various sensors or combinations of sensors. For example, one or more force sensors below or adjacent to the touch-sensitive surface are optionally used to measure forces at different points on the touch-sensitive surface. In some implementations, force measurements from multiple force sensors are combined (e.g., weighted average or summation) to determine an estimated force of contact. Similarly, the pressure-sensitive tip of the stylus is optionally used to determine the pressure of the stylus on the touch-sensitive surface. Alternatively, the size of the contact area detected on the touch-sensitive surface and/or its variation, the capacitance of the touch-sensitive surface in the vicinity of the contact and/or its variation and/or the resistance of the touch-sensitive surface in the vicinity of the contact and/or its variation are optionally used as a substitute for the force or pressure of the contact on the touch-sensitive surface. In some implementations, the surrogate measurement of contact force or pressure is directly used to determine whether an intensity threshold has been exceeded (e.g., the intensity threshold is described in units corresponding to the surrogate measurement). In some implementations, an alternative measurement of contact force or pressure is converted to an estimated force or pressure, and the estimated force or pressure is used to determine whether an intensity threshold has been exceeded (e.g., the intensity threshold is a pressure threshold measured in units of pressure). The intensity of the contact is used as an attribute of the user input, allowing the user to access additional device functions that would otherwise not be readily accessible on a smaller-sized device for displaying affordances and/or receiving user input (e.g., via a touch-sensitive display, a touch-sensitive surface, or physical/mechanical controls such as knobs or buttons).
In some embodiments, the contact/motion module 130 uses a set of one or more intensity thresholds to determine whether the user performed an operation (e.g., determines whether the user "clicked" on an icon). In some implementations, at least a subset of the intensity thresholds are determined according to software parameters (e.g., the intensity thresholds are not determined by activation thresholds of particular physical actuators and may be adjusted without changing the physical hardware of the device 100). For example, without changing the touchpad or touch screen display hardware, the mouse "click" threshold of the touchpad or touch screen display may be set to any of a wide range of predefined thresholds. Additionally, in some implementations, a user of the device is provided with software settings for adjusting one or more intensity thresholds in a set of intensity thresholds (e.g., by adjusting individual intensity thresholds and/or by adjusting multiple intensity thresholds at once with a system-level click on an "intensity" parameter).
As used in the specification and claims, the term "characteristic intensity" of a contact refers to the characteristic of a contact based on one or more intensities of the contact. In some embodiments, the characteristic intensity is based on a plurality of intensity samples. The characteristic intensity is optionally based on a predefined number of intensity samples or a set of intensity samples acquired during a predetermined period of time (e.g., 0.05 seconds, 0.1 seconds, 0.2 seconds, 0.5 seconds, 1 second, 2 seconds, 5 seconds, 10 seconds) relative to a predefined event (e.g., after detection of contact, before or after detection of contact lift, before or after detection of contact start movement, before or after detection of contact end, before or after detection of intensity increase of contact and/or before or after detection of intensity decrease of contact). The characteristic intensity of the contact is optionally based on one or more of a maximum value of the contact intensity, a mean value of the contact intensity, a value of the first 10% of the contact intensity, a half maximum value of the contact intensity, a 90% maximum value of the contact intensity, and the like. In some embodiments, the duration of the contact is used in determining the characteristic intensity (e.g., where the characteristic intensity is an average of the intensity of the contact over time). In some embodiments, the characteristic intensity is compared to a set of one or more intensity thresholds to determine whether the user has performed an operation. For example, the set of one or more intensity thresholds may include a first intensity threshold and a second intensity threshold. In this example, contact of the feature strength that does not exceed the first threshold results in a first operation, contact of the feature strength that exceeds the first strength threshold but does not exceed the second strength threshold results in a second operation, and contact of the feature strength that exceeds the second threshold results in a third operation. In some implementations, a comparison between the feature intensity and one or more intensity thresholds is used to determine whether to perform one or more operations (e.g., whether to perform the respective option or to forgo performing the respective operation) instead of being used to determine whether to perform the first operation or the second operation.
In some implementations, a portion of the gesture is identified for determining a feature strength. For example, the touch-sensitive surface may receive a continuous swipe contact that transitions from a starting position and reaches an ending position (e.g., a drag gesture) where the intensity of the contact increases. In this example, the characteristic intensity of the contact at the end position may be based on only a portion of the continuous swipe contact, rather than the entire swipe contact (e.g., only the swipe contact portion at the end position). In some implementations, a smoothing algorithm may be applied to the intensity of the swipe gesture prior to determining the characteristic intensity of the contact. For example, the smoothing algorithm optionally includes one or more of an unweighted moving average smoothing algorithm, a triangular smoothing algorithm, a median filter smoothing algorithm, and/or an exponential smoothing algorithm. In some cases, these smoothing algorithms eliminate narrow spikes or depressions in the intensity of the swipe contact for the purpose of determining the characteristic intensity.
The user interface drawings described herein optionally include various intensity graphs showing the current intensity of a contact on a touch-sensitive surface relative to one or more intensity thresholds (e.g., contact detection intensity threshold IT 0, light press intensity threshold IT L, deep press intensity threshold IT D (e.g., at least initially above I L), and/or one or more other intensity thresholds (e.g., intensity threshold I H below I L). The intensity map is typically not part of the displayed user interface, but is provided to help explain the map. In some embodiments, the tap intensity threshold corresponds to an intensity at which the device will perform an operation typically associated with clicking a button or touch pad of a physical mouse. In some embodiments, the deep press intensity threshold corresponds to an intensity at which the device will perform an operation that is different from the operation typically associated with clicking a button of a physical mouse or touch pad. In some implementations, when a contact is detected with a characteristic intensity below a light press intensity threshold (e.g., and above a nominal contact detection intensity threshold IT 0 below which contact is no longer detected), the device will move the focus selector according to movement of the contact on the touch-sensitive surface without performing an operation associated with the light press intensity threshold or the deep press intensity threshold. Generally, unless otherwise stated, these intensity thresholds are consistent across different sets of user interface drawings.
In some embodiments, the response of the device to the input detected by the device depends on a criterion based on the contact strength during the input. For example, for some "tap" inputs, the intensity of the contact exceeding a first intensity threshold during the input triggers a first response. In some embodiments, the response of the device to the input detected by the device is dependent on criteria that include both the contact strength during the input and a time-based criterion. For example, for some "deep press" inputs, the contact intensity exceeding the second intensity threshold (greater than the first intensity threshold for light presses) during the input triggers a second response only if a delay time has elapsed between the first intensity threshold being met and the second intensity threshold being met. The delay time is typically less than 200ms in duration (e.g., 40ms, 100ms, or 120ms, depending on the size of the second intensity threshold), the delay time increasing with increasing second intensity threshold). This delay time helps to avoid accidental deep press inputs. As another example, for some "deep press" inputs, a period of reduced sensitivity may occur after the first intensity threshold is reached. During this period of reduced sensitivity, the second intensity threshold is raised. This temporary increase in the second intensity threshold also helps to avoid accidental deep press inputs. For other deep press inputs, the response to detecting the deep press input is not dependent on a time-based criterion.
In some implementations, one or more of the input intensity threshold and/or corresponding outputs varies based on one or more factors such as user settings, contact movement, input timing, application execution, rate of intensity applied, number of concurrent inputs, user history, environmental factors (e.g., environmental noise), focus selector position, and the like. Exemplary factors are described in U.S. patent applications 14/399,606 and 14/624,296, which are hereby incorporated by reference in their entireties.
For example, fig. 4C illustrates a dynamic intensity threshold 480 that varies over time based in part on the intensity of the touch input 476 over time. The dynamic intensity threshold 480 is the sum of the two components, the first component 474 decays over time after a predefined delay time p1 from the initial detection of the touch input 476, and the second component 478 tracks the intensity of the touch input 476 over time. The initial high intensity threshold of the first component 474 reduces accidental triggering of a "deep press" response while still allowing the "deep press" response to occur immediately with sufficient intensity provided by the touch input 476. The second component 478 reduces inadvertent triggering of the "deep press" response by a gradual fluctuation in intensity in the touch input. In some implementations, a "deep press" response is triggered when the touch input 476 meets a dynamic intensity threshold 480 (e.g., at point 481 in fig. 4C).
Fig. 4D shows another dynamic intensity threshold 486 (e.g., intensity threshold I D). Fig. 4D also shows two other intensity thresholds, a first intensity threshold I H and a second intensity threshold I L. In fig. 4D, although the touch input 484 meets the first intensity threshold I H and the second intensity threshold I L before time p2, a response is not provided at time 482 until delay time p2 has elapsed. Also in fig. 4D, dynamic intensity threshold 486 decays over time, beginning to decay at time 488 after a predefined delay time p1 has elapsed from time 482 (when the response associated with second intensity threshold I L is triggered). This type of dynamic intensity threshold reduces accidental triggering of a response associated with dynamic intensity threshold I D after or simultaneously with triggering a response associated with a lower intensity threshold, such as first intensity threshold I H or second intensity threshold I L.
Fig. 4E shows yet another dynamic intensity threshold 492 (e.g., intensity threshold I D). In fig. 4E, after a delay time p2 has elapsed from when the touch input 490 was initially detected, a response associated with an intensity threshold I L is triggered. Meanwhile, the dynamic intensity threshold 492 decays after a predefined delay time p1 has elapsed since the initial detection of the touch input 490. Thus, without releasing the touch input 490, the intensity of the touch input 490 decreases after triggering the response associated with the intensity threshold I L, followed by an increase in the intensity of the touch input 490, which may trigger the response associated with the intensity threshold I D (e.g., at time 494), even when the intensity of the touch input 490 is below another intensity threshold (e.g., the intensity threshold I L).
The increase in contact characteristic intensity from an intensity below the light press intensity threshold IT L to an intensity between the light press intensity threshold IT L and the deep press intensity threshold IT D is sometimes referred to as a "light press" input. The increase in contact characteristic intensity from an intensity below the deep compression intensity threshold IT D to an intensity above the deep compression intensity threshold IT D is sometimes referred to as a "deep compression" input. The increase in contact characteristic intensity from an intensity below the contact detection intensity threshold IT 0 to an intensity between the contact detection intensity threshold IT 0 and the light press intensity threshold IT L is sometimes referred to as detecting a contact on the touch surface. The decrease in contact characteristic intensity from an intensity above the contact detection intensity threshold IT 0 to an intensity below the contact detection intensity threshold IT 0 is sometimes referred to as detecting a lift of contact from the touch surface. In some embodiments, IT 0 is zero. In some embodiments, IT 0 is greater than zero. In some illustrations, a hatched circle or oval is used to represent the contact intensity on the touch-sensitive surface. In some illustrations, an unshaded circle or oval is used to represent a respective contact on the touch-sensitive surface without specifying the intensity of the respective contact.
In some implementations described herein, one or more operations are performed in response to detecting a gesture that includes a respective press input or in response to detecting a respective press input performed with a respective contact (or contacts), wherein a respective press input is detected based at least in part on detecting an increase in intensity of the contact (or contacts) above a press input intensity threshold. In some implementations, the respective operation is performed in response to detecting that the intensity of the respective contact increases above a press input intensity threshold (e.g., the respective operation is performed on a "downstroke" of the respective press input). In some implementations, the press input includes an increase in intensity of the respective contact above a press input intensity threshold and a subsequent decrease in intensity of the contact below the press input intensity threshold, and the respective operation is performed in response to detecting the subsequent decrease in intensity of the respective contact below the press input threshold (e.g., the respective operation is performed on an "upstroke" of the respective press input).
In some implementations, the device employs intensity hysteresis to avoid accidental inputs sometimes referred to as "jitter," where the device defines or selects a hysteresis intensity threshold that has a predefined relationship to the compression input intensity threshold (e.g., the hysteresis intensity threshold is X intensity units lower than the compression input intensity threshold, or the hysteresis intensity threshold is 75%, 90%, or some reasonable proportion of the compression input intensity threshold). Thus, in some embodiments, the press input includes an increase in intensity of the respective contact above a press input intensity threshold and a subsequent decrease in intensity of the contact below a hysteresis intensity threshold corresponding to the press input intensity threshold, and the respective operation is performed in response to detecting that the intensity of the respective contact subsequently decreases below the hysteresis intensity threshold (e.g., the respective operation is performed on an "upstroke" of the respective press input). Similarly, in some embodiments, a press input is detected only when the device detects an increase in the intensity of the contact from an intensity at or below the hysteresis intensity threshold to an intensity at or above the press input intensity threshold and optionally a subsequent decrease in the intensity of the contact to an intensity at or below the hysteresis intensity, and a corresponding operation is performed in response to detecting a press input (e.g., an increase in the intensity of the contact or a decrease in the intensity of the contact, depending on the circumstances).
For ease of explanation, optionally, a description of an operation performed in response to a press input associated with a press input intensity threshold or in response to a gesture including a press input is triggered in response to detecting an increase in intensity of a contact above the press input intensity threshold, an increase in intensity of a contact from an intensity below a hysteresis intensity threshold to an intensity above the press input intensity threshold, a decrease in intensity of a contact below the press input intensity threshold, or a decrease in intensity of a contact below a hysteresis intensity threshold corresponding to the press input intensity threshold. In addition, in examples where the operation is described as being performed in response to the intensity of the detected contact decreasing below a press input intensity threshold, the operation is optionally performed in response to the intensity of the detected contact decreasing below a hysteresis intensity threshold that corresponds to and is less than the press input intensity threshold. As described above, in some embodiments, the triggering of these responses also depends on the satisfaction of a time-based criterion (e.g., a delay time has elapsed between the satisfaction of the first intensity threshold and the satisfaction of the second intensity threshold).
User interface and associated process
Attention is now directed to embodiments of a user interface ("UI") and associated processes that may be implemented on an electronic device, such as portable multifunction device 100 or device 300, having a display, a touch-sensitive surface, and one or more sensors for detecting the intensity of contact with the touch-sensitive surface.
Fig. 5A-5 BJ illustrate exemplary user interfaces for accessing common device functions according to some embodiments. The user interfaces in these figures are used to illustrate the processes described below, including the processes in fig. 6A-6E, 7A-7E, 8A-8C, and 9A-9E. Although some examples will be given later with reference to inputs on a touch screen display (where the touch sensitive surface and display are combined), in some embodiments the device detects inputs on a touch sensitive surface 451 separate from the display 450, as shown in fig. 4B.
Fig. 5A-1 through 5A-4 illustrate activating a display (e.g., touch sensitive display 112) in response to changing the orientation of the device 100. In some implementations, the device uses one or more sensors (e.g., an accelerometer, a gyroscope, an audio sensor, a thermal sensor, and/or a light sensor) to determine whether the orientation of the device has changed. For example, the device determines whether the device has rotated beyond a threshold angle (e.g., rotated along an axis of the device, such as tilted from a position in which the device is substantially horizontal to a position in which the device is substantially vertical). In fig. 5A-1, the device remains flat in the user's hand 502 such that the device display is substantially horizontal. In fig. 5A-2, the device is tilted so that the display is more vertical than shown in fig. 5A-1. In fig. 5A-2, the display is not activated because the tilt angle of the device does not increase above the threshold tilt angle. In fig. 5A-3, the device is tilted so that the display is more vertical than shown in fig. 5A-2. In fig. 5A-3, the display is activated (e.g., display content 504 is shown by the display) as the tilt angle of the device increases above the threshold tilt angle. Display 504 includes, for example, a time, date, signal indicator, battery charge level indicator, lock indicator, and/or one or more notifications (such as notification 506). In fig. 5A-4, the display is substantially vertical. In some embodiments, when the display is activated (e.g., as shown in fig. 5A-3), some or all of the display content 504 is shown in a smaller size (e.g., narrower width) than the default size of the display content 504. As the device continues to tilt after the display is activated, the size of the display content 504 gradually increases to a default size, as shown in fig. 5A-4.
Fig. 5C-5E illustrate user interfaces displayed in response to input provided by contact 508 at physical button 204 (e.g., a "home" or menu button). In fig. 5B, the physical button 204 detects a contact 508 that activates but does not actuate the physical button (e.g., when the contact 508 rests on the surface of the physical button without pressing the physical button). For example, upon detecting a contact 508 at the physical button 204 that does not meet the criteria for actuating the physical button 204 (e.g., the characteristic intensity of the contact 508 exceeds the contact detection intensity threshold IT 0, but does not exceed the actuation threshold IT A, as shown by the actuation meter 510), the physical button is activated (but not actuated). In some embodiments, activation of the physical button 204 wakes up the device and/or the touch screen display 112. For example, in response to activation of the physical button 204, the device activates the display and displays a wake screen user interface 512, as shown in FIG. 5C. For example, in response to an input that activates but does not actuate a physical button, the device is in a wake-up (display on) state, but not fully activated.
In some embodiments, in response to a contact meeting the criteria for actuating a physical button (e.g., when the intensity of the contact 508 with the physical button 204 exceeds an actuation threshold IT A, as shown by actuation meter 510 of fig. 5D), the device 100 displays a user interface (e.g., application diving board user interface 400) including a plurality of application icons (e.g., application icons 416-446), as shown in fig. 5E. For example, in response to an input actuating a physical button, the device is fully activated.
In some embodiments, the physical button 204 includes a biometric sensor, such as a fingerprint sensor for a user authentication scan (e.g., touchID scan). In some embodiments, the activation of the physical button and/or actuation of the unlocking device is based on successful authentication of the user by the biometric sensor of the physical button identifying the fingerprint corresponding to the contact 508. For example, in response to activation of the physical button 204 in which successful user authentication occurs, the unlocked state of the lock icon 514 is displayed in the wake screen interface 512 of fig. 5C.
In some embodiments, the device 100 includes a virtual button (e.g., rather than a physical button) that acts as a home or menu button 204. While buttons 204 are described herein, it should be understood that mechanical buttons, virtual buttons, or other types of buttons may be used. For example, the virtual button is activated in accordance with a determination that the characteristic intensity of the contact with the virtual button does not exceed an actuation threshold, and the virtual button is actuated in accordance with a determination that the characteristic intensity of the contact with the virtual button exceeds the actuation threshold. In some embodiments, the virtual button includes a biometric sensor for user authentication.
Fig. 5F-1 through 5F-4 illustrate a sequence of user interfaces displayed in response to a detected input when the device 100 is in a locked mode (e.g., as shown by the locked state of the lock icon 514). In FIG. 5F-1, a wake screen interface 512 is shown. In some implementations, the locked mode of the wake screen interface 512 is displayed, for example, in response to an input at the push button 206 to turn the device on/off and lock the device, in response to an input at the physical button 204 that does not unlock the device (e.g., an input by a user whose fingerprint is not recognized by a fingerprint scanner of the physical button 204), and/or in response to an input at the virtual button that activates the device display but does not unlock the device.
Fig. 5F-1 illustrates a swipe down gesture, in which the contact 516 moves along a path indicated by arrow 518. In response to the swipe down gesture, a lock mode is displayed informing the user interface 520, as shown in FIG. 5F-2. In some embodiments, when a swipe down occurs, an animation is displayed, wherein notification user interface 520 is displayed as gradually descending from the top of the display.
In response to input of the contact 522 detected at a location corresponding to the search input area 524 of fig. 5F-2, a user interface 526 with a keyboard 528 is displayed, as shown in fig. 5F-3.
In fig. 5F-3, a search term ("movie") is entered in the search input area 524 (e.g., using the keyboard 528). In response to input initiating a search (e.g., input of a contact 530 detected at an affordance corresponding to a "back" key of the keyboard 528), a lock mode of the search results user interface 532 is displayed, as shown in fig. 5F-4.
The user may wish to limit the information displayed when the device is locked so that others cannot see private and/or sensitive information. In some embodiments, the device is configurable such that particular types of notifications, notification content, and/or search results displayed when the device is unlocked are not visible when the device is locked. Fig. 5F-5 through 5F-8 illustrate additional content displayed in an unlocked mode of notification user interface 520 and search results user interface 532.
Fig. 5F-5 through 5F-8 illustrate a sequence of user interfaces displayed in response to a detected input when the device 100 is in an unlocked mode (e.g., as shown by the unlocked state of the lock icon 514). In some implementations, the unlock mode of the wake screen interface 512 is displayed, for example, in response to an input at the physical button 204 that unlocks the device (e.g., an input by a user whose fingerprint is not recognized by a fingerprint scanner of the physical button 204).
Fig. 5F-5 illustrate a swipe down gesture, in which contact 534 moves along the path indicated by arrow 536. In response to the swipe down gesture, an unlock mode is displayed informing the user interface 520, as shown in fig. 5F-6.
In response to detecting an input of the contact 538 at a location corresponding to the search input area 524 of FIGS. 5F-6, a user interface 526 with a keyboard 528 is displayed, as shown in FIGS. 5F-7.
In fig. 5F-7, a search term ("movie") is entered in the search input area 524 (e.g., using the keyboard 528). In response to an input to initiate a search (e.g., an input of a contact 540 detected at a "back" key affordance of the keyboard 528), an unlock mode of the search results user interface 532 is displayed, as shown in fig. 5F-8.
Fig. 5G illustrates a lock mode of notification user interface 520 (e.g., as indicated by the lock state of lock icon 514). The locked mode of message notification 542 and the locked mode of voicemail notification 544 are displayed in the locked mode of notification user interface 520.
Fig. 5H illustrates an unlock mode of notification user interface 520 (e.g., as indicated by the unlocked state of lock icon 514). The unlock mode of the message notification 542 and the unlock mode of the voicemail notification 544 are displayed in the unlock mode of the notification user interface 520. In contrast to the locked mode of message notification 542, the unlocked mode of message notification 542 includes additional information, such as message content (e.g., text 546 and/or image 548 from one or more messages). The unlock mode of notification user interface 520 displays invitation notifications 550 that are not displayed in the lock mode of notification user interface 520.
FIG. 5I illustrates a lock mode of the search results user interface 532 (e.g., as indicated by the locked state of the lock icon 514). Search results displayed in the locked mode of search results user interface 532 that input the search term "movie" in search input area 524 include, for example, location-based results (e.g., suggested movies played near the location of the device, location suggestions for movie theatres near the location of the device), application results (iMovie), and suggested website results.
Fig. 5J illustrates an unlock mode of the search results user interface 532 (e.g., as indicated by the unlocked state of the lock icon 514). The unlock mode of the notification user interface 532 displays messages 552 and 554 and an email 556 that are not displayed in the lock mode of the search results user interface 532.
Fig. 5K-5N illustrate a sequence in which a user is prompted to provide a password to view an expanded view of message notification 542 in response to input received in the locked mode of message notification 542. In fig. 5K, an input for the expand message notification 542 is provided through contact 557. Because the device is in the locked mode (as shown by the locked state of the lock icon 514), user authentication is required before the expanded view of the message notification 542 is displayed. Fig. 5L shows a user authentication prompt user interface 558 including a password input interface 560. Upon determining that the user authentication information (e.g., provided by input of contacts 562 at password input interface 560 and/or a fingerprint sensor of physical button 204) is valid, an expanded view of notification 542 is displayed, as shown in FIG. 5M.
In some embodiments, the expanded view of notification 542 includes, for example, message sender header 564 (and/or name of the message sender), affordance 566 (e.g., an "X" affordance) for dismissing the notification, message content (e.g., a message including image 548 and a message including text 546), and/or contextual session information (e.g., message 568 preceding message 548 in the session transcript text). In some embodiments, when a new message is received in the session, the expanded view of message notification 542 is updated to display the new message (e.g., new message 570 is displayed in message notification 542). In some embodiments, the expanded view of message notification 542 includes an affordance 572 for displaying a reply view of message notification 542 (e.g., in response to input received by contact 574, such as a tap input at a location corresponding to affordance 572 for displaying the reply view).
Fig. 5N illustrates a reply view to the message notification 542 that enables a user to reply to a message within the notification (e.g., without navigating from the notification user interface to the instant messaging application user interface). When a reply view of the message notification 542 is displayed, the keyboard 576 is displayed. Inputs received at the keypad 576 are displayed in the message input area 578 of the message notification 542.
Fig. 5O-5Q illustrate navigation from the calendar invitation message notification 550 to the expanded view of the calendar invitation message notification 550 and navigation from the expanded view of the calendar invitation message notification 550 to the calendar application user interface 594. In fig. 5O, input of contact 580 is received at a location corresponding to invite message notification 550 in the unlocked view of notification user interface 520. In response to detecting the input, the device displays an expanded view of calendar invitation message notification 550, as shown in FIG. 5P. The expanded view of calendar invitation message notification 550 includes contextual calendar information 582 corresponding to the invited event 584 and an action item menu including action items 586 ("accept"), 588 ("possible") and 590 ("decline"). Input to contact 592 is received at a location corresponding to calendar invitation message notification 550. In response to detecting the input, the device displays a calendar application user interface 594 (e.g., an event detail page corresponding to the invited event 584, as shown in FIG. 5Q).
FIG. 5R-1 shows a voicemail notification 544. Fig. 5R-2 shows an expanded view of the voicemail notification 544. The expanded view of the voicemail notification 544 includes, for example, playback controls (e.g., play/pause control 596, playback position slider 598, and/or volume control 5100) for playback of the voicemail audio, transcribed text 5102 of the voicemail (e.g., which is automatically generated by the device based on the voicemail audio), and/or an action item menu including action items 5104 ("power back") and 5106 ("delete").
Fig. 5S-1 illustrates a notification 5108 of another exemplary application (e.g., a driving services application). Fig. 5S-2 shows an expanded view of notification 5108. The expanded view of notification 5108 includes, for example, map 5112 and/or action items 5118 ("call driver"). In some embodiments, the map 5112 includes content that is updated in real-time and/or near real-time. For example, a representation 5114 of the real-time location of the car is displayed on the map 5112 relative to a representation 5116 of the destination of the car. As the car moves, the representation 5114 of the car is updated to reflect the movement of the car.
Fig. 5T-1 to 5T-2 illustrate a sequence of user interfaces displayed in response to a detected input when the device 100 is in the locked mode. In fig. 5T-1, the wakeup screen interface 512 is displayed in a lock mode, as shown by the locked state of the lock icon 514.
Fig. 5T-1 illustrates a swipe right gesture, in which the contact 5117 moves along the path indicated by arrow 5120. In response to the swipe right gesture, a lock mode of the mini-application user interface 5122 is displayed, as shown in fig. 5T-2. In some embodiments, when a swipe to the right is performed, an animation is displayed, wherein the notification user interface 520 displays a gradual sliding to the right from the left side of the display.
In some embodiments, the mini-application user interface 5122 includes a search input area 5124. In response to input received at search input area 5124, a lock mode of the search results user interface is displayed (e.g., similar to search results user interface 532 shown in fig. 5F-4 and 5I).
Fig. 5T-3 to 5T-4 illustrate a sequence of user interfaces displayed in response to a detected input when the device 100 is in the unlocked mode. In fig. 5T-3, the wakeup screen interface 512 is displayed in an unlocked mode, as shown by the unlocked state of the lock icon 514.
Fig. 5T-3 illustrates a swipe right gesture, in which the contact 5119 moves along the path indicated by arrow 5121. In response to the swipe right gesture, an unlock mode of the mini-application user interface 5122 is displayed, as shown in fig. 5T-2. In response to input received at search input area 5124, an unlock mode of the search results user interface is displayed (e.g., similar to search results user interface 532 shown in fig. 5F-8 and 5J).
Fig. 5U illustrates a lock mode of the mini-application user interface 5122. The lock mode of the mini-application user interface displays, for example, a lock mode of the calendar mini-application 5125 and/or a lock mode of the weather mini-application user interface 5126. In some embodiments, the locking and/or unlocking mode of the mini-application user interface 5122 includes information 5123, such as information about points of interest in the vicinity of the device 100 (e.g., as determined using GPS and/or Wi-Fi data received by the device 100 and/or point of interest data stored and/or accessed by the device 100). In some embodiments, calendar mini-application 5125 includes an identifier (e.g., text 5131 identifying the corresponding application and/or icon 5135 identifying the corresponding application).
In FIG. 5U, the contact 5128 provides input at the "display more" affordance 5127 for displaying an expanded view of the calendar mini-application 5125. Because the device is in the locked mode (as shown by the lock icon 514), user authentication is required before the expanded view of the mini-application 5125 is displayed. Fig. 5V is a user authentication prompt user interface 558 including a password input interface 560. Upon determining that the user authentication information (e.g., provided by the input of the contact 5129 at the password input interface 560 and/or the fingerprint sensor of the physical button 204) is valid, an expanded view of the mini-application 5125 is displayed, as shown in fig. 5W. In some embodiments, the expanded view of the mini-application 5125 includes, for example, an expanded time range (as compared to the locked and/or unlocked views of the calendar mini-application 5125) and calendar event information not displayed in the locked view of the calendar mini-application 5125 (e.g., "drink coffee with Jon," "team meeting," "see shell in shellfish market"). The weather mini-application 5126 adjusts downward in the mini-application user interface 5122 to accommodate the expansion of the calendar mini-application 5125.
In FIG. 5W, an input is provided through contact 5137 at a location corresponding to affordance 5130 for zooming out an expanded view of calendar mini-application 5125 (as shown in FIG. 5W) into an unexpanded view of calendar mini-application 5125, as shown in the unlocked mode of mini-application user interface 5122 of FIG. 5X.
In the unlocked mode of the mini-application user interface 5122, an unlocked view of the calendar mini-application 5125 is displayed (undeployed). In contrast to the locked view of the calendar mini-application 5125, the locked mode of the calendar mini-application 5125 includes additional information, such as calendar event information (e.g., "drink coffee with Jon" and/or "team meeting") that is not displayed in the unlocked view of the calendar mini-application 5125. The unlock mode of the mini-application user interface 5122 displays the next mini-application 5132 that is not displayed in the lock mode of the mini-application user interface 5122.
In FIG. 5X, the contact 5139 provides input for displaying an expanded view of the next mini-application 5132 at a location corresponding to the "display more" affordance 5141.
Fig. 5Y shows an expanded view of the next mini-application 5132. The expanded view of the next mini-application 5132 includes a map 5134 indicating the location of the next event and an action item menu including action items 5136 ("navigate"), 5138 ("nap"), 5140 ("invitee info") and 5142 ("delete event"). The input of the contact 5143 is detected at a position corresponding to the next mini-application 5132. In response to detecting the input, the device displays a calendar application user interface 5144 that includes an indication 5146 of the next event, as shown in fig. 5Z.
Fig. 5AA to 5AD illustrate changes that occur in the application diving board user interface 400 when an input for displaying a fast-action menu is received. In fig. 5AA, an input of contact 5150 is detected at a location corresponding to mail application icon 418. The mail application icon 418 corresponds to a mail application (e.g., a mail application that is launched in response to a tap input detected at the mail application icon 418). The characteristic intensity of the contact is indicated by intensity level 5148. In fig. 5AB, the characteristic intensity of the contact increases beyond the cue intensity threshold level IT H, as indicated by the intensity level meter 5148. In some embodiments, in accordance with a determination that the characteristic intensity of the contact 5149 has increased beyond the alert intensity threshold level IT H, a blurring effect is applied to at least a portion of the user interface 400. As shown in fig. 5AB to 5AD, as the characteristic intensity of the contact increases, the blurring effect increases. In some embodiments, in accordance with a determination that the characteristic intensity of the contact has increased beyond the alert intensity threshold level IT H, a foreground map 5152 of the fast action menu 5154 corresponding to the mail icon 418 is displayed. As shown in fig. 5AB to 5AC, the size of the front view 5152 increases with the characteristic intensity of the contact.
In fig. 5AD, the characteristic intensity of the contact 5150 has increased beyond the deep-press intensity threshold IT D. In some embodiments, in accordance with a determination that the characteristic intensity of the contact increases beyond the deep-press intensity threshold IT D, the one or more tactile output generators of the device 100 output a tactile output as shown at 5150. In some embodiments, a quick action menu 5154 corresponding to the mail application icon 418 is displayed. In some embodiments, a mini-application preview 5156 corresponding to the mail mini-application 5162 is displayed (e.g., in a location adjacent to the fast-action menu 5154 and/or on a disk that includes the fast-action menu 5154 and/or the mini-application preview 5156). In some implementations, the fast action menu 5154 and/or the mini-application preview 5156 are displayed at least partially overlaying the application diving board user interface 400.
In FIG. 5AE, the "Add desktop applet" illustration at mini-application preview 5156 can indicate that an input of contact 5158 was detected at 5160. In response to detecting the input, the device adds a mail mini-application 5162 corresponding to the mini-application preview 5156 to the mini-application user interface 5122, as shown in FIG. 5 AF. In some embodiments, in response to detecting an input of contact 5158 at the "add desktop applet" affordance 5160, the device displays a mini-application user interface 5122 that includes a mini-application preview 5156.
In FIG. 5AF, mail mini-application 5162 is displayed in mini-application user interface 5122. In some embodiments, mail mini-application 5162 includes a header 5164 (e.g., 5164a,5164b,5164c, and/or 5164 d) corresponding to one or more email senders that most recently and/or frequently send email to device 100 and/or receive email from device 100. The flag 5166 at the position corresponding to the avatar 5164a indicates, for example, the number of unread emails received from the email sender corresponding to the avatar 5164 a. Input of the contact 5170 is detected at a location corresponding to an affordance 5168 for displaying an expanded view of the mail mini-application 5162. In response to detecting the input, the device displays an expanded view of mail mini-application 5162, as shown in FIG. 5 AG.
Fig. 5AG shows an expanded view of the mail mini-application 5162 displayed in the mini-application user interface 5122. Additional avatars 5164e, 5164f, 5164g, and/or 5164h, which are not shown in the unexpanded view of the mail mini-application 5162, are shown in the expanded view of the mail mini-application 5162.
Fig. 5AH illustrates the transition from the state of displaying the wake screen interface 512 to the state of displaying the lock mode of the camera application user interface 5172. At a first time t 1, a swipe left gesture is initiated, in which the contact 5176 begins to move along a first portion of the swipe gesture, as indicated by arrow 5178. At a second time t 2, which is subsequent to time t 1, a first portion of the camera application user interface 5172 is displayed by the first portion of the swipe gesture, and the contact 5176 continues to move along the second portion of the swipe gesture, which is shown by arrow 5180. At a third time t 3, subsequent to time t 2, the camera application user interface 5172 is further displayed by the second portion of the swipe gesture, and the contact 5176 continues to move along the third portion of the swipe gesture shown by arrow 5182. At a fourth time t 4 after time t 3, the camera application user interface 5172 is further displayed by the third portion of the swipe gesture, and the contact 5176 continues to move along the fifth portion of the swipe gesture shown by arrow 5184. At a fifth time t 5, which is subsequent to time t 4, the lock mode of the camera application user interface 5172 is fully displayed.
In some embodiments, the one or more camera controls 5186 are not initially displayed in the locked mode of the camera application user interface 5172. For example, after a delay from time t 1 when the swipe left gesture is initiated (e.g., a delay of a period of time between time t 1 and time t 3), one or more camera controls 5186 are displayed in the camera application user interface 5172. In some embodiments, after displaying the threshold portion of the camera application user interface 5172, one or more camera controls 5186 are displayed in the camera application user interface 5172.
In some implementations, one or more images in the image library of the device 100 are inaccessible when the locked mode of the camera application user interface 5172 is displayed (e.g., as shown at time t 5 of fig. 5 AH). For example, when the unlock mode of the camera application user interface 5172 is displayed (e.g., as shown at time t 5 in fig. 5 AI), the most recently captured image will be displayed in the region 5188, and when the lock mode of the camera application user interface 5172 is displayed (e.g., as shown in the region 5188 at time t 5 in fig. 5 AH), the image will not be displayed in the region 5188.
Fig. 5AI illustrates a transition from a state in which the wake screen interface 512 is displayed to a state in which the unlock mode of the camera application user interface 5172 is displayed. At a first time t 1, a swipe gesture to the left is initiated, where contact 5190 begins to move along a first portion of the swipe gesture, as indicated by arrow 5192. At a second time t 2, which is subsequent to time t 1, a first portion of the camera application user interface 5172 is displayed by the first portion of the swipe gesture, and the contact 5190 continues to move along the second portion of the swipe gesture, which is shown by arrow 5194. At a third time t 3, subsequent to time t 2, the camera application user interface 5172 is further displayed by the second portion of the swipe gesture, and the contact 5190 continues to move along the third portion of the swipe gesture shown by arrow 5196. At a fourth time t 4, which is subsequent to time t 3, the camera application user interface 5172 is further displayed by the third portion of the swipe gesture, and the contact 5190 continues to move along the fourth portion of the swipe gesture, which is shown by arrow 5198. At a fifth time t 5, which is subsequent to time t 4, the unlock mode of the camera application user interface 5172 is fully displayed.
In some embodiments, the one or more camera controls 5186 are not initially displayed in the unlocked mode of the camera application user interface 5172. For example, after a delay from time t 1 when the swipe left gesture is initiated (e.g., a delay of a period of time between time t 1 and time t 3), one or more camera controls 5186 are displayed in the camera application user interface 5172. In some embodiments, after displaying the threshold portion of the camera application user interface 5172, one or more camera controls 5186 are displayed in the camera application user interface 5172. This avoids accidental operation of the camera application user interface.
In fig. 5AJ, the input to partially display the camera application user interface 5172 is followed by the input to again display the wakeup screen interface 512. In some implementations, input for displaying the camera application user interface 5172 is canceled by stopping the swipe left gesture (e.g., completing only the first portion of the gesture before displaying all of the camera application user interface 5172 or a threshold portion of the camera application user interface 5172). In some implementations, the input for displaying the camera application user interface 5172 is canceled by providing input (e.g., a swipe right gesture) before displaying all of the camera application user interface 5172 or before displaying a threshold portion of the camera application user interface 5172. For example, an input including a first portion 5178 and a second portion 5180 of a left gesture is detected, followed by a right gesture performed along the path indicated by arrow 5202. The right gesture cancels the display of the camera application user interface 5172. This may avoid accidental display and/or operation of the camera application interface.
In fig. 5AK, the wake screen interface 512 is displayed in a locked mode, as indicated by the locked state of the lock icon 514. Fig. 5AK illustrates an upward swipe gesture (e.g., from the lower edge of touch screen 112), wherein contact 5204 moves along the path indicated by arrow 5206. In response to the swipe up gesture, a first page of the multi-page control panel user interface 5208 is displayed, as shown in fig. 5 AL. In some implementations, when sliding up, an animation is displayed in which the multi-page control panel user interface 5208 displays a gradual upward sliding from the lower edge of the display.
Fig. 5AL shows a first page of the multi-page control panel user interface 5208. The multi-page control panel user interface 5208 includes a plurality of control affordances, such as 5210 (flight mode), 5212 (Wi-Fi), 5214 (bluetooth), 5216 (no-disturb mode), 5218 (spin lock), 5220 (flashlight), 5222 (timer), 5224 (night mode), 5226 (calculator), 5228 (camera), 5230 (Apple TV mirror), and/or 5232 (brightness). In some embodiments, one or more control affordances of the first page of the multi-page control panel user interface 5208 can indicate unavailability in a locked mode of the first page of the multi-page control panel user interface 5208. In some implementations, the appearance of one or more of the control affordances of the first page of the multi-page control panel user interface 5208 indicates the state of the control affordance. For example, wi-Fi control affordance 5212 is shown without shading to indicate that Wi-Fi is enabled, and bluetooth control affordance 5214 is shown with shading to indicate that bluetooth is disabled.
In some embodiments, the multi-page control panel user interface 5208 includes a control user interface disarming representation 5234. In some implementations, the page indicator 5236 is used to indicate the page of the multi-page control panel that is currently displayed. In some embodiments, the multi-page control panel user interface 5208 is displayed partially or fully overlaying another user interface (e.g., an initial screen such as a lock screen and/or a wake screen 5238, an application springboard user interface 400, and/or an application user interface as shown in fig. 5 AL) on which input for displaying the control user interface is detected. In some embodiments, the user appearance (e.g., wake screen user interface 512) partially covered by the multi-page control panel user interface 5208 is changed, as shown by 5238. For example, the partially covered user interface 512 may be blurred and/or have a lower brightness than the previous user interface appearance.
In response to an input (e.g., swipe left) of the contact 5241 with movement along the path indicated by the arrow 5240, a second page of the multi-page control panel user interface 5208 is displayed, as shown in fig. 5 AM. In some implementations, the second page of the multi-page control panel user interface 5208 includes a plurality of content playback control affordances, such as a playback locator 5242, a last track control 5246, a pause/play control 5248, a next track control 5250, and/or a volume adjustment control 5252. In some implementations, the second page of the multi-page control panel user interface 5208 includes content routing destination information 5260 and/or affordances 5262 for displaying/ceasing to display routing destination menus. In some implementations, the second page of the multi-page control panel user interface 5208 includes content information for the currently playing and/or recently playing content, such as images 5264 representing the content (e.g., album art) and/or identifying information 5266 for the content (e.g., track name, album name, and/or artist). In some embodiments, one or more control affordances of a first page of the multi-page control panel user interface 5208 can be indicated as unavailable in a locked mode of a second page of the multi-page control panel user interface 5208.
In response to AN input that includes a moving contact 5268 along the path indicated by arrow 5270 (e.g., swipe left), a third page of the multi-page control panel user interface 5208 is displayed, as shown in fig. 5 AN. In some embodiments, the third page of the multi-page control panel user interface 5208 includes a plurality of remote device control affordances, such as a temperature control 5272, a fan control 5274, a shadow control 5276, a light control 5278, a door control 5280, a camera control 5282, a smoke alarm control 5284, and/or a sleep control 5286. In some embodiments, one or more remote device controls are not available in the third page of the locked mode of the multi-page control panel user interface 5208. For example, the remote device control affordances 5272-5286 are shown as being visually changed (e.g., shaded) from a default state of the control affordances 5272-5286 displayed in an unlocked mode of the multi-page user interface 5208 to indicate that one or more remote device controls corresponding to the visually changed remote device control affordances are not available in a locked mode. In this way, the access rights of the remote device space are limited to be adjusted by the user who has provided authentication information to unlock the device. In some embodiments, the third page of the multi-page control panel user interface 5208 includes identifying information (e.g., the name 5288 of the defined area and/or an image 5290 corresponding to the area) for a defined area (e.g., a scene, such as a "living room") in which one or more remote devices controlled by the control affordances 5272-5286 are located.
In some embodiments, the remote device control affordances 5272-5286 displayed in the third page of the multi-page control panel user interface 5208 are remote device control affordances (e.g., determined by GPS and/or Wi-Fi signals received by the device) that correspond to (e.g., are located in) a defined area (e.g., room) in which the device 100 is currently located. In some embodiments, the remote device control affordances 5272-5286 displayed in the third page of the multi-page control panel user interface 5208 are remote device control affordances corresponding to the recently accessed defined areas. In some embodiments, the third page of the multi-page control panel user interface 5208 includes an affordance 5292 for displaying/ceasing to display the plurality of defined area identifiers (e.g., a menu 5294 for displaying the plurality of defined area identifiers, as shown in fig. 5 AQ).
In fig. 5AN, AN input for operating the lamp control affordance 5278 is provided through contact 5296. Because the device is in the locked mode (as shown by the lock icon 514), user authentication is required before the light control affordance 5278 is operational. In some implementations, in response to detecting an input at the control affordance 5278 while the lockout mode is active, the device displays a verification prompt user interface 558. FIG. 5AO is a user authentication prompt user interface 558 including a password input interface 560. In accordance with a determination that the user authentication information (e.g., provided by the input of contact 5295 at password input interface 560 and/or the fingerprint sensor of physical button 204) is valid, an unlocked view of the third page of multi-page control panel user interface 5208 is displayed, as shown in fig. 5 AP.
In fig. 5AP, the remote device control affordances 5272-5286 are shown without shading because the control affordances 5272-5286 are available for the unlock mode of the device. Input (e.g., a flick gesture) is provided through contact 5296 at a location corresponding to an affordance 5292 for displaying/ceasing to display a plurality of defined area identifiers. In response to the input, a menu 5294 including a plurality of defined area identifiers is displayed, as shown in fig. 5AQ. In some embodiments, the visual appearance of the affordance 5292 for displaying/ceasing to display the plurality of defined area identifier changes in response to input provided at a location corresponding to the affordance 5292. For example, the orientation of the V-shaped icon of affordance 5292 changes from fig. 5AP to fig. 5AQ.
In fig. 5AQ, a menu 5294 of a plurality of defined area identifiers (e.g., "favorites" 5300, "current room" 5302, "master bedroom" 5304, "office" 5306, and/or "kitchen" 5308) is displayed. Input (e.g., a flick gesture) is provided through the contact 5298 at a location corresponding to the affordance 5292 for displaying/ceasing to display the plurality of defined area identifiers. In response to the input, the menu 5294 of the plurality of defined area identifiers 5300-5308 ceases to be displayed and the third page of the multi-page control panel user interface 5208 is redisplayed, for example, as shown in fig. 5 AR. In some implementations, in response to receiving a respective defined area identifier of the menu 5294 in response to an input (e.g., a tap input), a set of control affordances corresponding to the selected respective defined area identifier are displayed.
AS shown in fig. 5 AR-5 AS, in response to input of the contact 5310 with movement along the path indicated by arrow 5312 (e.g., swipe down), display of the third page of the multi-page control panel user interface 5208 is stopped and the underlying user interface 512 is redisplayed AS shown in fig. 5 AS. (in some embodiments, input received at a location corresponding to the control user interface disallow representation 5234 causes the multi-page control panel user interface 5208 to cease displaying and redisplay the underlying wakeup screen interface 512, AS shown in FIG. 5 AS.)
AS shown in fig. 5 AS-5 AT, the multi-page control panel user interface 5208 is redisplayed in response to input of a contact 5314 having movement along the path indicated by arrow 5316 (e.g., swipe up). In some embodiments, the most recently displayed page of the multi-page control panel user interface 5208 (e.g., the third page of the multi-page control panel user interface 5208) is redisplayed in response to an input, as shown in fig. 5 AT.
Fig. 5AU shows input (e.g., swipe up along the path indicated by arrow 5320) received through contact 5322 at an application user interface (e.g., calendar application user interface 5318) to display a first page of multi-page control panel user interface 5208, as shown in fig. 5 AV.
In fig. 5AV, input of the contact 5235 is received (e.g., swipe left along the path indicated by arrow 5327) to display a second page of the multi-page user interface 5208, as shown in fig. 5 AW.
In FIG. 5AW, a second page of the multi-page control panel user interface 5208 is displayed partially overlaying the calendar application user interface 5318. In response to an input to stop displaying the multi-page control panel user interface 5208 (e.g., a downward swipe of contact 5329 along the path indicated by arrow 5331), display of the second page of the multi-page control panel user interface is stopped and the calendar application user interface 5318 is redisplayed, as shown in fig. 5 AX.
In fig. 5AX, an input for redisplaying the multi-page control panel user interface 5208 (e.g., an upward swipe of contact 5333 along the path indicated by arrow 5335) is detected at calendar application user interface 5318. In response to detecting the input, the device redisplays the second page of the multi-page control panel user interface 5208, as shown in fig. 5 AY.
In fig. 5AY, an input for displaying a third page of the multi-page control panel user interface 5208 (e.g., a horizontal swipe of touch 5339 along the path indicated by arrow 5337) is detected at the second page of the multi-page control panel user interface 5208. In response to detecting the input, the device displays a third page of the multi-page control panel user interface 5208, as shown in fig. 5 AZ.
In fig. 5AZ, an input for displaying a second page of the multi-page control panel user interface 5208 (e.g., a horizontal swipe of touch 5343 along the path indicated by arrow 5341) is detected at a third page of the multi-page control panel user interface 5208. In response to detecting the input, the device displays a second page of the multi-page control panel user interface 5208, as shown in fig. 5 BA.
In fig. 5BA, the route destination menu 5326 is shown in response to an input (e.g., a tap input) of the contact 5324 at a location corresponding to the affordance 5262 for displaying/stopping displaying the route destination menu, as shown in fig. 5 BB.
In fig. 5BB, the second page of the multi-page control panel user interface 5208 includes a route destination menu 5326 containing route destinations 5328 ("iPhone"), 5260 ("Airpods"), and 5332 ("bluetooth speaker"). For example, the route destination menu expands from the content route destination information area 5260, and the multi-page control panel user interface 5208 expands in size (vertically) to accommodate the route destination menu 5326. In some implementations, the one or more routing destinations include information corresponding to the routing destination, such as an indication 5334 of a current routing destination (e.g., a check mark icon) and/or a battery level indication 5336. In some implementations, the visual appearance of the affordance 5262 changes in response to input provided at a location corresponding to the affordance 5262. For example, the orientation of the V-shaped icon of affordance 5262 changes from fig. 5BA to fig. 5BB.
In fig. 5BB, in response to an input (e.g., a tap input) of contact 5338 at a location corresponding to route destination 5332 ("bluetooth speaker"), the route destination of the content (e.g., the currently playing content) changes from route destination 5260 ("Airpods") to route destination 5332 ("bluetooth speaker"), as shown in fig. 5 BC.
In fig. 5BC, the content has been routed to destination 5332 ("bluetooth speaker") and/or stopped being routed to routing destination 5260 ("Airpods"), e.g., as indicated by the absence of indication 5334 of the current routing destination in the area corresponding to routing destination 5260 ("Airpods") and the presence of indication 5334 of the current routing destination in the area corresponding to routing destination 5332 ("bluetooth speaker"). In response to an input (e.g., a tap input) of the contact 5340 at a location corresponding to the affordance 5262 for displaying/ceasing to display the route destination menu 5326, the route destination menu 5326 is dismissed, as shown in fig. 5 BD.
In fig. 5BD, an input for changing the content routing destination (e.g., a horizontal swipe of touch 5323 along the path indicated by arrow 5345) is shown. In response to detecting an input for changing the content routing destination, the device updates the content routing destination information area 5260 to indicate the changed content routing destination (e.g., "Airpods"), as shown in fig. 5 BE.
Fig. 5 BF-5 BJ illustrate functionality to modify the control affordance of the first page of the multi-page control panel user interface 5208 in accordance with some embodiments. In fig. 5BF, an input of contact 5342 is detected at a location corresponding to the Wi-Fi controlled affordance 5212 of the first page of the multi-page control panel user interface 5208. As indicated by the intensity meter 5344, the characteristic intensity of the contact is below the cue compression intensity threshold IT H, the light compression intensity threshold IT L, and the deep compression intensity threshold IT D. In accordance with a determination that the characteristic intensity of the contact meets the control switching criteria (e.g., the characteristic intensity of the contact is below the alert press intensity threshold IT H), the input of the contact 5342 switches the Wi-Fi control corresponding to the Wi-Fi control affordance 5212 from the disabled state to the enabled state.
Fig. 5BG illustrates a first page of the multi-page control panel user interface 5208 after the Wi-Fi control is switched from the disabled state as shown in fig. 5BF to the enabled state as shown in fig. 5 BG. To indicate that the Wi-Fi control has been switched to the enabled state, the Wi-Fi control affordance 5212 changes in appearance (e.g., changes visually, such as from a shaded state to an unshaded state), and/or displays a notification 5346 (e.g., "Wi-Fi: on").
Fig. 5BH to 5BI show inputs for displaying modification options of Wi-Fi controls. In fig. 5BH, the input of contact 5348 at the location corresponding to Wi-Fi control affordance 5212 meets enhanced control criteria (e.g., as indicated by intensity meter 5344, the characteristic intensity of contact 5348 increases beyond light press intensity threshold level IT L, and/or the characteristic intensity of contact 5348 increases beyond deep press intensity threshold level IT D). In response to the input, modification options menu 5350 includes modification options 5354 ("disconnect from home network"), 5356 ("turn off for 1 hour"), 5358 ("turn off all the way me left), 5360 (" connect to other networks "), and 5362 (" Wi-Fi settings "). The input of contact 5352 is detected at a location corresponding to modification option 5356 ("off for 1 hour").
In fig. 5BJ, the Wi-Fi control is (temporarily) disabled in response to the input shown in fig. 5BI, e.g., as indicated by the shadow state of Wi-Fi control affordance 5212 and notification 5346 (e.g., "Wi-Fi: off").
Fig. 6A-6E illustrate a flowchart of a method 600 of accessing a control from a user interface that is open from a display, according to some embodiments. The method 600 is performed on an electronic device (e.g., the device 300 of fig. 3, or the portable multifunction device 100 of fig. 1A) having a display, a touch-sensitive surface, and one or more sensors for detecting a contact strength with the touch-sensitive surface. In some implementations, the display is a touch screen display and the touch sensitive surface is on or integrated with the display. In some embodiments, the display is separate from the touch-sensitive surface. Some operations in method 600 are optionally combined, and/or the order of some operations is optionally changed.
As described below, the method 600 provides an intuitive way of accessing controls from a user interface that is open to a display. The method reduces the cognitive burden of a user when accessing controls from a user interface opened by a display, thereby creating a more efficient human-machine interface. For battery-operated electronic devices, enabling a user to access controls faster and more efficiently saves power and increases the time interval between battery charges.
While in the display off state (e.g., as shown in fig. 5B and 5D), the device detects a first input (602). In some implementations, the display off state is a state in which the display is turned off or darkened with the touch-sensitive surface in the touch screen remaining active, a state in which the touch screen is turned off, and/or a state in which the display functionality of the touch screen is turned off, e.g., implemented in a sleep/hibernate mode, a power saving mode, or another mode with limited functionality. In some embodiments, the first input is an input on a display activation affordance (e.g., a physical button or a virtual button) that meets display activation criteria for activating a device display (e.g., waking up a device and/or waking up a touch screen display). For example, the first input is an input of a contact 508 at the physical button 204, as shown in fig. 5B and 5D. In some implementations, the display activation criteria do not require that the characteristic intensity of the contact in the detected input meet a respective intensity threshold to meet the display activation criteria (e.g., by a flick gesture). In some embodiments, the display activation criteria is met when a contact lift-off is detected before a characteristic intensity of the contact exceeds a predetermined intensity threshold (if the display activation affordance is a virtual button) or when the contact does not physically actuate the display activation affordance (if the display activation affordance is a physical button, such as physical button 204). In some embodiments, the display activation criteria are met by other types of inputs (e.g., voice input, user pick up and tilt device, etc.) without requiring contact to be detected.
In response to detecting the first input, the device (604) activates a display (e.g., touch screen display 112) of the device and displays a first user interface on the display corresponding to a display on state of the device (e.g., the device displays a wake screen 512 as shown in fig. 5C, the device displays a user interface that is displayed immediately upon waking up the device, or the device activates the display of the device).
While the first user interface corresponding to the display on state of the device is displayed, the device detects (606) a swipe gesture on the touch-sensitive surface (e.g., a downward gesture along the path shown by arrow 518 in fig. 5F-1, a downward gesture along the path shown by arrow 536 in fig. 5F-5, a right gesture along the path shown by arrow 5120 in fig. 5T-1, a right gesture along the path shown by arrow 5121 in fig. 5T-3, a left gesture AS shown in fig. 5AH, a left gesture AS shown in fig. 5AI, an upward gesture along the path shown by arrow 5206 in fig. 5AK, and/or an upward gesture along the path shown by arrow 5314 in fig. 5 AS).
In response to detecting a swipe gesture on the touch-sensitive surface (608), in accordance with a determination that the device is in a locked mode of the display on state (e.g., as shown by the locked state of the lock icon 514) and the swipe gesture is in a first direction (e.g., downward and/or to the right), the device replaces a display of the first user interface (e.g., the wake screen 512 shown in fig. 5C) with a display of a second user interface that displays first content (e.g., the notification screen shown in fig. 5F-2 and/or the mini-application object screen shown in fig. 5T-2). In accordance with determining that the device is in the unlocked mode of the display on state (e.g., as shown by the unlocked state of the lock icon 514) and the swipe gesture is in a first direction (e.g., downward and/or to the right), the device replaces the display of the first user interface with the display of the second user interface, which displays the first content and the first additional content (e.g., limited notification content and/or limited mini-application object content) that are not displayed when the device is in the locked mode of the display on state. For example, as shown in FIGS. 5F-6 and 5H, the unlock mode of notification user interface 520 includes additional content not shown in the lock mode of notification user interface 520, such as notification 550, text 546 of notification 542, and/or image 548 of notification 542. As shown in fig. 5T-4 and 5X, the unlock mode of the mini-application user interface 5122 includes additional content, such as calendar event information (e.g., "drink coffee with Jon", "team meeting") not shown in the locked view of the next mini-application 5132 and/or calendar mini-application 5125.
Displaying different content depending on whether the device is in a locked or unlocked mode may effectively access information available in the device while maintaining security of sensitive information available in the device. Providing secure access to information stored in the device enhances the operability of the device and makes the user-device interface more efficient (e.g., by allowing a user to access information in a display-on state of the device without fully activating the device), which additionally reduces power usage and extends battery life of the device by enabling the user to use the device more quickly and efficiently.
In some embodiments, the notification includes information received by the device corresponding to the communication, such as a telephone call, a video call, a voice mail, an email, an SMS, an MMS, and/or a calendar invitation. In some embodiments, the notification is a calendar appointment notification, a calendar appointment invitation, a reminder, and/or another type of notification generated by an application installed on the device.
In some embodiments, the first content includes (610) one or more notifications (e.g., does not include limited notification information) and the first additional content includes limited notification information (e.g., notification content and/or other notification information subject to limited access through settings of the device, such as privacy and/or security settings). For example, the first additional content includes context information, such as calendar information including earlier communication content in the thread of the first content in the conversation transcript (such as message 568 preceding the received message 548 for which notification 542 was generated, as shown in fig. 5M) and/or a time range corresponding to a time associated with the calendar appointment notification in the first content (e.g., contextual calendar information 582 shown in the expanded view of calendar invitation message notification 550 shown in fig. 5P). In some embodiments, the second user interface (e.g., as shown in fig. 5F-6, 5G, 5M, 5N, 5P, 5R-2, and 5S-2) includes a longer version of the notification displayed in the second user interface when the device is in the locked mode (e.g., as shown in fig. 5F-2 and 5H) with the display in the on state. The longer version of the notification includes the expanded notification content, including for example all of the content in the short or standard version of the notification, as well as some additional content not included in the short or standard version of the notification. In some embodiments, the expanded notification content includes a more complete version of the notification content shown in the short version of the notification. In some embodiments, the expanded notification content includes images not included in the short version of the notification, interactive controls, selectable options for performing actions on the notification.
In some embodiments, the first content includes (612) one or more mini-application objects (e.g., one or more "desktops"), and the first additional content includes restricted mini-application object information (e.g., mini-application object information and/or other information subject to restricted access through settings of the device, such as privacy and/or security settings). For example, a calendar mini-application 5125 (e.g., as shown in fig. 5T-4, 5W, and 5X) as shown in the unlocked mode of the mini-application user interface 5122 includes mini-application object information that is not displayed in the calendar mini-application 5125 (e.g., as shown in fig. 5T-2 and 5U).
In some embodiments, the limited mini-application object information and/or limited notification information includes, for example, contextual content (such as an expanded view of an additional message or calendar in conversation transcript text), photo content, video, audio, real-time updated content (such as a traffic application map view showing real-time vehicle locations), controls (e.g., action buttons), a user's list of commonly used contacts, and/or a keyboard for inline input text (e.g., when operating a "answer" control in a communication notification). Additional content includes, for example, contextual content (e.g., messages 568 and 570) and/or reply affordance 572 in an expanded view of message notification 542 as shown in FIG. 5M; a message input area 578 and/or a keyboard 576 in the expanded view of the message notification 542 as shown in fig. 5N, an invitation message notification 550 (but not presented in the locked mode of the notification user interface 520) as shown in fig. 5O and 5P, playback controls for playback of voicemail audio (such as a play/pause control 596, a playback position slider 598, and/or a volume control 5100) as shown in fig. 5R-2, transcribed text 5102 of the voicemail in the expanded view of the voicemail notification 544, and/or action items 5104 ("back") and 5106 ("delete"); a map 5112 and/or action items 5118 ("call driver") in the expanded view of the notification 5108 as shown in fig. 5S-2, calendar mini-application 5125 calendar activity information (e.g., "and Jon drink coffee", "team", "meeting with shellby) and an expanded calendar time range," next mini-application 5132, and/or an additional mini-application 5164h in the expanded view 5164e of the application 5162 as shown in fig. 5S-2.
Displaying the mini-application object in an interface accessible in the display-on state of the device allows a user to view information (e.g., common information such as frequently accessed, user specified, and/or otherwise specified information) from the application without having to fully activate the device and/or activate the application. Providing access to application information in the display-on state of the device enhances the operability of the device and makes the user-device interface more efficient (e.g., by allowing a user to access information in the display-on state of the device without fully activating the device), which additionally reduces power usage and extends battery life of the device by enabling the user to use the device more quickly and efficiently.
In some implementations, in response to detecting a swipe gesture (614) on the touch-sensitive surface, in accordance with a determination that the device is in a locked mode (e.g., as shown in fig. 5F-1 and/or fig. 5T-1) with the swipe gesture in a second direction (e.g., rightward as shown in fig. 5T-1 and/or downward as shown in fig. 5F-1) different from the first direction, the device replaces display of the first user interface with display of a third user interface that displays second content (e.g., a mini-application object screen as shown in fig. 5T-2 and/or a notification screen as shown in fig. 5F-2); and, in accordance with determining that the device is in the unlocked mode of the display on state (e.g., as shown in fig. 5F-5 and 5T-3) and the swipe gesture is in a second direction (e.g., to the right as shown in fig. 5T-3 and/or downward as shown in fig. 5F-5), the device replaces the display of the first user interface with the display of a third user interface that displays second content and second additional content (e.g., limited mini-application object content (e.g., as shown in fig. 5T-4), and/or limited notification content or other notification information that is limited in access through privacy and/or security settings on the electronic device (e.g., as shown in fig. 5F-6)) that is not displayed when the device is in the locked mode of the display on state.
In some implementations, the swipe gesture is in a third direction (e.g., to the left as shown in fig. 5AH and 5 AI) that is different from the first direction and the second direction. In accordance with a determination that the device is in the unlocked mode (e.g., as shown in fig. 5 AI), the first user interface is replaced with a display of a fourth user interface associated with the camera application (e.g., camera application user interface 5172). In accordance with a determination that the device is in the locked mode (e.g., as shown in fig. 5 AH), the first user interface is replaced with a display of a fourth user interface associated with the camera application and access to the image library associated with the camera application is limited (e.g., only images taken from after receipt of the first input may be accessed).
In some implementations, the swipe gesture is in a fourth direction (e.g., upward along a path shown by arrow 5206, e.g., shown in fig. 5AK, and/or a path shown by arrow 5314 in fig. 5 AS) that is different from the first direction, the second direction, and the third direction. In accordance with a determination that the device is in the unlocked mode (e.g., AS shown in fig. 5 AS), the first user interface 512 is replaced with a display of a control panel user interface (e.g., a multi-page control panel user interface 5208). In some embodiments, in accordance with a determination that the device is in a locked mode (e.g., as shown in fig. 5 AK), the first user interface 512 is replaced with a display of the multi-page control panel user interface 5208, and at least one panel and/or control of the control panel user interface 5208 is not accessible (e.g., as shown in fig. 5 AN). For example, when the device is in the locked mode (e.g., as shown by the locked state of the lock icon 514 in fig. 5 AN), a remote device control, such as a home accessory control (e.g., remote device control affordances 5272 through 5286), is inoperable and/or is not displayed. In some embodiments, in accordance with a determination that the device is in a locked mode, the device displays a locked mode flag (e.g., a closed padlock icon 514) on the first user interface 512, the second user interface (e.g., notification user interface 520 and/or mini-application user interface 5122), the third user interface (e.g., notification user interface 520 and/or mini-application user interface 5122), the fourth user interface (e.g., camera application user interface 5172), and/or control panel user interface 5208. In some embodiments, in accordance with a determination that the device is in an unlocked mode, the device displays an unlock mode indicia (e.g., an open padlock icon, such as an unlocked state of lock icon 514) on the first user interface, the second user interface, the third user interface, the fourth user interface, and/or the control panel user interface.
In some embodiments, the second content includes (616) one or more mini-application objects (e.g., calendar mini-application 5125, weather mini-application 5126, next mini-application 5132, and/or mail mini-application 5162), and the second additional content includes limited mini-application object information.
In some embodiments, the second content includes (618) one or more notifications (e.g., calendar invitation notification 550, message notification 542, voice mail notification 544, and/or notification 5108), and the second additional content includes limited notification information.
In some embodiments, the second user interface includes (620) a first search input area 524 (e.g., as shown in fig. 5F-2 through 5F-4 and 5F-6 through 5F-8), and the third user interface includes a second search input area 5124 (e.g., as shown in fig. 5T-2 and 5T-4).
In some implementations, the device detects (622) an input (e.g., text input for filtering a search) in respective search input areas of the first and second search input areas (e.g., search input area 524, which contains the input word "movie", and/or search input area 5124).
In response to detecting an input in the respective search area, the device displays (624) a first set of search results (e.g., as shown in fig. 5F-4 and 5I) in accordance with a lock mode that determines that the device is in a display on state. In accordance with an unlock mode that determines that the device is in a display on state, the device displays a second set of search results (e.g., a set of search results that includes additional search results that are limited due to security and/or privacy policies of the device and are not included in the first set of search results) that are different from the first set of search results (e.g., as shown in fig. 5F-6 and 5J). For example, message 552, message 554, and email 556 are shown in fig. 5J, which illustrates the device unlock mode, but are not shown in fig. 5I, which illustrates the device lock mode.
In some embodiments, in response to detecting a first input (e.g., while the display is active), such as an input through contact 508 shown in fig. 5D, it is determined whether the device is in an unlocked mode or a locked mode of the display on state (626).
In some implementations, in response to detecting a swipe gesture (e.g., as described with reference to 606) on the touch-sensitive surface (e.g., after display activation), it is determined whether the device is in a locked mode or an unlocked mode of the display on state (628).
In some embodiments, the electronic device includes (630) one or more sensors (including, for example, accelerometers, gyroscopes, microphones, vibration sensors, thermal sensors, touch sensors, and/or light sensors), and detecting the first input (to activate a display of the device) includes detecting a change in the device environment (e.g., the device is picked up, as shown in fig. 5A-1 through 5A-4) with the one or more sensors. In some embodiments, the change in the device's environment is, for example, a change in the internal and/or external state of the device, such as a change in the device's orientation and/or a change in the environment (e.g., light level). For example, the environment of the device changes when the user picks up the device from a table and/or removes the device from a pocket to view the device held in the hand 502, and/or in response to detecting a trigger phrase or keyword spoken by the user. In response to detecting the change in the device environment, the device determines whether the change in the device environment meets display activation criteria. In some implementations, the display activation criteria include one or more of tilt criteria (e.g., a threshold deviation from an initial position and/or a threshold deviation from a horizontal axis), acceleration criteria, and/or illumination level change criteria. In some embodiments, the output of one or more sensors is used to prevent false positives. For example, if the light level variation criteria are met, but the amount of acceleration is below a threshold acceleration level (e.g., different from the acceleration criteria), the display activation criteria are not met. In some embodiments, as the orientation of the device changes, the wallpaper and/or notification expands as the orientation changes (e.g., as shown in fig. 5A-3 through 5A-4).
In some implementations, detecting the first input (to activate the display of the device) includes (632) detecting activation of a display activation affordance (e.g., a physical button 204 and/or a virtual button on the touch-sensitive surface 112). In some embodiments, detecting activation of the display activation affordance includes detecting operation of a physical button 204 (such as a sleep/wake button 206). In some embodiments, detecting activation of the display activation affordance includes detecting an increase in contact characteristic intensity that meets a respective threshold intensity (e.g., IT A as indicated by intensity meter 510 of fig. 5D) by one or more sensors for detecting contact intensity with a touch-sensitive surface (e.g., at a location corresponding to a virtual button, such as a virtual home button configured to detect force, contact area, and/or fingerprint).
In some embodiments, detecting the first input (to activate the display of the device) includes (634) detecting contact with a display activation affordance (e.g., the physical sleep/wake button 206 and/or the physical home button 204) that activates the display activation affordance (e.g., to activate the display of the device) without actuating the display activation affordance. In some embodiments, the contact with the display activation affordance that does not actuate the display activation affordance is a contact with the physical button 204 that does not move and/or press the physical button (e.g., a tap on a button containing a fingerprint sensor, such as a physical home button, for Touch ID scan), as shown in fig. 5B.
In some embodiments, detecting the first input (to activate the display of the device) includes (636) detecting contact with a display activation affordance (e.g., virtual button and/or physical button 204) that activates the display activation affordance (e.g., wakes the device or wakes the touch screen display) of the device display without activating the display activation affordance for performing at least one additional function associated with the display activation affordance. In some embodiments, the display activation affordance is a virtual button or physical button 204 that triggers execution of different functions in response to contact at different intensity levels. For example, a contact having a characteristic intensity below a first intensity threshold (e.g., IT A as indicated by the intensity meter 510 of fig. 5B) activates a first function associated with the virtual button (e.g., to activate a display of the device), and a contact 508 having a characteristic intensity equal to or exceeding the first intensity threshold (e.g., as shown in fig. 5D) activates a second function associated with the virtual button (e.g., to unlock the device (if the device is locked) and display a home screen (e.g., application springboard user interface 400, as shown in fig. 5E)).
In some implementations, when the first user interface 512 is displayed, the device detects (638) the second input and, in response to detecting the second input, the device replaces a display of the first user interface with a display of the user interface including a plurality of application icons corresponding to different ones of the plurality of applications at a time after detecting a second swipe gesture in a direction opposite the first direction on the touch-sensitive surface. For example, the device detects a second input including a contact 508 on the display activation affordance (e.g., physical button 204 or virtual button), e.g., that meets device unlocking criteria to unlock the device (if the device is locked), and in response to the second input, the device displays a home screen (e.g., application springboard user interface 400, as shown in fig. 5E). In some embodiments, the device unlocking criteria requires that the detected contact feature intensity in the input meet or exceed a corresponding intensity threshold (e.g., if the display activation affordance is a virtual button). For example, in some embodiments, the device unlocking criteria include a criterion that is met when the characteristic intensity of the contact increases above the deep press intensity threshold IT D. In some embodiments, the device unlock criteria require that the contact physically actuate the display activation affordance (if the display activation affordance is a physical button 204), for example, as shown in fig. 5D. In some embodiments, in response to detecting the second input, the device replaces the display of the first user interface (e.g., the wake screen user interface 512) with a display of a user interface that includes a plurality of application icons corresponding to different ones of the plurality of applications (e.g., the user interface that includes the plurality of application icons is the application springboard user interface 400, e.g., as shown in FIG. 5E). In some implementations, the second input includes pressing a physical button (e.g., a home button). In some embodiments, the second input includes detecting, by one or more sensors for detecting a contact intensity with the touch-sensitive surface (e.g., at a location corresponding to a virtual button, such as a virtual home button configured to detect a force and/or fingerprint), an increase in contact feature intensity that meets a device unlocking criteria that unlocks the device (if the device is locked) and displays a home screen. In some embodiments, when an input detected on a respective application icon in a user interface including a plurality of application icons meets application launch criteria, the device launches an application corresponding to the respective application icon. In some implementations, the application launch criteria are met by a tap input, e.g., the characteristic intensity of the tap input is not required to meet a respective intensity threshold.
In some embodiments, in response to detecting the second input, the device determines (640) whether the device is in a locked mode or an unlocked mode with the display in an on state, and in accordance with determining that the device is in a locked mode with the display in an on state (e.g., the device is not unlocked by a Touch ID scan occurring when the second input is provided, and/or the device is not unlocked by a previous input provided prior to providing the second input), the device displays (e.g., prompts the user to provide an unlock input to unlock the device) an authentication user interface, such as user authentication prompt user interface 558 including password input interface 560, as shown in FIG. 5L.
In some embodiments, detecting the first input includes (642) detecting activation of the display activation affordance. For example, the first input includes a contact 508 on the display activation affordance (e.g., physical button 204 or virtual button).
In some embodiments, the respective mini-application object is configured (644) to perform less than a subset of the full functionality of the corresponding application of the respective mini-application object.
In some embodiments, the respective mini-application object (e.g., mini-application object 5125 shown in fig. 5U) displays (646) an identifier of the corresponding application for the respective mini-application object (e.g., text 5131 identifying the corresponding application and/or icon 5135 identifying the corresponding application, as shown in fig. 5U).
In some implementations, the respective mini-application object displays (648) a portion of the content of the corresponding application from the respective mini-application object. For example, calendar mini-application 5125 includes a portion of content from a calendar application, as shown in calendar application user interface 5144 in fig. 5Z (e.g., the portion of content includes a subset of hours, such as an hour near the current time, from a calendar of a day in the calendar application).
In some embodiments, predefined inputs (650) on a respective mini-application object (e.g., including inputs corresponding to contacts 5143 at the location of the next mini-application object 5132, as shown in fig. 5Y) launch an application (e.g., a calendar application, as shown in calendar application user interface 5144 in fig. 5Z) corresponding to the respective mini-application object.
In some embodiments, the respective mini-application object is run (652) as a separate application residing in the memory of the device, the separate application being different from an associated application also residing in the memory of the device.
In some embodiments, the respective mini-application object is run as an extension or component of the associated application on the device (654).
In some implementations, the respective mini-application object has (656) a dedicated memory portion for temporarily storing information.
In some implementations, this memory portion is accessible by a corresponding full function application of the respective mini-application object (658).
In some embodiments, the notification is a data object published by the operating system (or a notification management module of the operating system) through the application (or a server associated with the application) for display outside of the normal user interface of the application. The notification may include data retrieved from a data store that is accessible by the notification and an application associated with the notification. In some embodiments, the notification may include a programmable component (e.g., mini-application object or extension) that may dynamically load or generate data for display on the device. In some embodiments, the notification received from the application (or a server associated with the application) includes data for generating a short version of a more abbreviated display and a longer version of a more complex display for the notification displayed on the device.
In some embodiments, the mini-application object is configured to perform a subset of less than all of the functionality of the corresponding application.
In some embodiments, the mini-application object displays an identifier of the corresponding application.
In some implementations, the mini-application object displays a portion of content from the corresponding application. For example, the map mini-application object displays a portion of a map displayed in a map application corresponding to the map mini-application object. For example, a calendar mini-application object displays a portion of a calendar displayed in a corresponding calendar application.
In some implementations, predefined inputs on the mini-application object launch the corresponding application.
In some embodiments, the mini-application object operates as a stand-alone application residing in the memory of the device, the stand-alone application being different from an associated application also residing in the memory of the device. For example, a mini-application object corresponding to a social networking application is run as a single-use or simplified application having a subset of less than the full functionality of the corresponding application, but associated with a full-functionality social networking application. In this embodiment, the mini-application object runs independently of the social networking application, and in a scenario where the social networking application is not running, the mini-application object continues to run.
In some embodiments, the mini-application object operates as an extension or component of an associated application on the device. For example, the mini-application object of the calendar application is a single feature or operating component of the full-function calendar application. In this embodiment, if the calendar application is not running (e.g., in the background), then the calendar mini-application object is also not running.
In some embodiments, the mini-application object has a dedicated memory portion for temporarily storing information. In some embodiments, the memory portion is accessible by a corresponding full-function application. For example, a mini-application object for an instant messaging application has a memory portion for temporarily storing a partially written reply message. In this embodiment, if the user opens the corresponding application in the middle of writing the reply message, the contents of the reply message are retrieved from the temporary storage location and used by the full-function application to allow the user to complete his reply message.
In some embodiments, the mini-application is a combination of any of the features described in paragraphs [00299] to [00305 ].
It should be understood that the particular order in which the operations in fig. 6A-6E are described is merely exemplary and is not intended to represent the only order in which the operations may be performed. Those of ordinary skill in the art will recognize a variety of ways to reorder the operations described herein. In addition, it should be noted that the details of other processes described herein with reference to other methods described herein (e.g., methods 700, 800, and 900) are equally applicable in a similar manner to method 600 described above with reference to fig. 6A-6E. For example, the contacts, gestures, user interface objects, intensity thresholds, focus selectors, and animations described above with reference to method 600 optionally have one or more of the features of the contacts, gestures, user interface objects, intensity thresholds, focus selectors, and animations described herein with reference to other methods described herein (e.g., methods 700, 800, and 900). For brevity, these details are not repeated here.
Fig. 7A-7E illustrate a flowchart of a method 700 of accessing a control from a user interface that is open from a display, according to some embodiments. The method 700 is performed on an electronic device (e.g., the device 300 of fig. 3, or the portable multifunction device 100 of fig. 1A) having a display, a touch-sensitive surface, and one or more sensors for detecting contact strength with the touch-sensitive surface. In some implementations, the display is a touch screen display and the touch sensitive surface is on or integrated with the display. In some embodiments, the display is separate from the touch-sensitive surface. Some operations in method 700 are optionally combined, and/or the order of some operations is optionally changed.
As described below, the method 700 provides an intuitive way of accessing controls from a user interface that is open to a display. The method reduces the cognitive burden of the user when accessing the control, thereby creating a more effective human-machine interface. For battery-operated electronic devices, enabling a user to access controls faster and more efficiently saves power and increases the time interval between battery charges.
While the device is in a display off state (e.g., a state in which the display is turned off or darkened with the touch-sensitive surface in the touch screen remaining active, a state in which the touch screen is turned off, and/or a state in which the display functionality of the touch screen is turned off, e.g., implemented in a sleep/hibernate mode, a power save mode, or another mode with limited functionality), the device detects (702) a first input, e.g., an input via contact 508 at physical button 204 as shown in fig. 5B, to activate the display of the device (e.g., wake the device or wake the touch screen display).
In response to detecting the first input (704), the device activates the display of the device and displays a first user interface 512 (e.g., a wake screen 512 as shown in FIG. 5C, a user interface that is displayed immediately upon waking up the device or activating the display of the device) on the display corresponding to the display on state of the device.
The device detects (706) a swipe gesture on the touch-sensitive surface while a first user interface corresponding to a display-on state of the device is displayed.
In response to detecting the swipe gesture (708), the device replaces the display of the first user interface 512 with the display of the camera application user interface 5172 in accordance with determining that the swipe gesture is in the first direction (e.g., to the left as shown in fig. 5AH and 5 AI). In accordance with determining that the swipe gesture is in a second direction (e.g., to the right along the path indicated by arrow 5120 in fig. 5T-1 and/or along the path indicated by arrow 5121 in fig. 5T-3) that is different from the first direction, the device replaces the display of the first user interface 512 with the display of a mini-application object user interface 5122 that is configured to include a plurality of mini-application objects (e.g., calendar mini-application 5125, weather mini-application 5126, next mini-application 5132, and/or mail mini-application 5162). Respective ones of the plurality of mini-application objects have corresponding applications (e.g., calendar application, weather application, and/or mail application) stored in the device. In accordance with determining that the swipe gesture is in a third direction (e.g., upward along the path indicated by arrow 5206 in fig. 5AK, and/or upward along the path indicated by arrow 5314 in fig. 5 AS) that is different from the first direction and the second direction, the device displays a first page of the multi-page control panel user interface 5208. For example, the device displays a multi-page control panel user interface 5208 that includes a first page as shown in FIG. 5AL having control affordances (e.g., 5210 through 5232) for adjusting device settings, a second page as shown in FIG. 5AM having a second control user interface that includes media player control affordances (e.g., 5242 through 5252), and/or a third page as shown in FIG. 5AN having a third control user interface that includes remote device control affordances (e.g., 5272 through 5286). In accordance with determining that the swipe gesture is in a fourth direction (e.g., downward along the path indicated by arrow 518 in fig. 5F-1, and/or downward along the path indicated by arrow 536 in fig. 5F-5) that is different from the first direction, the second direction, and the third direction, the device displays a notification user interface 520 (e.g., calendar invitation notification 550, message notification 542, voicemail notification 544, and/or notification 5108) configured to display a plurality of notifications. In some embodiments, the notification includes information received by the device corresponding to the communication, such as a telephone call, a video call, a voice mail, an email, an SMS, an MMS, and/or a calendar invitation. In some embodiments, the notification is a calendar appointment notification, a calendar appointment invitation, a reminder, and/or another type of notification generated by an application installed on the device.
Providing a camera application user interface, mini-application object user interface, multi-page control panel user interface, and notification user interface that is accessible via input received at the user interface corresponding to the display on state of the device allows a user to view information (e.g., common information such as application information and device settings) on the device display without fully activating the device. Providing access to such information in the display-on state of the device enhances the operability of the device and makes the user-device interface more efficient (e.g., by allowing a user to access the information in the display-on state of the device without fully activating the device), which additionally reduces power usage and extends battery life of the device by enabling the user to use the device more quickly and efficiently.
In some embodiments, replacing the display of the first user interface 512 with the display of the camera application user interface 5172 includes (710) (e.g., from the point in time when the swipe gesture was detected or from the point in time when the camera application user interface was displayed) delaying the display of one or more control affordances associated with the camera application (e.g., control affordances 5186 as shown in fig. 5AH and 5 AI) until the control display interval elapses. For example, control affordance 5186 is not displayed until time t 3 in fig. 5AI and 5AH after initiating the left gesture at time t 1. In some embodiments, replacing the display of the first user interface 512 with the display of the camera application user interface 5172 includes progressively displaying the camera application user interface upon receipt of the swipe gesture, and in accordance with a determination that the displayed portion of the camera application user interface 5172 increases above the threshold portion, displaying one or more control affordances. In some embodiments, the user is given some time and opportunity to revert to a previous user interface (e.g., home screen, wake screen, etc.) without lifting off contact 5176 by delaying the display of one or more control affordances 5186 associated with the camera application until a control display interval has elapsed or until a threshold portion of the camera application user interface 5172 is displayed. In other words, navigation of the user interface becomes smoother and more efficient, and the user can change his/her mind after seeing a prompt for the result of a continuous swipe gesture in the current swipe direction (e.g., continuing to swipe in the current direction will enter the camera application user interface). In some embodiments, some camera control functions associated with the control affordance (such as some auxiliary light sensors and/or some front-end control functions) and some control functions not associated with the control affordance (e.g., some back-end control functions) require time and power to activate, and activation of these control functions is delayed until the user's intent to activate the camera application is confirmed by the user continuing to sweep the gesture across the relevant threshold.
In some implementations, the device detects a gesture (e.g., a swipe gesture in an opposite direction of the swipe gesture that resulted in the display of the camera application user interface) back to the first user interface during the control display interval (712), and in response to detecting the gesture back to the first user interface during the control display interval, the device replaces the display of the camera application user interface (e.g., an already displayed portion of the camera application user interface) with the display of the first user interface (e.g., resumes the first user interface) (714).
In some implementations, when the multi-page control panel user interface 5208 is displayed (e.g., as shown in fig. 5 AL), the device detects (716) a gesture (e.g., a leftward gesture along a path shown by arrow 5240) on the touch-sensitive surface at a location corresponding to a first page of the multi-page control panel user interface 5208 (e.g., within and/or adjacent to the first page of the multi-page control panel user interface 5208). In some embodiments, in response to detecting a gesture at a location corresponding to a first page of the multi-page control panel, the device displays (718) a second page of the multi-page control panel user interface (e.g., including controls for media playback and/or controls for an accessory device (such as a home device accessory)), such as the second page of the multi-page control panel user interface 5208 shown in fig. 5 AM.
In some implementations, while displaying the second page of the multi-page control panel user interface 5208, the device detects (720) a gesture (e.g., a left gesture along a path shown by arrow 5270) on the touch-sensitive surface at a location corresponding to the second page of the multi-page control panel user interface (e.g., within and/or adjacent to the first page of the multi-page control panel user interface), and in response to detecting the gesture at the location corresponding to the second page of the multi-page control panel user interface, the device displays (722) a third page of the multi-page control panel user interface (e.g., including a control for one or more accessory devices (such as home devices) communicatively coupled to the electronic device), such as the third page of the multi-page control panel user interface 5208 shown in fig. 5 AN.
In some implementations, upon displaying notification user interface 520, the device detects (724) a dismissal gesture (e.g., swipe left or a gesture received at an affordance such as an "X" affordance (e.g., 566 in FIG. 5M) or a quick-action item such as an "accept calendar invitation" quick-action item) on touch-sensitive surface 112 at a location corresponding to a respective notification included in the notification user interface (e.g., within and/or adjacent to a notification tray at the location corresponding to the respective notification), and in response to detecting the dismissal gesture, the device stops (726) displaying the respective notification in the notification user interface. In some embodiments, when the notification user interface is subsequently displayed, the corresponding notification is not displayed.
In some implementations, while displaying the notification user interface, the device detects (728) a launch gesture (e.g., a flick gesture, such as a flick gesture by contact 592 at a location corresponding to the notification 550 (as shown in FIG. 5P) or at a location corresponding to an expanded version of the notification) on the touch-sensitive surface at a location corresponding to a first notification included in the notification user interface, and in response to detecting the launch gesture, the device launches (730) an application corresponding to the first notification (e.g., as shown in FIG. 5Q).
In some implementations, while displaying the mini-application object user interface, the device detects (732) a launch gesture (e.g., a tap gesture) on the touch-sensitive surface at a location corresponding to a first mini-application object included in the mini-application object user interface. For example, the start gesture is an input through the contact 5143 as shown in fig. 5Y. In some implementations, the launch gesture is detected at an extended version of the mini-application object. In some implementations, the launch gesture is detected at a non-extended version of the mini-application object.
In response to detecting the launch gesture, the device launches (734) an application corresponding to the first mini-application object. For example, in response to a start gesture performed by the contact 5143 at a location corresponding to the next mini-application object 5132, as shown in fig. 5Y, a corresponding calendar application 5144 is displayed, as shown in fig. 5Z.
In some embodiments, upon displaying the mini-application object user interface 5122, the device detects (736) an information expansion gesture (e.g., a flick gesture at a "more displayed" affordance on a mini-application object pad) on the touch-sensitive surface at a location corresponding to a second mini-application object included in the mini-application object user interface, wherein the second mini-application object includes mini-application object information (e.g., content, functionality, and/or an input device such as a control affordance and/or a keyboard). For example, the information expansion gesture is an input through a contact 5128 at a location corresponding to the "display more" affordance 5127 of the calendar mini-application 5125, as shown in fig. 5U, and/or an input through a contact 5139 at a location corresponding to the "display more" affordance 5141 of the next mini-application 5132, as shown in fig. 5X. In response to detecting the information expand gesture (738), the device expands the second mini-application object (e.g., increases the vertical size of the mini-application object, as shown by the expanded view of calendar mini-application object 5125 in FIG. 5W and the expanded view of the next mini-application object 5132 in FIG. 5Y). The device displays the mini-application object information and the additional mini-application object information in the expanded second mini-application object. For example, as shown in fig. 5Y, the expanded view of the next mini-application 5132 includes information displayed in the view of the next mini-application 5132 shown in fig. 5X, as well as additional information, such as a map 5134 and/or action items 5136 ("navigation"), 5138 ("nap"), 5140 ("invitee information"), and/or 5142 ("delete event"). In some embodiments, other mini-application objects move downward and/or scroll away from the bottom of the mini-application object user interface (e.g., as shown in weather mini-application object 5126 in fig. 5Y), thereby making room for the deployed mini-application object.
In some embodiments, the respective mini-application object is configured to perform (740) a subset of less than all of the functionality of the corresponding application of the respective mini-application object.
In some embodiments, the respective mini-application object displays (742) an identifier of the corresponding application for the respective mini-application object (e.g., text 5131 identifying the corresponding application and/or icon 5135 identifying the corresponding application, as shown in fig. 5U).
In some embodiments, the respective mini-application object displays (744) a portion of the content of the corresponding application from the respective mini-application object. For example, calendar mini-application 5125 includes a portion of content from a calendar application, as shown in calendar application user interface 5144 in fig. 5Z (e.g., a subset of hours, such as hours near the current time, from a calendar of a day in the calendar application).
In some embodiments, predefined inputs on a respective mini-application object (e.g., including inputs of contacts 5143 at locations corresponding to a next mini-application object 5132, as shown in fig. 5Y) launch (746) a corresponding application of the respective mini-application object (e.g., a calendar application, as shown in calendar application user interface 5144 in fig. 5Z).
In some embodiments, the respective mini-application object is run (748) as a separate application residing in the memory of the device that is different from an associated application also residing in the memory of the device.
In some embodiments, the respective mini-application object is run as an extension or component of the associated application on the device (750).
In some implementations, the respective mini-application object has (752) a dedicated memory portion for temporarily storing information.
In some implementations, the memory portion is accessible by a corresponding full function application of the respective mini-application object (754).
It should be understood that the particular order in which the operations in fig. 7A-7E are described is merely exemplary and is not intended to represent the only order in which the operations may be performed. Those of ordinary skill in the art will recognize a variety of ways to reorder the operations described herein. Additionally, it should be noted that the details of other processes described herein with reference to other methods described herein (e.g., methods 600, 800, and 900) are equally applicable in a similar manner to method 700 described above with reference to fig. 7A-7E. For example, the contacts, gestures, and user interface objects and animations described above with reference to method 700 optionally have one or more of the features of the contacts, gestures, and user interface objects and animations described herein with reference to other methods described herein (e.g., methods 600, 800, and 900). For brevity, these details are not repeated here.
Fig. 8A-8C illustrate a flowchart of a method 800 of adding a mini-application object to a mini-application user interface, according to some embodiments. The method 800 is performed on an electronic device (e.g., the device 300 of fig. 3, or the portable multifunction device 100 of fig. 1A) having a display, a touch-sensitive surface, and one or more sensors for detecting the intensity of a contact with the touch-sensitive surface. In some implementations, the display is a touch screen display and the touch sensitive surface is on or integrated with the display. In some embodiments, the display is separate from the touch-sensitive surface. Some operations in method 800 are optionally combined, and/or the order of some operations is optionally changed.
As described below, the method 800 provides an intuitive way of adding mini-application objects to a mini-application user interface. The method reduces the cognitive burden on the user when adding mini-application objects to the mini-application user interface, thereby creating a more efficient human-machine interface. For battery-powered electronic devices, enabling a user to more quickly and efficiently add mini-application objects to the mini-application user interface saves power and increases the interval between battery charges.
The device displays (802) a first user interface 400 comprising a plurality of application icons (e.g., a device display home screen or an application springboard 400 as shown in fig. 5 AA) corresponding to different ones of a plurality of applications installed on the device.
The device detects (804) a first input (e.g., a light press or a deep press input via the first contact 5150) on the touch-sensitive surface at a location corresponding to a first application icon (e.g., the mail application icon 418) of the plurality of application icons, the first application icon corresponding to the first application of the plurality of applications. For example, the first input is an input through the contact 5150, as shown in fig. 5AB to 5 AD.
In response to detecting the first input, the device displays (806) the first mini-application object or a preview of the first mini-application object (e.g., a representation or contracted version of the first mini-application object, such as preview 5156 of mini-application object 5162 as shown in fig. 5AF, as shown in fig. 5 AD) in a coverage area (e.g., of the overlay application springboard user interface 400). In some embodiments, one or more of the functional items corresponding to the mini-application object are not activated in the preview of the mini-application object. For example, when an input is received at the avatar 5164a of the mini-application preview 5156 of fig. 5AD, no response occurs, but a response occurs in response to an input received at the avatar 5164a of the mini-application object 5162 of fig. 5AF (e.g., a mail received from and/or sent to the user corresponding to the avatar 5164a is displayed). For example, the preview 5156 of the mini-application displays, for example, four head images 5164a to 5164d. The mini-application object 5162 includes an affordance ("display more" affordance 5168) for expanding the mini-application object 5162 to display additional avatars (e.g., 5164e through 5164h as shown in fig. 5 AG). The first mini-application object (e.g., preview 5156) corresponds to a first application (e.g., mail application) of the plurality of applications, and the overlay area includes an affordance ("add desktop" affordance 5160) for adding the first mini-application object (e.g., mini-application object 5162 corresponding to preview 5156) to a second user interface (e.g., mini-application user interface 5122) that displays the plurality of mini-application objects (e.g., including one or more user-selected mini-application objects and/or one or more default mini-application objects).
The device detects (808) a second input on the touch-sensitive surface at a location corresponding to an affordance 5160 for adding the first mini-application object 5162 to the second user interface (e.g., an input through the second contact 5150, such as a tap input, a light press input in which the characteristic intensity of the contact increases above a light press intensity threshold IT L, or a deep press input in which the characteristic intensity of the contact increases above a deep press intensity threshold IT D).
In response to detecting the second input, the device adds (810) the first mini-application object 5162 to a second user interface displaying the plurality of mini-application objects. In some embodiments, the plurality of mini-application objects (e.g., mail mini-application object 5162, calendar mini-application object 5125, and/or next mini-application object 5132 in the second user interface (e.g., 5122) are displayed as a vertical disk stack, as shown in fig. 5 AF. In some embodiments, respective disks of the vertical disk stack correspond to respective mini-application objects of the plurality of mini-application objects, and each disk has the same initial disk height.
Adding mini-application objects to a mini-application object user interface in response to input corresponding to an application icon is an intuitive and efficient way to facilitate user selection of a desired mini-application object. Providing the ability to add mini-application objects from the application icons to the set of mini-application objects enhances the operability of the device and makes the user-device interface more efficient, which in addition reduces power usage and extends the battery life of the device by enabling the user to use the device more quickly and efficiently.
In some implementations, while the first user interface (e.g., the application springboard user interface 400) is displayed, the device detects (812) a third input on the touch-sensitive surface 112 that meets a first criterion, such as a navigation criterion (e.g., the third input is one or more swipe right gestures) to navigate from the first user interface to the second user interface (mini-application user interface 5122).
In some embodiments, in response to detecting the third input, the device replaces the display (814) of the first user interface (e.g., the application springboard user interface 400) with a second user interface (mini-application user interface 5122), wherein replacing the display of the first user interface with the second user interface includes ceasing to display the first user interface (e.g., the application springboard user interface 400) and displaying the second user interface (mini-application user interface 5122) including concurrently displaying the first mini-application object 5162 (e.g., having updated content from the first application) and at least one second mini-application object (e.g., calendar mini-application object 5125 and/or next mini-application object 5132) corresponding to a second application of the plurality of applications that is different from the first application, as shown in FIG. 5 AF. In some embodiments, the second user interface includes an "edit" affordance for rearranging mini-application objects in the mini-application object stack and/or adding new mini-application objects to the mini-application object stack.
In some embodiments, the first mini-application object includes (816) at least one selectable information item (e.g., an avatar 5164a,5164b,5164c, and/or 5164 d). In some embodiments, the selectable information items are items that display information related to a particular type of object, such as stock listings that display stock information (e.g., starting price, highest price, price change), weather forecast items that display weather forecast information for a location (e.g., city), and avatars 5164 of one or more email senders that send email to the device 100 recently and/or frequently and/or receive email from the device 100 and/or calendar events.
While the second user interface is displayed, the device detects (818) a fourth input on the touch-sensitive surface at a location corresponding to the at least one selectable information item.
In response to detecting the fourth input, the device displays (820) additional information associated with the selectable information items (e.g., each stock list is a selectable information item in a mini-application object displaying a plurality of stock lists and corresponding stock prices). Additional information about one of the stock lists in the mini-application object, such as a stock value history, is displayed in the mini-application object in response to a tap input detected at that stock list. In some embodiments, the information associated with the selectable information item replaces a display of at least one previously displayed portion of the mini-application object. In some embodiments, the mini-application object is expanded and additional information is displayed in the expanded region of the mini-application object.
In some implementations, the first mini-application object 5162 is added 822 to the highest position in the vertical stack of mini-application objects in the second user interface (e.g., mini-application objects 5162, 5125, and/or 5132) (e.g., as shown in fig. 5 AF). In some implementations, the first mini-application object 5162 is added to another location (e.g., bottommost, leftmost, and/or rightmost) in the stack, grid, or other collection of mini-application objects. In some embodiments, the user is able to change the location of one or more mini-application objects in the set of mini-application objects.
In some embodiments, the first mini-application object includes (824) identifying information identifying the first application (e.g., text 5131 identifying the corresponding application and/or icon 5135 identifying the corresponding application, as shown in fig. 5U).
In some embodiments, the first mini-application object is configured to provide (826) a subset of the functionality provided by the first application (e.g., obtain data from the remote device (such as weather data, stock data, traffic data, location data for the remote device, and/or map data), determine an upcoming recent calendar appointment, and/or determine travel time to a predefined location and/or point of interest).
In some implementations, the first mini-application object includes (828) a subset of content (e.g., text, an image, a portion of a calendar, a map (such as a real-time update map showing a location of the electronic device and/or a location of the vehicle relative to the location of the electronic device), travel time to a predefined location and/or point of interest, weather data, and/or stock data) from the first application. In some embodiments, the content in the mini-application object is updated at periodic intervals and/or in response to user input.
In some implementations, in response to detecting the first input, the device displays (830) (e.g., in a panel adjacent to the mini-application object display) a menu 5154 corresponding to selectable options (e.g., one or more quick operation menu items) of the first application.
In some implementations, the predefined input on the first mini-application object 5162 launches (832) a first application (e.g., launches a mail application).
In some embodiments, the first mini-application object is run as a separate application resident in the memory of the device than the first application (834).
In some embodiments, the first mini-application object is run as an extension or component of the first application (836).
In some implementations, the first mini-application object has (838) a dedicated memory portion for temporarily storing information.
In some embodiments, the memory portion is accessible by the first application (840).
It should be understood that the particular order in which the operations in fig. 8A-8C are described is merely exemplary and is not intended to represent the only order in which the operations may be performed. Those of ordinary skill in the art will recognize a variety of ways to reorder the operations described herein. Additionally, it should be noted that the details of other processes described herein with reference to other methods described herein (e.g., methods 600,700, and 900) are equally applicable in a similar manner to method 800 described above with reference to fig. 8A-8C. For example, the contacts, gestures, user interface objects, and intensity thresholds described above with reference to method 800 optionally have one or more of the features of the contacts, gestures, user interface objects, and intensity thresholds described herein with reference to other methods described herein (e.g., methods 600,700, and 900). For brevity, these details are not repeated here.
Fig. 9A-9E illustrate a flowchart of a method 900 of using a multi-page control panel navigation control, according to some embodiments. The method 900 is performed on an electronic device (e.g., the device 300 of fig. 3, or the portable multifunction device 100 of fig. 1A) having a display, a touch-sensitive surface, and one or more sensors for detecting intensity of contacts with the touch-sensitive surface. In some implementations, the display is a touch screen display and the touch sensitive surface is on or integrated with the display. In some embodiments, the display is separate from the touch-sensitive surface. Some operations in method 900 are optionally combined, and/or the order of some operations is optionally changed.
As described below, the method 900 provides an intuitive way of navigating controls using a multi-page control panel. The method reduces the cognitive burden of the user when using the multi-page control panel navigation control, thereby creating a more effective human-machine interface. For battery-operated electronic devices, enabling a user to more quickly and efficiently use multi-page control panel navigation controls saves power and increases the interval between battery charges.
An initial user interface (e.g., application user interface 5318, a springboard user interface 400 comprising a plurality of application icons, or a wake screen user interface 512 that is displayed when the electronic device is woken up from a display off state) is displayed on the display, the device detects (902) a first gesture (e.g., a first swipe gesture, such as an upward swipe of contact 5322 along the path indicated by arrow 5320, as shown in fig. 5 AU) on the touch-sensitive surface.
In response to detecting the first gesture, the device displays (904) a first page of the multi-page control panel 5208 on the display, as shown in fig. 5 AV. In some implementations, the first page of the multi-page control panel is displayed overlaid on a portion of the initial user interface (e.g., overlaid on calendar application user interface 5318). In some implementations, the first page of the multi-page control panel 5208 gradually rises from the lower edge of the display according to the upward movement of the contact in the first gesture. The first page of the multi-page control panel includes a plurality of device control affordances (e.g., control affordances 5210-5228 described with reference to fig. 5 AL). In some embodiments, controls that can represent controls corresponding to one or more features of a device and/or controls for initiating display of an application interface (such as an application interface of an application run by the device) are controlled.
While a first page of the multi-page control panel is displayed, the device detects (906) a second gesture on the touch-sensitive surface. For example, the second gesture is a second swipe gesture, such as a horizontal swipe (e.g., via contact 5325 along a path indicated by arrow 5327) in a first direction orthogonal to the first swipe gesture direction. In some embodiments, the second gesture is not associated with a particular control affordance, e.g., the second gesture has the same effect at any location (within and/or adjacent to the multi-page control panel user interface) where the second gesture is received.
In response to detecting the second gesture, the device displays (908) a second page of the multi-page control panel 5208 (e.g., as shown in fig. 5 AW), wherein the second page of the multi-page control panel includes a plurality of content playback control affordances (e.g., playback position control slider, volume control slider, play/pause, fast forward, and/or fast reverse controls, such as control affordances 5242-5252 described with reference to fig. 5 AM). In some embodiments, the content playback control affordance is different from the device control affordance. The second page of the multi-page control panel 5208 replaces the first page of the multi-page control panel 5208 on the display. In some implementations, during a transition in which the first page of the multi-page control panel user interface 5208 replaces the first page of the multi-page control panel user interface 5208, an animation is displayed that slides the first page (e.g., a horizontal slide according to a detected gesture).
The multi-page control panel provides efficient access to common application functions and/or device settings to a greater extent than a single-page control panel. In general, the control panel may be accessed via input detected while the device is in any of a variety of device states (e.g., display on state, full active state displaying home screen user interface including a plurality of application icons, and/or full active state displaying application user interface). Providing a multi-page control panel increases the number of common features and settings that are accessible to the user without, for example, exiting the application, fully activating the device, and/or accessing a settings menu to access those features, which enhances the operability of the device and makes the user-device interface more efficient, which additionally reduces power usage and extends the battery life of the device by enabling the user to use the device more quickly and efficiently.
In some implementations, while displaying the second page of the multi-page control panel, the device detects (910) a third gesture on the touch-sensitive surface (e.g., a gesture made through contact 5329 moving along the path indicated by arrow 5331, as shown in fig. 5AW, or a gesture made through contact 5339 moving along the path indicated by arrow 5337, as shown in fig. 5 AY).
In some embodiments, in response to detecting the third gesture, in accordance with a determination that the third gesture is of the first gesture type (e.g., the gesture is a swipe down, such as shown in fig. 5AW, such as a swipe down from near the top of the second control user interface; the gesture is a tap outside of the second control user interface, and/or the gesture is a tap at a location corresponding to the control user interface de-affordance 5234), the device stops (912) displaying the second page of the multi-page control panel 5208 on the display (and displays an overlaid portion of the initial user interface, as shown in fig. 5 AX). For example, as the contact 5329 moves downward in the third gesture, the vertical length of the multi-page control panel 5208 decreases and the multi-page control panel 5208 appears to gradually descend.
In some embodiments, in accordance with a determination that the third gesture is of a second gesture type that is different from the first gesture type (e.g., the third gesture is a horizontal swipe in the first direction, such as an input through contact 5339 moving along the path indicated by arrow 5337, as shown in fig. 5 AY), the device displays (914) a third page of the multi-page control panel, wherein the third page of the multi-page control panel includes a plurality of remote device control affordances (e.g., as shown in fig. 5 AZ). In some embodiments, the third page of the multi-page control panel includes virtual remote control affordances (e.g., 5272 to 5286) for controlling one or more home automation devices, such as described with reference to fig. 5 AN. In some embodiments, the remote device control affordance is different from the content playback control affordance in a second page and the device control affordance in the first page of the multi-page control panel.
In some embodiments, the plurality of device control affordances, the plurality of content playback control affordances, and/or the plurality of remote device control affordances are arranged according to a predefined layout. In some embodiments, the third gesture is not associated with a particular control affordance, e.g., the third gesture has the same effect at any location (within and/or adjacent to the multi-page control panel user interface) where the third gesture is detected.
In some embodiments, the plurality of remote device controls included in the third page of the multi-page control panel 5208 can represent (e.g., as shown with reference to fig. 5 AP-5 AQ) different devices in respective defined areas (e.g., a "scene," such as a room and/or building, e.g., a "living room" as shown in fig. 5 AZ) of the plurality of defined areas (e.g., house rooms and/or campus buildings) that can be selected by the user.
In some embodiments, displaying the third page 5208 of the multi-page control panel includes (918) redisplaying one or more remote device controls included in the third page of the multi-page control panel when the third page of the multi-page control is displayed.
In some embodiments, displaying the third page of the multi-page control panel includes displaying (920) one or more remote device controls in the third page of the multi-page control panel that correspond to a defined area in which the device is currently located (e.g., as determined using GPS and/or Wi-Fi signals received by the device 100).
In some embodiments, the third page of the multi-page control panel includes (922) a defined region column representation enabled representation (e.g., an affordance 5292 for displaying/ceasing to display a plurality of defined region identifiers) that, when activated, is configured to cause a display of a plurality of defined region identifiers (e.g., a "scene" list, such as described with reference to fig. 5 AP-5 AQ) selectable by a user.
In some implementations, the first page of the multi-page control panel 5208 overlays (924) a portion of the initial user interface 5318 (e.g., as shown in fig. 5 AV). For example, the first page of the multi-page control panel 5208 and the initial user interface 5318 are displayed simultaneously, and a portion of the initial user interface 5318 is visually obscured by the first page of the multi-page control panel 5208. In some embodiments, the first page of the multi-page control panel is translucent and a portion of the initial user interface 5318 that exists behind the first page of the multi-page control panel 5208 is visible through the first page portion of the multi-page control panel 5208. In some embodiments, the second page of the multi-page control panel 5208 also overlays the portion of the initial user interface 5318. In some embodiments, one or more pages of the multi-page control panel user interface 5208 have a fixed size that is smaller than the size of the display and the size of the initial user interface 5318.
In some embodiments, the initial user interface is (926) an application user interface (e.g., calendar application user interface 5318), and the multi-page control panel 5208 and application user interface 5318 are displayed simultaneously, e.g., as shown in FIG. 5 AW. In some implementations, the multi-page control panel user interface can be accessed without exiting the application.
In some implementations, after ceasing to display the second page of the multi-page control panel, the device detects (928) a fourth gesture on the touch-sensitive surface (e.g., an input for redisplaying an upward swipe of the control user interface, such as through contact 5333 moving along the path indicated by arrow 5335, as shown in fig. 5 AX).
In some implementations, in response to detecting the fourth gesture, the device redisplays (930) the second page of the multi-page control panel 5208 (e.g., as shown in fig. 5 AY). In some embodiments, when an input is received to dismiss the multi-page control panel user interface, and then an input is received to redisplay the multi-page control panel user interface, a most recently displayed page of the multi-page control panel user interface (displayed prior to the input to dismiss the multi-page control panel user interface) is displayed in response to a subsequent input to redisplay the multi-page control panel user interface. For example, if the third page of the multi-page control panel is displayed immediately before receiving the input for dismissing the multi-page control panel, the third page of the multi-page control panel is displayed again in response to a subsequent input for redisplaying the multi-page control panel user interface (e.g., as shown in fig. 5 AR-5 AT).
In some embodiments, the plurality of device control affordances (e.g., the control affordances 5242-5252 described with reference to fig. 5 AM) includes (932) at least one switching control. The toggle control is, for example, a multi-state control such as a control to set the component state of the electronic device (e.g., a torch control to turn the device torch on and off, a Wi-Fi control, a bluetooth control) and/or a control to set the mode state of the electronic device (e.g., an enabled mode and a disabled mode) (e.g., a night mode control, a flight mode control, a do-not-disturb mode control, a spin-lock control, and/or a mirror mode control (e.g., "Apple TV mirror: on")). For example, as shown in fig. 5 BF-5 BG, the Wi-Fi control 5212 switches from the disabled state to the enabled state in response to an input through the contact 5342.
In some embodiments, as shown in fig. 5BA, the second page of the multi-page control panel 5208 includes (934) a tag (e.g., content routing destination information area 5260) of the routing destination of the currently playing media (e.g., the device to which the media is routed or sent, such as "playing on AirPods").
In some implementations, while displaying a second page of the multi-page control panel (e.g., as shown in fig. 5 BD), the device detects (936) an input (e.g., a swipe gesture by movement of contact 5323 along the path indicated by arrow 5345) on the touch-sensitive surface at a location corresponding to a marker 5260 of a route destination of the currently playing media, including a first device ("bluetooth speaker") corresponding to the first zone. In some embodiments, the routing destination corresponding to the first zone is, for example, a headset connected to the electronic device, a speaker of the electronic device, and/or one or more remote devices (e.g., a remote speaker and/or a remote display in the first room). In response to detecting an input on the touch-sensitive surface at a location corresponding to a marker of a routing destination of the currently playing media, the device routes (938) the media output to a second device (e.g., "Airpods", as shown in FIG. 5 BE) corresponding to the second zone. In some embodiments, the second zone includes, for example, a remote speaker and/or a remote display in the second room. In some embodiments, the tag 5260 changes to reflect the change in routing destination (as shown in fig. 5 BE).
In some implementations, the second page of the multi-page control panel 5208 includes (940) a media route destination column representation (e.g., a V-shape adjacent to an indication of the current destination of the currently playing media, such as an affordance 5262 for displaying/ceasing to display route destination menus).
In some implementations, the device detects (942) an input (e.g., an input via contact 5324, as shown in fig. 5 BA) on the touch-sensitive surface at a location corresponding to the media routing destination column representation 5262.
In some embodiments, in response to detecting an input at a location corresponding to the media route destination column representation, the device displays (944) a list 5236 of media route destination options, as shown in fig. 5BB (e.g., replaces the display of at least some content in the second page with the list of media route destination options and/or moves the display location of the at least some content).
In some embodiments, the device populates the list of media routing destination options 5326 with the identifier of the electronic device and the at least one additional device. In some embodiments, populating the list of media routing destination options with the at least one additional device includes (946) populating the list of media routing destination options with the one or more remote devices to which the electronic device is currently routing media output in accordance with a determination that the electronic device is currently routing media output, and populating the list of media routing destination options with the one or more mateable devices in accordance with a determination that the electronic device has been previously paired with the one or more mateable devices. In some embodiments, if the number of paired devices that the electronic device recently paired exceeds the threshold number, the list is populated with the threshold number of recently paired devices. In some embodiments, the currently undetected paired device is not included on one or more of the paired devices. In some embodiments, the electronic device (e.g., iPhone 5328 as shown in fig. 5 BB) is the first media routing destination in the list of media routing destination options, followed by the device to which the electronic device is currently routing media (if any) (e.g., airpods 5260 as shown in fig. 5 BB), followed by the paired device (if any) to which the electronic device has previously been paired (e.g., bluetooth speaker 5332 as shown in fig. 5 BB), and then by the signaling device (if any) detected by the electronic device.
In some embodiments, in accordance with a determination that one or more signaling (e.g., bluetooth signaling) devices are detected in proximity to the electronic device, the device populates (948) a list of media routing destination options with the one or more signaling devices.
In some embodiments, the electronic device includes (950) one or more sensors for detecting the intensity of the contact with the touch-sensitive surface 112. The device detects a first input, including detecting (952) a first contact 5342 on the touch sensitive surface 112 at a location corresponding to a first control affordance 5212 of the plurality of device control affordances, as shown in fig. 5 BF-5 BI. In response to detecting the first input (954), in accordance with a determination that the first input meets a second criterion (e.g., an enhanced control criterion, including a criterion that is met when a characteristic intensity of the contact increases above a deep press intensity threshold IT D, as shown by an intensity level meter 5344), the second criterion (e.g., the enhanced control criterion) requiring that the characteristic intensity of the first contact meet the first intensity threshold to meet the second criterion, the device displays one or more modification options (e.g., 5354,5356,5358,5360,5362) corresponding to a control of the first control affordance 5212, as shown in FIG. 5 BI. For example, the modification options are a mode modification option (e.g., wi-Fi enabled using the selected network) and/or a duration modification option (e.g., wi-Fi enabled for the next hour). In some implementations, the modification options are displayed in menu 5350. In accordance with a determination that the first input meets a third criterion (e.g., a control switching criterion) (e.g., the first input is a tap input), wherein the third criterion (e.g., the control switching criterion) does not require a characteristic intensity of the first contact to meet a first intensity threshold, the device switches a function (e.g., enables/disables Wi-Fi, as shown in fig. 5 BF-5 BG) corresponding to a control of the first control affordance.
In some embodiments, upon displaying one or more modification options corresponding to the control of the first control affordance, the device detects (956) a second input (e.g., via contact 5352 as shown in fig. 5 BI) that activates a first modification option (e.g., modification option 5356) of the one or more modification options. In some embodiments, the second input is a continuation of the first input by contact. In some embodiments, the second input is a separate tap input on the modification option. In response to detecting the second input, the device modifies (958) a control corresponding to the first control affordance in accordance with the activated first modification option (e.g., as shown in fig. 5 BJ).
In some embodiments, the respective page of the multi-page control panel includes (960) a indicia 5236 of the total number of pages the multi-page control panel has (e.g., a point corresponding to each page), and AN indicator corresponding to the currently displayed page of the multi-page control panel is highlighted in the respective page (e.g., as shown by indicia 5236 of fig. 5AL, 5AM, and 5 AN). For example, the flag 5236 indicates the position of the current page in the plurality of pages.
It should be understood that the particular order in which the operations in fig. 9A-9E are described is merely exemplary and is not intended to represent the only order in which the operations may be performed. Those of ordinary skill in the art will recognize a variety of ways to reorder the operations described herein. In addition, it should be noted that the details of other processes described herein with reference to other methods described herein (e.g., methods 600,700, and 800) are equally applicable in a similar manner to method 900 described above with reference to fig. 9A-9E. For example, the contacts, gestures, user interface objects, intensity thresholds, focus selectors, and animations described above with reference to method 900 optionally have one or more of the features of the contacts, gestures, user interface objects, intensity thresholds, focus selectors, and animations described herein with reference to other methods described herein (e.g., methods 600,700, and 800). For brevity, these details are not repeated here.
Fig. 10 illustrates a functional block diagram of an electronic device 1000 configured in accordance with the principles of various described embodiments, in accordance with some embodiments. The functional blocks of the apparatus are optionally implemented by hardware, software, or a combination of hardware and software that implements the principles of the various described embodiments. Those skilled in the art will appreciate that the functional blocks described in fig. 10 are optionally combined or separated into sub-blocks in order to implement the principles of the various described embodiments. Thus, the description herein optionally supports any possible combination or separation or further definition of functional blocks described herein.
As shown in fig. 10, the electronic device 1000 includes a display unit 1002, a touch-sensitive surface unit 1004, and a processing unit 1008 coupled with the display unit 1002 and the touch-sensitive surface unit 1004. In some embodiments, the electronic device includes one or more sensor units 1006, and the processing unit 1008 is coupled with the display unit 1002, the touch-sensitive surface unit 1004, and the one or more sensor units 1006. In some embodiments, processing unit 1008 includes a detection unit 1010, an activation unit 1012, a display enabling unit 1014, a replacement unit 1016, and a determination unit 1018.
The processing unit 1008 is configured to detect (e.g., with the detection unit 1010) a first input when the device is in a display off state. In response to detecting the first input, the processing unit 1008 is configured to activate the display unit 1002 of the device (e.g., with the activation unit 1012) and enable display of a first user interface corresponding to a display on state of the device on the display unit 1002 (e.g., with the display enabling unit 1014). While displaying a first user interface corresponding to a display on state of the device, the processing unit 1008 is configured to detect (e.g., with the detection unit 1010) a swipe gesture on the touch-sensitive surface unit 1004. In response to detecting a swipe gesture on the touch-sensitive surface unit 1004, the processing unit 1008 is configured to replace (e.g., with the replacement unit 1016) a display of a first user interface with a display of a second user interface that displays first content according to determining that the device is in a locked mode of the display on state and the swipe gesture is in a first direction, and to replace (e.g., with the replacement unit 1016) a display of the first user interface with a display of a second user interface that displays first content and first additional content that is not displayed when the device is in the locked mode of the display on state according to determining that the device is in an unlocked mode of the display on state and the swipe gesture is in the first direction.
In some embodiments, the first content includes one or more notifications and the first additional content includes limited notification information.
In some embodiments, the first content includes one or more mini-application objects and the first additional content includes restricted mini-application object information.
In some embodiments, the processing unit 1008 is further configured to, in response to detecting a swipe gesture on the touch-sensitive surface unit 1004, replace (e.g., with the replacement unit 1016) a display of the first user interface with a display of a third user interface that displays second content in accordance with determining that the device is in the locked mode of the display on state and the swipe gesture is in a second direction, and replace (e.g., with the replacement unit 1016) a display of the first user interface with a display of a third user interface that displays second content and second additional content that is not displayed when the device is in the locked mode of the display on state in accordance with determining that the device is in the unlocked mode of the display on state and the swipe gesture is in the second direction.
In some embodiments, the second content includes one or more mini-application objects and the second additional content includes restricted mini-application object information.
In some embodiments, the second content includes one or more notifications, and the second additional content includes limited notification information.
In some embodiments, the second user interface includes a first search input area and the third user interface includes a second search input area.
In some implementations, the processing unit 1008 is further configured to detect an input in a respective one of the first and second search input areas (e.g., with the detection unit 1010). In response to detecting an input in the respective search area, the processing unit 1008 is configured to enable display of a first set of search results (e.g., with the display enabling unit 1014) in accordance with a determination that the device is in a locked mode of the display on state, and enable display of a second set of search results, different from the first set of search results, in accordance with an unlocked mode of the determination that the device is in the display on state (e.g., with the display enabling unit 1014).
In some embodiments, determining whether the device is in an unlocked mode or a locked mode of the display on state is responsive to detecting the first input.
In some implementations, whether the device is in a locked mode or an unlocked mode of the display on state is determined in response to detecting a swipe gesture on the touch-sensitive surface unit 1004.
In some implementations, the electronic device includes one or more sensor units 1006 and detecting the first input includes detecting an environmental change of the device with the one or more sensors 1006 (e.g., with the detection unit 1010) and determining whether the environmental change of the device meets display activation criteria in response to detecting the environmental change of the device.
In some embodiments, detecting the first input includes detecting (e.g., with detection unit 1010) activation of the display activation affordance.
In some embodiments, detecting the first input includes detecting (e.g., with the detection unit 1010) contact with the display activation affordance that activates the display activation affordance without actuating the display activation affordance.
In some embodiments, detecting the first input includes detecting a contact with a display activation affordance that activates the display activation affordance to activate a display of the device, and that activates the display activation affordance to perform at least one additional function associated with the display activation affordance.
In some embodiments, the processing unit 1008 is configured to detect the second input (e.g., with the detection unit 1010) while the first user interface is displayed. In response to detecting the second input, the processing unit 1008 is configured to replace (e.g., with the replacement unit 1016) the display of the first user interface with a display of a user interface comprising a plurality of application icons corresponding to different ones of the plurality of applications.
In some implementations, in response to detecting the second input, the processing unit 1008 is configured to determine (e.g., with the determining unit 1018) whether the device is in a locked mode or an unlocked mode in a display on state, and enable display of the authentication user interface (e.g., with the display enabling unit 1014) in accordance with determining that the device is in the locked mode in the display on state.
In some embodiments, the respective mini-application object is configured to perform a subset of less than all of the functionality of the corresponding application of the respective mini-application object.
In some embodiments, the respective mini-application object displays an identifier of the corresponding application of the respective mini-application object.
In some implementations, the respective mini-application object displays a portion of the content of the corresponding application from the respective mini-application object.
In some embodiments, predefined inputs on a respective mini-application object launch a corresponding application of the respective mini-application object.
In some embodiments, the respective mini-application object operates as a stand-alone application residing in the memory of the device, the stand-alone application being different from an associated application also residing in the memory of the device.
In some embodiments, the respective mini-application objects are run as extensions or components of the associated application on the device.
In some implementations, the respective mini-application object has a dedicated memory portion for temporarily storing information.
In some embodiments, the memory portion is accessible by a corresponding full-function application of the respective mini-application object.
The operations in the information processing method described above are optionally implemented by running one or more functional modules in an information processing apparatus, such as a general purpose processor (e.g., as described above with respect to fig. 1A and 3) or an application specific chip.
The operations described above with reference to fig. 6A-6E are optionally implemented by the components depicted in fig. 1A-1B or fig. 10. For example, detection operations 602 and 606 and display operations 604 and 608 are optionally implemented by event sorter 170, event recognizer 180, and event handler 190. An event monitor 171 in the event sorter 170 detects a contact on the touch-sensitive display 112 and an event dispatcher module 174 communicates the event information to the application 136-1. The respective event identifier 180 of the application 136-1 compares the event information to the respective event definition 186 and determines whether the first contact at the first location on the touch-sensitive surface (or whether the rotation of the device) corresponds to a predefined event or sub-event, such as a selection of an object on the user interface, or a rotation of the device from one orientation to another. When a respective predefined event or sub-event is detected, the event recognizer 180 activates an event handler 190 associated with the detection of the event or sub-event. Event handler 190 optionally uses or invokes data updater 176 or object updater 177 to update the application internal state 192. In some embodiments, event handler 190 accesses a corresponding GUI updater 178 to update what is displayed by the application. Similarly, it will be apparent to one of ordinary skill in the art how other processes may be implemented based on the components shown in fig. 1A-1B.
Fig. 11 illustrates a functional block diagram of an electronic device 1100 configured according to the principles of various described embodiments, according to some embodiments. The functional blocks of the apparatus are optionally implemented by hardware, software, or a combination of hardware and software that implements the principles of the various described embodiments. Those skilled in the art will appreciate that the functional blocks described in fig. 11 are optionally combined or separated into sub-blocks in order to implement the principles of the various described embodiments. Thus, the description herein optionally supports any possible combination or separation or further definition of functional blocks described herein.
As shown in fig. 11, the electronic device 1100 includes a display unit 1102, a touch-sensitive surface unit 1104, and a processing unit 1108 coupled with the display unit 1102 and the touch-sensitive surface unit 1104. In some embodiments, processing unit 1108 includes a detection unit 1110, a display enable unit 1112, an activation unit 1114, and a replacement unit 1116.
While the device is in the display off state, the processing unit 1108 is configured to detect (e.g., with the detection unit 1110) a first input to activate (e.g., with the activation unit 1114) the display of the device. In response to detecting the first input, the processing unit 1108 is configured to activate the display unit 1102 of the device (e.g., with the activation unit 1114) and enable display of a first user interface on the display unit 1102 (e.g., with the display enabling unit 1112) corresponding to a display on state of the device. While displaying a first user interface corresponding to a display on state of the device, the processing unit 1108 is configured to detect (e.g., with the detection unit 1110) a swipe gesture on the touch-sensitive surface unit 1104. In response to detecting the swipe gesture, the processing unit 1108 is configured to replace (e.g., with the replacement unit 1116) the display of the first user interface with the display of the camera application user interface in accordance with a determination that the swipe gesture is in a first direction, to replace the display of the first user interface with the display of the mini-application object user interface (e.g., with the replacement unit 1116) in accordance with a determination that the swipe gesture is in a second direction different from the first direction, wherein the mini-application object user interface is configured to include a plurality of mini-application objects, wherein respective mini-application objects of the plurality of mini-application objects have corresponding applications stored in the device, to enable the display of the first page of the multi-page control panel user interface (e.g., with the display enabling unit 1112) in accordance with a determination that the swipe gesture is in a third direction different from the first direction, to enable the display of the notification user interface (e.g., with the display enabling unit 1112) and to enable the display of the notification user interface in accordance with a determination that the swipe gesture is in a fourth direction different from the first direction, the second direction, and the third direction.
In some embodiments, replacing the display of the first user interface with the display of the camera application user interface includes delaying the display of one or more control affordances associated with the camera application until a control display interval has elapsed.
In some embodiments, the processing unit 1108 is further configured to detect a gesture back to the first user interface during the control display interval (e.g., with the detection unit 1110). In response to detecting a gesture that returns to the first user interface during the control display interval, the processing unit 1108 is configured to replace (e.g., with the replacement unit 1116) the display of the camera application user interface with the display of the first user interface.
In some implementations, the processing unit 1108 is further configured to, while the multi-page control panel user interface is displayed, detect a gesture on the touch-sensitive surface unit 1104 (e.g., with the detection unit 1110) at a location corresponding to a first page of the multi-page control panel user interface and, in response to detecting the gesture at the location corresponding to the first page of the multi-page control panel, enable display of a second page of the multi-page control panel user interface (e.g., with the display enabling unit 1112).
In some implementations, the processing unit 1108 is further configured to, while displaying the second page of the multi-page control panel user interface, detect a gesture on the touch-sensitive surface unit 1104 (e.g., with the detection unit 1110) at a location corresponding to the second page of the multi-page control panel user interface, and, in response to detecting the gesture at the location corresponding to the second page of the multi-page control panel user interface, enable display of a third page of the multi-page control panel user interface (e.g., with the display enabling unit 1112).
In some implementations, the processing unit 1108 is further configured to, while displaying the notification user interface, detect a dismissal gesture on the touch-sensitive surface unit 1104 (e.g., with the detection unit 1110) at a location corresponding to a respective notification included in the notification user interface and, in response to detecting the dismissal gesture, cease displaying the respective notification in the notification user interface.
In some implementations, the processing unit 1108 is further configured to, while displaying the notification user interface, detect a launch gesture on the touch-sensitive surface unit 1104 (e.g., with the detection unit 1110) at a location corresponding to a first notification included in the notification user interface and launch an application corresponding to the first notification in response to detecting the launch gesture.
In some embodiments, the processing unit 1108 is further configured to detect a launch gesture on the touch-sensitive surface unit 1104 (e.g., with the detection unit 1110) at a location corresponding to a first mini-application object included in the mini-application object user interface while the mini-application object user interface is displayed, and launch an application corresponding to the first mini-application object in response to detecting the launch gesture.
In some embodiments, the processing unit 1108 is configured to detect an information expansion gesture on the touch-sensitive surface unit 1104 (e.g., with the detection unit 1110) at a location corresponding to a second mini-application object included in the mini-application object user interface that includes mini-application object information while the mini-application object user interface is displayed, and expand the second mini-application object in response to detecting the information expansion gesture, and enable display of the mini-application object information and additional mini-application object information in the expanded second mini-application object (e.g., with the display enabling unit 1112).
In some embodiments, the respective mini-application object is configured to perform a subset of less than all of the functionality of the corresponding application of the respective mini-application object.
In some embodiments, the respective mini-application object displays an identifier of the corresponding application of the respective mini-application object.
In some implementations, the respective mini-application object displays a portion of the content of the corresponding application from the respective mini-application object.
In some embodiments, predefined inputs on a respective mini-application object launch a corresponding application of the respective mini-application object.
In some embodiments, the respective mini-application object operates as a stand-alone application residing in the memory of the device, the stand-alone application being different from an associated application also residing in the memory of the device.
In some embodiments, the respective mini-application objects are run as extensions or components of the associated application on the device.
In some implementations, the respective mini-application object has a dedicated memory portion for temporarily storing information.
In some embodiments, the memory portion is accessible by a corresponding full-function application of the respective mini-application object.
The operations in the information processing method described above are optionally implemented by running one or more functional modules in an information processing apparatus, such as a general purpose processor (e.g., as described above with respect to fig. 1A and 3) or an application specific chip.
The operations described above with reference to fig. 7A-7E are optionally implemented by the components depicted in fig. 1A-1B or fig. 11. For example, detection operations 702 and 706 and display operations 704 and 708 are optionally implemented by event sorter 170, event recognizer 180, and event handler 190. An event monitor 171 in the event sorter 170 detects a contact on the touch-sensitive display 112 and an event dispatcher module 174 communicates the event information to the application 136-1. The respective event identifier 180 of the application 136-1 compares the event information to the respective event definition 186 and determines whether the first contact at the first location on the touch-sensitive surface (or whether the rotation of the device) corresponds to a predefined event or sub-event, such as a selection of an object on the user interface, or a rotation of the device from one orientation to another. When a respective predefined event or sub-event is detected, the event recognizer 180 activates an event handler 190 associated with the detection of the event or sub-event. Event handler 190 optionally uses or invokes data updater 176 or object updater 177 to update the application internal state 192. In some embodiments, event handler 190 accesses a corresponding GUI updater 178 to update what is displayed by the application. Similarly, it will be apparent to one of ordinary skill in the art how other processes may be implemented based on the components shown in fig. 1A-1B.
Fig. 12 illustrates a functional block diagram of an electronic device 1200 configured in accordance with the principles of various described embodiments, according to some embodiments. The functional blocks of the apparatus are optionally implemented by hardware, software, or a combination of hardware and software that implements the principles of the various described embodiments. Those skilled in the art will appreciate that the functional blocks described in fig. 12 are optionally combined or separated into sub-blocks in order to implement the principles of the various described embodiments. Thus, the description herein optionally supports any possible combination or separation or further definition of functional blocks described herein.
As shown in fig. 12, the electronic device 1200 includes a display unit 1202 configured to display a first user interface including a plurality of application icons corresponding to different ones of a plurality of applications installed on the device, a touch-sensitive surface unit 1204, and a processing unit 1208 coupled with the display unit 1202 and the touch-sensitive surface unit 1204. In some embodiments, the processing unit 1208 includes a detection unit 1210, a display enabling unit 1212, an adding unit 1214, and a replacing unit 1216.
The processing unit 1208 is configured to detect (e.g., with the detection unit 1210) a first input on the touch-sensitive surface unit 1204 at a location corresponding to a first application icon of the plurality of application icons, the first application icon corresponding to a first application of the plurality of applications. In response to detecting the first input, the processing unit 1208 is configured to enable (e.g., with the display enabling unit 1212) display of the first mini-application object or a preview of the first mini-application object in the coverage area. In some embodiments, the first mini-application object corresponds to a first application of the plurality of applications, and the overlay area includes an affordance for adding the first mini-application object to a second user interface that displays the plurality of mini-application objects. The processing unit 1208 is further configured to detect (e.g., with the detection unit 1210) a second input on the touch-sensitive surface unit 1204 at a location corresponding to an affordance for adding the first mini-application object to the second user interface. In response to detecting the second input, the processing unit 1208 is configured to add (e.g., with the adding unit 1214) the first mini-application object to a second user interface displaying a plurality of mini-application objects.
In some implementations, while the first user interface is displayed, the processing unit 1208 is configured to detect (e.g., with the detection unit 1210) a third input on the touch-sensitive surface unit 1204 that meets a first criterion for navigating from the first user interface to the second user interface. In response to detecting the third input, the processing unit 1208 is configured to replace the display of the first user interface with the second user interface (e.g., with the replacement unit 1216), the replacement of the display of the first user interface with the second user interface including ceasing to display the first user interface (e.g., with the display enabling unit 1212) and enabling the display of the second user interface (e.g., with the display enabling unit 1212) including simultaneously displaying the first mini-application object and at least one second mini-application object corresponding to a second application of the plurality of applications that is different from the first application in the second user interface (e.g., with the display enabling unit 1212).
In some embodiments, the first mini-application object includes at least one selectable information item, and the processing unit 1208 is configured to detect a fourth input on the touch-sensitive surface unit 1204 (e.g., with the detection unit 1210) at a location corresponding to the at least one selectable information item while the second user interface is displayed, and enable display of additional information associated with the selectable information item (e.g., with the display enabling unit 1212) in response to detecting the fourth input.
In some implementations, the first mini-application object is added at the highest position in the vertical stack of mini-application objects in the second user interface.
In some embodiments, the first mini-application object includes identification information identifying the first application.
In some embodiments, the first mini-application object is configured to provide a subset of the functionality provided by the first application.
In some implementations, the first mini-application object includes a subset of content from the first application.
In some implementations, the processing unit 1208 is further configured to enable display of a menu of selectable options corresponding to the first application (e.g., with the display enabling unit 1212) in response to detecting the first input.
In some implementations, predefined input on the first mini-application object launches the first application.
In some embodiments, the first mini-application object operates as a separate application residing in the device memory from the first application.
In some embodiments, the first mini-application object is run as an extension or component of the first application.
In some implementations, the first mini-application object has a dedicated memory portion for temporarily storing information.
In some embodiments, the memory portion is accessible by the first application.
The operations in the information processing method described above are optionally implemented by running one or more functional modules in an information processing apparatus, such as a general purpose processor (e.g., as described above with respect to fig. 1A and 3) or an application specific chip.
The operations described above with reference to fig. 8A-8C are optionally implemented by the components depicted in fig. 1A-1B or fig. 12. For example, display operations 802 and 806 and detection operations 804 and 808 are optionally implemented by event sorter 170, event recognizer 180, and event handler 190. An event monitor 171 in the event sorter 170 detects a contact on the touch-sensitive display 112 and an event dispatcher module 174 communicates the event information to the application 136-1. The respective event identifier 180 of the application 136-1 compares the event information to the respective event definition 186 and determines whether the first contact at the first location on the touch-sensitive surface (or whether the rotation of the device) corresponds to a predefined event or sub-event, such as a selection of an object on the user interface, or a rotation of the device from one orientation to another. When a respective predefined event or sub-event is detected, the event recognizer 180 activates an event handler 190 associated with the detection of the event or sub-event. Event handler 190 optionally uses or invokes data updater 176 or object updater 177 to update the application internal state 192. In some embodiments, event handler 190 accesses a corresponding GUI updater 178 to update what is displayed by the application. Similarly, it will be apparent to one of ordinary skill in the art how other processes may be implemented based on the components shown in fig. 1A-1B.
Fig. 13 illustrates a functional block diagram of an electronic device 1300 configured in accordance with the principles of various described embodiments, according to some embodiments. The functional blocks of the apparatus are optionally implemented by hardware, software, or a combination of hardware and software that implements the principles of the various described embodiments. Those skilled in the art will appreciate that the functional blocks described in fig. 13 are optionally combined or separated into sub-blocks in order to implement the principles of the various described embodiments. Thus, the description herein optionally supports any possible combination or separation or further definition of functional blocks described herein.
As shown in fig. 13, the electronic device 1300 includes a display unit 1302, a touch-sensitive surface unit 1304, one or more sensor units 1306, and a processing unit 1308 coupled to the display unit 1302, the touch-sensitive surface unit 1304, and the one or more sensor units 1306. In some embodiments, the electronic device includes one or more sensor units 1006, and the processing unit 1008 is coupled with the display unit 1002, the touch-sensitive surface unit 1004, and the one or more sensor units 1006. In some embodiments, the processing unit 1308 includes a detection unit 1310, a display enabling unit 1312, a routing unit 1314, a filling unit 1316, a modifying unit 1318, and a switching unit 1320.
When the initial user interface is displayed on the display unit 1302, the processing unit 1308 is configured to detect (e.g., with the detection unit 1310) a first gesture on the touch-sensitive surface unit 1304. In response to detecting the first gesture, the processing unit 1308 is configured to enable (e.g., with the display enabling unit 1312) display of a first page of the multi-page control panel on the display unit 1302. In some embodiments, a first page of the multi-page control panel includes a plurality of device control affordances. While the first page of the multi-page control panel is displayed, the processing unit 1308 is configured to detect (e.g., with the detection unit 1310) a second gesture on the touch-sensitive surface unit 1304. In response to detecting the second gesture, the processing unit 1308 is configured to enable (e.g., with the display enabling unit 1312) display of a second page of the multi-page control panel. In some embodiments, the second page of the multi-page control panel includes a plurality of content playback control affordances, and the second page of the multi-page control panel replaces the first page of the multi-page control panel on the display unit 1302.
In some implementations, while displaying the second page of the multi-page control panel, the processing unit 1308 is configured to detect (e.g., with the detecting unit 1310) a third gesture on the touch-sensitive surface unit 1304, and in response to detecting the third gesture, in accordance with a determination that the third gesture is of the first gesture type, to cease (e.g., with the display enabling unit 1312) displaying the second page of the multi-page control panel on the display unit 1302.
In some embodiments, the processing unit 1308 is configured to enable display of a third page of the multi-page control panel (e.g., with the display enabling unit 1312) in accordance with determining that the third gesture is of a second gesture type that is different from the first gesture type. In some embodiments, the third page of the multi-page control panel includes a plurality of remote device control affordances.
In some embodiments, the plurality of remote device controls included in the third page of the multi-page control panel can represent different devices in respective defined areas corresponding to the plurality of defined areas selectable by the user.
In some embodiments, displaying the third page of the multi-page control panel includes redisplaying (e.g., with the display enabling unit 1312) one or more remote device controls included in the third page of the multi-page control panel when the third page of the multi-page control is displayed.
In some embodiments, displaying the third page of the multi-page control panel includes displaying (e.g., with the display enabling unit 1312) one or more remote device controls in the third page of the multi-page control panel, the one or more remote device controls corresponding to a defined area in which the device is currently located.
In some embodiments, the third page of the multi-page control panel includes a defined region column representation that, when activated, is configured to cause display of a plurality of defined region identifiers selectable by a user.
In some embodiments, a first page of the multi-page control panel overlays a portion of the initial user interface.
In some embodiments, the initial user interface is an application user interface and the multi-page control panel and the application user interface are displayed simultaneously.
In some implementations, after ceasing to display the second page of the multi-page control panel, the processing unit 1308 is configured to detect (e.g., with the detecting unit 1310) a fourth gesture on the touch-sensitive surface unit 1304, and in response to detecting the fourth gesture, redisplay (e.g., with the display enabling unit 1312) the second page of the multi-page control panel.
In some embodiments, the plurality of device control affordances includes at least one toggle control.
In some implementations, the second page of the multi-page control panel includes indicia of the route destination of the currently playing media.
In some implementations, while displaying the second page of the multi-page control panel, the processing unit 1308 is configured (e.g., with the detection unit 1310) to detect an input on the touch-sensitive surface unit 1304 at a location corresponding to a marker of the route destination of the currently playing media. In some implementations, the route destination of the currently playing media includes a first device corresponding to the first zone. In response to detecting an input on the touch-sensitive surface unit 1304 at a location corresponding to a marker of a routing destination of the currently playing media, the processing unit 1308 is configured to route (e.g., using the routing unit 1314) media output to a second device corresponding to the second zone.
In some implementations, the second page of the multi-page control panel includes a media routing destination column representation capable representation, and the processing unit 1308 is configured to detect (e.g., with the detection unit 1310) an input on the touch-sensitive surface unit 1304 at a location corresponding to the media routing destination column representation capable representation, and to enable (e.g., with the display enabling unit 1312) display of the list of media routing destination options in response to detecting the input at the location corresponding to the media routing destination column representation capable representation.
In some embodiments, the processing unit 1308 is configured to populate (e.g., with the populating unit 1316) a list of media routing destination options with identifiers of electronic devices and at least one additional device, wherein populating the list of media routing destination options with the at least one additional device includes populating (e.g., with the populating unit 1316) the list of media routing destination options with one or more remote devices to which the electronic device is currently routing media output in accordance with a determination that the electronic device has been previously paired with one or more mateable devices (e.g., with the populating unit 1316).
In some embodiments, the processing unit 1308 is further configured to populate a list of media routing destination options with one or more signaling devices (e.g., with the populating unit 1316) in accordance with a determination that one or more signaling devices are detected in proximity to the electronic device.
In some embodiments, the electronic device includes one or more sensor units 1306 for detecting an intensity of a contact with the touch-sensitive surface unit 1304 (e.g., with the detection unit 1310), and the processing unit 1308 is configured to detect a first input (e.g., with the detection unit 1310), including detecting a first contact on the touch-sensitive surface unit 1304 at a location corresponding to a first control affordance of a plurality of device control affordances, in response to detecting the first input, in accordance with determining that the first input meets a second criterion, wherein the second criterion requires a characteristic intensity of the first contact to meet a first intensity threshold to meet the second criterion, enable display of one or more modification options corresponding to the control of the first control affordance (e.g., with the display enabling unit 1312), and in accordance with determining that the first input meets a third criterion, wherein the third criterion does not require the characteristic intensity of the first contact to meet the first intensity threshold, switch a function of the control corresponding to the first control affordance (e.g., with the switching unit 1320).
In some embodiments, while displaying one or more modification options corresponding to the control of the first control affordance, the processing unit 1308 is configured (e.g., with the detection unit 1310) to detect a second input that activates a first modification option of the one or more modification options, and in response to detecting the second input, modify the control corresponding to the first control affordance in accordance with the activated first modification option (e.g., with the modification unit 1318).
In some embodiments, the respective page of the multi-page control panel includes indicia of a total number of pages the multi-page control panel has, and an indicator corresponding to a currently displayed page of the multi-page control panel is highlighted in the respective page.
The operations in the information processing method described above are optionally implemented by running one or more functional modules in an information processing apparatus, such as a general purpose processor (e.g., as described above with respect to fig. 1A and 3) or an application specific chip.
The operations described above with reference to fig. 9A to 9E are optionally implemented by the components depicted in fig. 1A to 1B or fig. 13. For example, display operations 902, 904, and 908, and detection operation 906 are optionally implemented by event sorter 170, event recognizer 180, and event handler 190. An event monitor 171 in the event sorter 170 detects a contact on the touch-sensitive display 112 and an event dispatcher module 174 communicates the event information to the application 136-1. The respective event identifier 180 of the application 136-1 compares the event information to the respective event definition 186 and determines whether the first contact at the first location on the touch-sensitive surface (or whether the rotation of the device) corresponds to a predefined event or sub-event, such as a selection of an object on the user interface, or a rotation of the device from one orientation to another. When a respective predefined event or sub-event is detected, the event recognizer 180 activates an event handler 190 associated with the detection of the event or sub-event. Event handler 190 optionally uses or invokes data updater 176 or object updater 177 to update the application internal state 192. In some embodiments, event handler 190 accesses a corresponding GUI updater 178 to update what is displayed by the application. Similarly, it will be apparent to one of ordinary skill in the art how other processes may be implemented based on the components shown in fig. 1A-1B.
The foregoing description, for purposes of explanation, has been described with reference to specific embodiments. However, the illustrative discussions above are not intended to be exhaustive or to limit the invention to the precise forms disclosed. Many modifications and variations are possible in light of the above teaching. The embodiments were chosen and described in order to best explain the principles of the invention and its practical application to thereby enable others skilled in the art to best utilize the invention and various described embodiments with various modifications as are suited to the particular use contemplated.

Claims (18)

1.一种显示用户界面的方法,包括:1. A method for displaying a user interface, comprising: 在具有显示器和一个或多个输入设备的电子设备处:At an electronic device having a display and one or more input devices: 在所述电子设备处于功率节省模式的同时,检测第一输入;以及While the electronic device is in a power save mode, detecting a first input; and 响应于检测到所述第一输入:In response to detecting the first input: 从所述功率节省模式激活所述电子设备的所述显示器,以及在所述显示器上显示第一用户界面,其中所述第一用户界面是当所述电子设备响应于所述第一输入退出所述功率节省模式时显示的初始用户界面;activating the display of the electronic device from the power saving mode, and displaying a first user interface on the display, wherein the first user interface is an initial user interface displayed when the electronic device exits the power saving mode in response to the first input; 在显示所述第一用户界面的同时,检测轻扫手势;以及,While displaying the first user interface, detecting a swipe gesture; and, 响应于检测到所述轻扫手势:In response to detecting the swipe gesture: 响应于确定所述轻扫手势处于第一方向以及在所述轻扫手势被检测时所述电子设备处于解锁模式,在保持所述电子设备的所述解锁模式的同时,用第二用户界面替换所述第一用户界面的显示,以及In response to determining that the swipe gesture is in a first direction and that the electronic device is in an unlocked mode when the swipe gesture is detected, replacing display of the first user interface with a second user interface while maintaining the unlocked mode of the electronic device, and 响应于确定所述轻扫手势处于第一方向以及在所述轻扫手势被检测时所述电子设备处于锁定模式,在保持所述电子设备的所述锁定模式的同时,用第二用户界面替换所述第一用户界面的显示,其中:In response to determining that the swipe gesture is in a first direction and that the electronic device is in a locked mode when the swipe gesture is detected, replacing display of the first user interface with a second user interface while maintaining the locked mode of the electronic device, wherein: 所述第一用户界面和所述第二用户界面不同于home屏幕用户界面,所述home屏幕用户界面包括与多个应用程序相对应的多个应用程序启动图标;以及The first user interface and the second user interface are different from a home screen user interface, the home screen user interface including a plurality of application launch icons corresponding to a plurality of applications; and 所述第二用户界面包括第一搜索输入区域;The second user interface includes a first search input area; 检测所述第二用户界面中的所述第一搜索输入区域中的输入;以及detecting an input in the first search input area in the second user interface; and 响应于检测到所述第二用户界面中的所述第一搜索输入区域中的所述输入:In response to detecting the input in the first search input area in the second user interface: 根据确定所述电子设备处于所述锁定模式,在所述第二用户界面上显示第一组搜索结果;以及Based on determining that the electronic device is in the locked mode, displaying a first set of search results on the second user interface; and 根据确定所述电子设备处于所述解锁模式,在所述第二用户界面上显示不同于所述第一组搜索结果的第二组搜索结果。Based on determining that the electronic device is in the unlock mode, a second group of search results different from the first group of search results are displayed on the second user interface. 2.根据权利要求1所述的方法,包括:2. The method according to claim 1, comprising: 响应于检测到所述轻扫手势:In response to detecting the swipe gesture: 根据确定所述电子设备处于所述电子设备的所述锁定模式并且所述轻扫手势处于不同于所述第一方向的第二方向,用第三用户界面的显示替换所述第一用户界面的显示,所述第三用户界面显示第二内容;以及,replacing display of the first user interface with display of a third user interface based on determining that the electronic device is in the lock mode of the electronic device and the swipe gesture is in a second direction different from the first direction, the third user interface displaying second content; and 根据确定所述电子设备处于所述电子设备的所述解锁模式并且所述轻扫手势处于所述第二方向,用所述第三用户界面的显示替换所述第一用户界面的显示,所述第三用户界面显示当所述电子设备处于所述锁定模式时不显示的所述第二内容和第二附加内容。Based on determining that the electronic device is in the unlocked mode of the electronic device and the swipe gesture is in the second direction, the display of the first user interface is replaced by the display of the third user interface, and the third user interface displays the second content and second additional content that are not displayed when the electronic device is in the locked mode. 3.根据权利要求2所述的方法,其中:3. The method according to claim 2, wherein: 所述第二内容包括一个或多个迷你应用程序对象;以及The second content includes one or more mini-application objects; and 所述第二附加内容包括受限的迷你应用程序对象信息。The second additional content includes restricted mini-application object information. 4.根据权利要求2所述的方法,其中:4. The method according to claim 2, wherein: 所述第二内容包括一个或多个通知;以及The second content includes one or more notifications; and 所述第二附加内容包括受限的通知信息。The second additional content includes limited notification information. 5.根据权利要求2所述的方法,其中所述第三用户界面包括第二搜索输入区域。The method of claim 2 , wherein the third user interface comprises a second search input area. 6.根据权利要求5所述的方法,包括:6. The method according to claim 5, comprising: 检测所述第二搜索输入区域中的输入;detecting an input in the second search input area; 响应于检测到所述第二搜索输入区域中的所述输入:In response to detecting the input in the second search input area: 根据确定所述电子设备处于所述锁定模式,显示第三组搜索结果;以及Based on determining that the electronic device is in the locked mode, displaying a third set of search results; and 根据确定所述电子设备处于所述解锁模式,显示不同于所述第三组搜索结果的第四组搜索结果。Based on determining that the electronic device is in the unlock mode, a fourth group of search results different from the third group of search results are displayed. 7.根据权利要求1-6中的任一项所述的方法,其中响应于检测到所述第一输入而确定所述电子设备是处于所述解锁模式还是所述锁定模式。7 . The method of claim 1 , wherein determining whether the electronic device is in the unlocked mode or the locked mode is performed in response to detecting the first input. 8.根据权利要求1-6中的任一项所述的方法,其中响应于检测到所述轻扫手势而确定所述电子设备是处于所述锁定模式还是所述解锁模式。8 . The method according to claim 1 , wherein determining whether the electronic device is in the locked mode or the unlocked mode is performed in response to detecting the swipe gesture. 9.根据权利要求1-6中的任一项所述的方法,其中检测所述第一输入包括:9. The method of any one of claims 1-6, wherein detecting the first input comprises: 利用一个或多个传感器检测所述电子设备的环境改变;以及Detecting changes in the environment of the electronic device using one or more sensors; and 响应于检测到所述电子设备的所述环境改变,确定所述电子设备的所述环境改变是否满足显示器激活标准。In response to detecting the change in the environment of the electronic device, it is determined whether the change in the environment of the electronic device satisfies a display activation criterion. 10.根据权利要求1-6中的任一项所述的方法,其中检测所述第一输入包括检测显示器激活示能表示的激活。10. The method of any one of claims 1-6, wherein detecting the first input comprises detecting activation of a display activation indicator. 11.根据权利要求1-6中的任一项所述的方法,其中检测所述第一输入包括检测与显示器激活示能表示的接触,所述接触激活所述显示器激活示能表示而不致动所述显示器激活示能表示。11. The method of any one of claims 1-6, wherein detecting the first input comprises detecting contact with a display activation enable indication, the contact activating the display activation enable indication without actuating the display activation enable indication. 12.根据权利要求1-6中的任一项所述的方法,其中检测所述第一输入包括检测与显示器激活示能表示的接触,所述接触激活所述显示器激活示能表示以激活所述电子设备的所述显示器,并且使得显示所述第一用户界面。12. A method according to any one of claims 1-6, wherein detecting the first input comprises detecting contact with a display activation enable indication, the contact activating the display activation enable indication to activate the display of the electronic device and causing the first user interface to be displayed. 13.根据权利要求1-6中的任一项所述的方法,包括:13. The method according to any one of claims 1 to 6, comprising: 在显示所述第一用户界面的同时,检测第二输入;以及While displaying the first user interface, detecting a second input; and 响应于检测到所述第二输入,用包括多个应用程序图标的用户界面的显示替换所述第一用户界面的显示,所述多个应用程序图标与多个应用程序中的不同应用程序相对应。In response to detecting the second input, display of the first user interface is replaced with display of a user interface including a plurality of application icons corresponding to different applications of a plurality of applications. 14.如权利要求13所述的方法,包括:14. The method of claim 13, comprising: 响应于检测到所述第二输入:In response to detecting the second input: 确定所述电子设备是处于所述电子设备的所述锁定模式还是所述解锁模式;以及determining whether the electronic device is in the locked mode or the unlocked mode of the electronic device; and 根据确定所述电子设备处于所述锁定模式,显示认证用户界面。Based on determining that the electronic device is in the locked mode, an authentication user interface is displayed. 15.根据权利要求1-6中的任一项所述的方法,包括:15. The method according to any one of claims 1 to 6, comprising: 响应于检测到相应输入:In response to detecting the corresponding input: 根据确定所述相应输入满足设备解锁标准,用所述home屏幕用户界面替换所述第一用户界面的显示,所述home屏幕用户界面包括与所述多个应用程序相对应的所述多个应用程序启动图标。Based on determining that the corresponding input satisfies the device unlocking criteria, the display of the first user interface is replaced with the home screen user interface, and the home screen user interface includes the multiple application launch icons corresponding to the multiple applications. 16.根据权利要求1-6中的任一项所述的方法,其中:16. The method according to any one of claims 1 to 6, wherein: 在所述第二用户界面上显示所述第一组搜索结果包括响应于所述第一搜索输入区域中的所述输入,在所述第二用户界面上显示与第一应用程序相对应的至少第一应用程序图标;以及Displaying the first set of search results on the second user interface includes displaying at least a first application icon corresponding to a first application on the second user interface in response to the input in the first search input area; and 在所述第二用户界面上显示所述第二组搜索结果包括在所述第二用户界面上显示所述第一应用程序图标和至少第二应用程序内的第一内容的表示,其中所述第二应用程序内的所述第一内容的表示不在所述第一组搜索结果中。Displaying the second set of search results on the second user interface includes displaying the first application icon and at least a representation of first content within a second application on the second user interface, wherein the representation of the first content within the second application is not in the first set of search results. 17.一种存储一个或多个程序的计算机可读存储介质,所述一个或多个程序包括指令,所述指令当由具有显示器和一个或多个输入设备的电子设备执行时,使得所述电子设备执行根据权利要求1至16中的任一项所述的方法。17. A computer-readable storage medium storing one or more programs, the one or more programs comprising instructions, which, when executed by an electronic device having a display and one or more input devices, cause the electronic device to perform the method according to any one of claims 1 to 16. 18.一种电子设备,包括:18. An electronic device, comprising: 显示器;monitor; 一个或多个输入设备;one or more input devices; 一个或多个处理器;one or more processors; 存储器;和Memory; and 一个或多个程序,其中所述一个或多个程序被存储在所述存储器中并且被配置为由所述一个或多个处理器执行,所述一个或多个程序包括用于执行根据权利要求1至16中的任一项所述的方法的指令。One or more programs, wherein the one or more programs are stored in the memory and configured to be executed by the one or more processors, the one or more programs comprising instructions for executing the method according to any one of claims 1 to 16.
CN202110560299.6A 2016-06-12 2017-05-24 Device and method for accessing common device functions Active CN113093983B (en)

Applications Claiming Priority (9)

Application Number Priority Date Filing Date Title
US201662349100P 2016-06-12 2016-06-12
US62/349,100 2016-06-12
DKPA201670616A DK201670616A1 (en) 2016-06-12 2016-08-11 Devices and Methods for Accessing Prevalent Device Functions
DK201670616 2016-08-11
DK201670620 2016-08-12
DKPA201670621A DK201670621A1 (en) 2016-06-12 2016-08-12 Devices and Methods for Accessing Prevalent Device Functions
DK201670621 2016-08-12
DKPA201670620A DK201670620A1 (en) 2016-06-12 2016-08-12 Devices and Methods for Accessing Prevalent Device Functions
CN201710383083.0A CN107491257B (en) 2016-06-12 2017-05-24 Apparatus and method for accessing common device functions

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
CN201710383083.0A Division CN107491257B (en) 2016-06-12 2017-05-24 Apparatus and method for accessing common device functions

Publications (2)

Publication Number Publication Date
CN113093983A CN113093983A (en) 2021-07-09
CN113093983B true CN113093983B (en) 2025-04-04

Family

ID=59054207

Family Applications (2)

Application Number Title Priority Date Filing Date
CN201710383083.0A Active CN107491257B (en) 2016-06-12 2017-05-24 Apparatus and method for accessing common device functions
CN202110560299.6A Active CN113093983B (en) 2016-06-12 2017-05-24 Device and method for accessing common device functions

Family Applications Before (1)

Application Number Title Priority Date Filing Date
CN201710383083.0A Active CN107491257B (en) 2016-06-12 2017-05-24 Apparatus and method for accessing common device functions

Country Status (2)

Country Link
CN (2) CN107491257B (en)
WO (1) WO2017218153A1 (en)

Families Citing this family (40)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8502856B2 (en) 2010-04-07 2013-08-06 Apple Inc. In conference display adjustments
US10521107B2 (en) 2016-09-24 2019-12-31 Apple Inc. Devices, methods, and graphical user interfaces for selecting and interacting with different device modes
US10466889B2 (en) 2017-05-16 2019-11-05 Apple Inc. Devices, methods, and graphical user interfaces for accessing notifications
US10372298B2 (en) 2017-09-29 2019-08-06 Apple Inc. User interface for multi-user communication session
KR102469754B1 (en) * 2018-02-13 2022-11-22 삼성전자주식회사 Image display apparatus and operating method thereof
CN110457104B (en) * 2018-05-07 2024-12-06 苹果公司 Multi-participant real-time communication user interface
DK201870335A1 (en) 2018-05-07 2019-12-04 Apple Inc. Devices, methods, and graphical user interfaces for proactive management of notifications
CN110456935A (en) * 2018-05-07 2019-11-15 苹果公司 Apparatus and method for adjusting the provision of notifications
DK201870364A1 (en) 2018-05-07 2019-12-03 Apple Inc. MULTI-PARTICIPANT LIVE COMMUNICATION USER INTERFACE
DK180118B1 (en) 2018-05-07 2020-05-15 Apple Inc. DEVICES AND METHODS FOR ADJUSTING THE PROVISION OF NOTIFICATIONS
DK201870358A1 (en) * 2018-06-03 2020-01-03 Apple Inc. Accelerated task performance
CN108923969A (en) * 2018-06-29 2018-11-30 海尔优家智能科技(北京)有限公司 A kind of method, apparatus of equipment linkage, equipment and computer readable storage medium
US11128792B2 (en) 2018-09-28 2021-09-21 Apple Inc. Capturing and displaying images with multiple focal planes
CN109711133B (en) * 2018-12-26 2020-05-15 巽腾(广东)科技有限公司 Identity information authentication method and device and server
CN109918001B (en) * 2019-03-28 2022-07-29 北京小米移动软件有限公司 Interface display method, device and storage medium
CN112035175B (en) * 2019-05-17 2023-04-07 成都鼎桥通信技术有限公司 Application setting method and device
CN112486604B (en) * 2019-09-12 2024-10-29 北京搜狗科技发展有限公司 Toolbar setting method and device for setting toolbar
CN111263002B (en) * 2020-01-19 2022-08-26 华为技术有限公司 Display method and electronic equipment
CN114766015A (en) * 2020-03-10 2022-07-19 苹果公司 Device, method and graphical user interface for interacting with user interface objects corresponding to an application
US11416127B2 (en) 2020-03-10 2022-08-16 Apple Inc. Devices, methods, and graphical user interfaces for interacting with user interface objects corresponding to applications
AU2021102471B4 (en) 2020-05-11 2022-02-10 Apple Inc. System, method and user interface for supporting scheduled mode changes on electronic devices
AU2020233622B2 (en) 2020-05-11 2022-03-10 Apple Inc. System, method and user interface for supporting scheduled mode changes on electronic devices
US11079913B1 (en) 2020-05-11 2021-08-03 Apple Inc. User interface for status indicators
US11671697B2 (en) 2021-01-31 2023-06-06 Apple Inc. User interfaces for wide angle video conference
US12301979B2 (en) 2021-01-31 2025-05-13 Apple Inc. User interfaces for wide angle video conference
US12170579B2 (en) 2021-03-05 2024-12-17 Apple Inc. User interfaces for multi-participant live communication
US11379106B1 (en) 2021-05-12 2022-07-05 Apple Inc. Devices, methods, and graphical user interfaces for adjusting the provision of notifications
CN120881039A (en) 2021-05-15 2025-10-31 苹果公司 Real-time communication user interface
US11907605B2 (en) 2021-05-15 2024-02-20 Apple Inc. Shared-content session user interfaces
US11822761B2 (en) 2021-05-15 2023-11-21 Apple Inc. Shared-content session user interfaces
US11893214B2 (en) 2021-05-15 2024-02-06 Apple Inc. Real-time communication user interface
US12449961B2 (en) 2021-05-18 2025-10-21 Apple Inc. Adaptive video conference user interfaces
US12368946B2 (en) 2021-09-24 2025-07-22 Apple Inc. Wide angle video conference
US11770600B2 (en) 2021-09-24 2023-09-26 Apple Inc. Wide angle video conference
US12267622B2 (en) 2021-09-24 2025-04-01 Apple Inc. Wide angle video conference
CN114416190B (en) * 2022-01-19 2023-11-24 深圳市诠云科技有限公司 Android device and computer USB linkage dormancy awakening circuit
US12265687B2 (en) 2022-05-06 2025-04-01 Apple Inc. Devices, methods, and graphical user interfaces for updating a session region
US12405706B2 (en) 2022-05-10 2025-09-02 Apple Inc. Devices, methods, and graphical user interfaces for providing focus modes
US12541277B2 (en) 2022-06-05 2026-02-03 Apple Inc. Systems and methods for interacting with multiple applications on an electronic device
CN117555410B (en) * 2022-08-05 2025-09-12 荣耀终端股份有限公司 5G function switch state switching method and electronic device

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105122267A (en) * 2013-03-15 2015-12-02 苹果公司 Mobile computing device with multiple access modes

Family Cites Families (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070177804A1 (en) * 2006-01-30 2007-08-02 Apple Computer, Inc. Multi-touch gesture dictionary
CN101090518B (en) * 2006-06-12 2010-09-29 彭建强 System and method for quickly starting application function
US10169484B2 (en) * 2010-09-23 2019-01-01 Fisher-Rosemount Systems, Inc. Methods and apparatus to manage process control search results
JP6073782B2 (en) * 2011-05-16 2017-02-01 パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカPanasonic Intellectual Property Corporation of America Display device, display control method and display control program, and input device, input support method and program
CN102841732B (en) * 2011-06-23 2017-11-14 腾讯科技(深圳)有限公司 The method and device of task management in a kind of terminal
KR101563150B1 (en) * 2011-09-09 2015-10-28 주식회사 팬택 Method for providing shortcut in lock screen and portable device employing the same
TW201324307A (en) * 2011-12-08 2013-06-16 Acer Inc Electronic apparatus and method for controlling the same
US9213822B2 (en) * 2012-01-20 2015-12-15 Apple Inc. Device, method, and graphical user interface for accessing an application in a locked device
US9778706B2 (en) * 2012-02-24 2017-10-03 Blackberry Limited Peekable user interface on a portable electronic device
CN103873634A (en) * 2012-12-07 2014-06-18 盛乐信息技术(上海)有限公司 Terminal function setting method and terminal function setting system
CN104424410B (en) * 2013-09-05 2018-10-19 深圳市艾酷通信软件有限公司 Mobile intelligent terminal divides the method and its system that safe class quickly starting is applied
CN103955337B (en) * 2014-05-06 2017-07-21 北京金山安全软件有限公司 Method and system for opening application program in mobile terminal

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105122267A (en) * 2013-03-15 2015-12-02 苹果公司 Mobile computing device with multiple access modes

Also Published As

Publication number Publication date
CN113093983A (en) 2021-07-09
CN107491257A (en) 2017-12-19
WO2017218153A1 (en) 2017-12-21
CN107491257B (en) 2021-05-18

Similar Documents

Publication Publication Date Title
JP7781803B2 (en) Electronic device for accessing general device functionality, method and computer program executed on the electronic device - Patent Application 20070122999
CN113093983B (en) Device and method for accessing common device functions
WO2015183537A1 (en) User interface for phone call routing among devices
EP3469470A1 (en) Accelerated scrolling
US20220365638A1 (en) Devices, Methods, and Graphical User Interfaces for Displaying Media Items Shared from Distinct Applications
CN117321560A (en) Systems and methods for interacting with user interfaces
DK201670621A1 (en) Devices and Methods for Accessing Prevalent Device Functions

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant