CN111339032B - Device, method and graphical user interface for managing folders with multiple pages - Google Patents
Device, method and graphical user interface for managing folders with multiple pages Download PDFInfo
- Publication number
- CN111339032B CN111339032B CN202010125835.5A CN202010125835A CN111339032B CN 111339032 B CN111339032 B CN 111339032B CN 202010125835 A CN202010125835 A CN 202010125835A CN 111339032 B CN111339032 B CN 111339032B
- Authority
- CN
- China
- Prior art keywords
- folder
- page
- icon
- icons
- user interface
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims description 163
- 230000004044 response Effects 0.000 claims abstract description 267
- 230000008859 change Effects 0.000 claims description 29
- 230000007704 transition Effects 0.000 claims description 24
- 238000003860 storage Methods 0.000 claims description 23
- 230000009471 action Effects 0.000 description 328
- 230000000875 corresponding effect Effects 0.000 description 293
- 230000033001 locomotion Effects 0.000 description 123
- 230000004913 activation Effects 0.000 description 92
- 230000000007 visual effect Effects 0.000 description 29
- 230000003213 activating effect Effects 0.000 description 28
- 238000012545 processing Methods 0.000 description 25
- 238000004891 communication Methods 0.000 description 24
- 230000002093 peripheral effect Effects 0.000 description 21
- 230000010365 information processing Effects 0.000 description 19
- 230000008569 process Effects 0.000 description 18
- 238000007726 management method Methods 0.000 description 17
- 230000003287 optical effect Effects 0.000 description 17
- 238000001514 detection method Methods 0.000 description 16
- 230000006870 function Effects 0.000 description 15
- 230000004048 modification Effects 0.000 description 13
- 238000012986 modification Methods 0.000 description 13
- 238000010586 diagram Methods 0.000 description 11
- 230000001149 cognitive effect Effects 0.000 description 10
- 230000000295 complement effect Effects 0.000 description 10
- 238000005516 engineering process Methods 0.000 description 8
- 241000699666 Mus <mouse, genus> Species 0.000 description 7
- 230000008707 rearrangement Effects 0.000 description 5
- 239000011800 void material Substances 0.000 description 5
- 238000009877 rendering Methods 0.000 description 4
- 238000012360 testing method Methods 0.000 description 4
- 230000001413 cellular effect Effects 0.000 description 3
- 230000000881 depressing effect Effects 0.000 description 3
- 230000000694 effects Effects 0.000 description 3
- 230000000977 initiatory effect Effects 0.000 description 3
- 239000007787 solid Substances 0.000 description 3
- 102100039250 Essential MCU regulator, mitochondrial Human genes 0.000 description 2
- 101000813097 Homo sapiens Essential MCU regulator, mitochondrial Proteins 0.000 description 2
- 241000699670 Mus sp. Species 0.000 description 2
- 230000002730 additional effect Effects 0.000 description 2
- 238000012790 confirmation Methods 0.000 description 2
- 238000006073 displacement reaction Methods 0.000 description 2
- 238000007667 floating Methods 0.000 description 2
- 230000003993 interaction Effects 0.000 description 2
- 238000003825 pressing Methods 0.000 description 2
- 238000003672 processing method Methods 0.000 description 2
- 238000000638 solvent extraction Methods 0.000 description 2
- 239000013589 supplement Substances 0.000 description 2
- 230000001755 vocal effect Effects 0.000 description 2
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 description 2
- 241000222511 Coprinus Species 0.000 description 1
- 108010093488 His-His-His-His-His-His Proteins 0.000 description 1
- 101100518008 Mus musculus Nxf1 gene Proteins 0.000 description 1
- 230000001133 acceleration Effects 0.000 description 1
- 230000003044 adaptive effect Effects 0.000 description 1
- 238000004458 analytical method Methods 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- 238000003491 array Methods 0.000 description 1
- 230000008901 benefit Effects 0.000 description 1
- 238000004883 computer application Methods 0.000 description 1
- 238000004590 computer program Methods 0.000 description 1
- 230000001276 controlling effect Effects 0.000 description 1
- 230000001934 delay Effects 0.000 description 1
- 238000012217 deletion Methods 0.000 description 1
- 230000037430 deletion Effects 0.000 description 1
- 230000000994 depressogenic effect Effects 0.000 description 1
- 239000006185 dispersion Substances 0.000 description 1
- 238000009826 distribution Methods 0.000 description 1
- 230000004424 eye movement Effects 0.000 description 1
- 238000003384 imaging method Methods 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 229910044991 metal oxide Inorganic materials 0.000 description 1
- 150000004706 metal oxides Chemical class 0.000 description 1
- 238000010295 mobile communication Methods 0.000 description 1
- 238000003032 molecular docking Methods 0.000 description 1
- 238000005192 partition Methods 0.000 description 1
- 229920000642 polymer Polymers 0.000 description 1
- 238000010248 power generation Methods 0.000 description 1
- 230000010076 replication Effects 0.000 description 1
- 238000005096 rolling process Methods 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 238000010897 surface acoustic wave method Methods 0.000 description 1
- 239000000725 suspension Substances 0.000 description 1
- 230000001131 transforming effect Effects 0.000 description 1
- 230000001960 triggered effect Effects 0.000 description 1
- 238000010200 validation analysis Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/04817—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/0483—Interaction with page-structured environments, e.g. book metaphor
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
A device displays a plurality of selectable user interface objects including one or more folder icons on a display. The device detects a first input requesting selection of a folder icon for a folder, the folder icon for a folder including selectable icons that divide between a plurality of distinct, separately displayed pages of a folder view. In response to detecting the first input, the device displays a first page of the folder view. While displaying the first page of the folder view, the device detects a second input corresponding to a request to display a second page of the folder view. In response to detecting the second input, the device ceases to display the first page and displays the second page of the folder view.
Description
The application is a divisional application of Chinese patent application with application number 201480001676.0, application date 2014, 5 month and 30 date and the name of 'equipment, method and graphic user interface for managing folders with multiple pages'.
Cross Reference to Related Applications
The present application continues in the section of U.S. application Ser. No.12/888,362 entitled "Device, method and Graphical User Interface for Managing Folders," filed on month 22 of 2010, which claims priority from U.S. provisional application Ser. No. 61/321,872 filed on month 4 of 2010, both of which are incorporated herein by reference in their entirety.
The application also relates to the following applications: (1) U.S. application Ser. No.12/888,366 entitled "Device, method, and Graphical User Interface for Managing Folders" filed on 22 th month 9, (2) U.S. application Ser. No.12/888,370 entitled "Device, method, and Graphical User Interface for Managing Folders" filed on 22 nd month 9, (3) U.S. application Ser. No.12/888,373 entitled "Device, method, and Graphical User Interface for Managing Folders" filed on 22 nd month 9, (4) U.S. application Ser. No.12/888,375 entitled "Device, method, and Graphical User Interface for Managing Folders" filed on 22 nd month 2010, (5) U.S. application Ser. No.12/888,376 entitled "Device, method, and Graphical User Interface for Managing Folders", filed on 22 nd month 9, and (6) U.S. application Ser. No.12/888,377 entitled "Device, method, and Graphical User Interface for Managing Folders" filed on 22 nd 9 month 2010 are incorporated herein by reference in their entirety.
Technical Field
Embodiments of the present disclosure generally relate to electronic devices having touch-sensitive surfaces, including but not limited to electronic devices having touch-sensitive surfaces for managing folders.
Background
In recent years, the use of touch-sensitive surfaces as input devices for computers and other electronic computing devices has increased significantly. Exemplary touch-sensitive surfaces include touch pads and touch screen displays. These surfaces are widely used to manage folders by manipulating selectable user interface objects on a display.
Exemplary manipulations include creating a folder, displaying a folder view associated with the folder, adding selectable user interface objects (e.g., application icons, file icons, folder icons, etc.) to the folder, removing selectable user interface objects from the folder, repositioning selectable user interface objects within the folder view of the folder, repositioning folder icons within the arrangement of selectable user interface objects, and deleting the folder. Exemplary selectable user interface objects include icons representing applications, digital images, video, text, icons, and other files, as well as application icons associated with computing applications (e.g., mobile device applications and/or personal computer applications, etc.).
But existing methods for performing these manipulations are cumbersome and inefficient. For example, using a series of inputs to create, modify, and/or delete folders and content within folders is tedious and creates a significant cognitive burden on the user. In addition, the existing method takes longer than necessary, and thus wastes energy. The latter consideration is particularly important in battery powered devices.
Disclosure of Invention
Accordingly, there is a need for a computing device having faster, more efficient methods and interfaces for managing folders. These methods and interfaces may supplement or replace traditional methods for managing folders. These methods and interfaces reduce the cognitive burden on the user and result in a more efficient human-machine interface. For battery powered computing devices, these methods and interfaces save power and increase the time between battery charges.
The above-described drawbacks and other problems associated with user interfaces for computing devices having touch-sensitive surfaces are reduced or eliminated by the devices of the present disclosure. In some embodiments, the device is a desktop computer. In some embodiments, the device is portable (e.g., a notebook, tablet, or handheld device). In some embodiments, the device has a touch pad. In some embodiments, the device has a touch sensitive display (also known as a "touch screen" or "touch screen display"). In some embodiments, the device has a Graphical User Interface (GUI), one or more processors, memory, and one or more modules, programs, or sets of instructions stored in the memory for performing a plurality of functions. In some embodiments, the user interacts with the GUI primarily through finger contacts and gestures on the touch-sensitive surface. In some embodiments, the functions may include image editing, drawing, rendering, word processing, website creation, disk authoring, spreadsheet making, game playing, phone calls, video conferencing, sending email, instant messaging, workout support, digital photography, digital video recording, web browsing, digital music playing, and/or digital video playing. Executable instructions for performing these functions may be included in a computer-readable storage medium or other computer program product configured for execution by one or more processors.
According to some embodiments, a multi-function device includes a display, one or more processors, memory, and one or more programs. The one or more programs are stored in the memory and configured to be executed by the one or more processors. The one or more programs include instructions for: displaying a plurality of selectable user interface objects on a display; detecting a first input; and in response to detecting the first input, moving a first object of the plurality of selectable user interface objects across the display to a position on the display that is in proximity to a second object of the plurality of selectable user interface objects. The one or more programs further include instructions for: detecting that the first input meets a predefined folder creation criterion while the first object is in proximity to the second object; and in response to detecting that the first input satisfies the predefined folder creation criteria while the first object is in proximity to the second object, creating a folder containing the first object and the second object.
According to some embodiments, a method is performed at a multifunction device with a display. The method comprises the following steps: displaying a plurality of selectable user interface objects on a display; detecting a first input; and in response to detecting the first input, moving a first object of the plurality of selectable user interface objects across the display to a position on the display that is in proximity to a second object of the plurality of selectable user interface objects. The method further comprises the steps of: detecting that the first input meets a predefined folder creation criterion while the first object is in proximity to the second object; and in response to detecting that the first input satisfies the predefined folder creation criteria while the first object is in proximity to the second object, creating a folder containing the first object and the second object.
According to some embodiments, a graphical user interface on a multi-function device with a display, a memory, and one or more processors to execute one or more programs stored in the memory includes a plurality of selectable user interface objects. A first input is detected, and in response to detecting the first input, a first object of the plurality of selectable user interface objects is moved across the display to a position on the display that is in proximity to a second object of the plurality of selectable user interface objects. Detecting that the first input meets a predefined folder creation criterion while the first object is in proximity to the second object; and creating a folder containing the first object and the second object in response to detecting that the first input satisfies the predefined folder creation criteria while the first object is in proximity to the second object.
According to some embodiments, a computer readable storage medium has stored therein instructions that, when executed by a multifunction device with a display, cause the device to: displaying a plurality of selectable user interface objects on a display; detecting a first input; and in response to detecting the first input, moving a first object of the plurality of selectable user interface objects across the display to a position on the display that is in proximity to a second object of the plurality of selectable user interface objects. The instructions further cause the apparatus to: detecting that the first input meets a predefined folder creation criterion while the first object is in proximity to the second object; and in response to detecting that the first input satisfies the predefined folder creation criteria while the first object is in proximity to the second object, creating a folder containing the first object and the second object.
According to some embodiments, a multifunction device includes: a display; means for displaying a plurality of selectable user interface objects on a display; means for detecting a first input; and in response to detecting the first input, means for moving a first object of the plurality of selectable user interface objects across the display to a position on the display that is in proximity to a second object of the plurality of selectable user interface objects. The apparatus further includes means for detecting that the first input meets a predefined folder creation criterion while the first object is in proximity to the second object; and means for creating a folder containing the first object and the second object in response to detecting that the first input satisfies the predefined folder creation criteria while the first object is in proximity to the second object.
According to some embodiments, an information processing apparatus for use in a multifunction device with a display includes: means for detecting a first input; and in response to detecting the first input, means for moving a first object of the plurality of selectable user interface objects across the display to a position on the display that is in proximity to a second object of the plurality of selectable user interface objects. The information processing apparatus further includes means for detecting that the first input meets a predefined folder creation criterion when the first object is located in proximity to the second object; and means for creating a folder containing the first object and the second object in response to detecting that the first input satisfies the predefined folder creation criteria while the first object is in proximity to the second object.
According to some embodiments, a multi-function device includes a display, one or more processors, memory, and one or more programs. The one or more programs are stored in the memory and configured to be executed by the one or more processors. The one or more programs include instructions for: one or more action icons and one or more folder icons are displayed simultaneously on the display. The multifunction device has a normal operating mode for activating an application and a user interface reconfiguration mode for rearranging action icons and folder icons on a display. The one or more programs further include instructions for: detecting a first input; and in response to detecting the first input: displaying folder content associated with a respective folder icon when the first input corresponds to a request to select the respective folder icon of the one or more folder icons, regardless of whether the multifunction device is in a normal operating mode or a user interface reconfiguration mode; and when the first input corresponds to a request to select a respective action icon of the one or more action icons: activating an application associated with a corresponding action icon when the multifunction device is in a normal operating mode; and continuing to display the corresponding action icon without activating the application associated with the corresponding action icon while the multifunction device is in the user interface reconfiguration mode.
According to some embodiments, a method is performed at a multifunction device with a display. The method comprises the following steps: one or more action icons and one or more folder icons are displayed simultaneously on the display. The multifunction device has a normal operating mode for activating an application and a user interface reconfiguration mode for rearranging action icons and folder icons on a display. The method further comprises the steps of: detecting a first input; and in response to detecting the first input: displaying folder content associated with a respective folder icon when the first input corresponds to a request to select the respective folder icon of the one or more folder icons, regardless of whether the multifunction device is in a normal operating mode or a user interface reconfiguration mode; and when the first input corresponds to a request to select a respective action icon of the one or more action icons: activating an application associated with a corresponding action icon when the multifunction device is in a normal operating mode; and continuing to display the corresponding action icon without activating the application associated with the corresponding action icon while the multifunction device is in the user interface reconfiguration mode.
According to some embodiments, a graphical user interface on a multi-function device with a display, a memory, and one or more processors to execute one or more programs stored in the memory includes one or more action icons and one or more folder icons. One or more action icons and one or more folder icons are simultaneously displayed on the display. The multifunction device has a normal operating mode for activating an application and a user interface reconfiguration mode for rearranging action icons and folder icons on a display. Detecting a first input; and in response to detecting the first input: when the first input corresponds to a request to select a respective folder icon of the one or more folder icons, folder content associated with the respective folder icon is displayed regardless of whether the multifunction device is in a normal operating mode or a user interface reconfiguration mode; and when the first input corresponds to a request to select a respective action icon of the one or more action icons: when the multifunctional device is in a normal operation mode, an application associated with the corresponding action icon is activated; and, while the multifunction device is in the user interface reconfiguration mode, the respective action icon continues to be displayed without activating the application associated with the respective action icon.
According to some embodiments, a computer readable storage medium has stored therein instructions that, when executed by a multifunction device with a display, cause the device to: one or more action icons and one or more folder icons are displayed simultaneously on the display. The multifunction device has a normal operating mode for activating an application and a user interface reconfiguration mode for rearranging action icons and folder icons on a display. The instructions further cause the apparatus to: detecting a first input; and in response to detecting the first input: displaying folder content associated with a respective folder icon when the first input corresponds to a request to select the respective folder icon of the one or more folder icons, regardless of whether the multifunction device is in a normal operating mode or a user interface reconfiguration mode; and when the first input corresponds to a request to select a respective action icon of the one or more action icons: activating an application associated with a corresponding action icon when the multifunction device is in a normal operating mode; and continuing to display the corresponding action icon without activating the application associated with the corresponding action icon while the multifunction device is in the user interface reconfiguration mode.
According to some embodiments, a multi-function device includes: a display; means for simultaneously displaying one or more action icons and one or more folder icons on the display. The multifunction device has a normal operating mode for activating an application and a user interface reconfiguration mode for rearranging action icons and folder icons on a display. The apparatus further comprises: means for detecting a first input; and in response to detecting the first input, means for: displaying folder content associated with a respective folder icon when the first input corresponds to a request to select the respective folder icon of the one or more folder icons, regardless of whether the multifunction device is in a normal operating mode or a user interface reconfiguration mode; and when the first input corresponds to a request to select a respective action icon of the one or more action icons: activating an application associated with a corresponding action icon when the multifunction device is in a normal operating mode; and continuing to display the corresponding action icon without activating the application associated with the corresponding action icon while the multifunction device is in the user interface reconfiguration mode.
According to some embodiments, an information processing apparatus for use in a multifunction device with a display includes: means for simultaneously displaying one or more action icons and one or more folder icons on the display. The multifunction device has a normal operating mode for activating an application and a user interface reconfiguration mode for rearranging action icons and folder icons on a display. The information processing apparatus further includes: means for detecting a first input; and in response to detecting the first input, means for: displaying folder content associated with a respective folder icon when the first input corresponds to a request to select the respective folder icon of the one or more folder icons, regardless of whether the multifunction device is in a normal operating mode or a user interface reconfiguration mode; and when the first input corresponds to a request to select a respective action icon of the one or more action icons: activating an application associated with a corresponding action icon when the multifunction device is in a normal operating mode; and continuing to display the corresponding action icon without activating the application associated with the corresponding action icon while the multifunction device is in the user interface reconfiguration mode.
According to some embodiments, a multi-function device includes a display, one or more processors, memory, and one or more programs. The one or more programs are stored in the memory and configured to be executed by the one or more processors. The one or more programs include instructions for: receiving input corresponding to a request to create a folder containing a first item and a second item; and in response to receiving the input, creating a folder containing the first item and the second item; determining a first plurality of descriptors associated with the first item; and determining a second plurality of descriptors associated with the second item. The one or more programs further include instructions for: when the first plurality of descriptors and the second plurality of descriptors share at least a first common descriptor: automatically generating a folder name for the folder based on the first common descriptor; and displaying an icon for the folder on the display with the automatically generated folder name.
According to some embodiments, a method is performed at a multifunction device with a display. The method comprises the following steps: receiving input corresponding to a request to create a folder containing a first item and a second item; and in response to receiving the input: creating a folder containing the first item and the second item; determining a first plurality of descriptors associated with the first item; and determining a second plurality of descriptors associated with the second item. The method further includes, while the first plurality of descriptors and the second plurality of descriptors share at least a first common descriptor: automatically generating a folder name for the folder based on the first common descriptor; and displaying an icon for the folder on the display with the automatically generated folder name.
According to some embodiments, a graphical user interface on a multifunction device with a display, a memory, and one or more processors to execute one or more programs stored in the memory includes a first item and a second item. An input corresponding to a request to create a folder containing a first item and a second item is received; and in response to receiving the input: a folder is created containing the first item and the second item; a first plurality of descriptors associated with the first item is determined; and a second plurality of descriptors associated with the second item is determined. When the first plurality of descriptors and the second plurality of descriptors share at least a first common descriptor: automatically generating a folder name for the folder based on the first common descriptor; and displaying the automatically generated folder name on the display for the icon of the folder.
According to some embodiments, a computer readable storage medium has stored therein instructions that, when executed by a multifunction device with a display, cause the device to: receiving input corresponding to a request to create a folder containing a first item and a second item; and in response to receiving the input: creating a folder containing the first item and the second item; determining a first plurality of descriptors associated with the first item; and determining a second plurality of descriptors associated with the second item. The instructions further cause the apparatus to, when the first plurality of descriptors and the second plurality of descriptors share at least a first common descriptor: automatically generating a folder name for the folder based on the first common descriptor; and displaying an icon for the folder on the display with the automatically generated folder name.
According to some embodiments, a multi-function device includes: a display; means for receiving input corresponding to a request to create a folder containing a first item and a second item; and means, in response to receiving the input, for: creating a folder containing the first item and the second item; determining a first plurality of descriptors associated with the first item; and determining a second plurality of descriptors associated with the second item. The apparatus further includes means for performing the following when the first plurality of descriptors and the second plurality of descriptors share at least a first common descriptor: automatically generating a folder name for the folder based on the first common descriptor; and displaying an icon for the folder on the display with the automatically generated folder name.
According to some embodiments, an information processing apparatus for use in a multifunction device with a display includes: means for receiving input corresponding to a request to create a folder containing a first item and a second item; and means for performing the following in response to receiving the input: creating a folder containing the first item and the second item; determining a first plurality of descriptors associated with the first item; and determining a second plurality of descriptors associated with the second item. The information processing apparatus further includes means for performing the following when the first plurality of descriptors and the second plurality of descriptors share at least a first common descriptor: automatically generating a folder name for the folder based on the first common descriptor; and displaying an icon for the folder on the display with the automatically generated folder name.
According to some embodiments, a multi-function device includes a display, one or more processors, memory, and one or more programs. The one or more programs are stored in the memory and configured to be executed by the one or more processors. The one or more programs include instructions for: a plurality of icons is displayed on a display. A first icon of the plurality of icons is displayed at a first location on the display. A second icon of the plurality of icons that is distinct from the first icon has an activation region of a default size. The one or more programs further include instructions for: detecting an input corresponding to a request to move the first icon; changing the size of the activation region for the second icon from a default size based on a distance from the first location to the location of the second icon; and in response to detecting the input, moving the first icon across the display away from the first location. The one or more programs also include instructions for: detecting that the input meets a predefined trigger criterion; and in response to detecting that the input meets a predefined trigger criterion: performing a first operation associated with the second icon when the first icon is at least partially within the activation region of the second icon; and performing a second operation distinct from the first operation when the first icon is outside the activation region of the second icon.
According to some embodiments, a method is performed at a multifunction device with a display. The method comprises the following steps: a plurality of icons is displayed on a display. A first icon of the plurality of icons is displayed at a first location on the display. A second icon of the plurality of icons that is distinct from the first icon has an activation region of a default size. The method further comprises the steps of: detecting an input corresponding to a request to move the first icon; changing the size of the activation region for the second icon from a default size based on a distance from the first location to the location of the second icon; and in response to detecting the input, moving the first icon across the display away from the first location. The method further comprises the steps of: detecting that the input meets a predefined trigger criterion; and in response to detecting that the input meets a predefined trigger criterion: performing a first operation associated with the second icon when the first icon is at least partially within the activation region of the second icon; and performing a second operation distinct from the first operation when the first icon is outside the activation region of the second icon.
According to some embodiments, a graphical user interface on a multifunction device with a display, a memory, and one or more processors to execute one or more programs stored in the memory includes a plurality of icons displayed on the display. A first icon of the plurality of icons is displayed at a first location on the display. A second icon of the plurality of icons that is distinct from the first icon has an activation region of a default size. An input corresponding to a request to move a first icon is detected. The size of the activation region for the second icon is changed from a default size based on the distance from the first location to the location of the second icon. In response to detecting the input, the first icon is moved across the display away from the first location. Detecting that the input meets a predefined trigger criterion; and in response to detecting that the input meets a predefined trigger criterion: performing a first operation associated with the second icon when the first icon is at least partially within the activation region of the second icon; and performing a second operation distinct from the first operation when the first icon is outside the activation region of the second icon.
According to some embodiments, a computer readable storage medium has stored therein instructions that, when executed by a multifunction device with a display, cause the device to: a plurality of icons is displayed on a display. A first icon of the plurality of icons is displayed at a first location on the display. A second icon of the plurality of icons that is distinct from the first icon has an activation region of a default size. The instructions further cause the apparatus to: detecting an input corresponding to a request to move the first icon; changing the size of the activation region for the second icon from a default size based on a distance from the first location to the location of the second icon; and in response to detecting the input, moving the first icon across the display away from the first location. The instructions further cause the apparatus to: detecting that the input meets a predefined trigger criterion; and in response to detecting that the input meets a predefined trigger criterion: performing a first operation associated with the second icon when the first icon is at least partially within the activation region of the second icon; and performing a second operation distinct from the first operation when the first icon is outside the activation region of the second icon.
According to some embodiments, a multi-function device includes: a display; means for displaying a plurality of icons on the display. A first icon of the plurality of icons is displayed at a first location on the display. A second icon of the plurality of icons that is distinct from the first icon has an activation region of a default size. The apparatus further comprises: means for detecting an input corresponding to a request to move the first icon; means for changing the size of the activation region for the second icon from a default size based on a distance from the first location to the location of the second icon; and means for moving the first icon across the display away from the first location in response to detecting the input. The apparatus further comprises: means for detecting that the input meets a predefined trigger criterion; and means for, in response to detecting that the input meets a predefined trigger criterion, performing the following: performing a first operation associated with the second icon when the first icon is at least partially within the activation region of the second icon; and performing a second operation distinct from the first operation when the first icon is outside the activation region of the second icon.
According to some embodiments, an information processing apparatus for use in a multifunction device with a display includes: means for displaying a plurality of icons on the display. A first icon of the plurality of icons is displayed at a first location on the display. A second icon of the plurality of icons that is distinct from the first icon has an activation region of a default size. The information processing apparatus further includes: means for detecting an input corresponding to a request to move the first icon; means for changing the size of the activation region for the second icon from a default size based on a distance from the first location to the location of the second icon; and means for moving the first icon across the display away from the first location in response to detecting the input. The information processing apparatus further includes: means for detecting that the input meets a predefined trigger criterion; and means for, in response to detecting that the input meets a predefined trigger criterion, performing the following: performing a first operation associated with the second icon when the first icon is at least partially within the activation region of the second icon; and performing a second operation distinct from the first operation when the first icon is outside the activation region of the second icon.
According to some embodiments, a multi-function device includes a display, one or more processors, memory, and one or more programs. The one or more programs are stored in the memory and configured to be executed by the one or more processors. The one or more programs include instructions for: displaying a plurality of icons on a display in a first arrangement; detecting an input corresponding to a request to move a first icon of the plurality of icons from a first location on the display to a second location on the display; and in response to detecting the input: moving the first icon from the first position to the second position; and maintaining the position of each respective icon of the plurality of icons other than the first icon until the auto-reconfiguration criteria has been met. The apparatus further includes instructions for: one or more of the plurality of icons other than the first icon is moved to form a second arrangement that is distinct from the first arrangement when the auto-reconfiguration criteria has been met.
According to some embodiments, a method is performed at a multifunction device with a display. The method comprises the following steps: displaying a plurality of icons on a display in a first arrangement; detecting an input corresponding to a request to move a first icon of the plurality of icons from a first location on the display to a second location on the display; and in response to detecting the input: moving the first icon from the first position to the second position; and maintaining the position of each respective icon of the plurality of icons other than the first icon until the auto-reconfiguration criteria has been met. The method further comprises the steps of: one or more of the plurality of icons other than the first icon is moved to form a second arrangement that is distinct from the first arrangement when the auto-reconfiguration criteria has been met.
According to some embodiments, a graphical user interface on a multifunction device with a display, a memory, and one or more processors to execute one or more programs stored in the memory includes a plurality of icons in a first arrangement on the display. Detecting an input corresponding to a request to move a first icon of the plurality of icons from a first location on the display to a second location on the display; and in response to detecting the input: moving the first icon from the first position to the second position; and maintaining the position of each respective icon of the plurality of icons other than the first icon until the auto-reconfiguration criteria has been met. One or more of the plurality of icons other than the first icon is moved to form a second arrangement that is distinct from the first arrangement when the auto-reconfiguration criteria has been met.
According to some embodiments, a computer readable storage medium has stored therein instructions that, when executed by a multifunction device with a display, cause the device to: displaying a plurality of icons on a display in a first arrangement; detecting an input corresponding to a request to move a first icon of the plurality of icons from a first location on the display to a second location on the display; and in response to detecting the input: moving the first icon from the first position to the second position; and maintaining the position of each respective icon of the plurality of icons other than the first icon until the auto-reconfiguration criteria has been met. The instructions further cause the apparatus to: one or more of the plurality of icons other than the first icon is moved to form a second arrangement that is distinct from the first arrangement when the auto-reconfiguration criteria has been met.
According to some embodiments, a multi-function device includes: a display; means for displaying a plurality of icons on the display in a first arrangement; means for detecting an input corresponding to a request to move a first icon of the plurality of icons from a first location on the display to a second location on the display; and means, in response to detecting the input, for: moving the first icon from the first position to the second position; and maintaining the position of each respective icon of the plurality of icons other than the first icon until the auto-reconfiguration criteria has been met. The apparatus further comprises means for: one or more of the plurality of icons other than the first icon is moved to form a second arrangement that is distinct from the first arrangement when the auto-reconfiguration criteria has been met.
According to some embodiments, an information processing apparatus for use in a multifunction device with a display includes: means for displaying a plurality of icons on the display in a first arrangement; means for detecting an input corresponding to a request to move a first icon of the plurality of icons from a first location on the display to a second location on the display; and means, in response to detecting the input, for: moving the first icon from the first position to the second position; and maintaining the position of each respective icon of the plurality of icons other than the first icon until the auto-reconfiguration criteria has been met. The information processing apparatus further includes means for performing the following operations: one or more of the plurality of icons other than the first icon is moved to form a second arrangement that is distinct from the first arrangement when the auto-reconfiguration criteria has been met.
According to some embodiments, a multi-function device includes a display, one or more processors, memory, and one or more programs. The one or more programs are stored in the memory and configured to be executed by the one or more processors. The one or more programs include instructions for: displaying dynamic folder icons. The dynamic folder icon includes a visual indicator of the current content in the folder associated with the dynamic folder icon. The apparatus further includes instructions for: detecting an input corresponding to a request to modify content in a folder; and in response to detecting the input: modifying the content in the folder; and updating the dynamic folder icon to include a visual indicator of the spatial arrangement of the modified content within the folder.
According to some embodiments, a method is performed at a multifunction device with a display. The method comprises the following steps: displaying dynamic folder icons. The dynamic folder icon includes a visual indicator of the current content in the folder associated with the dynamic folder icon. The method further comprises the steps of: detecting an input corresponding to a request to modify content in a folder; and in response to detecting the input: modifying the content in the folder; and updating the dynamic folder icon to include a visual indicator of the spatial arrangement of the modified content within the folder.
According to some embodiments, a graphical user interface on a multifunction device with a display, a memory, and one or more processors to execute one or more programs stored in the memory includes a dynamic folder icon. The dynamic folder icon includes a visual indicator of the current content in the folder associated with the dynamic folder icon. Detecting an input corresponding to a request to modify content in a folder; and in response to detecting the input: modifying the content in the folder; and updating the dynamic folder icon to include a visual indicator of the spatial arrangement of the modified content within the folder.
According to some embodiments, a computer readable storage medium has stored therein instructions that, when executed by a multifunction device with a display, cause the device to: displaying dynamic folder icons. The dynamic folder icon includes a visual indicator of the current content in the folder associated with the dynamic folder icon. The instructions further cause the apparatus to: detecting an input corresponding to a request to modify content in a folder; and in response to detecting the input: modifying the content in the folder; and updating the dynamic folder icon to include a visual indicator of the spatial arrangement of the modified content within the folder.
According to some embodiments, a multi-function device includes: a display; means for displaying dynamic folder icons. The dynamic folder icon includes a visual indicator of the current content in the folder associated with the dynamic folder icon. The apparatus further comprises: means for detecting an input corresponding to a request to modify content in a folder; means, in response to detecting the input, for: modifying the content in the folder; and updating the dynamic folder icon to include a visual indicator of the spatial arrangement of the modified content within the folder.
According to some embodiments, an information processing apparatus for use in a multifunction device with a display includes: means for displaying dynamic folder icons. The dynamic folder icon includes a visual indicator of the current content in the folder associated with the dynamic folder icon. The information processing apparatus further includes: means for detecting an input corresponding to a request to modify content in a folder; means, responsive to detecting the input, for: modifying the content in the folder; and updating the dynamic folder icon to include a visual indicator of the spatial arrangement of the modified content within the folder.
According to some embodiments, a multi-function device includes a display, one or more processors, memory, and one or more programs. The one or more programs are stored in the memory and configured to be executed by the one or more processors. The one or more programs include instructions for: a folder icon is displayed over the wallpaper background on the display, the folder icon corresponding to a folder containing content including one or more selectable user interface objects. The one or more programs further include instructions for: detecting a first input corresponding to a request to display folder content; and in response to detecting the first input: dividing the wallpaper background into a first part and a second part; moving the second portion away from the first portion; and displaying contents of the folder in an area between the first portion and the second portion.
According to some embodiments, a method is performed at a multifunction device with a display. The method comprises the following steps: a folder icon is displayed over the wallpaper background on the display, the folder icon corresponding to a folder containing content including one or more selectable user interface objects. The method further comprises the steps of: detecting a first input corresponding to a request to display folder content; and in response to detecting the first input: dividing the wallpaper background into a first part and a second part; moving the second portion away from the first portion; and displaying contents of the folder in an area between the first portion and the second portion.
According to some embodiments, a graphical user interface on a multifunction device with a display, a memory, and one or more processors to execute one or more programs stored in the memory includes a folder icon displayed over a wallpaper background on the display, the folder icon corresponding to a folder containing content including one or more selectable user interface objects. Detecting a first input corresponding to a request to display folder content; and in response to detecting the first input: dividing the wallpaper background into a first part and a second part; moving the second portion away from the first portion; and displaying contents of the folder in an area between the first portion and the second portion.
According to some embodiments, a computer readable storage medium has stored therein instructions that, when executed by a multifunction device with a display, cause the device to: a folder icon is displayed over the wallpaper background on the display, the folder icon corresponding to a folder containing content including one or more selectable user interface objects. The instructions further cause the apparatus to: detecting a first input corresponding to a request to display folder content; and in response to detecting the first input: dividing the wallpaper background into a first part and a second part; moving the second portion away from the first portion; and displaying contents of the folder in an area between the first portion and the second portion.
According to some embodiments, a multi-function device includes: a display; means for displaying a folder icon over a wallpaper background on the display, the folder icon corresponding to a folder containing content, the content comprising one or more selectable user interface objects. The apparatus further comprises: means for detecting a first input corresponding to a request to display folder content; and means, in response to detecting the first input, for: dividing the wallpaper background into a first part and a second part; moving the second portion away from the first portion; and displaying contents of the folder in an area between the first portion and the second portion.
According to some embodiments, an information processing apparatus for use in a multifunction device with a display includes: means for displaying a folder icon over a wallpaper background on the display, the folder icon corresponding to a folder containing content, the content comprising one or more selectable user interface objects. The information processing apparatus further includes: means for detecting a first input corresponding to a request to display folder content; and means, in response to detecting the first input, for: dividing the wallpaper background into a first part and a second part; moving the second portion away from the first portion; and displaying contents of the folder in an area between the first portion and the second portion.
According to some embodiments, an electronic device includes a display, one or more processors, memory, and one or more programs. The one or more programs are stored in the memory and configured to be executed by the one or more processors. The one or more programs include instructions for: a plurality of selectable user interface objects including one or more folder icons are simultaneously displayed on a display. The one or more programs further include instructions for: a first input corresponding to a request to select a respective folder icon for a respective folder is detected, the respective folder including a first number of selectable icons divided among a plurality of distinct separately displayed pages including a first page and a second page. The one or more programs further include instructions for: in response to detecting the first input, a folder view for the respective folder is displayed. The folder view includes space to simultaneously display no more than a second number of selectable icons, less than the first number of selectable icons. The folder view displays a first page that includes a first subset of selectable icons in a folder. The one or more programs further include instructions for: while the first page of the folder view is displayed, a second input corresponding to a request to display a second page of the folder view is detected, and in response to detecting the second input, the display of the first page of the folder view is stopped and the second page of the folder view for the corresponding folder is displayed. The second page of the folder view includes a second subset of selectable icons that is different from the first subset of selectable icons.
According to some embodiments, a method is performed at a multifunction device with a display. The method comprises the following steps: a plurality of selectable user interface objects including one or more folder icons are simultaneously displayed on a display. The method further comprises the steps of: a first input corresponding to a request to select a respective folder icon for a respective folder is detected, the respective folder including a first number of selectable icons divided among a plurality of distinct separately displayed pages including a first page and a second page. The method further comprises the steps of: in response to detecting the first input, a folder view for the respective folder is displayed. The folder view includes space to simultaneously display no more than a second number of selectable icons, less than the first number of selectable icons. The folder view displays a first page that includes a first subset of selectable icons in a folder. The method further comprises the steps of: while the first page of the folder view is displayed, a second input corresponding to a request to display a second page of the folder view is detected, and in response to detecting the second input, the display of the first page of the folder view is stopped and the second page of the folder view for the corresponding folder is displayed. The second page of the folder view includes a second subset of selectable icons that is different from the first subset of selectable icons.
According to some embodiments, a graphical user interface on an electronic device with a display, a memory, and one or more processors to execute one or more programs stored in the memory includes a plurality of selectable user interface objects including one or more folder icons concurrently displayed on the display. A first input corresponding to a request to select a respective folder icon for a respective folder is detected, the respective folder including a first number of selectable icons divided among a plurality of distinct separately displayed pages including a first page and a second page. In response to detecting the first input, a folder view for the respective folder is displayed. The folder view includes space to simultaneously display no more than a second number of selectable icons, less than the first number of selectable icons. The folder view displays a first page that includes a first subset of selectable icons in a folder. While the first page of the folder view is displayed, a second input corresponding to a request to display a second page of the folder view is detected. In response to detecting the second input, the graphical user interface is configured to stop displaying the first page of the folder view and display the second page of the folder view for the respective folder. The second page of the folder view includes a second subset of selectable icons that is different from the first subset of selectable icons.
According to some embodiments, a computer-readable storage medium has stored therein instructions that, when executed by an electronic device with a display, cause the device to: a plurality of selectable user interface objects including one or more folder icons are simultaneously displayed on a display. The instructions further cause the apparatus to: a first input corresponding to a request to select a respective folder icon for a respective folder is detected, the respective folder including a first number of selectable icons divided among a plurality of distinct separately displayed pages including a first page and a second page. The instructions further cause the apparatus to: in response to detecting the first input, a folder view for the respective folder is displayed. The folder view includes space to simultaneously display no more than a second number of selectable icons, less than the first number of selectable icons. The folder view displays a first page that includes a first subset of selectable icons in the folder. The instructions further cause the apparatus to: while the first page of the folder view is displayed, a second input corresponding to a request to display a second page of the folder view is detected, and in response to detecting the second input, the display of the first page of the folder view is stopped and the second page of the folder view for the corresponding folder is displayed. The second page of the folder view includes a second subset of selectable icons that is different from the first subset of selectable icons.
According to some embodiments, an electronic device includes a display; means for simultaneously displaying a plurality of selectable user interface objects including one or more folder icons on a display; means for detecting a first input corresponding to a request to select a respective folder icon for a respective folder, the respective folder including a first number of selectable icons divided among a plurality of distinct separately displayed pages including a first page and a second page; and means for displaying a folder view for the respective folder in response to detecting the first input, wherein: the folder view includes a space to simultaneously display no more than a second number of selectable icons, the second number of selectable icons being less than the first number of selectable icons; and the folder view displays a first page including a first subset of selectable icons in the folder; means for detecting a second input corresponding to a request to display a second page of the folder view while the first page of the folder view is displayed; and means for ceasing to display the first page of the folder view and displaying a second page of the folder view for the respective folder in response to detecting the second input, wherein the second page of the folder view includes a second subset of selectable icons different from the first subset of selectable icons.
According to some embodiments, an information processing apparatus for use in an electronic device having a display includes: means for simultaneously displaying a plurality of selectable user interface objects including one or more folder icons on a display; means for detecting a first input corresponding to a request to select a respective folder icon for a respective folder, the respective folder including a first number of selectable icons divided among a plurality of distinct separately displayed pages including a first page and a second page; and means for displaying a folder view for the respective folder in response to detecting the first input, wherein: the folder view includes a display for simultaneously displaying no more than a second number of selectable icon spaces, the second number of selectable icons being less than the first number of selectable icons; and the folder view displays a first page including a first subset of selectable icons in the folder; means for detecting a second input corresponding to a request to display a second page of the folder view while the first page of the folder view is displayed; and means for ceasing to display the first page of the folder view and displaying a second page of the folder view for the respective folder in response to detecting the second input, wherein the second page of the folder view includes a second subset of selectable icons different from the first subset of selectable icons.
According to some embodiments, an electronic device includes: a display unit configured to simultaneously display a plurality of selectable user interface objects including one or more folder icons; an input unit configured to receive a first input and a second input, wherein: the first input corresponds to a request to select a respective folder icon for a respective folder, the respective folder including a first number of selectable icons divided among a plurality of distinct, separate display pages in a folder view including a first page and a second page; and the second input corresponds to a request to display a second page; a processing unit coupled to the display unit and the input unit, the processing unit configured to: detecting a first input; in response to detecting the first input, a folder view is displayed for the respective folder, wherein: the folder view includes a space to simultaneously display no more than a second number of selectable icons, the second number of selectable icons being less than the first number of selectable icons; and the folder view displays a first page including a first subset of selectable icons in the folder; detecting a second input while displaying a first page of the folder view; and in response to detecting the second input, ceasing to display the first page of the folder view and displaying a second page of the folder view for the respective folder, wherein the second page of the folder view includes a second subset of selectable icons different from the first subset of selectable icons.
Thus, a faster and more efficient method and interface for managing folders is provided for a multifunction device with a display, thereby increasing the effectiveness, efficiency, and satisfaction of the user with such a device. These methods and interfaces may supplement or replace traditional methods for managing folders.
Drawings
For a better understanding of the embodiments of the present invention mentioned above, and additional embodiments thereof, reference will be made to the description of the specific embodiments in conjunction with the following drawings in which like reference numerals designate corresponding parts throughout the figures.
Fig. 1A and 1B are block diagrams illustrating a portable multifunction device with a touch-sensitive display in accordance with some embodiments.
FIG. 1C is a block diagram illustrating exemplary components for event processing according to some embodiments.
Fig. 2 illustrates a portable multifunction device with a touch screen in accordance with some embodiments.
FIG. 3 is a block diagram of an exemplary multifunction device with a display and a touch-sensitive surface in accordance with some embodiments.
Fig. 4A and 4B illustrate exemplary user interfaces for application menus on a portable multifunction device in accordance with some embodiments.
FIG. 4C illustrates an exemplary user interface for a multi-function device having a touch-sensitive surface separate from a display, in accordance with some embodiments.
Fig. 5A-5 LLLL illustrate an exemplary user interface for managing folders according to some embodiments.
Fig. 6A-6E are flowcharts illustrating methods for creating new folders according to some embodiments.
Fig. 7A-7C are flowcharts illustrating methods for managing folder icons and action icons, according to some embodiments.
Fig. 8A-8C are flowcharts illustrating methods for naming new folders according to some embodiments.
Fig. 9A-9B are flowcharts illustrating methods for adjusting an activation region for a selectable user interface object in response to an icon management input, according to some embodiments.
10A-10B are flowcharts illustrating methods for reconfiguring icons on a display in response to an icon management input, according to some embodiments.
11A-11C are flowcharts illustrating methods for updating dynamic folder icons to provide visual indications of the contents of folders associated with the dynamic folder icons, in accordance with some embodiments.
Fig. 12A-12E are flowcharts illustrating methods for providing context information in connection with displaying the contents of a folder, according to some embodiments.
Fig. 13A-13E are flowcharts illustrating methods for displaying and navigating a multi-page folder, according to some embodiments.
Fig. 14 is a functional block diagram of an electronic device according to some embodiments.
Detailed Description
Reference will now be made in detail to embodiments, examples of which are illustrated in the accompanying drawings. In the following detailed description, numerous specific details are set forth in order to provide a thorough understanding of the present invention. It will be apparent, however, to one skilled in the art that the present invention may be practiced without these specific details. In other instances, well-known methods, procedures, components, circuits, and networks have not been described in detail so as not to unnecessarily obscure aspects of the embodiments.
It will be further understood that, although the terms first, second, etc. may be used herein to describe various elements, these elements should not be limited by these terms. These designations are used merely to distinguish one element from another. For example, a first contact may be referred to as a second contact, and similarly, a second contact may be referred to as a first contact, without departing from the scope of the invention. The first contact and the second contact are both contacts, but they are not the same contact.
The terminology used in the description of the invention herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. The singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise, when used in the description of the invention and the appended claims. It will also be understood that the term "and/or" as used herein refers to and encompasses any of one or more of the associated listed items and all possible combinations thereof. It will be further understood that the terms "comprises," "comprising," "includes" and/or "including," when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
As used herein, the term "if" may be interpreted as meaning "when …" or "once …" or "responsive to determination" or "responsive to detection" depending on the context. Similarly, the phrase "if a determination" or "if a [ stated condition or event ] is detected" may be interpreted as meaning "upon determination" or "in response to determination" or "upon detection of a [ stated condition or event ]" or "in response to detection of a [ stated condition or event ]" depending on the context.
As used herein, the term "resolution" of a display refers to the number of pixels along each axis of the display or in each dimension of the display (also referred to as "pixel count" or "pixel resolution"). For example, the display may have a resolution of 320x480 pixels. Furthermore, as used herein, the term "resolution" of the multifunction device refers to the resolution of the display in the multifunction device. The term "resolution" does not imply a limitation on the size of each pixel or the pixel spacing. For example, a second display having a resolution of 320x480 pixels has a lower resolution than a first display having a resolution of 1024x768 pixels. However, it should be noted that the physical size of the display depends not only on the pixel resolution, but also on many other factors, including pixel size and pixel spacing. Thus, the first display may have the same, smaller, or larger physical dimensions than the second display.
As used herein, the term "video resolution" of a display refers to the pixel density along each axis of the display or in each dimension of the display. Video resolution is often measured in Dots Per Inch (DPI), which counts the number of pixels that can be placed straight within a one foot span along the corresponding dimension of the display.
Embodiments of computing devices, user interfaces for such devices, and associated processes for using such devices are described. In some embodiments, the computing device is a portable communication device, such as a mobile phone, that also contains other functions, such as PDA and/or music player functions. Exemplary embodiments of the portable multifunction device include, but are not limited to: from Apple Inc. of Coprinus, califiPod/>An apparatus. Other portable devices such as a laptop or tablet having a touch-sensitive surface (e.g., a touch screen display and/or a touchpad) may also be used. It should also be appreciated that in some embodiments, the device is not a portable communication device, but rather a desktop computer having a touch-sensitive surface (e.g., a touch screen display and/or a touchpad).
In the following discussion, a computing device including a display and a touch-sensitive surface is described. However, it should be understood that the computing device may include one or more other physical user interface devices, such as a physical keyboard, mouse, and/or joystick.
The device supports various applications, such as one or more of the following: drawing applications, presentation applications, word processing applications, website creation applications, disk authoring applications, spreadsheet applications, gaming applications, telephony applications, video conferencing applications, email applications, instant messaging applications, workout support applications, photograph management applications, digital camera applications, digital video recorder applications, web browsing applications, digital music player applications, and/or digital video player applications.
Various applications that may be executed on the device may use at least one common physical user interface device, such as a touch-sensitive surface. One or more functions of the touch-sensitive surface and corresponding information displayed on the device may be adjusted and/or changed depending on the application and/or in the respective application. In this way, a common physical architecture of the device (such as a touch-sensitive surface) may support various applications through a user interface that is intuitive and transparent to the user.
The user interface may include one or more soft keyboard embodiments. Soft keyboard embodiments may include standard (QWERTY) and/or non-standard configurations of symbols on displayed icons of the keyboard, such as those described in U.S. patent application No. 11/459,606 "Keyboards For Portable Electronic Devices, filed 24 at 7, 2006, and U.S. patent application No. 11/459,615" Touch Screen Keyboards For Portable Electronic Devices ", filed 24 at 7, 2006, the contents of which are incorporated herein by reference in their entirety. A keyboard embodiment may include a reduced number of icons (or soft keyboards) relative to the number of keys of an existing physical keyboard, such as the number of keys of a typewriter. This may make it easier for the user to select one or more icons in the keyboard and thus one or more corresponding symbols. The keyboard embodiments may be adaptive. For example, the displayed icons may be modified according to user actions, such as selecting one or more icons and/or one or more corresponding symbols. One or more applications on the device may utilize common and/or different keyboard embodiments. Thus, the keyboard embodiments used may be adapted to at least some applications. In some embodiments, one or more keyboard embodiments may be adapted to the respective user. For example, one or more keyboard embodiments may be adapted to a respective user based on the word usage history (lexicography, slang, personal use) of the respective user. When using soft keyboard embodiments, some keyboard embodiments may be adjusted to reduce the probability of user error when selecting one or more icons and thus one or more symbols.
Attention is now directed to an embodiment of a portable device having a touch sensitive display. Fig. 1A and 1B are block diagrams illustrating a portable multifunction device 100 with a touch-sensitive display 112 in accordance with some embodiments. For convenience, the touch-sensitive display 112 is sometimes referred to as a "touch screen" and may also be referred to or referred to as a touch-sensitive display system. Device 100 may include memory 102 (which may include one or more computer-readable storage media), memory controller 122, one or more processing units (CPUs) 120, peripheral interface 118, RF circuitry 108, audio circuitry 110, speaker 111, microphone 113, input/output (I/O) subsystem 106, other input or control devices 116, and external ports 124. The device 100 may include one or more optical sensors 164. These components may communicate via one or more communication buses or signal lines 103.
It should be understood that device 100 is only one example of a portable multifunction device and that device 100 may have more or less components than those shown, may combine two or more components, or may have different configurations or arrangements of components. The various components shown in fig. 1A and 1B may be implemented in hardware, software, or a combination of both hardware and software, including one or more signal processing and/or application specific integrated circuits.
Memory 102 may include high-speed random access memory, and may also include non-volatile memory, such as one or more disk storage devices, flash memory devices, or other non-volatile solid-state memory devices. Access to memory 102 by other components of device 100, such as CPU 120 and peripheral interface 118, may be controlled by memory controller 122.
The peripheral interface 118 may be used to couple input and output peripheral devices of the device to the CPU 120 and memory 102. The one or more processors 120 run or execute various software programs and/or sets of instructions stored in the memory 102 for performing various functions for the device 100 and for processing data.
In some embodiments, peripheral interface 118, CPU 120, and memory controller 122 may be implemented on a single chip, such as chip 104. In some other embodiments, the peripheral interface 118, the CPU 120, and the memory controller 122 may be implemented on separate chips.
RF (radio frequency) circuitry 108 receives and transmits RF signals, also referred to as electromagnetic signals. The RF circuitry 108 converts/converts electrical signals to/from electromagnetic signals and communicates with a communication network and other communication devices via the electromagnetic signals. RF circuitry 108 may include known circuitry for performing these functions including, but not limited to: an antenna system, an RF transceiver, one or more amplifiers, a tuner, one or more oscillators, a digital signal processor, a CODEC chipset, a Subscriber Identity Module (SIM) card, memory, etc. RF circuitry 108 may communicate with the internet, such as the World Wide Web (WWW), an intranet, and/or a wireless network, such as a cellular telephone network, a wireless Local Area Network (LAN), and/or a Metropolitan Area Network (MAN), among other devices, via wireless communication. The wireless communication may use any of a variety of communication standards, protocols, and technologies, including, but not limited to: global system for mobile communications (GSM), enhanced Data GSM Environment (EDGE), high Speed Downlink Packet Access (HSDPA), wideband code division multiple access (W-CDMA), code Division Multiple Access (CDMA), time Division Multiple Access (TDMA), bluetooth, wireless high fidelity (Wi-Fi) (e.g., IEEE802.11 a, IEEE802.11 b, IEEE802.11 g, and/or IEEE802.11 n), voice over internet protocol (VoIP), wi-MAX, protocols for e-mail (e.g., internet Message Access Protocol (IMAP) and/or Post Office Protocol (POP)), instant messaging (e.g., extensible messaging and presence protocol (XMPP), session initiation protocol for instant messaging and presence balance extension (SIMPLE), instant Messaging and Presence Service (IMPS)), and/or Short Message Service (SMS), or any other suitable communication protocol, including communication protocols not yet developed until the date of filing of this document.
Audio circuitry 110, speaker 111, and microphone 113 provide an audio interface between the user and device 100. Audio circuitry 110 receives audio data from peripheral interface 118, converts the audio data to electrical signals, and transmits the electrical signals to speaker 111. The speaker 111 converts the electrical signal into sound waves audible to humans. The audio circuit 110 also receives electrical signals converted from sound waves by the microphone 113. The audio circuitry 110 converts the electrical signals into audio data and transmits the audio data to the peripheral interface 118 for processing. Audio data may be retrieved from memory 102 and/or RF circuitry 108 and/or transferred to memory 102 and/or RF circuitry 108 through peripheral interface 118. In some embodiments, audio circuit 110 also includes a headphone jack (e.g., 212 in fig. 2). The headphone jack provides an interface between the audio circuit 110 and removable audio input/output peripherals such as output-only headphones or headphones that are both output (e.g., monaural or binaural headphones) and input (e.g., microphones).
I/O subsystem 106 couples input/output peripheral devices on device 100, such as touch screen 112 and other input control devices 116, to peripheral interface 118. The I/O subsystem 106 may include a display controller 156 and one or more input controllers 160 for other input or control devices. The one or more input controllers 160 receive electrical signals from other input or control devices 116/send electrical signals to other input or control devices 116. Other input or control devices 116 may include physical buttons (e.g., push buttons, rocker buttons, etc.), dials, slider switches, joysticks, click dials, and the like. In some alternative embodiments, the input controller(s) 160 may be coupled to (or not coupled to) any of the following: a keyboard, an infrared port, a USB port, and a pointer device such as a mouse. The one or more buttons (e.g., 208 in fig. 2) may include up/down buttons for volume control of the speaker 111 and/or microphone 113. The one or more buttons may include a push button (e.g., 206 in fig. 2). A quick press of the push button may unlock the touch screen 112 or begin the process of unlocking the device on the touch screen using gestures, as described in U.S. patent application No. 11/322,549, filed on 12/23 2005, which is incorporated herein by reference in its entirety. Longer presses of the push button (e.g., 206) may power the device 100 on or off. The user may be able to customize the functionality of one or more of the buttons. Touch screen 112 is used to implement virtual buttons or soft buttons and one or more soft keyboards.
The touch sensitive display 112 provides an input interface and an output interface between the device and the user. Display controller 156 receives electrical signals from touch screen 112 and/or transmits electrical signals to touch screen 112. Touch screen 112 displays visual output to a user. The visual output may include graphics, text, icons, video, and any combination of the foregoing (collectively, "graphics"). In some embodiments, some or all of the visual output may correspond to a user interface object.
Touch screen 112 has a touch-sensitive surface, a sensor, or a set of sensors that accepts input from a user based on tactile (haptic) and/or haptic (tactile) contacts. Touch screen 112 and display controller 156 (along with any associated modules and/or sets of instructions in memory 102) detect contact (and any movement or interruption of the contact) on touch screen 112 and translate the detected contact into interactions with user interface objects (e.g., one or more soft keys, icons, web pages, or images) displayed on touch screen 112. In one exemplary embodiment, the point of contact between touch screen 112 and the user corresponds to the user's finger.
Touch screen 112 may use LCD (liquid crystal display) technology, LPD (light emitting polymer display) technology, or LED (light emitting diode) technology, but other display technologies may be used in other embodiments. Touch screen 112 and display controller 156 may detect contact and any movement or interruption of the contact using any of a number of touch sensing technologies now known or later developed, including, but not limited to: electric power Capacitive, resistive, infrared, and surface acoustic wave technologies, as well as other proximity sensor arrays or other elements for determining one or more points of contact with touch screen 112. In one exemplary embodiment, an Apple corporation such as that available in california, curbitino is usedAnd iPod->Projected mutual capacitance sensing techniques found in (c).
The touch sensitive display in some embodiments of touch screen 112 may be similar to the multi-touch sensitive touch pad described in the following U.S. patents: 6,323,846 (Westerman et al), 6,570,557 (Westerman et al), and/or 6,677,932 (Westerman), and/or U.S. patent publication 2002/0015024A1, each of which is incorporated herein by reference in its entirety. However, touch screen 112 displays visual output from portable device 100, while touch sensitive touchpads do not provide visual output.
The touch sensitive display in some embodiments of touch screen 112 may be a touch sensitive display as described in the following applications: (1) U.S. patent application Ser. No. 11/381,313, "Multipoint Touch Surface Controller," filed on 5/2/2006; (2) U.S. patent application Ser. No. 10/840,862, "Multipoint Touchscreen", filed 5/6/2004; (3) U.S. patent application Ser. No. 10/903,964, "Gestures For Touch Sensitive Input Devices," filed on 7.30.2004; (4) U.S. patent application Ser. No. 11/048,264, "Gestures For Touch Sensitive Input Devices," filed 1/31/2005; (5) U.S. patent application Ser. No. 11/038,590, "Mode-Based Graphical User Interfaces For Touch Sensitive Input Devices," filed 1/18/2005; (6) U.S. patent application Ser. No. 11/228,758, "Virtual Input Device Placement On A Touch Screen User Interface," filed 9.16.2005; (7) U.S. patent application Ser. No. 11/228,700, "Operation Of A Computer With A Touch Screen Interface," filed 9/16/2005; (8) U.S. patent application Ser. No. 11/228,737, "Activating Virtual Keys Of A Touch-Screen Virtual Keyboard," filed 9/16/2005; and (9) U.S. patent application Ser. No. 11/367,749, filed 3/2006, "Multi-Functional Hand-Held Device". All of these applications are incorporated by reference herein in their entirety.
Touch screen 112 may have a video resolution in excess of 100 dpi. In some embodiments, the touch screen has a video resolution of about 168 dpi. The user may make contact with touch screen 112 using any suitable object or appendage, such as a stylus, finger, or the like. In some embodiments, the user interface is designed to work primarily with finger-based contacts and gestures that are less accurate than stylus-based inputs due to the larger contact area of the finger on the touch screen. In some embodiments, the device converts the finger-based coarse input into a precise pointer/cursor position or command for performing the action desired by the user.
In some embodiments, the device 100 may include a touch pad (not shown) for activating or deactivating a specific function in addition to the touch screen. In some embodiments, the touch pad is a touch sensitive area of the device that does not display visual output unlike a touch screen. The touch pad may be a touch sensitive surface separate from the touch screen 112 or an extension of the touch sensitive surface formed by the touch screen.
In some embodiments, device 100 may include a physical or virtual dial (e.g., a click dial) as control device 116. The user may navigate among or interact with one or more graphical objects (e.g., icons) displayed in touch screen 112 by rotating the click wheel or by moving a point of contact with the click wheel (e.g., wherein the amount of movement of the point of contact is measured by an angular displacement of the point of contact relative to a center point of the click wheel). Clicking on the dial may also be used to select one or more of the displayed icons. For example, the user may press down on at least a portion of the click wheel or an associated button. User commands and navigation commands provided by the user via the click wheel may be processed by the input controller 160 and one or more of the modules and/or the instruction set in the memory 102. For a virtual click wheel, the click wheel and click wheel controller may be part of touch screen 112 and display controller 156, respectively. For a virtual click wheel, the click wheel may be an opaque or translucent object that appears and disappears on the touch screen display in response to user interaction with the device. In some embodiments, the virtual click wheel is displayed on a touch screen of the portable multifunction device and is operable by user contact with the touch screen
The apparatus 100 also includes a power system 162 that powers the various components. The power system 162 may include a power management system, one or more power sources (e.g., battery, alternating Current (AC)), a charging system, a power failure detection circuit, a power converter or inverter, a power status indicator (e.g., light Emitting Diode (LED)), and any other components related to power generation, management, and distribution in a portable device.
The device 100 may also include one or more optical sensors 164. Fig. 1A and 1B illustrate optical sensors coupled to an optical sensor controller 158 in the I/O subsystem 106. The optical sensor 164 may include a Charge Coupled Device (CCD) or a Complementary Metal Oxide Semiconductor (CMOS) phototransistor. The optical sensor 164 receives light from the environment projected through one or more lenses and converts the light into data representing an image. In conjunction with an imaging module 143 (also referred to as a camera module), the optical sensor 164 may capture still images or video. In some embodiments, the optical sensor is located on the back of the device 100 opposite the touch screen display 112 on the front of the device so that the touch screen display can be used as a viewfinder for still and/or video image acquisition. In some embodiments, the optical sensor is located on the front of the device such that user images of the video conference can be acquired while the user views other video conference participants on the touch screen display, in some embodiments, the optical sensor 164 can be changed by the user (e.g., by rotating the lenses and sensors in the device housing) such that a single optical sensor 164 can be used with the touch screen display for both video conference and still and/or video image acquisition.
The device 100 may also include one or more proximity sensors 166. Fig. 1A and 1B illustrate a proximity sensor 166 coupled to the peripheral interface 118. Alternatively, the proximity sensor 166 may be coupled to the input controller 160 in the I/O subsystem 106. The proximity sensor 166 may perform as described in the following U.S. patent applications: 11/241,839, "Proximity Detector In Handheld Device"; 11/240,788, "Proximity Detector In Handheld Device"; 11/620,702, "Using Ambient Light Sensor To Augment Proximity Sensor Output"; 11/586,862, "Automated Response To And Sensing Of User Activity In Portable Devices"; and 11/638,251, "Methods And Systems For Automatic Configuration Of Peripherals," which is incorporated by reference herein in its entirety. In some embodiments, the proximity sensor is turned off and the touch screen 112 is disabled when the multifunction device is near the user's ear (e.g., when the user is making a phone call).
The device 100 may also include one or more accelerometers 168. Fig. 1A and 1B illustrate an accelerometer 168 coupled to the peripheral interface 118. Alternatively, the accelerometer 168 may be coupled to the input controller 160 in the I/O subsystem 106. Accelerometer 168 may be implemented as described in the following U.S. patent publications: 20050190059, "accel-based Theft Detection System for Portable Electronic Devices" and 20060017692, "Methods And Apparatuses For Operating A Portable Device Based On An Accelerometer", both of which disclosures are incorporated herein by reference in their entirety. In some embodiments, information is displayed on the touch screen display in a portrait view or a landscape view based on analysis of data received from one or more accelerometers. In addition to accelerometer(s) 168, device 100 may optionally include a magnetometer (not shown) and a GPS (or GLONASS or other global navigation system) receiver (not shown) for obtaining information regarding the position and orientation (e.g., longitudinal or lateral) of device 100.
In some embodiments, the software components stored in memory 102 include an operating system 126, a communication module (or instruction set) 128, a contact/motion module (or instruction set) 130, a graphics module (or instruction set) 132, a text input module (or instruction set) 134, a Global Positioning System (GPS) module (or instruction set) 135, and an application (or instruction set) 136. Further, as shown in fig. 1A, 1B, and 3, in some embodiments, memory 102 stores device/global internal state 157. The device/global internal state 157 includes one or more of the following: active application state, indicating the currently active application (if any); display status, indicating applications, views, and other information occupying various areas of the touch screen display 112; sensor status, including information obtained from the various sensors of the device and the input control device 116; and location information related to the location and/or attitude of the device.
Operating system 126 (e.g., darwin, RTXC, LINUX, UNIX, OS X, WINDOWS, or an embedded operating system such as VxWorks) includes various software components and/or drivers for controlling and managing general system tasks (e.g., memory management, storage device control, power management, etc.), and facilitates communication between the various hardware and software components.
The communication module 128 facilitates communication with other devices through one or more external ports 124 and also includes various software components for processing data received through the RF circuitry 108 and/or external ports 124. External port 124 (e.g., universal Serial Bus (USB), FIREWIRE, etc.) is adapted to be coupled to other devices directly or indirectly through a network (e.g., the internet, wireless LAN, etc.). In some embodiments, the external port is a multi-pin (e.g., 30-pin) connector that is the same, similar, and/or compatible with a 30-pin connector used on an iPod (Apple inc. Brand) device.
The contact/motion module 130 may detect contact with the touch screen 112 (in conjunction with the display controller 156) and other touch sensitive devices (e.g., a touch pad or physical click wheel). The contact/motion module 130 includes various software components for performing various operations related to the detection of contact, such as determining whether contact has occurred (e.g., detecting a finger down event), determining whether there is movement of the contact and tracking movement across the touch-sensitive surface (e.g., detecting one or more finger drag events), and determining whether the contact has ceased (e.g., detecting a finger up event or a contact break). The contact/motion module 130 receives contact data from the touch-sensitive surface. Determining movement of the point of contact (which is represented by a series of contact data) may include determining a velocity (amplitude), a speed (amplitude and direction), and/or an acceleration (change in amplitude and/or direction) of the point of contact. These operations may be applied to a single contact (e.g., one finger contact), or multiple simultaneous contacts (e.g., "multi-touch"/multi-finger contact). In some embodiments, the contact/motion module 130 and the display controller 156 detect contact on the touch pad. In some embodiments, the contact/motion module 130 and the controller 160 detect a contact on the click wheel.
The contact/motion module 130 may detect gestures input by the user. Different gestures on the touch-sensitive surface have different contact patterns. Thus, gestures may be detected by detecting a particular contact pattern. For example, detecting a finger click (tap) gesture includes: a finger down event is detected followed by a finger up (e.g., lift) event at the same location (or substantially the same location) as the location of the finger down event (e.g., at the icon location). As another example, detecting a finger swipe (swipe) gesture on a touch surface includes: a finger down event is detected, followed by one or more finger drag events, and then followed by a finger up (e.g., lift) event.
Graphics module 132 includes various known software components for rendering and displaying graphics on touch screen 112 or other display, including components for changing the brightness of the displayed graphics. As used herein, the term "graphic" includes any object that may be displayed to a user, including but not limited to: text, web pages, icons (such as user interface objects including soft keys), digital images, video, animations, and the like.
In some embodiments, graphics module 132 stores data representing graphics to be used. Each graphic may be assigned a corresponding code. Graphics module 132 receives one or more codes from an application or the like that specify graphics to be displayed, along with (if needed) coordinate data and other graphics attribute data, and then generates screen image data for output to display controller 156.
Text input module 134 (which may be a component of graphics module 132) provides a soft keyboard for inputting text into various applications (e.g., contacts 137, email 140, IM 141, browser 147, and any other application requiring text input).
The GPS module 135 determines the location of the device and provides this information for use by various applications (e.g., to the phone 138 for use in location-based dialing, to the camera 143 as picture/video metadata, and to applications that provide location-based services such as weather gadgets, local yellow pages gadgets, and map/navigation gadgets).
The application 136 may include the following modules (or instruction sets), or a subset or superset thereof:
a contacts module 137 (sometimes referred to as an address book or contact list);
a telephone module 138;
video conferencing module 139;
email client Module 140
An Instant Messaging (IM) module 141;
exercise support module 142;
a camera module 143 for still and/or video images;
an image management module 144;
video player module 145;
music player module 146;
browser module 147;
calendar module 148;
A gadget module 149, which may include one or more of the following: weather gadgets 149-1, stock gadgets 149-2, calculator gadget 149-3, alarm gadget 149-4, dictionary gadget 149-5, and other gadgets obtained by the user, and user-created gadgets 149-6;
a gadget creator module 150 for making user-created gadgets 149-6;
search module 151;
a video and music player module 152 that incorporates a video player module 145 and a music player module 146;
a memo module 153;
map module 154; and/or
An online video module 155.
Examples of other applications 136 that may be stored in the memory 102 include other word processing applications, other image editing applications, drawing applications, presentation applications, JAVA-enabled applications, encryption, digital rights management, voice recognition, and voice replication.
The contact module 137 may be used in conjunction with the touch screen 112, the display controller 156, the contact module 130, the graphics module 132, and the text input module 134 to manage an address book or contact list (e.g., in the application internal state 192 of the contact module 137 stored in the memory 102 or the memory 370), including: adding the name(s) to the address book; deleting name(s) from the address book; associating phone number(s), email address(s), physical address(s), or other information with the name; associating the image with a name; classifying and sorting names; a telephone number or email address is provided to initiate and/or facilitate communications through telephone 138, video conference 139, email 140, or IM 141, etc.
Telephone module 138 may be used in conjunction with RF circuitry 108, audio circuitry 110, speaker 111, microphone 113, touch screen 112, display controller 156, contact module 130, graphics module 132, and text input module 134 to enter a sequence of characters corresponding to a telephone number, access one or more telephone numbers in address book 137, modify telephone numbers that have been entered, dial a corresponding telephone number, conduct a conversation, and disconnect or hang up when the conversation is completed. As described above, wireless communication may use any of a variety of communication standards, protocols, and technologies.
In conjunction with RF circuitry 108, audio circuitry 110, speaker 111, microphone 113, touch screen 112, display controller 156, optical sensor 164, optical sensor controller 158, contact module 130, graphics module 132, text input module 134, contact list 137, and telephony module 138, videoconferencing module 139 includes executable instructions for initiating, conducting, and terminating a videoconference between a user and one or more other participants according to user instructions.
In conjunction with RF circuitry 108, touch screen 112, display controller 156, contact module 130, graphics module 132, and text input module 134, email client module 140 includes executable instructions for creating, sending, receiving, and managing emails in response to user instructions. In conjunction with the image management module 144, the email client module 140 makes it very easy to create and send emails with still or video images captured with the camera module 143.
In conjunction with RF circuitry 108, touch screen 112, display controller 156, contact module 130, graphics module 132, and text input module 134, instant messaging module 141 includes executable instructions for entering a sequence of characters corresponding to an instant message in order to modify previously entered characters in order to transmit the corresponding instant message (e.g., using a Short Message Service (SMS) or Multimedia Message Service (MMS) protocol for a phone-based instant message, or using XMPP, SIMPLE, or IMPS for an internet-based instant message) in order to receive the instant message, and in order to view the received instant message. In some embodiments, the transmitted and/or received instant message may include graphics, photographs, audio files, video files, and/or other attachments, as supported by an MMS and/or Enhanced Messaging Service (EMS). As used herein, "instant messaging" refers to telephone-based messages (e.g., messages sent using SMS or MMS) and internet-based messages (e.g., messages using XMPP, SIMPLE, or IMPS).
In conjunction with RF circuitry 108, touch screen 112, display controller 156, contact module 130, graphics module 132, text input module 134, GPS module 135, map module 154, and music player module 146, exercise support module 142 includes executable instructions for: creating an workout (e.g., with time, distance, and/or calorie burn targets); communicate with an exercise sensor (exercise device); receiving exercise sensor data; calibrating a sensor for detecting exercise; selecting and playing music for exercise; and displaying, storing and transmitting the exercise data.
In conjunction with touch screen 112, display controller 156, optical sensor(s) 164, optical sensor controller 158, contact module 130, graphics module 132, and image management module 144, camera module 143 includes executable instructions for: capturing still images or video (including video streams) and storing them in the memory 102, modifying characteristics of the still images or video, or deleting the still images or video from the memory 102.
In conjunction with touch screen 112, display controller 156, contact module 130, graphics module 132, text input module 134, and camera module 143, image management module 144 includes executable instructions for arranging, modifying (e.g., editing), or otherwise manipulating, marking, deleting, presenting (e.g., in a digital slide presentation or album), and storing still and/or video images.
In conjunction with touch screen 112, display controller 156, contact module 130, graphics module 132, audio circuitry 110, and speaker 111, video player module 145 includes executable instructions for displaying, rendering, or otherwise playing back video (e.g., on touch screen 112 or on a display externally connected via external port 124).
In conjunction with touch screen 112, display controller 156, contact module 130, graphics module 132, audio circuit 110, speaker 111, rf circuit 108, and browser module 147, music player module 146 includes executable instructions that allow a user to download and playback recorded music as well as other sound files stored in one or more file formats (such as MP3 or AAC files). In some embodiments, the device 100 may include the functionality of an MP3 player, such as an iPod (trademark of Apple Inc.).
In conjunction with RF circuitry 108, touch screen 112, display system controller 156, touch module 130, graphics module 132, and text input module 134, browser module 147 includes executable instructions for browsing the internet (including searching, linking, receiving, and displaying web pages or portions of web pages, as well as attachments and other files linked to web pages) according to user instructions.
In conjunction with RF circuitry 108, touch screen 112, display system controller 156, contact module 130, graphics module 132, text input module 134, email client module 140, and browser module 147, calendar module 148 includes executable instructions for creating, displaying, modifying, and storing calendars and data associated with calendars (e.g., calendar entries, to-do lists, etc.) according to user instructions.
In conjunction with the RF circuitry 108, touch screen 112, display system controller 156, contact module 130, graphics module 132, text input module 134, and browser module 147, gadget module 149 is a gadget that may be downloaded and used by a user (e.g., weather gadget 149-1, stock gadget 149-2, calculator gadget 149-3, alarm gadget 149-4, and dictionary gadget 149-5), or a gadget created by a user (e.g., user created gadget 149-6). In some embodiments, gadgets include HTML (hypertext markup language) files, CSS (cascading style sheet) files, and JavaScript files. In some embodiments, gadgets include XML (extensible markup language) files and JavaScript files (e.g., yahoo.
In conjunction with RF circuitry 108, touch screen 112, display system controller 156, contact module 130, graphics module 132, text input module 134, and browser module 147, gadget creator module 150 may be used by a user to create a gadget (e.g., to transform a user-specified portion of a web page into a gadget).
In conjunction with touch screen 112, display system controller 156, contact module 130, graphics module 132, and text input module 134, search module 151 includes executable instructions for searching for text, music, sound, images, video, and/or other files in memory 102 that match one or more search criteria (e.g., one or more user-specified search terms) according to user indications.
In conjunction with touch screen 112, display controller 156, contact module 130, graphics module 132, and text input module 134, memo module 153 includes executable instructions to create and manage memos, calendars, and the like, according to user instructions.
In conjunction with the RF circuitry 108, touch screen 112, display system controller 156, contact module 130, graphics module 132, text input module 134, GPS module 135, and browser module 147, map module 154 may be used to receive, display, modify, and store maps and data associated with maps (e.g., driving directions; data regarding shops and other points of interest at or near a particular location; and other location-based data) as directed by a user.
In conjunction with touch screen 112, display system controller 156, contact module 130, graphics module 132, audio circuit 110, speaker 111, RF circuit 108, text input module 134, email client module 140, and browser module 147, online video module 155 includes instructions that allow a user to access, browse, receive (e.g., through streaming and/or download), play back a particular online video (e.g., on a touch screen or on a display externally connected via external port 124), send an email with a link to the particular online video, and manage online video in one or more file formats such as h.264. In some embodiments, the instant messaging module 141 is used instead of the email client module 140 to send links to specific online videos. Additional description of online video applications can be found in the following U.S. patent applications: U.S. provisional patent application Ser. No. 60/936,562, filed on even date 20 at 6 months of 2007, "Portable Multifunction Device, method, and Graphical User Interface for Playing Online Videos," and U.S. patent application Ser. No. 11/968,067, filed on even date 31 at 12 months of 2007, "Portable Multifunction Device, method, and Graphical User Interface for Playing Online Videos," the contents of which are incorporated herein by reference in their entirety.
Each of the above-described modules and applications corresponds to a set of instructions for performing one or more of the functions described above as well as the methods described in the present disclosure (e.g., the computer-implemented methods described herein, as well as other information processing methods). These modules (i.e., sets of instructions) need not be implemented as separate software programs, procedures, or modules, and thus various subsets of these modules may be combined or otherwise rearranged in various embodiments. For example, the video player module 145 may be combined with the music player module 146 into a single module (e.g., the video and music player module 152 of fig. 1B). In some embodiments, memory 102 may store a subset of the modules and data structures described above. In addition, the memory 102 may store other modules and data structures not described above.
In some embodiments, device 100 is a device that performs operations of a predefined set of functions on the device exclusively through a touch screen and/or touchpad. By using a touch screen and/or a touch pad as the primary input control device for operation of the device 100, the number of physical input control devices (such as push buttons, dials, etc.) on the device 100 may be reduced.
The predefined set of functions that may be performed exclusively by the touch screen and/or the touch pad include navigation between user interfaces. In some embodiments, the touchpad, when touched by a user, navigates the device 100 from any user interface that may be displayed on the device 100 to a home page, home screen, or root menu. In such embodiments, the touch pad may be referred to as a "menu button". In some other embodiments, the menu buttons may be physical push buttons or other physical input control devices, rather than touch pads.
FIG. 1C is a block diagram illustrating exemplary components for event processing according to some embodiments. In some embodiments, memory 102 (in FIGS. 1A and 1B) or 370 (FIG. 3) includes event sorter 170 (e.g., in operating system 126) and corresponding applications 136-1 (e.g., any of the aforementioned applications 137-151, 155, 380-390).
The event classifier 170 receives event information and determines the application 136-1 and the application view 191 in the application 136-1 to which the event information is delivered. The event classifier 170 includes an event monitor 171 and an event dispatcher (event dispatcher) module 174. In some embodiments, the application 136-1 includes an application internal state 192 that indicates the current application view(s) displayed on the touch-sensitive display 112 when the application is active or executing. In some embodiments, the device/global content state 157 is used by the event classifier 170 to determine which application or applications are currently active, and the application internal state 192 is used by the event classifier 170 to determine the application view 191 to which to deliver event information.
In some embodiments, the application internal state 192 includes additional information, such as one or more of the following: restoration information to be used when the application 136-1 resumes execution, user interface state information indicating information that the application 136-1 is displaying or is ready to display, a state queue that enables the user to return to a previous state or view of the application 136-1, and a redo/undo queue of previous actions performed by the user.
Event monitor 171 receives event information from peripheral interface 118. The event information includes information about sub-events (e.g., user touches on the touch sensitive display 112 as part of a multi-touch gesture). The peripheral interface 118 transmits information it receives from the I/O subsystem 106 or sensors, such as proximity sensor 166, accelerometer(s) 168, and/or microphone 113 (via audio circuitry 110). The information received by the peripheral interface 118 from the I/O subsystem 106 includes information from the touch-sensitive display 112 or touch-sensitive surface.
In some embodiments, event monitor 171 sends requests to peripheral interface 118 at predetermined intervals. In response, the peripheral interface 118 sends event information. In other embodiments, the peripheral interface 118 transmits event information only when an important event occurs (e.g., an input exceeding a predetermined noise threshold and/or longer than a predetermined duration is received).
In some embodiments, event classifier 170 also includes hit view determination module 172 and/or active event recognizer determination module 173.
Hit view determination module 172 provides a software process for determining where sub-events occur in one or more views when touch sensitive display 112 displays more than one view. The view is made up of controls and other elements that are visible to the user on the display.
Another aspect of the user interface associated with an application is a set of views, sometimes referred to herein as application views or user interface windows, in which information is displayed and in which touch-based gestures occur. An application view (of the respective application) in which a touch is detected may correspond to a program or program level in a view hierarchy of the application. For example, the lowest level view in which a touch is detected may be referred to as a hit view, and the set of events identified as correctly entered may be determined based at least in part on the hit view of the initial touch that begins the touch-based gesture.
Hit view determination module 172 receives information regarding sub-events of touch-based gestures. When an application has multiple views organized in a hierarchy, hit view determination module 172 identifies the lowest level view in the hierarchy that should handle the sub-event as the hit view. In most cases, a hit view is the view of the lowest level in which the initiating sub-event (i.e., the first sub-event in the sequence of sub-events that forms an event or potential event) occurs. Once a hit view is identified by the hit view determination module, the hit view typically receives all sub-events related to the same touch or input source that caused it to be identified as a hit view.
The active event identifier determination module 173 determines a view or views in the view hierarchy that should receive a particular sequence of sub-events. In some embodiments, the active event identifier determination module 173 determines that only the hit view should receive a particular sequence of sub-events. In other embodiments, the active event identifier determination module 173 determines that all views including the physical location of the sub-event are active related (active live) views, and thus determines that all active related views should receive a particular sequence of sub-events. In other embodiments, even though the touch sub-event is fully defined to the region associated with one particular view, the view higher in the hierarchy will remain as the view effectively involved.
Event dispatcher module 174 dispatches event information to an event recognizer (e.g., event recognizer 180). In embodiments that include an active event recognizer determination module 173, the event dispatcher module 174 delivers event information to the event recognizers determined by the active event recognizer determination module 173. In some embodiments, the event dispatcher module 174 stores event information in an event queue, which is retrieved by the corresponding event receiver module 182.
In some embodiments, operating system 126 includes event classifier 170. Alternatively, application 136-1 includes event classifier 170. In other embodiments, the event sorter 170 is a separate module or is part of another module stored in the memory 102 (such as the contact/motion module 130).
In some embodiments, application 136-1 includes a plurality of event handlers 190 and one or more application views 191, each of which includes instructions for processing touch events that occur within a respective view of the user interface of the application. Each application view 191 of the application 136-1 includes one or more event recognizers 180. Typically, the corresponding application view 191 includes a plurality of event recognizers 180. In other embodiments, one or more of the event recognizers 180 are part of a separate module, such as a user interface suite (not shown), or a higher-level object from which the application 136-1 inherits methods and other properties. In some embodiments, the respective event handlers 190 include one or more of the following: the data updater 176, the object updater 177, the GUI updater 178, and/or event data 179 received from the event sorter 170. Event handler 190 may utilize or call data updater 176, object updater 177, or GUI updater 178 to update the application internal state 192. Alternatively, one or more of application views 191 include one or more corresponding event handlers 190. Also, in some embodiments, one or more of the data updater 176, the object updater 177, and the GUI updater 178 are included in a respective application view 191.
The corresponding event identifier 180 receives event information (e.g., event data 179) from the event classifier 170 and identifies events based on the event information. Event recognizer 180 includes event receiver 182 and event comparator 184. In some embodiments, event identifier 180 further includes at least a subset of: metadata 183 and event delivery instructions 188 (which may include sub-event delivery instructions).
Event receiver 182 receives event information from event sorter 170. The event information includes information about sub-events (e.g., touches or touch movements). Depending on the sub-event, the event information also includes additional information, such as the location of the sub-event. When a sub-event relates to movement of a touch, the event information may also include the speed and direction of the sub-event. In some embodiments, the event includes a rotation of the device from one orientation to another (e.g., a rotation from portrait to landscape, or vice versa), and the event information includes corresponding information about a current orientation of the device (also referred to as a device pose).
The event comparator 184 compares the event information with definitions of predefined events or sub-events and, based on the comparison, determines the event or sub-event, or determines or updates the state of the event or sub-event. In some embodiments, event comparator 184 includes event definition 186. The event definition 186 contains definitions of events (e.g., a predefined sequence of sub-events), such as event 1 (187-1), event 2 (187-2), and so forth. In some embodiments, sub-events in event 187 include, for example, touch start, touch end, touch move, touch cancel, and multi-touch. In one example, the definition of event 1 (187-1) is a double click on a display object. The double click includes, for example, a first touch (touch start) for a predetermined period of time to the display object, a first lift-off (touch end) for a predetermined period of time, a second touch (touch start) for a predetermined period of time to the display object, and a second lift-off (touch end) for a predetermined period of time. In another example, the definition of event 2 (187-2) is a drag on a display object. The drag includes, for example, a touch (or contact) to the display object for a predetermined period of time, movement of the touch across the touch-sensitive display 112, and lifting of the touch (touch end). In some embodiments, the event also includes information for one or more associated event handlers 190.
In some embodiments, the event definitions 187 include definitions of events for respective user interface objects. In some embodiments, event comparator 184 performs hit testing for determining a user interface object associated with a sub-event. For example, in an application view on touch-sensitive display 112 in which three user interface objects are displayed, when a touch is detected on touch-sensitive display 112, event comparator 184 performs a hit test to determine which of the three user interface objects (if any) are associated with the touch (sub-event). If each display object is associated with a respective event handler 190, the event comparator uses the results of the hit test to determine which event handler 190 should be activated. For example, event comparator 184 selects an event handler associated with the sub-event and object that triggered the hit test.
In some embodiments, the definition of each event 187 also includes a delay action that delays the delivery of event information until it has been determined whether the sequence of sub-events corresponds to the event type of the event recognizer.
When the respective event recognizer 180 determines that the sequence of sub-events does not match any of the events in the event definition 186, the respective event recognizer 180 enters an event impossible, event failed, or event end state, after which the respective event recognizer 180 ignores subsequent sub-events of the touch based gesture. In this case, the other event recognizers (if any) that remain active for hit views continue to track and process sub-events of the in-progress touch-based gesture.
In some embodiments, the respective event recognizer 180 includes metadata 183 with configurable attributes, flags (flags), and/or lists that indicate how the event delivery system should perform sub-event delivery to the effectively involved event recognizer. In some embodiments, metadata 183 includes configurable attributes, flags, and/or lists that indicate how event recognizers may interact with each other. In some embodiments, metadata 183 includes configurable attributes, flags, and/or lists that indicate whether sub-events are delivered to different levels in a view or program hierarchy.
In some embodiments, the respective event recognizer 180 activates an event handler 190 associated with the event when one or more sub-events of the event are recognized. In some embodiments, the respective event identifier 180 delivers event information associated with the event to the event handler 190. Activating event handler 190 is distinct from sending (or deferring the sending of) sub-events to the corresponding hit view. In some embodiments, event recognizer 180 throws a flag associated with the recognized event and event handler 190 associated with the flag captures the flag and performs a predefined process.
In some embodiments, the event delivery instructions 188 include sub-event delivery instructions that deliver event information about sub-events without activating the event handler. Instead, the sub-event delivery instruction delivers event information to an event handler associated with a series of sub-events or views that are effectively involved. An event handler associated with a series of sub-events or with a view effectively involved receives the event information and performs a predetermined procedure.
In some embodiments, the data updater 176 creates and updates data used in the application 136-1. For example, the data updater 176 updates the phone number used in the contacts module 137, or stores video files used in the video player module 145. In some embodiments, object updater 177 creates and updates data used in application 136-1. For example, the object updater 177 creates a new user interface object or updates the location of the user interface object. GUI updater 178 updates the GUI. For example, the GUI updater 178 prepares the display information and sends it to the graphics module 132 for display on a touch-sensitive display.
In some embodiments, event handler(s) 190 includes or has access to data updater 176, object updater 177, and GUI updater 178. In some embodiments, the data updater 176, the object updater 177, and the GUI updater 178 are included in a single module of the respective application 136-1 or application view 191. In other embodiments, the data updater 176, the object updater 177, and the GUI updater 178 are included in two or more software modules.
It should be appreciated that the foregoing discussion of event handling of user touches on a touch sensitive display also applies to other forms of user inputs to operate the multifunction device 100 with input devices, where not all user inputs are initiated on a touch screen, e.g., having mouse movements and mouse button presses cooperate or do not cooperate with single or multiple keyboard presses or holds on a touch pad, user movements, clicks, drags, scrolls, etc., stylus inputs, movements of the device, verbal instructions, detected eye movements, biometric inputs, and/or any combination of the above, which may be used as inputs corresponding to defining sub-events to be identified.
Fig. 2 illustrates a portable multifunction device 100 with a touch screen 112 in accordance with some embodiments. The touch screen may display one or more graphics within the User Interface (UI) 200. In this embodiment, as well as other embodiments described below, a user may select one or more graphics by, for example, contacting the graphics with one or more fingers 202 (not drawn to scale in the figures) or one or more styluses (not drawn to scale in the figures). In some embodiments, selection of one or more graphics occurs when a user interrupts contact with one or more graphics. In some embodiments, the contact may include a gesture, such as one or more clicks, one or more swipes (left to right, right to left, up and/or down), and/or a rotation of a finger that has been in contact with the device 100 (rolling, right to left, left to right, up and/or down). In some embodiments, inadvertent contact with the graphic may not select the graphic. For example, when the gesture corresponding to the selection is a tap, a swipe gesture across the application icon may not select the corresponding application.
The device 100 may also include one or more physical buttons, such as a "home screen" or menu button 204. As previously described, menu button 204 may be used to navigate to any application 136 in the set of applications that may be executed on device 100. Alternatively, in some embodiments, the menu buttons are implemented as soft keys in a GUI displayed on touch screen 112.
In one embodiment, device 100 includes touch screen 112, menu buttons 204, a press button 206 for turning on/off the device power and locking the device, and volume adjustment button(s) 208, subscriber Identity Module (SIM) card slot 210, headphone jack 212, and docking/charging external port 124. Depressing the button 206 may be used to turn on/off the device power by depressing the button and holding the button depressed for a predefined time interval; locking the device by depressing the button and releasing the button before a predefined time interval has elapsed; and/or unlock the device or initiate an unlocking process. In an alternative embodiment, the device 100 may also accept verbal input for activating or deactivating some functions through the microphone 113.
FIG. 3 is a block diagram of an exemplary multifunction device with a display and a touch-sensitive surface in accordance with some embodiments. The device 300 need not be portable. In some embodiments, device 300 is a laptop computer, a desktop computer, a tablet computer, a multimedia player device, a navigation device, an educational device (such as a child learning toy), a gaming system, or a control device (e.g., a home or industrial controller). The device 300 generally includes one or more processing units (CPUs) 310, one or more network or other communication interfaces 360, memory 370, and one or more communication buses 320 for interconnecting these components. Communication bus 320 may include circuitry (sometimes referred to as a chipset) that interconnects and controls communications between system components. The device 300 includes an input/output interface 330 that includes a display 340, which is typically a touch screen display. The input/output interface 330 may also include a keyboard and/or mouse (or other pointing device) 350 and a touchpad 355. Memory 370 includes high-speed random access memory such as DRAM, SRAM, DDR RAM, or other random access solid state memory devices; and may include non-volatile memory such as one or more magnetic disk storage devices, optical disk storage devices, flash memory devices, or other non-volatile solid state storage devices. Memory 370 may optionally include one or more storage devices located remotely from CPU(s) 310. In some embodiments, memory 370 stores programs, modules, and data structures similar to those stored in memory 102 of portable multifunction device 100 (fig. 1) or a subset thereof. Further, the memory 370 may store additional programs, modules, and data structures that are not present in the memory 102 of the portable multifunction device 100. For example, memory 370 of device 300 may store drawing module 380, presentation module 382, word processing module 384, website creation module 386, disk authoring module 388, and/or spreadsheet module 390, while memory 102 of portable multifunction device 100 (fig. 1) may not store these modules.
Each of the above elements in fig. 3 may be stored in one or more of the aforementioned memory devices. Each of the above-described modules corresponds to a set of instructions for performing the functions described above. The above-described modules or programs (i.e., sets of instructions) need not be implemented as separate software programs, procedures or modules, and thus various subsets of these modules may be combined or otherwise rearranged in various embodiments. In some embodiments, memory 370 may store a subset of the modules and data structures described above. Further, the memory 370 may store additional modules and data structures not described above.
Attention is now directed to an embodiment of a user interface ("UI") that may be implemented on the portable multifunction device 100.
Fig. 4A and 4B illustrate exemplary user interfaces for application menus on the portable multifunction device 100 in accordance with some embodiments. A similar user interface may be implemented on device 300. In some embodiments, user interface 400A includes the following elements, or a subset or superset thereof:
signal strength indicator(s) 402 for wireless communication(s), such as cellular signals and Wi-Fi signals;
time 404;
Bluetooth indicator 405;
battery status indicator 406;
a tray 408 with icons of the following frequently used applications, such as:
an o-phone 138, which may include an indicator 414 of the number of missed calls or voice email messages;
o email client 140, which may include an indicator 410 of the number of unread emails;
o browser 147; and
o music player 146; and
icons of other applications such as:
o IM 141;
o image management 144;
an o camera 143;
o video player 145;
o weather 149-1;
o stock 149-2;
o exercise support 142
o calendar 148;
o calculator 149-3;
o-alarm clock 149-4
o dictionary 149-5; and
o user created gadgets 149-6.
In some embodiments, user interface 400B includes the following elements, or a subset or superset thereof:
402, 404, 405, 406, 141, 148, 144, 143, 149-3, 149-2, 149-1, 149-4, 410, 414, 138, 140, and 147, as described above;
map 154;
memo 153; the method comprises the steps of carrying out a first treatment on the surface of the
Settings 412 that provide access to settings of the device 100 and its various applications 136, as described further below;
video and music player module 152, also referred to as iPod (trademark of apple inc) module 152; and
The online video module 155, also referred to as the YouTube (trademark of Google corporation) module 155.
Fig. 4C illustrates an exemplary user interface on a device (e.g., device 300, fig. 3) having a touch-sensitive surface 451 (e.g., tablet or touchpad 355, fig. 3) separate from a display 450 (e.g., touch screen display 112). While many of the examples below will be given with reference to inputs on a touch screen display 112 (where the touch sensitive surface is combined with the display), in some embodiments the device detects inputs on a touch sensitive surface separate from the display, as shown in fig. 4C. In some embodiments, the touch-sensitive surface (e.g., 451 in fig. 4C) has a primary axis (e.g., 452 in fig. 4C) that corresponds to the primary axis (e.g., 453 in fig. 4C) on the display (e.g., 450). According to these embodiments, the device detects contact (e.g., 460 and 462 in fig. 4C) with the touch-sensitive surface 451 at locations corresponding to respective locations on the display (e.g., 460 corresponds to 468 and 462 corresponds to 470 in fig. 4C). In this way, when the touch-sensitive surface is separated from the display, user inputs (e.g., contacts 460 and 462 and movements thereof) detected by the device on the touch-sensitive surface (e.g., 451 in fig. 4C) are used by the device to manipulate a user interface on the display (e.g., 450 in fig. 4C) of the multifunction device. It should be appreciated that similar approaches may be used for other user interfaces described herein.
Additionally, while the following examples are given primarily with reference to finger inputs (e.g., finger contacts, finger tap gestures, finger swipe gestures), it should be understood that in some embodiments one or more of the finger inputs may be replaced by inputs from another input device (e.g., mouse-based inputs or stylus inputs). For example, a swipe gesture may be replaced with a mouse click (e.g., instead of a contact) followed by moving a cursor along a swipe path (e.g., instead of moving a contact). As another example, the tap gesture may be replaced with a mouse tap (e.g., instead of detecting a contact and then ceasing to detect the contact) where the cursor is located over the tap gesture's location. Similarly, when multiple user inputs are detected simultaneously, it should be understood that multiple computer mice may be used simultaneously, or that the mice and finger contacts may be used simultaneously.
Attention is now directed to embodiments of a user interface ("UI") and associated processes that may be implemented on a multifunction device having a display and a touch-sensitive surface, such as device 300 or portable multifunction device 100.
Fig. 5A-5 LLLL illustrate an exemplary user interface for creating and managing folders containing one or more of the selectable user interface objects in accordance with some embodiments. The user interfaces in these figures are used to illustrate the processes described below, including the processes of fig. 6A to 6E, 7A to 7C, 8A to 8C, 9A to 9B, 10A to 10B, 11A to 11C, 12A to 12E, and 13A to 13E.
Turning now to fig. 5A, fig. 5A illustrates a multifunction device (e.g., 100 in fig. 5A-5N, 5P-5 LLLL) having a touch screen (e.g., 112 in fig. 5A-5N, 5P-5 LLLL) display (e.g., 166 in fig. 5A-5N, 5P-5 LLLL) in some embodiments, the multifunction device 100 further includes a speaker (e.g., 111 in fig. 5A-5N, 5P-5 LLLL), a microphone (e.g., 113 in fig. 5A-5N, 5P-5 LLLL), one or more optical sensors (e.g., 164 in fig. 5A-5N, 5P-5 LLLL), a proximity sensor (e.g., 166 in fig. 5A-5N, 5P-5 LLLL), one or more accelerometers (e.g., 168 in fig. 5A-5N, 5P-5 LLLL), as described in more detail above.
In some embodiments, the multifunction device 100 displays a plurality of notification icons, such as signal strength indicator(s) for wireless communication(s) such as cellular signals and Wi-Fi signals (e.g., 402 in fig. 5A-5N, 5P-5 LLLL); a time indicator (e.g., 404 in fig. 5A-5N, 5P-5 LLLL); bluetooth indicator (e.g., 405 in fig. 5A-5N, 5P-5 LLLL); battery status indicators (e.g., 406 in fig. 5A-5N, 5P-5 LLLL). According to some embodiments, the multifunction device also displays a plurality of selectable user interface objects (e.g., application icon 5002 and folder icon 5004 in fig. 5A-5 LLLL). In some embodiments, one or more of the selectable user objects are displayed in a tray (e.g., application icon 5006 in fig. 5A-5N, 5P-5 LLLL), also sometimes referred to as a dock. In some embodiments, selectable user objects (e.g., application icons and/or folder icons) outside the tray are part of multiple groups/pages of selectable user interface objects, where each group/page of selectable user interface objects includes a different plurality of selectable user interface objects. However, in some embodiments, the tray 5006 does not change when the multifunction device switches from the first group/first page selectable user interface object to the second group/second page selectable user interface object.
Attention is now directed to fig. 5A, which fig. 5A includes a plurality of selectable user interface (user interface) objects, including a plurality of action icons 5002 and a plurality of folder icons 5004. For example, in FIG. 5A, the action icons 5002 include a plurality of application icons (e.g., photo application icon 5002-1, clock application icon 5002-2, browser application icon 5002-3, card application icon 5002-4, weather application icon 5002-5, memo application icon 5002-6, text application icon 5002-7, map application icon 5002-8, stock application icon 5002-9, camera application icon 5002-12, racing application icon 5002-13, email application icon 5002-14, phone application icon 5002-15, and iPod application icon 5002-16) for activating a web browser and displaying a bookmarked web page, and a document icon 5002-11 for activating a document viewing/editing application to display a document associated with document icon 5002-11.
The folder icons in FIG. 5A (e.g., 5004-1-a and 5004-2 in FIG. 5A) are icons that may be activated to display a folder view. In some embodiments, the folder icons 5004 each include multiple reduced scale representations of selectable object indicators associated with the folder (e.g., reduced scale representations "x1", "x2", "x3", "x4", "x5", and "x6" for folder icon 5004-1 and reduced scale representations "z1", "z2", "z3", "z4", "z5", and "z 6") for folder icon 5004-2. It should be appreciated that, according to some embodiments, displaying the folder view includes displaying an area including a plurality of selectable user interface icons (e.g., action icons 5002). In fig. 5A, the device is in a normal operation mode. In other words, selecting one of the action icons will activate an application (e.g., launch an application that is not currently running on the device or display a view of an application that is currently running on the device). In some embodiments, the device detects a request to enter a user interface reconfiguration mode. For example, in fig. 5A, the device detects contact 5008 with the card application icons 5002-4 for more than a predetermined period of time (e.g., 2 seconds) and in response, the device enters a user interface reconfiguration mode, as shown in fig. 5B.
In fig. 5B, the device has entered a user interface reconfiguration mode. In some embodiments, selectable user interface objects (e.g., 5002 and 5004) display visual indications that the device is in a user interface reconfiguration mode. For example, as shown in fig. 5B, the selectable user interface objects are gently swirled as if they were floating on water (e.g., each respective selectable user interface object is oscillated on the display around a respective average position of the selectable user interface object). Additionally, in some embodiments, at least some of the selectable user interface objects are associated with an object removal flag (e.g., 5010 in fig. 5B) while in the user interface reconfiguration mode, and the selectable user interface objects associated with the object removal flag (e.g., weather application icons 5002-5 in fig. 5B) are removed (e.g., deleted) from the user interface when the device detects activation of the object removal flag (e.g., click gesture 5011 in fig. 5B).
Additionally, in some embodiments, one or more of the folder icons change appearance when the device enters a user interface reconfiguration mode. For example, in FIG. 5A, when the device is in the normal operation mode, the folder icon 5004-1-a is displayed with a first plurality of reduced-size representations (e.g., "x1", "x2", "x3", "x4", "x5", "x 6") of the selectable user interface object in the folder icon 5004-1-a in FIG. 5A, while in FIG. 5B, after the device enters the user interface reconfiguration mode, a second plurality of reduced-size representations (e.g., "x4", "x5", "x6" and "x 7") of the selectable user interface object are displayed within the folder icon (e.g., folder icon 5004-1-B in FIG. 5B). Additionally, in some embodiments, one or more of the folder icons (e.g., 5004-1-B in FIG. 5B) includes a notification flag 5012 indicating that an application associated with one of the selectable user interface objects in the folder has a notification. In some embodiments, one of the scaled-down representations (e.g., "x7" in folder icon 5004-1-b) has its own notification flag 5014 indicating that the application associated with that scaled-down representation has a notification. Typically, the notification is an indication that the application requires the attention of the device user (e.g., because a new message has arrived, or a new event has occurred, an update is available, etc.).
Attention is now directed to fig. 5B-5F, which illustrate exemplary user interfaces for creating new folders in accordance with some embodiments. In fig. 5B, the device detects a request to move a corresponding selectable user interface object to the edge of the screen. In this example, the request includes a contact 5016-a on the touch sensitive surface (e.g., touch screen 112) at a location corresponding to the "racing" action icon 5002-13 and a subsequent movement 5018 of the contact to the edge of the touch sensitive surface (e.g., to contact location 5016-b on touch screen 112, as shown in FIG. 5C). In fig. 5C, action icons 5002-13 are moved to the edge of the display (e.g., touch screen 112) and remain at the edge of the display for more than a predetermined time threshold (e.g., 1 second). In response to detecting that the action icons 5002-13 last beyond a predetermined time threshold at an edge of the display (e.g., touch screen 112), the device navigates to a next group/page of selectable user interface objects (e.g., as shown in fig. 5D).
In some embodiments, the device maintains multiple pages of selectable user interface objects while the device is in a normal operating mode. In some of these embodiments, when the device enters the user interface reconfiguration mode, the device creates additional pages of the newly created empty folder. For example, in FIG. 5D there is no action icon 5002 and no filled folders, so the device displays a folder icon 5004-3 for an empty folder (e.g., a folder that does not contain selectable user interface objects). In some embodiments, the folder icon (e.g., 5004-3) for an empty folder has a different appearance than the folder icon for a filled folder (e.g., a folder containing one or more selectable user interface objects).
In FIG. 5D, the device detects a request to move a selectable user interface object 5002-13 to a folder icon for an empty folder 5004-3. In the example shown in fig. 5D, the request includes movement 5020 of the contact 5016 from a contact position 5016-b near the edge of the display (e.g., touch screen 112) to a contact position (e.g., 5016-c in fig. 5E) near the folder icon 5004-3 for the newly created empty folder. In response to a request to move a selectable user interface object, the device moves the selectable user interface object from a position (e.g., 5002-13 in FIG. 5D) proximate to an edge of a display (e.g., touch screen 112) to a position proximate to or overlapping an activation region for a folder icon 5004-3 or for a newly created empty folder.
In some embodiments, the device detects termination of an input requesting movement of the selectable user interface object 5002-13 (e.g., lifting of the contact 5016-c from the touch screen 112) when the selectable user interface object 5002-13 is proximate to or overlapping the folder icon 5004-3 for the empty folder. In response to detecting termination of the input, i.e., lifting of the contact from the touch-sensitive surface (e.g., touch screen 112), the device adds a selectable user interface object 5002-13 to the folder associated with folder icon 5004-3, as illustrated in FIG. 5F. In some embodiments, the device changes the appearance of the folder icon when the device adds a selectable user interface object to the folder associated with the folder icon 5004-3. For example, in FIG. 5F, the folder icon 5004-3 displays a scaled down representation of the selectable user interface object 5002-13 added to the folder associated with the folder icon 5004-3.
In some embodiments, the device always displays an empty folder while in the user interface reconfiguration mode. For example, in FIG. 5F, once the previously empty folder (e.g., the folder associated with folder icon 5004-3) is populated (e.g., after selectable user interface objects 5002-13 have been added to the folder, as shown in FIGS. 5C-5E), the device creates a subsequently newly empty folder and displays the folder icon (e.g., 5004-4 in FIG. 5F) associated with the subsequently newly empty folder on the display (e.g., touch screen 112). Thus, a new empty folder can be created by the user by simply filling in the previously empty folder. In some embodiments, when the device resumes normal operation mode, what folder icon (e.g., folder icon 5004-4) associated with the empty folder is no longer displayed by the device. For example, in fig. 5F, the device detects a request to return to the normal operating mode (e.g., a press input 5022 on the home screen button 204 in fig. 5F). In response to a request to return to the normal operating mode, the device reverts to the normal operating mode and stops displaying the empty folder (e.g., 5004-4 in FIG. 5F) on the display (e.g., touch screen 112) as shown in FIG. 5G.
Attention is now directed to fig. 5H-5L, which illustrate exemplary user interfaces for creating new folders in accordance with some embodiments. In some embodiments, the device enters a user interface reconfiguration mode (e.g., as described in more detail above with reference to fig. 5A). In some embodiments, when the device enters the user interface reconfiguration mode, a new folder creation element (e.g., new folder creation area 5024 in fig. 5H) is displayed on a display (e.g., touch screen 112). In some embodiments, when the device enters user interface reconfiguration mode, selectable user interface icons on the display (e.g., touch screen 112) move toward each other (e.g., decrease the amount of empty space between selectable user interface objects along at least one axis, such as a vertical axis), thereby making room for the newly created folder creation element(s). In some embodiments, the device detects a request to add a new folder (e.g., a tap gesture 5026 at a location on the touch screen 112 that corresponds to the new folder creation area 5024). In response to a request to create a new folder, the device creates the new folder and displays a folder icon (e.g., 5004-5 in FIG. 5I) on a display (e.g., touch screen 112). In some embodiments, a folder icon (e.g., 5004-5 in FIG. 5I) for the newly created folder is displayed at the first available location in accordance with a predefined arrangement of selectable user interface objects.
In some embodiments, the new folder creation element is represented as an area (e.g., area 5028 in fig. 5I) that contains a folder creation icon (e.g., 5030 in fig. 5I) that is visually similar to the new folder icon. In some embodiments, the device detects a request to create a folder. For example, as shown in fig. 5I, the device detects a contact 5032 on a touch-sensitive surface (e.g., touch screen 112) at a location corresponding to the location of a folder creation icon 5030 on a display (e.g., touch screen 112) and subsequent movement 5034 of the contact away from the folder creation area. In some embodiments, the device displays an animation of the folder moving away from the folder creation element according to the contacted movement 5034. In other words, it appears as if the new folder icon (e.g., 5004-6 in FIG. 5J) has been dragged out of the folder creation area 5028. In response to detecting a request to create a new folder, the device creates the new folder and associates the new folder with the folder icon 5004-6 dragged out of the folder creation area.
In some embodiments, the device receives a request to add one or more of the selectable user interface objects (e.g., action icons 5002-6) to one of the newly created folders. For example, in FIG. 5J, the device detects contact 5035 with action icons 5002-6 and subsequent movement 5036 of the contact to one of the newly created folders 5004-5. In some embodiments, in response to a request to add a selectable user interface object to a newly created folder, the device adds the selectable user interface object to the folder and removes the selectable user interface object from the display. In some embodiments, after adding the selectable user interface object to the folder, the device modifies a folder icon associated with the folder to which the selectable user interface object is added. For example, in FIG. 5K, the device has added action icon 5002-6 to the folder associated with folder icon 5004-5, and folder icon 5004-5 has been updated to display a scaled down representation of action icon 5002-6 added to the folder (e.g., "N" in folder icon 5004-5).
In some embodiments, when the device resumes to the normal operating mode, what folder icons (e.g., folder icons 5004-6) associated with the empty folders are no longer displayed by the device. For example, in fig. 5K, the device detects a request to return to the normal operating mode (e.g., pressing input 5038 on home screen button 204 in fig. 5K). In response to a request to return to the normal operating mode, the device reverts to the normal operating mode and stops displaying the empty folder (e.g., 5004-6 in FIG. 5K) on the display (e.g., touch screen 112) as shown in FIG. 5L. However, it should be appreciated that according to some embodiments, any folder icons representing folders to which one or more selectable user interface objects have been added continue to be displayed. For example, in FIG. 5L, after the device has returned to normal operation mode, folders 5004-5 continue to be displayed. Additionally, in some embodiments, when the device reverts to the normal operating mode, the selectable user interface objects are rearranged, thereby eliminating any gaps in the arrangement. For example, in FIG. 5K, the selectable user interface object is in a first arrangement with a void in which was the location where memo application icons 5004-6 were located (e.g., as shown in FIG. 5J), while in FIG. 5L the selectable user interface object has been rearranged to eliminate the void.
Attention is now directed to fig. 5M-5O, with fig. 5M-5O showing an exemplary user interface for creating new folders in accordance with some embodiments. In some embodiments, the device receives a folder creation request while in the user interface reconfiguration mode that corresponds to movement of one of the selectable user interface objects toward another one of the selectable user interface objects. For example, in fig. 5M, the device detects contact (e.g., 5040-a) on the touch-sensitive surface (e.g., touch screen 112) at a location corresponding to the location of a first action icon (e.g., card application icon 5002-4) on the display (e.g., touch screen 112) and detects subsequent movement of the contact (e.g., movement from first location 5040-a in fig. 5M to second location 5040-b in fig. 5N on touch screen 112), which corresponds to movement of first action icon 5002-4 to second action icon 5002-13, as shown in fig. 5N. In some embodiments, the device displays an indication of the impending creation of the folder (e.g., by highlighting the second action icon 5002-13, as shown in FIG. 5N). In some embodiments, the device creates a folder that includes a first action icon and a second action icon after detecting termination of the input (e.g., detecting lifting of contact 5040-b). In some embodiments, the device creates a folder that includes the first action icon and the second action icon upon detecting that the input meets predefined folder creation criteria (e.g., when the first action icon 5002-4 is adjacent to or on top of the second action icon 5002-13, the contact pauses for more than a predetermined period of time).
In some embodiments, in conjunction with creating a folder, the device displays a new folder icon associated with the folder. In some embodiments, the new folder icon includes a scaled down representation of the user interface object added to the folder. In some embodiments, folders are named based on descriptors of the first selectable user interface object (e.g., action icons 5002-4) and/or the second selectable user interface object (e.g., action icons 5002-13). For example, FIG. 5O illustrates the device receiving input including a request to create a folder that includes a first action icon (e.g., card application icon 5002-4) and a second action icon (e.g., racing application icon 5002-13), the input including a contact 5044 on the touch-sensitive surface at a location corresponding to (e.g., touch screen 112) the location of the first action icon 5002-4 on the display (e.g., touch screen 112) and a subsequent movement 5046 of the contact 5044 to a location proximate to (or on top of) the second action icon 5002-13. In response to this input, the device creates a new folder and displays a "game" folder icon 5004-7 for the new folder, the folder icon including scaled down representations (e.g., "O" and "R", respectively) of the first selectable user interface object and the second selectable user interface object. As another example, fig. 5O illustrates the device receiving input including a request to create a folder including a first action icon (e.g., car racing application icons 5002-17) and a second action icon (e.g., aviation racing application icons 5002-18), the input including a contact 5048 on a touch-sensitive surface (e.g., touch screen 112) at a location corresponding to a location of the first action icons 5002-4 on a display (e.g., touch screen 112) and a subsequent movement 5050 of the contact 5048 to a location proximate to (or on top of) the second action icons 5002-18. In response to this input, the device creates a new folder comprising scaled down representations (e.g., "r1" and "r2", respectively) of the first and second selectable user interface objects and displays "racing game" folder icons 5004-8 for the new folder. As another example, fig. 5O also shows that the device receives input including a request to create a folder that includes a first action icon (e.g., email application icons 5002-14) and a second action icon (e.g., phone application icons 5002-15), the input including a contact 5052 on a touch-sensitive surface (e.g., touch screen 112) at a location corresponding to the location of the first action icons 5002-14 on a display (e.g., touch screen 112) and a subsequent movement 5054 of the contact 5052 to a location proximate to (or on top of) the second action icons 5002-15. In response to this input, the device creates a new folder and displays "communication" folder icons 5004-9 for the new folder, including scaled down representations (e.g., "E" and "P", respectively) of the first and second selectable user interface objects.
As another example, fig. 5O also shows that the device receives input including a request to create a folder that includes a first action icon (e.g., camera application icons 5002-12) and a second action icon (e.g., stock application icons 5002-9), the input including a contact 5056 on a touch-sensitive surface (e.g., touch screen 112) at a location corresponding to the location of the first action icons 5002-12 on a display (e.g., touch screen 112) and a subsequent movement 5058 of the contact 5056 to a location proximate to (or on top of) the second action icons 5002-9. In response to this input, the device creates a new folder and displays "photography" folder icons 5004-10 for the new folder, including scaled-down representations (e.g., "C" and "S", respectively) of the first selectable user interface object and the second selectable user interface object. As another example, fig. 5O also shows that the device receives input including a request to create a folder that includes a first action icon (e.g., stock application icons 5002-9) and a second action icon (e.g., camera application icons 5002-12), the input including a contact 5060 on a touch-sensitive surface (e.g., touch screen 112) at a location corresponding to the location of the first action icon 5002-9 on a display (e.g., touch screen 112) and a subsequent movement 5062 of the contact 5060 to a location proximate to (or on top of) the second action icon 5002-12. In response to this input, the device creates a new folder and displays a "utility" folder icon 5004-11 for the new folder, the folder icon including scaled down representations (e.g., "S" and "C", respectively) of the first selectable user interface object and the second selectable user interface object.
Attention is now directed to fig. 5P-5R, which illustrate an exemplary user interface for renaming new folders in accordance with some embodiments. In some embodiments, the new folder is automatically named after it is created, as described above with reference to FIG. 5O. In some embodiments, the device displays a name validation dialog (e.g., 5064 in fig. 5P) immediately after creating the folder. In response to the detector confirmation input (e.g., tap gesture 5066 in fig. 5P at a location corresponding to the "confirm" icon on touch screen 112), the device confirms the creation of the folder and the automatically generated name, as shown in fig. 5S. In response to the detector cancel input (e.g., corresponding to tap gesture 5068 at the location of the "cancel" icon on touch screen 112 in fig. 5P), the device cancels creation of the folder, as shown in fig. 5M. In response to detecting a rename input (e.g., tap gesture 5070 in FIG. 5P at a location corresponding to the "rename" icon on touch screen 112), the device displays a dialog 5072, which dialog 5072 is used to change the name of the newly created folder (e.g., from "game" to "entertainment" as shown in FIG. 5Q) and to confirm the name change (e.g., by detecting tap gesture 5074 at a location corresponding to the location of the "confirm" icon on touch screen 112). The device displays a folder icon (e.g., 5004-7 in FIG. 5R) associated with the new name for the newly created folder.
Attention is now directed to fig. 5S-5Q, with fig. 5S-5Q showing an exemplary user interface for displaying a folder view, in accordance with some embodiments. In some embodiments, the device displays a folder view of the folder associated with the folder icon (e.g., 5004-7) in response to detecting a request to activate the folder icon (e.g., tap gesture 5076 in FIG. 5S). In response to detecting the request, the device displays a folder view (e.g., as shown in any one of FIGS. 5T,5U, 5V-5W, or 5X-5Y) on a display (e.g., touch screen 112). In some embodiments, the device automatically displays the folder view after creating the folder (e.g., transitioning directly from FIG. 5N to any of FIG. 5T,5U, 5V-5W, or 5X-5Y), or automatically displays the folder view after renaming the newly created folder (e.g., transitioning directly from FIG. 5Q to FIG. 5T,5U, 5V-5W, or 5X-5Y).
In fig. 5T, folder view 5078 includes an overlay that overlays at least a portion of touch screen 112 to obscure (e.g., hide or de-emphasize) selectable user interface objects displayed on touch screen 112. In some embodiments, selectable user interface objects that are not in the folder are at least partially faded, thereby drawing attention to the folder view (e.g., 5078 in FIG. 5T) while providing contextual feedback by indicating the placement of selectable user interface objects outside the folder view (e.g., 5078 in FIG. 5T). In some embodiments, the folder view (e.g., 5078 in FIG. 5T) includes selectable user interface objects (e.g., 5002-4 and 5002-13) that are added to the folder associated with the new folder icon 5004-7.
In fig. 5U, folder view 5080 includes a semi-transparent overlay that overlays all or substantially all of touch screen 112, obscures selectable user interface objects displayed on touch screen 112 and draws attention to the folder view (e.g., 5080 in fig. 5U) while providing contextual feedback by indicating the placement of selectable user interface objects outside the folder (e.g., including the location of folder icons 5004-7 for the folder within the placement). The folder view (e.g., 5080 in FIG. 5U) includes selectable user interface objects (e.g., 5002-4 and 5002-13) that are added to the folders associated with the new folder icon 5004-7.
Attention is now directed to fig. 5V-5Y, which illustrate an exemplary user interface for displaying an animated transition to a folder, in accordance with some embodiments. In some embodiments, the device displays a transitional animation that transitions from displaying a folder icon to displaying a folder view. For example, in FIG. 5V, the device displays an animation in response to receiving a request to display a folder view (e.g., tap gesture 5076 at a location on touch screen 112 corresponding to the location of folder icons 5004-7 in FIG. 5S). The exemplary animation in FIG. 5V includes displaying a plurality of selectable user interface objects (e.g., 5002-7, 5002-8 5002-10, 5002-11, etc.) dispersed from a display (e.g., touch screen 112) by moving toward an edge of the display (e.g., touch screen 112). In conjunction with the dispersion of the plurality of selectable user interface objects, the device displays a folder view 5082 that includes selectable user interface objects (e.g., 5002-4, 5002-13 in FIG. 5W) associated with the folder represented by the selected folder icon (e.g., 5004-7 in FIG. 5S) enlarged to populate the touch screen 112 as shown in FIG. 5W.
As another example of an animation transition, in fig. 5X, the device displays an animation in response to receiving a request to display a folder view (e.g., detecting a tap gesture 5076 on folder icon 5004-7 in fig. 5S). The exemplary animation in fig. 5X includes dividing the wallpaper into a first portion 5084 and a second portion 5086 and moving the second portion away from the first portion (e.g., as shown in fig. 5Y). In some embodiments, the first portion has an edge 5088, the edge 5088 having a profile complementary to a profile of an edge 5090 of the second portion. For example, in fig. 5X, an edge 5088 of the first portion 5084 is complementary to an edge 5090 of the second portion 5086.
It will be appreciated that in some embodiments, the first portion moves away from the second portion or the first and second portions move away from each other. In fig. 5Y, a folder view 5092 is displayed in the area between the first portion 5084 and the second portion 5086. In conjunction with movement of the first portion 5084 and the second portion 5086, the device displays on a display (e.g., touch screen 112) selectable user interface objects (e.g., 5002-4, 5002-13 in FIG. 5Y) associated with a folder represented by a selected folder icon (e.g., 5004-7 in FIG. 5S) within a folder view (e.g., 5092 in FIG. 5Y). In some embodiments, the animation includes displaying wallpaper splits to reveal selectable user interface objects in the folder (e.g., 5002-4, 5002-13 in FIG. 5Y) as if the wallpaper were a sliding door that slides out to display selectable user interface objects associated with the folder (e.g., 5002-4, 5002-13 in FIG. 5Y) from behind the wallpaper. In some embodiments, the selected folder icon 5004-7 associated with the folder view 5092 continues to be displayed while the folder view 5092 is displayed, as shown in FIG. 5Y. In some embodiments, the folder icon 5004-7 is visually distinguished from other selectable user interface objects (e.g., 5002-1, 5002-2, 5002-3, 5002-5, 5002-6, 5002-7, 5002-8, 5002-9, 5004-1-b, 5002-10, 5002-11, 5002-12 in FIG. 5Y). In some embodiments, the contour of the edge 5088 of the first portion or the contour of the edge of the second portion is adjusted such that the contours of the edges are no longer complementary. For example, in fig. 5X, an edge 5088 of the first portion 5084 and an edge 5090 of the second portion 5086 are complementary, having a cut-out notch 5094. However, continuing with the present example, after the portions have been moved apart from one another as shown in fig. 5Y, the edge 5088 of the first portion 5084 still has a cut-out notch 5094, while the edge 5090 of the second portion 5086 is straight and therefore the edges are no longer complementary. In some embodiments, the cut-out notch 5094 provides a visual indication of the location of the selected folder icon (5004-7 in fig. 5X and 5Y) within the arrangement of selectable user interface objects, as shown in fig. 5Y.
In some embodiments, the device detects a folder view exit input (e.g., detects a tap gesture 5096 on touch screen 112 at a location corresponding to a location outside of folder view 5092 in fig. 5Y), and in response to the folder exit input, the device stops displaying the folder view (e.g., as shown in fig. 5S). In some embodiments, the device detects a folder renaming input (e.g., a tap gesture 5098 is detected on a folder renaming area such as a button or text entry area or the name of the folder), and in response to the folder renaming input, the device provides a renaming interface (e.g., a soft keyboard that slides up from the bottom of the touch screen) that may be used to rename the folder.
Attention is now directed to fig. 5Y-5CC, which illustrate an exemplary user interface for canceling creation of a new folder in accordance with some embodiments. In some embodiments, if a cancel input is received, the folder creation operation is canceled. In some embodiments, canceling the input includes removing one of the selectable user interface objects from the folder immediately after creating the new folder with one or more selectable user interface objects (e.g., action icons 5002-4 and 5002-13). In some embodiments, the device detects input corresponding to a request to remove one of the selectable user interface objects from the folder. For example, in FIG. 5Y, the device detects input including a contact 5100 on a touch-sensitive surface (e.g., touch screen 112) at a location corresponding to one of selectable user interface objects 5002-4 in the folder and subsequent movement 5102 of the contact (e.g., movement from a first contact location 5100-a inside folder view 5092 on touch screen 112 in FIG. 5Y to a second contact location 5100-b outside folder view 5092 on touch screen 112 in FIG. 5Z). In response to detecting the input, the device removes the selectable user interface object from the folder and deletes the folder. For example, in FIG. 5Z, selectable user interface objects 5002-4 are outside of the folder, and in FIG. 5AA, selectable user interface objects 5002-4 are displayed outside of the folder after termination of the contact is detected.
In some embodiments, the folder icon is also updated to reflect changes in folder content. For example, in FIG. 5Y, the folder icon 5004-7 includes reduced scale representations (e.g., "O" and "R") of two selectable user interface objects (5002-4 and 5002-13) within the folder, while in FIG. 5AA, after one of the selectable user interface objects (e.g., 5002-4) has been removed from the folder, only reduced scale representations (e.g., "R") of selectable user interface objects (e.g., 5004-13) that are still in the folder are displayed in the folder icon (e.g., 5004-7 in FIG. 5 AA).
In some embodiments, since this is a folder creation cancel operation, the folder icon is stopped from being displayed and the remaining selectable user interface objects (e.g., 5002-13) are redisplayed outside of the folder view (e.g., as shown in FIG. 5 CC). In some embodiments, an animation transition is displayed that shows the folder icon (e.g., 5004-7) becoming the remaining selectable user interface objects (e.g., selectable user interface objects 5002-13 that remain in the folder), as shown in FIG. 5BB, wherein an intermediate stage 5104 of the animation (e.g., an animation frame between the folder icon and the remaining selectable user interface objects 5002-13) is displayed on the touch screen 112. In some embodiments, the remaining selectable user interface objects replace folder icons on the touch screen. For example, in FIG. 5AA, folder icons 5004-7 are displayed in the fourth row in the first column of the arrangement of selectable user interface objects, while in FIG. 5CC, the remaining selectable user interface objects 5002-13 are displayed in the fourth row on the first column of selectable user interface objects.
Attention is now directed to fig. 5DD-5JJ, which illustrates an exemplary user interface for deleting folders according to some embodiments. In some embodiments, after creating the new folder (e.g., as described in more detail above with reference to fig. 5M-5O), the device automatically displays a folder view of the folder. For example, in FIG. 5DD, the device displays a folder view 5106 that includes two selectable user interface objects (e.g., 5002-4 and 5002-13). In some embodiments, when the folder view is displayed, the device also displays a folder rename area for renaming the folder. For example, in fig. 5DD, the device displays a folder view 5106 with a folder renaming area 5108 in which the device has received renaming input (e.g., text input from a physical keyboard, keypad, soft keyboard, or other alphanumeric character input device) to change the name of the folder from "game" to "play". In response to the renaming input, the device changes the name of the folder and changes the appearance of the folder icon (e.g., 5004-7) according to the renaming input (e.g., changes the "game" in FIG. 5DD to "play" in FIG. 5EE after receiving the renaming input).
In some embodiments, the folder view is displayed by dividing the wallpaper background into a first portion (e.g., 5108) and a second portion (e.g., 5110) and shifting the first portion (e.g., 5108) of the wallpaper background away from the second portion (e.g., 5110) of the wallpaper background to display the folder view 5106 in an area between the first portion 5108 and the second portion 5110.
In some embodiments, after creation of the folder has been confirmed (e.g., by renaming the folder, opening and closing the folder, adding additional selectable user interface objects to the folder, etc.), the folder is not deleted when a single item is removed from the folder. Conversely, in some embodiments, the folder is deleted by the device only after all items have been removed from the folder. For example, in fig. 5EE, the device detects a request to remove a selectable user interface object (e.g., 5002-4) from the folder view (e.g., detects a contact 5112 and a subsequent movement 5114 of the contact 5112 on the touch screen 112 toward a location outside of the folder view 5106 and into the first portion 5108 of the desktop background). In response to a request to remove a selectable user interface object, e.g., 5002-4, from the folder view 5106, the device removes the selectable user interface object 5002-4 from the folder view 5106 and displays the selectable user interface object 5002-4 outside of the folder view 5106, as illustrated in FIG. 5 FF. Continuing with the present example, in FIG. 5FF, the device detects a request to remove the last selectable user interface object (e.g., 5002-13) from the folder view, including detecting a contact 5116 and subsequent movement 5118 of the contact (e.g., movement from a first contact location 5116-a inside the folder view 5106 on the touch screen 112 in FIG. 5FF to a second contact location 5116-b outside the folder view 5106 on the touch screen 112 in FIG. 5 GG). In some embodiments, responsive to the request, a last selectable user interface object (e.g., 5002-13 in fig. 5 GG) is removed from the folder in response to detecting termination of the input corresponding to the request. In some embodiments, when the device detects termination of the input (e.g., lifting of the contact), the last selectable user interface object is displayed on a display (e.g., touch screen 112) according to the location of the contact 5116-b.
In some embodiments, upon removing the last selectable user interface object (e.g., 5002-13 in FIG. 5 GG) from the folder view (e.g., 5106 in FIG. 5 GG), a scaled-down representation (e.g., "R") of the selectable user interface object is removed from the folder icon. For example, in FIG. 5HH, the folder icons 5004-7 do not include any scaled down representation of selectable user interface objects (e.g., because the folder associated with the folder icon does not contain any selectable user interface objects).
In some embodiments, once the last selectable user interface object (e.g., 5002-13 in fig. 5 GG) has been removed from the folder view 5106, as shown in fig. 5GG, the folder is deleted and the associated folder view is stopped from being displayed. For example, in fig. 5II, the device has stopped displaying the folder view (e.g., 5106 in fig. 5 GG) and the folder icons (e.g., 5004-7 in fig. 5 GG) associated with the folders. In some embodiments, the device displays an animation of the folder icons (e.g., 5004-7) disappearing. For example, in FIG. 5HH, the folder displays folder icons 5004-7 without a scaled-down representation of the selectable user interface object and begins to zoom out of the folder icons 5004-7, as indicated by the arrows in FIG. 5 HH. Continuing with this animation, in FIG. 5II, the folder icon is completely stopped from being displayed. After ceasing to display the folder icons, in some embodiments, the device rearranges the adjustable user interface objects, thereby eliminating the gaps left in the predefined arrangement of selectable user interface icons by deleting the folder icons. For example, in fig. 5JJ, selectable user interface objects 5002-4 associated with the card application are moved to the left to fill in the void left by the folder icon (e.g., 5004-7 in fig. 5 HH).
Attention is now directed to fig. 5KK-5PP, with fig. 5KK-5PP showing an exemplary user interface for adding selectable user interface objects to a folder in accordance with some embodiments. In FIG. 5KK, the device displays a plurality of selectable user interface objects including a plurality of action icons (e.g., 5002-1, 5002-2, 5002-3, 5002-5, 5002-6, 5002-7, 5002-8, 5002-09, 5002-10, 5002-11, 5002-12, 5002-14, 5002-15, and 5002-16) and a plurality of folder icons (e.g., 5004-1-b, 5004-7, and 5004-2). In some embodiments, the device detects an input or a beginning of an input (e.g., contact 5120-a on touch screen 112 in FIG. 5 KK) corresponding to a request on a display (e.g., touch screen 112) to move a respective selectable user interface object (e.g., 5002-9). In some embodiments, one or more of the other selectable user interface objects (e.g., action icon 5002 and folder icon 5004) have default activation areas (e.g., 5122-1-a, 5122-2-a, 5122-3-a, 5122-4-a, 5122-5-a, 5122-6-a, 5122-7-a, 5122-8-a, 5122-9-a, 5122-10-a, 5122-11-a, 5122-12-a, 5122-13-a, 5122-14-a, 5122-15-a, 5122-16-a, 5122-17-a), wherein each activation area for a respective selectable user interface object is used to perform an action associated with the respective selectable user interface object. In some embodiments, the respective activation region for the respective action icon is associated with an action that creates a folder that includes the respective action icon. In some embodiments, the respective activation region for the respective folder icon is associated with an action to add a selectable user interface object to the folder associated with the respective folder icon. In some embodiments, in response to detecting an input on the touch-sensitive surface corresponding to movement of the first selectable user interface object (e.g., 5002-9 in FIG. 5 KK), one or more of the respective activation regions changes from a default size to an adjusted size (e.g., 5122-1-b, 5122-2-b, 5122-3-b, 5122-4-b, 5122-5-b, 5122-6-b, 5122-7-b, 5122-8-b, 5122-9-b, 5122-10-b, 5122-11-b, 5122-12-b, 5122-13-b, 5122-14-b, 5122-15-b, 5122-16-b, 5122-17-b in FIG. 5 LL). In some embodiments, the adjusted size of the respective activation region (e.g., 5122-13-b in FIG. 5 LL) is determined based on a distance from the respective activation region (e.g., 5122-13-a in FIG. 5 KK) to the first respective selectable user interface object (e.g., 5002-9 in FIG. 5 KK) on the display (e.g., touch screen 112).
As one example of adding a selectable user interface object to a folder, a device detects an input corresponding to a request to move the selectable user interface object to an activation region associated with a folder icon. For example, the device detects a contact 5120 on a touch-sensitive surface (e.g., touch screen 112) at a location corresponding to a respective selectable user interface object 5002-9 and detects a subsequent movement 5121 of the contact across the touch-sensitive surface (e.g., movement on touch screen 112 from first contact location 5120-a to second contact location 5120-b in FIG. 5KK to third contact location 5120-c in FIG. 5 MM). In response to detecting the input, the device moves the respective selectable user interface object 5002-9 across the display (e.g., touch screen 112) in accordance with movement of contact 5120, as shown in fig. 5KK-5 MM. In some embodiments, the device automatically rearranges the selectable user interface objects as the respective selectable user interface objects 5002-9 move across the display (e.g., touch screen 112). In some embodiments, the device does not rearrange the selectable user interface objects until a predetermined condition has been met (e.g., no contact 5120 is detected on touch screen 112). For example, in fig. 5LL-5MM, even though the respective selectable user interface object 5002-9 has been moved across the display (e.g., touch screen 112), the other selectable user interface objects are not immediately rearranged.
In some embodiments, the device detects termination of an input (e.g., lifting of contact 5120-c in FIG. 5 MM) when the respective selectable user interface object 5002-9 is at least partially within the activation region (e.g., 5122-13-b) for one of the other selectable user interface objects (e.g., folder icon 5004-7). In some embodiments, in response to detecting termination of the input, the device adds a respective selectable user interface object 5002-9 to a folder associated with the selectable user interface object (e.g., folder icon 5004-7). In some embodiments, after the respective selectable user interface object (e.g., 5002-9) has been added to the folder associated with the selectable user interface object (e.g., folder icon 5004-7), the device modifies the folder icon (e.g., 5004-7 in FIG. 5 NN) to include a scaled-down representation of the action icon (e.g., "S" in folder icon 5004-7 in FIG. 5 NN), as shown in FIG. 5 NN. In some embodiments, after action icons 5002-9 have been added to the folders associated with folder icons 5004-7, the device rearranges the selectable user interface objects on the display (e.g., touch screen 112) to fill any gaps in the arrangement, as shown in FIG. 5OO, where the gaps left by movement of action icons 5002-9 into the folders are filled.
In fig. 5OO, another input detected by the device includes a contact 5124 on the touch-sensitive surface (e.g., touch screen 112) at a location corresponding to the location of the action icons 5002-8 on the display (e.g., touch screen 112) and a subsequent movement 5126 of the contact (e.g., movement on touch screen 112 from a first contact location 5124-a in fig. 5OO to a second contact location 5124-b in fig. 5 PP). In some embodiments, one or more of the other selectable user interface objects (e.g., action icon 5002 and folder icon 5004) have an activation region (e.g., 5128-1, 5128-2, 5128-3, 5128-4, 5128-5, 5128-6, 5128-7, 5128-8, 5128-9, 5128-10, 5128-11, 5128-12, 5128-13, 5128-14, 5128-15), wherein each activation region for a respective selectable user interface object is for performing an action associated with the respective selectable user interface object. In some embodiments, the respective activation areas (e.g., 5128-1, 5128-2, 5128-3, 5128-4, 5128-5, 5128-6, 5128-8, 5128-9, 5128-10, 5128-11, 5128-12, or 5128-13) for the respective action icons are associated with an action to create a folder that includes the respective action icons. In some embodiments, the respective activation region (e.g., 5128-7, 5128-11, or 5128-15) for the respective action icon is associated with an action to add the respective selectable user interface object (e.g., 5002-8) to the folder associated with the respective folder icon. In some embodiments, the size of the activation region is determined based on the distance of the activation region from the corresponding selectable user object (e.g., 5002-8). In some embodiments, the activation region is not displayed on a display (e.g., touch screen 112).
In response to detecting the input (e.g., movement 5126 of contact 5124 on touch screen 112 in fig. 5OO-5 PP), the device moves the respective selectable user interface object 5002-8 across the display (e.g., touch screen 112) in accordance with the movement of contact 5124, as shown in fig. 5OO-5 PP. In some embodiments, the device does not rearrange the selectable user interface objects until the predetermined condition has been met, as described in more detail above with reference to fig. 5LL-5 MM. In some embodiments, the device automatically rearranges the selectable user interface objects as the respective selectable user interface objects 5002-8 move across the display (e.g., touch screen 112), as shown in fig. 5 PP. For example, in fig. 5OO-5PP, even though the respective selectable user interface object 5002-8 is still moving across the display (e.g., touch screen 112), the other selectable user interface objects have been rearranged to fill the void left by the movement of the respective selectable user interface object 5002-8 across the display (e.g., touch screen 112).
In some embodiments, the device detects additional movement 5130 of the contact (e.g., movement from the second contact location 5124-b in FIG. 5PP to a contact location within the activation area 5128-11 for the corresponding folder icon 5004-7). In some embodiments, the device detects termination of an input (e.g., lifting of contact 5124 from touch screen 112) while the respective selectable user interface object 5002-8 is at least partially within an activation region (e.g., 5128-11) for one of the other selectable user interface objects (e.g., folder icon 5004-7). In some embodiments, in response to detecting termination of the input, the device adds a respective selectable user interface object 5002-8 to a folder associated with the selectable user interface object (e.g., folder icon 5004-7). In some embodiments, after the respective selectable user interface object (e.g., 5002-8) has been added to the folder associated with the selectable user interface object (e.g., folder icon 5004-7), the device modifies the folder icon (e.g., 5004-7 in FIG. 5 QQ) to include a scaled-down representation of the action icon (e.g., "M" in folder icon 5004-7 in FIG. 5 QQ), as shown in FIG. 5 QQ.
Attention is now directed to fig. 5QQ-5SS, with fig. 5QQ-5SS showing an exemplary user interface for moving folder icons according to some embodiments. In some embodiments, the folder icons may be moved around on the display (e.g., touch screen 112) in response to a folder repositioning input on the touch-sensitive surface (e.g., touch screen 112) while the device is in the user interface reconfiguration mode. For example, in FIG. 5QQ, the device detects input including contact 51313 at a location corresponding to the location of the corresponding folder icon 5004-7 on the touch-sensitive surface (e.g., touch screen 112), and detects subsequent movement 5134 of contact 5132 across the touch-sensitive surface (e.g., movement across touch screen 112 from first contact location 5132-a in FIG. 5QQ to second contact location 5132-b in FIG. 5 RR). In response to detecting the input, the device moves the corresponding folder icon 5004-7 to a new location on the display (e.g., touch screen 112) as indicated in FIG. 5RR according to the input. In some embodiments, upon detecting a predetermined condition (e.g., termination of an input or suspension of movement for more than a predetermined period of time), the device rearranges the selectable user interface objects to make room for the corresponding folder icons (e.g., 5004-7) that move in response to detecting the input, as shown in fig. 5 SS.
Attention is now directed to fig. 5QQ-5VV, fig. 5QQ-5VV showing an exemplary user interface for rearranging selectable user interface objects in a folder, according to some embodiments. In FIG. 5SS, the device detects a folder view display input (e.g., a tap gesture 5136 at a location on touch screen 112 that corresponds to the location of folder icons 5004-7). In response to the folder view display input, the device displays a folder view (e.g., 5138 in FIG. 5TT-5 UU) that includes selectable user interface objects (e.g., action icons 5002-4, 5002-13, 5002-9, 5002-8) associated with the folder represented by folder icon 5004-7. In some embodiments, the selectable user interface objects within the folder view (e.g., 5138) have a predetermined spatial arrangement and may be rearranged based on the detected rearrangement input. For example, in fig. 5TT, the device detects a rearrangement input (e.g., a contact 5140 and subsequent movement 5142 of the contact 5140 across the touch screen 112). In response to detecting the rearrangement input, the device moves one or more respective selectable user interface objects within the folder view from a first position within the spatial arrangement of the folder view to a second position within the spatial arrangement of the folder view in accordance with the rearrangement input. For example, in fig. 5TT, the device detects a contact 5140 on the touch-sensitive surface (e.g., touch screen 112) at a location corresponding to the location of stock application icons 5002-9 on the display (e.g., touch screen 112) and detects a subsequent movement 5142 of the contact 5140 to a leftmost location on the touch-sensitive surface (e.g., touch screen 112) corresponding to the spatial arrangement on the display (e.g., touch screen 112). In response to this movement, the device moves the stock application icons 5002-9 to the leftmost position within the spatial arrangement of the folder view 5138, as shown in FIG. 5 UU.
Additionally, in some embodiments, the folder icons (e.g., 5004-7) associated with the folder view (e.g., 5138) are updated to reflect modifications to the spatial arrangement of icons within the folder view (e.g., 5138). For example, in FIG. 5TT (e.g., before the spatial arrangement of selectable user interface objects within the folder view has changed), the device displays scaled-down representations of selectable user interface objects in the folder view 5138 (e.g., in a left-to-right, top-down order "O", "R", "S", "M") for the card application icons 5002-4, the racing application icons 5002-13, the stock application icons 5002-9, and the map application icons 5002-8) in a first order (e.g., in a left-to-right, top-down order) corresponding to the spatial arrangement of selectable user interface objects. In contrast, in FIG. 5UU (e.g., after the spatial arrangement has changed), the device displays scaled-down representations of the selectable user interface objects in folder view 5138 (e.g., "S", "O", "R", "M" in a left-to-right, top-down order) in a second order (e.g., stock application icons 5002-9, card application icons 5002-4, racing application icons 5002-13, map application icons 5002-8) corresponding to the new spatial arrangement of selectable user interface objects.
In some embodiments, in response to the folder view exit input, the device stops displaying the folder view. For example, in fig. 5UU, the device detects a tap gesture 5144 on the touch-sensitive surface (e.g., touch screen 112) at a location corresponding to a location on the display (e.g., touch screen 112) outside of the folder view. In response to detecting the tap gesture 5144, the device stops displaying the folder view, as shown in fig. 5 VV. In some embodiments, the device displays an animation of the folder view closing (e.g., the background wallpaper closing on the selectable user interface object within folder view 5138) on a display (e.g., touch screen 112).
Attention is now directed to fig. 5VV-5BBB, fig. 5VV-5BBB showing an exemplary user interface for removing selectable user interface objects from folders, in accordance with some embodiments. In some embodiments, the device detects a folder view display input (e.g., a tap gesture 5146 in FIG. 5VV at a location on touch screen 112 corresponding to folder icon 5004-1-b). In response to detecting the folder view display input, the device displays a folder view 5148 (e.g., action icons 5002-19, 5002-20, 5002-21, 5002-22, 5002-23, 5002-24, and 5002-25 on touch screen 112) including the contents of the folder, as shown in FIG. 5 WW.
In some embodiments, one or more of the selectable user interface objects includes a notification flag (e.g., 5150 in the WW of fig. 5) indicating that an application (e.g., application-7) associated with the selectable user interface object (e.g., action icons 5002-25) requires attention of the device user. In some embodiments, one or more additional notification markers (e.g., 5012 and/or 5014) are also displayed on the folder icon (e.g., 5004-1-b) associated with the folder containing the selectable user interface object (e.g., action icons 5002-25). In some embodiments, additional notification flags are updated as notification flags on the selectable user interface object are updated (e.g., when notification flag 5150 appears, disappears, or changes, thereby indicating that the number of notifications has changed).
In some embodiments, the device detects a selectable user interface object removal input and, in response to detecting the selectable user interface object removal input, the device removes the selectable user interface object from the folder view. For example, in FIG. 5WW, the device detects a contact 5152 on the touch-sensitive surface (e.g., touch screen 112) at a location corresponding to the location of an application icon 5002-19 on the display (e.g., touch screen 112) and a subsequent movement 5154 of the contact 5152 across the touch-sensitive surface (e.g., touch screen 112) toward the display (e.g., touch screen 112) corresponding to a location of a portion outside of the folder view 5148. In some embodiments, the input is a quick gesture that does not specify a particular location outside of the folder (e.g., the gesture is a flick gesture or a quick click gesture and a drag gesture that does not include a pause outside of the folder view), and the device moves the selectable user interface object to an automatically determined location on the display outside of the folder view (e.g., touch screen 112). Continuing with the above example, in response to detecting the contact 5152 and subsequent movement 5154 of the contact, the device removes the action icons 5002-19 from the folder view 5148, closes the folder view, and displays the action icons 5002-19 at a first open position in the arrangement of selectable user interface objects on the display (e.g., touch screen 112). In this example, action icons 5002-19 are displayed at the lower right corner of the three-by-four array of selectable user interface objects on the display (e.g., touch screen 112 in FIG. 5 XX).
In some embodiments, when a selectable user interface object has been removed from a folder associated with a folder icon, the device updates the folder icon associated with the folder. For example, in FIG. 5WW, the folder icon 5004-1-b associated with the displayed folder view 5148 includes four scaled-down representations (e.g., "x4", "x5", "x6", and "x 7") of selectable user interface objects contained within the folder associated with the folder icon 5004-1-b. In some embodiments, the empty space in the folder icon indicates that the folder view includes space to add more selectable user interface objects, as shown by folder icon 5004-1-b in FIG. 5 WW. However, after removing the selectable user interface object (e.g., action icon 5002-19) from the folder associated with the folder icon (e.g., 5004-1-b), the device changes the appearance of the folder icon (e.g., 5004-1-b in FIG. 5 XX) to indicate that the selectable user interface object (e.g., 5002-19) has been removed from the folder. For example, in FIG. 5XX, folder icons 5004-1-b illustrate reduced scale representations (e.g., displaying "x2", "x3", "x4", "x5", "x6", and "x 7") in folders that are rearranged to indicate one less selectable user interface object in the folder associated with folder icon 5004-1-b.
In some embodiments, the device detects an action icon selection input (e.g., tap gesture 5155 in FIG. 5XX at a location corresponding to action icons 5002-12 on touch screen 112) while the device is in the user interface reconfiguration mode, and in response to the action icon selection input while the device is in the user interface reconfiguration mode, the device does not activate an application (e.g., camera application) associated with the action icon (e.g., camera application icons 5002-12). In some embodiments, the device detects a folder view display input (e.g., tap gesture 5156 in FIG. 5XX at a location corresponding to folder icon 5004-1-b). In response to detecting the folder view display input, the device displays a folder view 5158 on a display (e.g., touch screen 112) that includes content of the folder (e.g., action icons 5002-20, 5002-21, 5002-22, 5002-23, 5002-24, and 5002-25 on touch screen 112), as shown in FIG. 5 YY.
In some embodiments, while the device is in the user interface reconfiguration mode, the device detects an input corresponding to a request to be based on an application associated with one of the action icons (e.g., a tap gesture 5156 corresponding to a request to activate an application-7 application corresponding to the application-7 application icons 5002-25). However, according to some embodiments, in response to detecting the input, the device does not activate the application icon while the device is in the user interface reconfiguration mode.
In some embodiments, the device detects a selectable user interface object removal input and, in response to detecting the selectable user interface object removal input, the device removes the selectable user interface object from the folder view. In some embodiments, the selectable user interface object is positioned within the arrangement of selectable user interface objects according to the selectable object removal input when the selectable object removal input satisfies the predefined condition. For example, in fig. 5YY, the device detects a contact 5162 on the touch-sensitive surface (e.g., touch screen 112) at a location corresponding to the location of an application icon 5002-22 on the display (e.g., touch screen 112) and a subsequent movement of that contact 5152 across the touch-sensitive surface (e.g., movement across touch screen 112 from contact location 5162-a within folder view 5158 in fig. 5YY to a contact location in fig. 5ZZ that corresponds to a portion of touch screen 112 outside folder view 5158). In some embodiments, the device detects contact (e.g., contact location 5162-b) on the touch-sensitive surface (e.g., touch screen 112) at a location on the display (e.g., touch screen 112) that corresponds to a location outside of the folder view 5158 for more than a threshold period of time, and in response, the device stops displaying the folder view, as shown in FIG. 5 AAA.
In some embodiments, after ceasing to display the folder view, the device continues to detect movement 5166 of the contact 5162 (e.g., movement from the second contact location 5162-b on the touch screen 112 in FIG. 5AAA to the third contact location 5162-c on the touch screen 112 in FIG. 5BBB that corresponds to a location within the arrangement of selectable user interface objects on the touch screen 112). In response to continuing the movement 5166, the device moves the selectable user interface object (e.g., action icons 5002-22) on the display (e.g., touch screen 112) in accordance with the movement of the contact. In some embodiments, the selectable user interface object (e.g., action icon 5002) removed from the folder in this manner is placed according to the selectable user interface object input. Continuing with this example from above, the device detects termination of an input associated with a location on the display (e.g., touch screen 112) within the arrangement of selectable user interface objects (e.g., lifting of contact 5162 from touch screen 112 at contact location 5162-c in the BBB of fig. 5). As shown in the BBB of FIG. 5, the contact 5161-c and the action icon 5002-22 are located between two of the other selectable user interface objects (e.g., 5002-6 and 5004-7), and upon detection of termination of the input (e.g., lifting of the contact 5162-c in the BBB of FIG. 5), the selectable user interface object is displayed at a location within the arrangement of the selectable user interface objects indicated by the input (e.g., between the memo action icon 5002-6 and the game folder icon 5004-7 on the touch screen 112 in the CCC of FIG. 5), the application-4 action icon 5002-22 is displayed.
In some embodiments, when a selectable user interface object has been removed from a folder associated with a folder icon, the device updates the folder icon associated with the folder. For example, in FIG. 5YY, the folder icon 5004-1-b associated with the displayed folder view 5158 displays six scaled-down representations (e.g., "x2", "x3", "x4", "x5", "x6", and "x 7") of selectable user interface objects contained within the folder associated with the folder icon 5004-1-b. However, after removing the selectable user interface object (e.g., action icon 5002-22) from the folder associated with the folder icon (e.g., 5004-1-b), the device changes the appearance of the folder icon (e.g., from 5004-1-b in FIG. 5XX-5ZZ to 5004-1-b in FIG. 5 AAA), thereby indicating that the selectable user interface object (e.g., 5002-22) has been removed from the folder. For example, in FIG. 5AAA, folder icon 5004-1-b shows all of the scaled-down representations (e.g., only "x2", "x3", "x5", "x6", and "x 7") in the rearranged folder, indicating one less selectable user interface object in the folder associated with folder icon 5004-1-b. Additionally, in some embodiments, displaying the scaled down representation of the removed selectable user interface object within the folder icon is stopped. For example, the reduced scale representation "x4" is no longer displayed in the folder icons 5004-1-b in the CCC of FIG. 5 because the action icons 5002-22 have been removed from the folder.
Attention is now directed to fig. 5CCC-5EEE, with fig. 5CCC-5EEE showing an exemplary user interface for navigating multiple pages of a selectable user interface object in accordance with some embodiments. In some embodiments, the device detects a folder view display input (e.g., a tap gesture 5168 in FIG. 5CCC at a location on touch screen 112 that corresponds to the location of folder icon 5004-2 on touch screen 112). In response to detecting the folder view display input, the device displays a folder view 5170 including the contents of the folder (e.g., action icons 5002-26, 5002-27, 5002-28, 5002-29, 5002-30, 5002-31, 5002-32, 5002-33, 5002-34, 5002-35, 5002-36, and 5002-37) on a display (e.g., touch screen 112), as shown in fig. 5 EEE.
In some embodiments, the device displays an animation of the folder view expanding from the dock when the folder view display input is a request to display a folder view for a folder icon (e.g., 5004-2) in a tray (e.g., 5006 in CCC of fig. 5) of the user interface. For example, in FIG. 5CCC, the device detects a tap gesture 5168 on the folder icon 5004-2 in the tray 5006 and displays a folder view 5170 of the folder in FIG. 5 EEE. In some embodiments, the device displays a transitional animation prior to displaying the folder view, as shown in fig. 5 DDD. For example, in response to detecting the click gesture 5168, the device divides the wallpaper background into a first portion 5712 and a second portion 5174 and displays an animation of the wallpaper background sliding back (e.g., the second portion 5174 moves away from the first portion 5172), thereby rendering selectable user interface objects (e.g., 5002-34, 5002-35, 5002-36, 5002-37 in the DDD of fig. 5) that appear to be located below the wallpaper background. At the end of the animation, the contents of the folder or a portion of the contents are displayed in folder view 5170 on a display (e.g., touch screen 112).
In some embodiments, the folder includes more selectable user interface objects than can be displayed in the folder view (e.g., 5170 in fig. 5 EEE). In some embodiments, the folder has a maximum number of selectable user interface objects that can be added to the folder, wherein the maximum number is based on the maximum number of selectable user interface objects that can be displayed in the folder view for the folder. For example, in the EEE of FIG. 5, only 12 selectable user interface objects may be added to the folder associated with folder view 5170. However, in some embodiments, the folder view contains multiple "pages" or groups of selectable user interface objects, and the folder may accommodate additional selectable user interface objects by displaying additional selectable user interface objects that are part of a subsequent set of selectable user interface objects that are not suitable for use in the first group of selectable user interface objects (e.g., action icons 5002-26, 5002-27, 5002-28, 5002-29, 5002-30, 5002-31, 5002-32, 5002-33, 5002-34, 5002-35, 5002-36, and 5002-37). For example, in fig. 5EEE, the device detects a next page input that includes a contact 5176 and subsequent movement 5178 of the contact across a touch-sensitive surface (e.g., touch screen 112). In response to detecting the next page input, the device displays a second set of selectable user interface objects (e.g., action icons 5002-38, 5002-39, 5002-40, 5002-41, 5002-42, and 5002-43 in the FFF of FIG. 5) within folder view 5170 for the folder. In other words, the folder includes eighteen selectable user interface objects, with twelve selectable user interface objects in a first page and six selectable user interface objects in a second page.
Attention is now directed to fig. 5GGG-5MMM, fig. 5GGG-5MMM showing an exemplary user interface for removing selectable user interface objects from folders in accordance with some embodiments. In some embodiments, the device detects a folder view display input (e.g., a tap gesture 5168 in the CCC of fig. 5 at a location corresponding to the folder icon 5004-2 on the touch screen 112). In response to detecting the folder view display input, the device displays a folder view 5182 including content of the folder (e.g., action icons 5002-26, 5002-27, 5002-28, 5002-29, 5002-30, 5002-31, 5002-32, 5002-33, 5002-34, 5002-35, 5002-36, and 5002-37), as shown in FIG. 5 GGG.
In some embodiments, the folder view (e.g., 5182), in some of these embodiments, includes all or substantially all of the display (e.g., touch screen 112), with the device displaying a selectable user interface object removal area (e.g., 5184 in fig. 5 GGG). In some embodiments, in response to detecting a removal request corresponding to a request to move a respective selectable user interface object into selectable user interface object removal area 5184, the device removes the respective selectable user interface object from the folder. For example, in FIG. 5GGG, the device detects contact 5186 and movement 5188 of the contact (e.g., movement from a first contact location 5168-a in FIG. 5GGG corresponding to a location of a respective selectable user interface object 5002-32 on touch screen 112 to a second contact location 5186-b in FIG. 5HHH on touch screen 112 corresponding to a location proximate to or within selectable user interface object removal area 5184). Continuing with the present example, the device moves each selectable user interface object (e.g., action icons 5002-32) to selectable user interface object removal area 5184. In some embodiments, in response to a termination of the input (e.g., a lifting of contact 5186-b in FIG. 5 HH) being determined, the device stops displaying the corresponding selectable user interface object (e.g., action icons 5002-32) from folder view 5182 and automatically rearranges the selectable user interface objects within folder view 5182, thereby eliminating any gaps in the arrangement of selectable user interface objects. For example, in FIG. 5III, the selectable user interface objects have been rearranged so as to fill in the void that remains in FIG. 5HHH after the corresponding selectable user interface object (e.g., action icons 5002-32) is removed from the folder view.
In some embodiments, the display of the folder view is automatically stopped after the selectable user interface objects (e.g., action icons 5002-32) are removed from the folder (e.g., the device automatically switches from the user interface shown in FIG. 5III to the user interface shown in FIG. 5 JJJ). In some embodiments, the device detects an exit folder view input (e.g., a tap gesture 5190 on the "exit folder" icon in the upper right corner of folder view 5182 in fig. 5 III). In response to detecting the exit folder view input, the device stops displaying the folder view and redisplays the home screen as shown in fig. 5 JJJ. In some embodiments, selectable user interface objects (e.g., 5002-32) that are removed from the folder are displayed on the home screen, as shown in FIG. 5 JJJ.
In some embodiments, the device detects a folder view display input (e.g., a tap gesture 5192 in fig. 5JJJ at a location corresponding to folder icon 5004-2). In response to detecting the folder view display input, the device redisplays the folder view 5182 including the modified content of the folder (e.g., action icons 5002-26, 5002-27, 5002-28, 5002-29, 5002-30, 5002-31, 5002-33, 5002-34, 5002-35, 5002-36, and 5002-37), as shown in FIG. 5 KKK. In some embodiments, instead of or in addition to the selectable user interface object removal area described above, the device displays object modification targets associated with one or more of the selectable user interface objects. For example, in FIG. 5KKK, each of the selectable user interface objects has an object modification target associated therewith (e.g., action icons 5002-37 have respective object modification targets 5194).
In some embodiments, when the device detects a request to activate an object modification target for a respective selectable user interface object (e.g., a tap gesture 5196 at a location on touch screen 112 corresponding to the location of the object modification region for action icons 5002-37), the device displays an object modification dialog box. For example, in the LLL of fig. 5, the device displays a pop-up dialog 5198 that provides the user with options for modifying selectable user interface objects (e.g., delete action icons 5002-37, remove action icons 5002-37 from folders, or cancel object modification operations). In this example, in response to input corresponding to a request to delete a selectable user interface object (e.g., tap gesture 5200 on delete button), the device deletes the selectable user interface object (e.g., removes action icons 5002-37 from the folder associated with the folder view and from the device entirely so that they are not displayed on the home screen or any other folder view, as shown in the MMM of fig. 5). In some embodiments, when the selectable user interface is deleted, the application associated with the selectable user interface object is deleted from the device. In this example, in response to input corresponding to a request to cancel an object modification operation (e.g., cancel the tap gesture 5202 on the icon), the device stops displaying the object modification dialog 5198 without modifying the selectable user interface object (e.g., action icons 5002-37), thereby returning to the user interface displayed in fig. 5 KKK. In this example, in response to input corresponding to a request to remove a selectable user interface object from a folder (e.g., a tap gesture 5204 on a remove button), the device removes the selectable user interface object from the folder (e.g., as shown in MMM of fig. 5) without removing the selectable user interface object from the device (e.g., remove action icons 5002-37 from the folder associated with the folder view and display action icons 5002-37 on the home screen, as shown in NNN of fig. 5).
Attention is now directed to fig. 5NNN-5OOO, fig. 5NNN-5OOO illustrating an exemplary user interface for displaying a folder view while in a normal operating mode according to some embodiments. In some embodiments, the device detects an input corresponding to a request to exit the user interface reconfiguration mode and return to the normal operation mode. For example, in fig. 5NNN, the device detects a request to return to the normal operating mode (e.g., pressing input 5206 on home screen button 204 in fig. 5 NNN). In response to a request to return to normal operating mode, the device returns to normal operating mode such that the selectable user interface objects (e.g., action icon 5002 and folder icon 5004) in OOO-5PPP of fig. 5 cannot be rearranged (although in some embodiments the arrangement of selectable user interface objects may be scrolled in one or two dimensions).
In some embodiments, selection of the respective action icon 5002 results in activation of an application associated with the respective action icon while the device is in the normal operating mode. For example, in fig. 5OOO, in response to detecting selection of the photo action icon (e.g., detecting a tap gesture 5208 at a location on the touch-sensitive surface corresponding to photo action icon 5002-1), the device displays the photo application (e.g., launches the photo application if the photo application is not running, or simply displays the photo application if the photo application is already running). Conversely, in some embodiments, in response to detecting selection of the respective folder icon 5004, a folder view for the folder is caused to be displayed. For example, in fig. 5OOO, after detecting a selection of a game folder icon (e.g., detecting a tap gesture 5210 at a location on the touch-sensitive surface corresponding to game folder icons 5004-7), the device displays a folder view 5212 for the game folder. In some embodiments, selection of the respective action icon 5002 within the folder view (e.g., folder view 5212) results in activation of an application associated with the respective action icon while the device is in the normal operating mode. For example, in response to detecting selection of the photo action icon (e.g., detecting a tap gesture 5214 on the touch-sensitive surface (e.g., touch screen 112) at a location corresponding to stock action icons 5002-9), the device displays a photo application on a display (e.g., touch screen 112) (e.g., launches the stock application if the stock application is not running, or simply displays the stock application if the stock application is already running).
Attention is now directed to fig. 5QQQ-5TTT, fig. 5 qq-5TTT showing an exemplary user interface for displaying a folder view while in a normal mode of operation, according to some embodiments. In fig. 5QQQ, the device is displaying a plurality of selectable user interface objects (e.g., action icon 5002 and folder icon 5004) on the home screen. In some embodiments, the device detects an input requesting selection of a folder icon. For example, in FIG. 5QQQ, the device detects contact 5216 on folder icons 5004-12. The folder icons 5004-12 include a plurality of scaled-down representations of selectable icons associated with folders (e.g., scaled-down representations of selectable icons "y1", "y2", "y3", "y4", "y5", "y6", "y7", "y8", and "y 9"). As explained below, in some embodiments, the scaled down representations of the selectable icons included in the folder icons 5004-12 correspond to a subset of the selectable icons within the folder corresponding to the folder icons 5004-12.
Fig. 5QQQ-5TTT illustrate an animated transition between the home screen (e.g., application launch user interface) shown in fig. 5QQQ and the folder view corresponding to folder icons 5004-12 shown in fig. 5 TTT. Specifically, in this example, the plurality of selectable user interface objects are displayed in the home screen in respective arrangements and the animation expands the respective arrangements such that the folder icons 5004-12 are expanded and move toward the center of the display (e.g., in FIG. 5 QQ-5RRR, the device expands the arrangements such that the folder icons 5004-12 are expanded and move toward the center of the touch screen 112; in FIG. 5RRR-5SSS, the device expands the arrangements such that the folder icons 5004-12 are expanded and move toward the center of the touch screen 112; and in FIG. 5SSS-5TTT, the device expands the arrangements such that the folder icons 5004-12 are expanded and move toward the center of the touch screen 112). During the animation, the device stops displaying enlarged representations of selectable user interface objects other than folder icons 5004-12 because they are "away" from the display. After zooming in on the folder icons 5004-12, the display of the zoomed-in folder icons 5004-12 is replaced with a folder view, as illustrated in the TTT of FIG. 5. In some embodiments, in response to a request to exit the folder view shown in fig. 5TTT (e.g., in response to activation of home screen button 204), the device reverses the animation transition shown in fig. 5QQQ-5TTT (e.g., by stepping back from fig. 5TTT to fig. 5SSS and back to fig. 5RRR to fig. 5 qq).
Attention is now directed to fig. 5TTT-5VVV, fig. 5TTT-5VVV illustrating an exemplary user interface for displaying the page indicator icon 5217 in the folder view and updating its appearance according to the displayed page of the folder view.
As shown in fig. 5TTT, the folder view (fig. 5QQQ-5 SSS) associated with the folder icons 5004-12 displays a first page including a first subset of selectable icons in the folder. In some embodiments, the folder view includes space to simultaneously display no more than a particular number of selectable icons, the particular number being less than the total number of selectable icons within the folder. For example, the folders associated with the folder icons 5004-12 include action icons (e.g., application icons) 5002-51, 5002-52, 5002-53, 5002-54, 5002-55, 5002-56, 5002-57, 5002-58, 5002-59 displayed in a first page (FIG. 5 TTT) of the folder view, and also include at least action icons 5002-60, 5002-61, 5002-62, 5002-63, 5002-64, 5002-65, 5002-66, 5002-67, 5002-68 displayed in a second page (FIG. 5 VVV) of the folder view. In this example, the folder view includes a space to display nine or less selectable icons in each page.
In some embodiments, the folder view associated with folder icons 5004-12 also displays page indicator icons 5217, the page indicator icons 5217 indicating the number of distinct pages of selectable icons in the folder between which the selectable icons are divided. For example, in the TTT of fig. 5, the folder view displays three page indicators (e.g., page indicator 5217-1 corresponding to a first page of selectable icons in the folder, 5217-2 corresponding to a second page of selectable icons in the folder, 5217-3 corresponding to a third page of selectable icons in the folder) that indicate that selectable icons in the folder are divided among the three pages. In some embodiments, the page indicator for the currently selected page is displayed differently than the one or more page indicators for the unselected pages. For example, in fig. 5TTT, page indicators 5217-1 corresponding to a currently selected page (e.g., a first page) are displayed with a first appearance (e.g., as a solid black line), while page indicators 5217-2 and 5217-3 corresponding to unselected pages are displayed with a second appearance (e.g., as an unfilled circle) to enable a user to determine his or her location within multiple pages.
In some embodiments, the device detects an input corresponding to a request to display a next page of the folder view. For example, in fig. 5TTT, the device detects a contact 5218 at 5218-a on a predefined area of the folder view (e.g., in an area that does not correspond to a request to initiate an action associated with the action icon). In some embodiments, the input corresponding to the request to display the next page of the folder view includes a predefined gesture. For example, in the TTT of fig. 5, 5218 in the gesture 5220 across the display is continuously detected. As shown in the UUU of fig. 5, in some embodiments, the device display gesture 5220 results in a drag effect of an animated displacement from a first page to a second page (e.g., the device displays a "scroll" or "swipe" icon between the first page and the second page). As further shown in the UUU of fig. 5, the device detects the release (or lift) of the contact 5218 at 5218-b and in response to detecting the lift of the contact 5218, the device displays a selectable icon of a second page, as shown in the VVV of fig. 5.
Fig. 5VVV-5YYY illustrate an animated transition between a second page of the folder view and the application view. In some embodiments, the device detects an input corresponding to a request to initiate an action associated with the selectable user interface object. For example, in the VVV of FIG. 5, a tap gesture including contact 5222 is detected on action icons 5002-66, which is interpreted by the device as a request to launch "application y 16". Before this input is detected, in this example, a plurality of selectable user interface objects are displayed in a corresponding arrangement in a second page. Upon launching the "application y16", as shown in the WWW of FIG. 5, the animation is correspondingly arranged to expand such that the action icons 5002-6 are expanded and moved toward the center of the display. As shown in FIG. 5XXX, during animation, the device stops displaying enlarged representations of selectable user interface objects other than action icons 5002-66 because they are "away" from the display. After the action icons 5002-66 are enlarged, the display of the enlarged action icons 5002-55 is replaced with the application view 5224 as illustrated in fig. 5 YYY. In some embodiments, in response to a request to exit the application view 5224 shown in fig. 5YYY (e.g., in response to activation of the home screen button 204), the device reverses the animation transition shown in fig. 5VVV-5 yyyy (e.g., by stepping back from fig. 5YYY to fig. 5XXX and back to fig. 5WWW to fig. 5 VVV).
Attention is now directed to FIG. 5ZZZ-5CCCC, which FIG. 5ZZZ-5CCCC illustrates an exemplary user interface for removing corresponding icons from a folder. Fig. 5ZZZ is similar to fig. 5VVV. However, in fig. 5ZZZ, the device detects an input 5226 corresponding to a request to enter a user interface reconfiguration mode. In some embodiments, the input corresponding to the request to enter the user interface reconfiguration mode is different from the request to initiate the action icon based on predefined criteria. For example, when a contact is detected on an action icon for a predetermined amount of time (e.g., 0.5, 1, or 2 seconds), the device interprets the contact as a request to enter a user interface reconfiguration mode. And when contact is detected on the action icon for less than a predetermined amount of time (e.g., the device detects a tap gesture instead of a long press gesture), the device launches an application (e.g., as shown in fig. 5VVV-5 YYY) instead of entering a user interface reconfiguration mode (e.g., as shown in fig. 5ZZZ-5 CCCC).
In fig. 5AAAA, the device has entered the user interface reconfiguration mode. In some embodiments, the selectable user interface object (e.g., action icon 5002) displays a visual indication that the device is in a user interface reconfiguration mode. For example, as shown in fig. 5AAAA, the action icons are gently shaken as if they were floating on water (e.g., each respective action icon vibrates at a respective average location of microsoft action icons on the display). Additionally, in some embodiments, at least some of the action icons are associated with an object removal flag (e.g., 5010 in the AAAA of fig. 5) while in the user interface reconfiguration mode, and the action icons associated with the object removal flag are removed (e.g., deleted) from the page of the folder view when the device detects activation of the object removal flag. In some embodiments, deleting the action icon deletes the application associated with the action icon. In some embodiments, deleting the action icon deletes the icon and does not delete the application associated with the action icon.
In some embodiments, the device detects an input corresponding to a request to move a respective icon from a page of the folder view to a respective location on the display that is outside of the folder. For example, in fig. 5AAAA, the device detects contact 5228 at 5228-a and, while continuously detecting contact 5228, detects a gesture 5230 (e.g., a drag gesture) corresponding to a request to move the action icons 5002-66 to the bottom region of the display. As shown in fig. 5BBBB, the action icons 5002-66 are moved (e.g., dragged) to a bottom region of the display, which in this example is a predetermined icon removal region. The contact 5226 is released at 5226-b (e.g., lifting of the contact 5226 is detected), causing the action icons 5002-66 to "fall" into the icon removal area 5231. In some embodiments, the action icon is removed from the folder view based on a determination that the corresponding location is within the predetermined icon removal area. In some embodiments, removing the action icon from the folder view results in the action icon being placed at a different location within the user interface. For example, as shown in FIG. 5CCCC, action icons 5002-66 are now located in the home screen while the device is still in the user interface reconfiguration mode.
Turning now to fig. 5ZZZ and 5DDDD-5FFFF,5ZZZ and 5DDDD-5FFFF illustrate an exemplary user interface for changing pages of respective icons within a folder view in response to detecting movement of an icon into a page-changing area (e.g., page-changing area 5233-1 or next page area 5233-2). Fig. 5DDDD is similar to fig. 5AAAA, but rather than the contact 5226 moving from 5226-a to 5226-b and releasing the action icons 5002-66 in the icon area removal area 5231, the contact 5226 starts moving from 5226-c (fig. 5 DDDD) and after detecting the contact 5226 at location 5226-c (e.g., within the page change area 5233-2) for more than a predetermined amount of time (e.g., 0.1, 0.2, 0.3, 0.4, or 0.5 seconds), the device displays the next page of the folder view (e.g., the third page of the folder view) as shown in fig. 5EEEE, and when the device detects a lift of the contact 5226 while displaying the next page of the folder view, the icons 5002-66 move to the next page (e.g., the third page of the folder view). In some embodiments, the device rearranges the arrangement of other action icons 5002 (e.g., they are not action icons 5002-66) in the second page in response to movement of the action icons 5002-66. In some embodiments, the displayed page changes according to movement of the action diagram icons 5002-66 on the page change area. For example, in fig. 5EEEE, the page-change region 5233-2 is the next-change region, and the device therefore transitions from displaying the second page (fig. 5 EEEE) to displaying the third page (fig. 5 FFFF) in response to detecting movement of the icon into the page-change region 5233-2; when the page-change region 5233-1 is the previous page-change region, and the device will transition from displaying the second page (fig. 5 EEEE) to displaying the first page (fig. 5 CCCC) in response to detecting movement of the icon into the page-change region 5233-1. In this example, the third page of the folder view includes action icons 5002-69 and 5002-70 that were originally present in the third page of the folder view.
Fig. 5FFFF also shows the release of contact 5226 (e.g., lifting of contact 5266 at location 5266-c), which in this example is interpreted by the device as a request to drop action icons 5002-66 in the third page of the folder view. As shown in fig. 5GGGG, while still in the user interface reconfiguration mode, the device dynamically rearranges the arrangement of action icons 5002 in the third page of the folder view to accommodate the addition of action icons 5002-66.
In some embodiments, while in the reconfiguration mode, the device detects an input corresponding to a request to exit the reconfiguration mode. For example, in response to detecting activation of the home screen button 204 in fig. 5GGGG by contact 5234, the device exits the reconfiguration mode and returns to the normal mode of operation (e.g., non-reconfiguration mode of operation) and displays a folder view, as shown in fig. 5 HHHH.
Attention is now directed to fig. 5HHHH-5JJJJ, with fig. 5HHHH-5jjj showing an exemplary user interface for an animation that transitions from a corresponding page of the folder view to the home screen.
In some embodiments, upon displaying a third page (or a second page, etc.) of the folder view, the device detects an input corresponding to a request to close the folder view (e.g., activation of a home screen button). In response to detecting an input corresponding to a request to close the folder view, the device stops displaying the folder view and instead displays the home screen. For example, in fig. 5 hhhhh, the device detects that home screen button 204 is activated by contact 5236. In response to a request to close the folder view, in FIG. 5III, the device displays a page of the home screen that includes folder icons 5004-12 corresponding to the folder view. In some embodiments, the folder icon for the folder includes a scaled down representation of the action icon in the now-away page of the folder view at the time the home screen is displayed. In some examples, the device displays an animated transition between fig. 5 hhhhhh and 5IIII that is similar to the inverse of the animation displayed in fig. 5QQQ-5TTT, except that the folder view has a different appearance because it has a different reduced scale representation displayed on the third page of the folder view, and thus the appearance of the folder icon includes the reduced scale representation shown in fig. 5IIII as it is reduced in size. For example, FIG. 5HHH shows a third page of the folder view that includes action icons 5002-69, 5002-70, 5002-66. The folder icons 5004-12 include scaled-down representations of action icons 5002-69, 5002-70, 5002-66 as navigating from the third page away from the folder view to the home screen (FIG. 5 IIII).
In some embodiments, as shown in FIG. 5JJJJ, the device transitions to displaying a reduced scale representation of a first page of action icons in the folder icons 5004-12 instead of a reduced scale representation of a third page of action icons in the folder.
Attention is now directed to FIG. 5JJJJ-5LLLL, and FIG. 5JJJJ-5LLL illustrates an exemplary user interface for navigating to different pages of a selectable user interface object. In some embodiments, the device detects input of a request for a selectable user interface object corresponding to a different page of a plurality of distinct separately displayed pages to display the selectable user interface object. For example, FIG. 5JJJJ shows that the detection of gesture 5238 begins at 5238-a. Fig. 5 kkkkkkk shows that gesture 5238 ends at 5238-b (e.g., by detecting a lift of the gesture). As also shown in fig. 5 kkkkk, in response to the gesture, which in this example is a swipe gesture, the device displays an animated transition between the third page and the second page, and optionally, if the transition is not completed upon detecting the lifting of contact 5238, the device completes the animated transition, thereby stopping displaying the third page and instead displaying the second page (shown in fig. 5 LLLL).
Fig. 6A-6E are flowcharts illustrating a method 600 of creating a new folder, according to some embodiments. The method 600 is performed at a multifunction device (e.g., device 300, fig. 3, or portable multifunction device 100, fig. 1) having a display and a touch-sensitive surface. In some embodiments, the display is a touch screen display (e.g., 112 in FIGS. 5A-5N, 5P-5 LLLL) and the touch sensitive surface is located on the display. In some embodiments, the display is separate from the touch-sensitive surface. Some of the operations in method 600 may be combined and/or the order of some of the operations may be changed.
As described below, method 600 provides an intuitive way to create a new folder. The method reduces the cognitive burden of the user when creating the new folder, thereby creating a more efficient human-machine interface. For battery operated computing devices, enabling users to create new folders faster and more efficiently saves power and increases the time between battery charges.
The device displays 602 a plurality of selectable user interface objects on the display (e.g., in fig. 5M, the device displays a plurality of action icons 5002 and a plurality of folder icons 5004 on the touch screen 112). In some embodiments, the selectable user interface object is (604) action icon 5002. For example, the action icons 5002 may include one or more activatable icons (e.g., photo application icon 5002-1, clock application icon 5002-2, browser application icon 5002-3, card application icon 5002-4, weather application icon 5002-5, memo application icon 5002-6, text application icon 5002-7, map application icon 5002-8, stock application icon 5002-9, camera application icon 5002-12, racing application icon 5002-13, email application icon 5002-14, phone application icon 5002-15, and iPod application icon 5002-16), files (e.g., document action icon 5002-11), bookmarks (e.g., bookmark action icon 5002-10), and the like, representing software programs. Likewise, a first folder icon 5004-1-b is associated with a first folder and a second folder icon 5004-2 is associated with a second folder.
In response to detecting an input corresponding to a selection of a respective selectable user interface object, the device activates (606) an application associated with the respective selectable user interface object. In some embodiments, the application is only activated when the device is in a normal operating mode. For example, selecting the card application icon 5002-1 in FIG. 5A (e.g., while the device is in the normal operating mode) will launch a card game application. It should be appreciated that when action icon 5002 represents a file, an input corresponding to selection of the action icon is detected displaying an application for viewing and/or editing the file. For example, if the device detects an input (e.g., a tap gesture) corresponding to a selection of a bookmark icon (e.g., 5002-10 in FIG. 5A) associated with a bookmarked web page, the device will launch a web browser and navigate to the bookmarked web page. As another example, if the device detects an input corresponding to a selection of a document picture (e.g., 5002-11 in fig. 5A) associated with the spreadsheet document, the device will launch the spreadsheet editing/viewing application and display the spreadsheet document within the spreadsheet editing/viewing application. In some embodiments, the action icon is an application launch icon and the action icon is selected to launch the application if the application is not currently running, or to display the application if the application is currently running but hidden from view. In other words, in response to detecting selection of the action icon, the device displays a view of the application, however, if the application is not running at the time the input is detected, the device must first launch the application, whereas if the application is already running at the time the input is detected, the device may simply display a view of the application and need not launch the application.
The device detects (608) a first input. For example, as shown in fig. 5M, the device detects finger contact 5040 at a location (e.g., first location 5040-a in fig. 5M) on the touch-sensitive surface corresponding to a first object (e.g., card application icon 5002-4) and detects subsequent movement 5042 of the finger contact across the touch-sensitive surface (e.g., to second location 5040-b in fig. 5N). In some embodiments, the first input is a gesture (e.g., a finger drag gesture) detected (610) on the touch-sensitive surface. In some embodiments, the touch-sensitive surface is distinct from the display. In some embodiments, the touch-sensitive surface is combined with a display into touch screen 112.
In response to detecting the first input, the device moves (612) a first object of the plurality of selectable user interface objects across the display (e.g., touch screen 112) to a location on the display (e.g., touch screen 112) that is proximate to a second object of the plurality of selectable user interface objects. For example, in FIG. 5N, the card application icon 5002-4 has moved from a previous position to a current position (e.g., 5002-4 in FIG. 5N) that is proximate to the second object (e.g., the racing application icon 5002-13 in FIG. 5N). In some embodiments, the location is on or at least partially overlaps the second object or an activation area for the second object, as shown in FIG. 5N, where the card application icons 5002-4 partially overlap the racing application icons 5002-13. In some embodiments, the location is on or at least partially overlaps with an activation region for the second object, as described in more detail below with reference to method 900.
The device detects (614) that the first input meets the predefined folder creation criteria while the first object is proximate to the second object. In some embodiments, detecting that the first input meets the predefined folder creation criteria includes detecting (616) termination of the first input when the first object is proximate to the second object. For example, in FIG. 5N, the device detects a lift of contact 5040-b when the card application icon 5002-4 overlaps with the racing application icon 5002-13. As another example, in fig. 5N, the device detects a pause in contact 5040 when the card application icons 5002-4 overlap with the racing application icons 5002-13 for more than a predetermined period of time (e.g., 0.5 seconds, 1 second, 1.5 seconds, or any reasonable period of time). In some embodiments, the device (618) is in a user interface reconfiguration mode (i.e., not in a normal operating mode) when the first input is detected, as described in more detail above. Additionally, it should be appreciated that in some embodiments, the selectable user interface object cannot be moved (e.g., repositioned within an arrangement of selectable user interface objects) while the device is in the normal operating mode.
Operations 622-636 are performed in response to detecting that the first input meets the predefined folder creation criteria while the first object is proximate to the second object (620). In some embodiments, the display of the first object and the second object is stopped (622) in response to detecting that the first input satisfies the predefined folder creation criteria of the first input while the first object is proximate to the second object (e.g., as shown in fig. 5N and 5S). For example, in FIG. 5S, after the device has detected that the first input meets the predefined folder creation criteria (e.g., by detecting that the input is terminated or that the input is paused for more than a predetermined period of time), the card application icons 5002-4 and the racing application icons 5002-13 previously displayed in FIG. 5N are no longer displayed in FIG. 5S. Instead, folder icons 5004-7 are displayed, representing folders containing card application icons and racing application icons. In some embodiments, ceasing to display the second object includes displaying an animation of the second object transforming into a folder icon corresponding to the folder. For example, in FIG. 5S, the folder icons 5004-7 have replaced the racing application icons 5002-13 from FIG. 5N. In some embodiments, a scaled-down representation of the first object and/or the second object is displayed in a folder icon, as described in more detail below with reference to fig. 11A-11C. For example, in FIG. 5S, the folder icons include scaled down representations (e.g., "O" and "S") of the card application icons 5002-4 and the racing application icons 5002-13, respectively.
The device creates (624) a folder containing the first object and the second object. In some embodiments, creating the folder includes displaying (626) a folder icon representing the folder (e.g., as described in more detail below with reference to method 1100). For example, the device creates a folder including a card application icon 5002-4 and a racing application icon 5002-13 and displays the folder icon 5004-7 as shown in FIG. 5S. In some embodiments, the folder icon has different properties than other selectable objects, (e.g., action icons such as application icons, bookmark icons, document icons, etc.), as described in more detail below with reference to method 700. In some embodiments, the folder icon is displayed (628) on the display (e.g., touch screen 112) at a location previously occupied by the second selectable object. For example, in FIG. 5N, the racing application icons 5002-13 are the last selectable user interface object in the arrangement of selectable user interface objects (e.g., from left to right, top-down), while in FIG. 5S, the folder icons 5004-7 containing the racing application icons 5002-13 are displayed as the last selectable user interface object in the arrangement of selectable user interface objects. In some embodiments, displaying the folder icon includes displaying (630) an animation of the second selectable object becoming (e.g., morph) into the folder icon. In some embodiments, existing folders cannot be combined in this way. For example, even if the device detects an input corresponding to dragging a first folder (e.g., 5004-2 in FIG. 5N) over the top of a second folder (e.g., 5004-1-b in FIG. 5N), the device does not add the first folder to the second folder.
In some embodiments, after creating the folder, the device automatically displays (632) a folder view including the contents of the folder. For example, the device automatically moves from the user interface shown in FIG. 5N to the user interface shown in FIG. 5Y in response to detecting that the first input satisfies the predefined folder creation criteria, the device displaying a folder view (e.g., region 5092 containing the card application icons 5002-4 and the racing application icons 5002-13, as shown in FIG. 5Y).
In some embodiments, after detecting the first input, the device displays the plurality of user interface objects in a first arrangement on the display. For example, in fig. 5X, selectable user interface objects (e.g., action icon 5002 and folder icon 5004) are displayed in a two-dimensional grid arrangement, each having a respective location within the grid arrangement. In some of these embodiments, after creating the folder, the device displays (634) a folder view (e.g., 5092 in FIG. 5Y) on a display (e.g., touch screen 112). In some embodiments, displaying the folder view includes displaying a first object and a second object within the folder view. It should be appreciated that the corresponding folder view is the portion of the display (e.g., touch screen 112) that includes the folder content. For example, area 5078 in FIG. 5T, shadow area 5092 in FIG. 5Y-5Z, shadow area 5106 in FIG. 5DD-5GG, shadow area 5138 in FIG. 5TT-5UU, shadow area 5148 in FIG. 5WW, shadow area 5158 in FIG. 5YY-5ZZ, shadow area 5170 in FIG. 5EEE-5FFF, and shadow area 5212 in FIG. 5PPP are all folder views as part of a display (e.g., touch screen 112). In some embodiments, the folder view is substantially all of the display (e.g., touch screen 112). For example, the area in FIG. 5U and the area 5182 in FIGS. 5GGG-5III, 5KKK, 5MMM are folder views for substantially all (or all) of the display (e.g., touch screen 112).
In some embodiments, the device displays an animated transition from the first arrangement to the folder view (e.g., as described in more detail below with reference to method 1200). In some embodiments, displaying the folder view includes displaying (636) a representation of the folder view (e.g., 5004-7 in fig. 5Y) and at least a portion of the first arrangement, thereby indicating a location of the folder icon within the first arrangement. For example, in FIG. 5Y, the selectable user interface objects (e.g., 5002-1, 5002-2, 5002-3, 5002-5, 5002-6, 5002-7, 5002-8, 5002-6, 5004-1-b, 5002-10, 5002-11, and 5002-12) displayed above the folder icon (e.g., in FIG. 5X) continue to be displayed above the folder icon 5004-7, while the selectable user interface objects (e.g., 5002-14, 5002-15, 5002-16, 5004-2) displayed below the folder icon 5004-7 are no longer displayed below the folder icon 5004-7.
In some embodiments, the selectable user interface objects include (638) one or more action icons (e.g., 5002 in fig. 5 OOO) and one or more folder icons (e.g., 5004 in fig. 5 OOO) while the device is in the normal operating mode. In some of these embodiments, the device detects (640) a second input. In some embodiments, in response to detecting the second input: when the second input corresponds to a request to select a respective action icon, the device activates (642) an application associated with the respective action icon; and when the second input corresponds to a request to select a folder icon, the device displays a folder view including content of a folder associated with the folder icon. For example, in fig. 5OOO, a tap gesture 5210 on a touch-sensitive surface (e.g., touch screen display 112) at a location corresponding to a location of an action icon (e.g., photo application icon 5002-9) on the display (e.g., touch screen 112) activates an application (e.g., photo application) associated with the action icon, and in response to detecting a tap gesture 5212 on the touch-sensitive surface (e.g., touch screen 112) at a location corresponding to a location of a folder icon (e.g., folder icon 5004-7 in fig. 5 OOO) on the display (e.g., touch screen 112), the device displays a folder view 5212 that includes content (e.g., action icons 5002-9, 5002-4, 5002-13, 5002-8) of a folder associated with the folder icon 5004-7.
In some embodiments, after creating the folder, the device automatically displays (644) a folder view including the contents of the folder and displays an object removal area. For example, in fig. 5Y, the object removal area is a first portion 5084 of the display (e.g., touch screen 112). As another example, in fig. 5GGG-5HHH, object removal area 5184 is a separately identified portion of a display (e.g., touch screen 112). As another example, in the WW of fig. 5, both portions of the display (e.g., touch screen 112) above and below folder view 5148 are object removal areas. In some embodiments, the folder view includes an object removal area (e.g., folder view 5182 with object removal area 5184 in fig. 5GGG-5 HHH). In some embodiments, the object removal area is outside of the folder view (e.g., object removal area 5084 is outside of folder view 5092 in fig. 5Y). In some embodiments, the object removal area is always displayed (e.g., the object removal area is part of a home screen that is displayed at the time the folder view is displayed). In some embodiments, the object removal area is displayed only while in icon reconfiguration mode (e.g., in fig. 5S, the removal area is not displayed, while in fig. 5Y, the object removal area 5084 is displayed over folder view 5092). In some embodiments, the object removal area is displayed only when in icon reconfiguration mode and a request to move an object is currently detected (e.g., in fig. 5GGG, object removal area 5184 is displayed only when the device detects movement of contact 5186 across the touch-sensitive surface).
In some embodiments, while the folder view is displayed, the device detects (646) a second input corresponding to a request to move the respective selectable user interface object into the object removal area, and in response to detecting the second input, the device removes (648) the respective selectable user interface object from the folder. For example, in FIG. 5Y, the device detects a contact 5100-a at a location on the touch-sensitive surface (e.g., touch screen 112) corresponding to card application icons 5002-4 and a subsequent movement 5102 of the contact across the touch-sensitive surface (e.g., touch screen 112) into object removal area 5084 that is a first portion of the display (e.g., touch screen 112). Continuing with the present example, in response to detecting this gesture, the device removes the card application icons 5002-4 from the folder and from the folder view 5092, as shown in FIG. 5Z, and returns the card application icons 5002-4 to the first arrangement of selectable user interface objects, as shown in FIG. 5 AA.
In some embodiments, the plurality of user interface objects are displayed (650) in a first arrangement on a display (e.g., touch screen 112) prior to detection of the first input, and when the second input corresponds to a request to move a respective selectable user interface object to an object removal area and termination of the second input is detected: the device stops displaying the folder view; and displaying (652) the respective selectable user interface object at a predetermined location in the first arrangement (e.g., at the end of the first arrangement or at a first open location within the first arrangement). In other words, in some embodiments, the second input corresponds to a flick gesture that includes contact on the touch-sensitive surface at a location corresponding to the location of the respective selectable user interface object and includes a lateral movement on the touch-sensitive surface (e.g., touch screen 112) corresponding to movement on the display (e.g., touch screen 112) toward a location corresponding to the object removal area on the display (e.g., touch screen 112). For example, in fig. 5VV, the device displays a first arrangement of selectable user interface objects (e.g., action icon 5002 and folder icon 5004), and in fig. 5WW, the device detects a gesture that includes a contact 5152 on the touch-sensitive surface (e.g., touch screen 112) at a location corresponding to the location of application-1 application icons 5002-19 on the display (e.g., touch screen 112) and a movement 5154 of the contact to a location on the touch-sensitive surface (e.g., touch screen 112) corresponding to a location on the display (e.g., touch screen 112) outside of folder view 5148. In this example, in response to detecting the gesture, the device stops displaying the folder view and displays the application-1 application icons 5002-19 in the first arrangement, as shown in FIG. 5 XX.
In some embodiments, before detecting the first input, displaying (650) the plurality of user interface objects in a first arrangement on a display; when the device continues to detect the second input on the touch-sensitive surface (e.g., touch screen 112) at a location corresponding to the location of the object removal area on the display (e.g., touch screen 112) for more than a predetermined amount of time: the device stops displaying the folder view; the device detects termination of the second input at a respective location on the touch-sensitive surface (e.g., touch screen 112) corresponding to a location on the display (e.g., touch screen 112) within the first arrangement; and in response to detecting termination of the second input, the device displays (656) the respective selectable user interface object at the respective position in the first arrangement. In other words, in some embodiments, the second input corresponds to a tap and drag gesture that includes movement of the contact into an area on the touch-sensitive surface (e.g., touch screen 112) that corresponds to an object removal area on the display (e.g., touch screen 112). For example, in FIG. 5XX, the selectable user interface objects (e.g., action icon 5002 and folder icon 5004) are arranged in a first arrangement, and in FIG. 5YY, the device detects a second input (e.g., a gesture including contact 5161 and movement 5164 of the contact on touch screen 112) corresponding to movement of application-4 application icons 5002-22 out of folder view 5158 into object removal area 5084. In this example, in response to detecting a pause in the contact (e.g., 5162-b in FIG. 5 ZZ) in the object removal area, the device stops displaying the folder view 5158 and displaying the first arrangement (e.g., as shown in FIG. 5 AAA) and detects a subsequent movement (e.g., 5166) of the contact within the first arrangement on the touch screen 112. In this example, when the device detects termination (e.g., lifting of contact 5162-c from touch screen 112 in the BBB of fig. 5), the device displays application-4 application icon 5002-22 at a location in the first arrangement corresponding to the location of application-4 application icon 5002-22 upon detecting termination of the input, as shown in CCC of fig. 5.
In some embodiments, after creating the folder, the device detects (658) a second input corresponding to a request to remove respective selectable user interface objects of the first object and the second object from the folder; and when the second input is detected before creation of the folder has been confirmed, the device deletes (660) the folder and redisplays the first object and the second object. For example, in FIG. 5Y, the first input detected by the device after creating the folder created by dragging the card application icons 5002-4 on top of the racing application icons 5002-13 is the input corresponding to the request to remove the card application icons 5002-4 from the folder (e.g., contact 5100 and movement 5102 of the contact on the touch screen 112). In this example, the input is received before creation of the folder has been confirmed, and thus after the card application icons 5002-4 have been removed from the folder, the device deletes the folder 5004-7 in FIG. 5AA and redisplays the racing application icons 5002-13 on the display (e.g., touch screen 112), as shown in FIG. 5 CC.
In some embodiments, the first object is a corresponding object that is moved out of the folder, and deleting (662) the folder and redisplaying includes: displaying the first object at a location on the display (e.g., touch screen 112) determined based on the second input; and displays the second object on the display (e.g., touch screen 112) at a location previously occupied by the folder icon of the folder. Continuing with the example described above with reference to fig. 5AA and 5CC, in fig. 5AA, the device displays the folder icons 5004-7 in the fourth row on the left end of the selectable user interface object on the display (e.g., touch screen 112), and in fig. 5CC, the racing application icons 5002-13 have replaced the folder icons 5004-7 at the left end of the fourth row of selectable user interface objects. In some embodiments, displaying the second object includes displaying 664 an animation of the folder icon becoming (e.g., morphing) to the second object. For example, in FIG. 5BB, the device displays animation 5104 (e.g., as shown in FIG. 5 AA) that the racing application icons 5002-13 expand to fill the space previously occupied by the folder icons 5004-7.
Conversely, when the second input is detected after the creation of the folder has been confirmed, the device displays the corresponding object outside the folder while continuing to display the folder. For example, when creation of a folder has been confirmed, as shown in FIG. 5DD, removing a single application icon from the folder (e.g., removing card application icons 5002-4 in FIG. 5EE in response to detecting contact 5112 and movement 5114 of the contact) does not result in deletion of the folder. Instead, in this example, the folders (e.g., 5004-7 in FIG. 5 FF) continue to be displayed. In other words, in some embodiments, after folder creation is confirmed, the folder is deleted only when the last icon is removed, and when the last icon is removed, the folder is collapsed (e.g., rather than changing back to one of the action icons).
In some embodiments, creation of the folder is confirmed (668) when the device detects an input corresponding to a request to perform an action to manipulate the folder. Such actions include, but are not limited to, opening (670) the folder, closing (672) the folder, moving (674) the folder, renaming (676) the folder, adding (678) an additional selectable user interface object to the folder, entering (680) the user interface reconfiguration mode, leaving (682) the user interface reconfiguration mode. In some embodiments, creation of a folder is confirmed when the device receives other predefined input corresponding to a request to manipulate the folder. In other words, creation of a folder is confirmed by indicating that creation of the folder is an intentional rather than accidental action.
Note that the details of the other processes described herein with reference to methods 700, 800, 900, 1000, 1100, 1200, 1300 (e.g., fig. 7A-7C, 8A-8C, 9A-9B, 10A-10B, 11A-11C, 12A-12E, 13A-13E, and 5A-5 LLLL) also apply to method 600 described above in a similar manner. For example, the selectable user interface objects (e.g., action icon 5002 and folder icon 5004) described with reference to fig. 6A-6E may have one or more of the characteristics of the various selectable user interface objects/icons/items (e.g., action icon 5002 and folder icon 5004) described herein with reference to any of methods 700, 800, 900, 1000, 1100, 1200, or 1300. For brevity, these details are not repeated here.
Fig. 7A-7C are flowcharts illustrating a method 700 of managing folder icons and action icons, according to some embodiments. The method 700 is performed at a multifunction device (e.g., device 300, fig. 3, or portable multifunction device 100, fig. 1) having a display and a touch-sensitive surface. In some embodiments, the display is a touch screen display and the touch sensitive surface is on the display. In some embodiments, the display is separate from the touch-sensitive surface. Some of the operations in method 700 may be combined and/or the order of some of the operations may be changed.
As described below, the method 700 provides an intuitive way to manage folder icons and action icons. The method reduces the cognitive burden on the user in managing the folder icons and action icons, thereby creating a more efficient human-machine interface. For battery operated computing devices, enabling users to manage folder icons and action icons faster and more efficiently saves power and increases the time between battery charges.
The device simultaneously displays (702) one or more action icons (e.g., application icons and other activatable icons in addition to the folder icon) and one or more folder icons on the display. The device has a normal mode of operation for activating the application (e.g., as shown in fig. 5A, 5L, 5OOO-5 PPP) and a user interface reconfiguration mode for rearranging action icons and folder icons on the display (e.g., as shown in fig. 5B-5K, 5M-5N, 5P-5 NNN). In some embodiments, while the device is in the user interface reconfiguration mode, the selectable user interface objects (e.g., action icon 5002 and folder icon 5004) oscillate about respective average positions in order to indicate that the device is in the user interface reconfiguration mode and can be moved around on the display (e.g., action icon 5002 and folder icon 5004).
The device detects 704 a first input (e.g., the tap gesture 5155 in fig. 5XX, the tap gesture 5156 in fig. 5XX, the tap gesture 5208 in fig. 5OOO, or the tap gesture 5210 in fig. 5 OOO). In some embodiments, when the first input is 706 a tap gesture on the touch-sensitive surface (e.g., touch screen 112) at a location corresponding to a location of a respective folder icon on the display (e.g., tap gesture 5156 in FIG. 5XX at a location corresponding to folder icon 5004-1-b or tap gesture 5210 in FIG. 5OOO at a location corresponding to folder icon 5004-7 on touch screen 112), the first input corresponds to a request to select the respective folder icon. In some embodiments, when the first input is (708) a tap gesture on the touch-sensitive surface (e.g., touch screen 112) at a location corresponding to a location of a respective action icon on the display (e.g., tap gesture 5155 in fig. 5XX at a location corresponding to camera application icon 5002-12 or tap gesture 5208 in fig. 5OOO at a location corresponding to photo application icon 5002-1 on touch screen 112), the first input corresponds to a request to select the respective action icon.
In response to detecting the first input, operations 712-718 and 744-766 are performed (710).
In some embodiments, the device determines whether the first input is a request to select a folder icon or to select an action icon. When the first input is a request to select (712) an action icon, the device performs operations discussed in more detail below with reference to figures 744-766. Conversely, when the first input is a request to select (714) an action icon, the device performs operations discussed in more detail below with reference to operations 716-742.
When the device determines (716) that the first input corresponds to a request to select a respective folder icon of the one or more folder icons, the device displays (718) content of a folder associated with the respective folder icon, regardless of whether the device is in a normal operating mode or a user interface reconfiguration mode. For example, in FIG. 5XX, while the device is in the user interface reconfiguration mode, the device detects a click gesture 5156 at a location corresponding to the folder icon 5004-1-b and, in response to detecting the click gesture 5156, the device displays a folder view 5158 for the folder associated with the folder icon 5004-1-b, as shown in FIG. 5 YY. Likewise, in fig. 5OOO, while the device is in the normal operating mode, the device detects a tap gesture 5210 at a location corresponding to the folder icons 5004-7 and, in response to detecting the tap gesture 5210, the device displays a folder view 5212 for the folders associated with the folder icons 5004-7, as shown in PPP in fig. 5.
In some embodiments, the folder icon may also be moved in response to a folder repositioning input while the device is in the user interface reconfiguration mode. For example, in FIG. 5QQ, the device detects a contact 5132 on the touch-sensitive surface (e.g., touch screen 112) at a location corresponding to the location of folder icons 5004-7 on the display (e.g., touch screen 112) and a subsequent movement 5134 of the contact across the touch-sensitive surface (e.g., touch screen 112). In response to detecting the input, the device moves the folder icons 5004-7 to locations within the arrangement of selectable user interface objects determined based on the input on the display (e.g., touch screen 112), as shown in FIG. 5 SS. In some embodiments, the folder icon cannot be repositioned while the device is in the normal operating mode. In other words, the same gesture performed while the device is in the normal operating mode will not cause the device to reposition the folder icon within the arrangement of selectable user interface objects.
In some embodiments, displaying the content of the folders associated with the respective folder icons includes displaying (720) a plurality of action icons (e.g., in FIG. 5YY, the device displays action icons 5002-20, 5002-21, 5002-22, 5002-23, 5002-24, and 5002-25 in folder view 5158, or in FIG. 5PPP, the device displays action icons 5002-9, 5002-4, 5002-13, 5002-8 in folder icon 5212). In some of these embodiments, the device detects (724) selection of a respective action icon of the plurality of action icons. In some of these embodiments, operations 728-734 are performed (726) in response to detecting the selection of the corresponding action icon. In some embodiments, the device determines whether the device is in a normal operating mode or a user interface reconfiguration mode. In some of these embodiments, when the device is in (728) normal operating mode, the device activates (730) an application associated with the respective action icon (e.g., in the PPP of fig. 5, when the device detects a tap gesture 5214 at a location corresponding to a stock application icon 5002-9, the device activates the stock application associated with the stock application icon 5002-9 in response); and, when the device is in (732) the user interface reconfiguration mode, the device continues (734) to display the respective action icon without activating the application associated with the respective action icon. For example, in fig. 5YY, the device detects a click gesture 5161 on the touch-sensitive surface (e.g., touch screen 112) at a location corresponding to the location of application-7 application icons 5002-25 on the display (e.g., touch screen 112), and in response, the device does not activate any applications associated with application-7 application icons 5002-25.
In some embodiments, the action icons within the folder are moved in response to an action icon movement input while the device is in the user interface reconfiguration mode. For example, in FIG. 5TT, while the device is in the user interface reconfiguration mode, the device detects an action icon movement input that includes a contact 5140 on the touch-sensitive surface (e.g., touch screen 112) at a location corresponding to the location of the stock application icons 5002-9 on the display (e.g., touch screen 112) and a subsequent movement 5142 of the contact to the location on the touch-sensitive surface (e.g., touch screen 112) corresponding to the location on the display (e.g., touch screen 112) within the folder view 5138, as shown in FIG. 5 TT. In response to the action icon movement input, the device moves the stock application icons 5002-9 to a new location within the arrangement of selectable user interface objects within the folder view 5138, the new location based on movement of the input on the touch-sensitive surface (e.g., touch screen 112), as shown in fig. 5 UU.
In some embodiments, the first input corresponds to a request to select a respective folder icon; and the device is in a user interface reconfiguration mode when the first input is detected. In some of these embodiments, the device displays (738) the contents of the folder while still in (736) the user reconfiguration mode. In some of these embodiments, after displaying the contents of the folder, the device detects (740) a second input; and in response to detecting the second input, the device stops (742) displaying the folder view. For example, in fig. 5SS, the device detects a first input (e.g., a tap gesture 5136) on the touch-sensitive surface (e.g., touch screen 112) at a location corresponding to the folder icon 5004-7, and in response to detecting the tap gesture, the device displays a folder view (e.g., 5138 in fig. 5TT-5 UU) that includes contents (e.g., action icons 5002-9, 5002-4, 5002-13, and 5002-8) of the folder associated with the folder icon 5004-7 on the display (e.g., touch screen 112). While the folder view 5138 is displayed, the device detects a second input (e.g., a tap gesture 5144) on the touch-sensitive surface (e.g., touch screen 112) at a location outside of the folder view 5138 (e.g., a tap gesture in an area of the touch screen 112 that is below the folder view), and in response to detecting the second input, the device stops displaying the folder view on the display (e.g., touch screen 112), as shown in VV of fig. 5.
The device determines (744) that the first input corresponds to a request to select a respective action icon of the one or more action icons (e.g., a tap gesture 5155 in FIG. 5XX at a location corresponding to camera application icon 5002-12 or a tap gesture 5208 in FIG. 5OOO at a location corresponding to photo application icon 5002-1). In some embodiments, the device determines whether the device is in a normal operating mode or a user interface reconfiguration mode. When the device is in (746) the normal mode of operation, the device performs operations 750-756, as discussed in more detail below, in response to detecting the first input. Conversely, when the device is in (748) the user interface reconfiguration mode, the device performs operations 758-766, discussed in more detail below, in response to detecting the first input.
In some embodiments, operations 752-756 are performed when the device is in (750) a normal mode of operation. The device activates (752) an application associated with the respective action icon. For example, in fig. 5OOO, the device detects a tap gesture 5208 at a location on the touch-sensitive surface (e.g., touch screen 112) corresponding to the location of photo application icon 5002-1 on the display (e.g., touch screen 112), and in response to detecting tap gesture 5208, the device activates a photo application associated with photo application icon 5002-1. In some embodiments, the function of the respective action icon associated with the respective application is to activate (754) the respective application (e.g., the action icon is an application launch icon) while the device is in the normal operating mode; and while in the normal mode of operation, the application icons and the folder icons cannot be rearranged (756) on the display (e.g., the relative positions of the selectable user interface objects within the arrangement of selectable user interface objects are fixed). It should be appreciated that while the arrangement of selectable user interface objects may be scrolled, paged, or otherwise migrated across a display (e.g., touch screen 112), these operations do not result in any rearrangement of the selectable user interface objects, as the relative positions of the selectable user interface objects to one another remain unchanged when the device performs these operations.
Operations 760-766 are performed when (758) the device is in the user interface reconfiguration mode. The device continues (760) to display the respective action icon without activating the application associated with the respective action icon. For example, in fig. 5XX, the device detects a tap gesture 5155 at a location corresponding to the camera application icons 5002-12, and in response to detecting the tap gesture 5155, the device does not activate the camera application associated with the photo application icons 5002-12, but merely continues to display an arrangement of selectable user interface objects, as shown in fig. 5 XX. In some embodiments, the action icon may also be moved by a tap and drag gesture. For example, in fig. 5XX, if the device is to detect a subsequent movement of contact 5155 across a touch-sensitive surface (e.g., touch screen 112), the device is to move action icons 5002-12 across a display (e.g., touch screen 112) in accordance with the movement in response to detecting the movement.
In some embodiments, when the device is in the user interface reconfiguration mode, the corresponding action icon is prevented (762) from being used to activate applications (e.g., a request to select camera application icons 5002-12 cannot activate an application icon). In some embodiments, when the device is in the user reconfiguration mode, one or more of the action icons includes (764) a delete area for deleting the action icon, and none of the folder icons includes a delete area for deleting the folder icon. For example, the device displays the object removal flags 5010 in FIG. 5B associated with multiple action icons (e.g., 5002-4, 5002-5, 5002-6, 5002-7, 5002-10, 5002-11, and 5002-13), but does not show the object removal flags associated with any folder icons (e.g., 5004-1-B or 5004-2). In some embodiments, the device detects a delete input (e.g., tap gesture 5011 in fig. 5B) that includes a selection of a respective delete zone for a respective action icon; and delete the corresponding action icon (e.g., weather application icon 5002-5). In some embodiments, one or more of the action icons does not include a delete zone. For example, in FIG. 5B, a plurality of action icons (e.g., action icons 5002-1 5002-2, 5002-3, 5002-9, 5002-12, 5002-14, 5002-15, and 5002-16) are associated with applications (e.g., telephones, emails, cameras, web browsers) that provide access to basic features of the device and thus cannot be deleted while the device is in the user interface reconfiguration mode. In some embodiments, while in the user interface reconfiguration mode, the application icons and folder icons may be rearranged (766) on a display (e.g., touch screen 112) in response to the detected input, as described in more detail above.
Note that the details of the other processes described herein with reference to methods 600, 800, 900, 1000, 1100, 1200, 1300 (e.g., fig. 6A-6E, 8A-8C, 9A-9B, 10A-10B, 11A-11C, 12A-12E, 13A-13E, and 5A-5 LLLL) also apply in a similar manner to method 700 described above. For example, the selectable user interface objects (e.g., action icon 5002 and folder icon 5004) described with reference to fig. 7A-7C may have one or more of the characteristics of the various selectable user interface objects/icons/items (e.g., action icon 5002 and folder icon 5004) described herein with reference to any of methods 600, 800, 900, 1000, 1100, 1200, or 1300. For brevity, these details are not repeated here.
8A-8C are flowcharts illustrating a method 800 of naming a newly created folder, according to some embodiments. The method 800 is performed at a multifunction device (e.g., device 300, fig. 3, or portable multifunction device 100, fig. 1) having a display and a touch-sensitive surface. In some embodiments, the display is a touch screen display and the touch sensitive surface is on the display. In some embodiments, the display is separate from the touch-sensitive surface. Some of the operations in method 800 may be combined and/or the order of some of the operations may be changed.
As described below, method 800 provides an intuitive way to name new folders. The method reduces the cognitive burden of the user when naming the new folder, thereby creating a more efficient human-machine interface. For battery operated computing devices, enabling users to name new folders faster and more efficiently saves power and increases the time between battery charges.
The device receives (802) input corresponding to a request to create a folder containing a first item and a second item (e.g., the items may be applications, software programs, or files corresponding to icons or other selectable user interface objects on a display). In some embodiments, the folder initially contains only the first item and the second item (i.e., the first item and the second item are the first two items in the folder). In some embodiments, the request to create a folder containing the first item and the second item includes (804) an input corresponding to a request to move the first item closer to the second item or a request to drag the first item on top of the second item (e.g., as described in more detail above with reference to method 600). For example, in FIG. 5M, the device detects a contact 5040 at a location on the touch-sensitive surface (e.g., a first contact location 5040-a on the touch screen 112 in FIG. 5M) corresponding to a location of the card application icon 5002-4 on the display (e.g., the touch screen 112) and a subsequent movement 5042 of the contact across the touch-sensitive surface (e.g., movement of the contact across the touch screen 112 to a second location 5040-b in FIG. 5N). In response to detecting the input, the device moves the card application icons 5002-4 close to (or on top of) the racing application icons 5002-13, as shown in FIG. 5N. In this example, the device detects a predetermined folder creation criteria (e.g., termination of contact 5040-b in FIG. 5N) and creates a folder that includes a card application icon 5002-4 and a racing application icon 5002-13, and displays a folder icon 5004-7 for the folder on a display (e.g., touch screen 112) as shown in FIGS. 5O and 5S. As another example, the device may receive a first input selecting a first item, a second input selecting a second item, and a third input corresponding to a command to perform a new folder creation operation (e.g., a click on a "new folder" command in a file browser or context menu, etc.).
Operations 808-844 are performed (806) in response to receiving the input. The device creates (808) a folder containing the first item and the second item. The device determines (810) a first plurality of descriptors (e.g., "games," "card games") associated with the first item. The device also determines (812) a plurality of descriptors (e.g., "games," "action games," "racing games") associated with the second item. In some embodiments, an apparatus determines whether there is a common descriptor shared by the first plurality of descriptors and the second plurality of descriptors. Continuing the present example from above, both the card application and the racing application are associated with a common descriptor "game". When a common descriptor is present (814), the device performs operations 824-844, described in more detail below.
In some embodiments, when there are no (816) common descriptors, the device determines that the first plurality of descriptors and the second plurality of descriptors do not (818) share any common descriptors: the device automatically generates (820) a folder name for the folder based on the descriptors in the first plurality of descriptors; and displaying (822) an icon for the folder with the automatically generated folder name on a display. In other words, the folder name for the folder is determined based on the descriptor of only one of the two items that were originally added to the folder. In some embodiments, the descriptor is a descriptor of the first item to be added to the folder (e.g., if more than one item is selected before the folder has been created). In some embodiments, the descriptor is a descriptor of the first item added to the folder. For example, in FIG. 5O, the camera applications associated with the camera application icons 5002-12 are associated with descriptors such as "photography" and "camera management," while the stock applications associated with the stock application icons 5002-9 have descriptors such as "utility" and "financial management. In this example, two application icons are associated with applications that do not share any descriptors. Thus, in this example, the folder created by dragging the camera application icon 5002-12 close to the stock application icon 5002-9 is named "photo" folder 5004-10, while the folder created by dragging the stock application icon 5002-9 close to the camera application icon 5002-12 is named "utility" folder 5004-11. In other words, in some embodiments, when a folder is created using two items that do not share any common descriptors, the name of the resulting folder depends on the order in which the items (e.g., icons associated with the applications) are selected.
When the device determines (824) that the first plurality of descriptors and the second plurality of descriptors share at least a first common descriptor. For example, in FIG. 5O, the card applications associated with the card application icons 5002-4 are associated with descriptors such as "game," card game, "and the racing applications associated with the racing application icons 5002-13 have descriptors such as" game, "" action game, "" racing game. In this example, two application icons are associated with applications that share only a single descriptor (e.g., a "game"), and thus the descriptor is selected as a common descriptor. In some embodiments, the first plurality of descriptors and the second plurality of descriptors share a set of one or more descriptors: the descriptors in the shared set of one or more descriptors having different levels of specificity; and the most specific descriptor of the shared set of one or more descriptors is selected (826) by the device as the first common descriptor. For example, in FIG. 5O, the car racing applications associated with car racing application icons 5002-17 are associated with descriptors such as "game," action game, "" racing game, "and the aviation racing applications associated with racing application icons 5002-18 have descriptors such as" game, "" action game, "" racing game. In this example, the two application icons are associated with applications that share multiple descriptors (e.g., "games"), and thus, the most specific descriptor (e.g., "racing game") is selected as the common descriptor.
In some embodiments, the first plurality of descriptors includes a plurality of labels previously assigned to the first item; the second plurality of descriptors includes a plurality of labels previously assigned to a second item; and the first common descriptor (828) is a tag included in the first plurality of tags and in the second plurality of tags. In some embodiments, the tag is assigned to each item by the user of the device and is applicable only to locally stored items. In some embodiments, the tag is assigned at the remote server and sent to the device by the remote server.
In some embodiments, the first plurality of descriptors includes a first category hierarchy; the second plurality of descriptors includes a second category hierarchy; and the first common descriptor is a category (e.g., "game" > "card game" or "game" > "action game" > "racing game") included in the first category hierarchy and in the second category hierarchy (830). In some embodiments, the first category hierarchy is (832) a set of categories to which a first item within an application database (e.g., a database of applications in an application library) is assigned, and the second category hierarchy is a set of categories to which a second item within the application database is assigned. In some embodiments, the application database is a database of applications in an application library (834). For example, in some embodiments, the descriptor is based at least in part on a category name for an application in a dedicated application library for the mobile device (e.g., an application store for an apple iPhone). In some embodiments, these category names are supplemented with additional tags that indicate additional information about the item (e.g., the name of the creator of the item, the date/time the item was created, etc.).
When the first plurality of descriptors and the second plurality of descriptors share at least a first common descriptor, the device automatically generates (836) a folder name for the folder based on the first common descriptor. After generating the folder name, the device displays (838) an icon on the display for the folder with the automatically generated folder name. Continuing with the present example from above, folders created by dragging the card application icon 5002-4 closer to the racing application icon 5002-13 are named "game" folders 5004-7, as shown in FIGS. 5O and 5S.
In some embodiments, the device simultaneously displays (840) icons for folders and folder views for folders showing folder content. For example, in response to detecting an input (e.g., contact 5040 and movement 5042 in FIG. 5M) that causes the device to create a folder (e.g., folders 5004-7 in FIG. 5S), the device automatically displays a folder view 5092 in FIG. 5Y for the folder. In other words, the device automatically transitions from the user interface shown in FIG. 5M (where the folder creation input is detected) to the user interface shown in FIG. 5S (where the folder is created and the folder icons 5004-7 are displayed) and the user interface shown in FIG. 5Y (where the folder view 5092 for the folder is displayed) without any further intervention from the user.
In some embodiments, in response to receiving the input, the device displays (842) a notification that the folder has been created, wherein the notification includes instructions for renaming the folder. For example, in fig. 5P, after creating the folder, the device displays a naming confirmation dialog 5064 that provides the user with options for confirming folder creation, renaming the folder, and canceling folder creation, as discussed in more detail above. In some embodiments, in response to receiving the input, the device displays (844) a text input field for renaming the folder. For example, in fig. 5P, if the device detects an input corresponding to a request to rename a folder (e.g., tap gesture 5070 on a rename button), the device displays a dialog 5702 for changing the name of the newly created folder (e.g., from "game" to "entertainment" as shown in fig. 5Q). As another example, in response to detecting an input (e.g., tap gesture 5098 in fig. 5Y) corresponding to a request to activate a folder rename button, the device displays a folder rename area 5108 including a text input field, as shown in fig. 5 DD.
In some embodiments, the device detects (846) additional input and, in response to detecting the additional input, the device adds (848) a third item to the folder. In some of these embodiments, the device determines (850) a third plurality of descriptors associated with a third item. In some of these embodiments, the device selects (852) a second descriptor shared by the first plurality of descriptors, the second plurality of descriptors, and the third plurality of descriptors. In some of these embodiments, the device automatically generates (854) a new folder name for the folder based on the second descriptor. In other words, in these embodiments, the name of the folder changes when the contents of the folder change. For example, if the card application icon 5002-4 is to be added to the folder associated with the racing game folder icon 5004-8, the device will modify the name of the racing game folder icon 5004-8 to a new name (e.g., from "racing game" to "game"), where the new name is based at least in part on the descriptors shared by all three items within the folder (e.g., the card application icon 5002-4, the car racing application icon 5002-17, and the aviation racing application icon 5002-18). Conversely, in some embodiments, the folder name is fixed at the time the folder is created, and thus adding a new item to the folder does not change the folder name.
Note that the details of the other processes described herein with respect to methods 600, 700, 900, 1000, 1100, 1200, 1300 (e.g., fig. 6A-6E, 7A-7C, 9A-9B, 10A-10B, 11A-11C, 12A-12E, 13A-13E, and 5A-5 LLLL) also apply in a similar manner to method 800 described above. For example, the items (e.g., action icon 5002 and folder icon 5004) described with reference to fig. 8A-8C may have one or more of the characteristics of the various selectable user interface objects/icons/items (e.g., action icon 5002 and folder icon 5004) described herein with reference to any of methods 600, 700, 900, 1000, 1100, 1200, or 1300. For brevity, these details are not repeated here.
9A-9B are flowcharts illustrating a method 900 of adjusting an activation region for a selectable user interface object in response to an icon management input, according to some embodiments. The method 900 is performed at a multifunction device (e.g., device 300, fig. 3, or portable multifunction device 100, fig. 1) having a display and a touch-sensitive surface. In some embodiments, the display is a touch screen display and the touch sensitive surface is on the display. In some embodiments, the display is separate from the touch-sensitive surface. Some of the operations in method 900 may be combined and/or the order of some of the operations may be changed.
As described below, the method 900 provides an intuitive way to manage icons. The method reduces the cognitive burden on the user in managing icons, thereby creating a more efficient human-machine interface. For battery operated computing devices, enabling a user to manage icons faster and more efficiently saves power and increases the time between battery charges.
The device displays (902) a plurality of icons (e.g., selectable user interface objects such as application icon 5002 and/or folder icon 5004 on touch screen 112) on the display. A first icon (e.g., an action icon or a folder icon) of the plurality of icons is displayed (904) at a first location on a display (e.g., touch screen 112). A second icon (e.g., an action icon or a folder icon) of the plurality of icons that is distinct from the first icon has (906) an activation region of a default size at a second location on the display (e.g., touch screen 112) that is distinct from the first location. For example, in FIG. 5KK, a first icon (e.g., stock application icons 5002-9) is displayed on the right side of the second row of selectable user interface objects. In this example, the second icon 5004-7 initially has a default activation region (e.g., 5122-13-a in FIG. 5 KK). In some embodiments, each icon (e.g., action icon 5002 and folder icon 5004) has an activation region (e.g., activation region 5122 in fig. 5 KK) that is a default size. In some embodiments, the active area 5122 is a hidden active area (e.g., the active area is not shown on the display).
The device detects (908) an input corresponding to a request to move the first icon. For example, as shown in fig. 5KK, the device detects finger contact 5120 at a location on the touch-sensitive surface corresponding to the first icon (e.g., stock application icon 5002-9 in fig. 5 KK) and movement 5121 of the finger contact across the touch-sensitive surface (e.g., from first location 5120-a in fig. 5KK to second location 5120-b in fig. 5LL to third location 5120-c in fig. 5MM on touch screen 112). After detecting the input, the device changes (910) the size of the activation region for the second icon from a default size based on the distance from the first location to the location of the second icon. For example, in FIG. 5KK, the device changes the size of the activation area 5122-13 for the second icon (e.g., the game folder icon 5004-7) from the default activation area 5122-13-a in FIG. 5KK to the enlarged activation area 5122-13-b in FIG. 5LL-5 MM. In some embodiments, the size of the activation region is changed in response to detecting contact on the touch-sensitive surface (e.g., touch screen 112) at a location corresponding to the location of the first icon on the display (e.g., touch screen 112) (e.g., the device changes activation region 5122-13-a to activation region 5122-13-b in response to simply detecting contact 5120 in fig. 5 KK). In some embodiments, the size of the activation region is changed in response to detecting movement of the contact away from a location on the touch-sensitive surface (e.g., touch screen 112) corresponding to the location of the first icon on the display (e.g., touch screen 112) (e.g., the device changes the activation region 5122-13-a to the activation region 5122-13-b only after at least some movement 5121 of the contact 5120 on the display is detected, as shown in fig. 5 LL).
In some embodiments, the size of the activation region of the second icon is also changed (912) based at least in part on the type of the second icon. In some embodiments, when the second icon is a folder icon, the activation region has (914) a first size (e.g., because the first operation is to add the first icon to the folder represented by the second icon). In some of these embodiments, when the second icon is an action icon, the activation region has a second size that is smaller than the first size (e.g., because the first operation is creating a folder that includes the first icon and the second icon, in other words, the activation region is adjusted so that it is easiest to move the icon, and secondly it is easiest to add an icon to a pre-existing folder, and most difficult to create a new folder from the two activation icons).
In some embodiments, for one or more respective icons of the plurality of icons other than the first icon, the device changes (916) the size of the respective activation region for the respective icon from the respective default size based on a distance from the first location to the respective location of the respective icon (e.g., in response to detecting contact at a location on the touch-sensitive surface corresponding to the first icon or in response to movement of the contact away from the location on the touch-sensitive surface corresponding to the first icon). For example, in FIG. 5LL, the size of the activation region for photo application icon 5002-1 has increased from the default size 5122-1-a to the larger size 5122-1-b based on the distance from the first location (e.g., the location of contact 5120-a in FIG. 5 KK) to the corresponding location of photo application icon 5002-1 in FIG. 5 LL. Likewise, it should be noted that in some embodiments, default activation regions (e.g., 5122-5-a, 5122-6-a, 5122-9-a, 5122-10-a, 5122-14-b, 5122-16-b, 5122-17-b, respectively) for a plurality of other icons (e.g., selectable user interface objects 5002-1, 5002-2, 5002-6, 5002-7, 5002-14, 5002-15, 5002-16, 5004-2, respectively in FIG. 5KK-5 LL) have been adjusted as shown in FIG. 5LL (e.g., adjusted activation regions 5122-5-b, 5122-6-b, 5122-9-b, 5122-10-b, 5122-14-b, 5122-16-b, 5122-17-b, respectively in FIG. 5 LL).
In response to detecting the input, the device moves 918 a first icon across a display (e.g., touch screen 112) away from a first location. For example, in FIG. 5LL, the device displays stock application icons 5002-9 moving away from a first location (e.g., the location of stock application icons 5002-9 in FIG. 5 KK). In some embodiments, when the first icon is at least partially within the activation region of the second icon for more than a predetermined period of time, the device displays (920) an indication of an action to be performed upon termination of the input (e.g., an animation indicating that a new folder is to be created or an animation indicating that the first icon is to be added to a folder represented by the second icon). For example, when the second icon is a folder icon, the device may highlight the folder icon to indicate that the first icon is to be added to the folder. As another example, when the second icon is an action icon, the device may highlight the action icon to indicate that a folder including the first icon and the second icon is to be created.
The device detects 922 that the input meets a predefined trigger criteria (e.g., a lift of a finger contact is detected). In some embodiments, detecting that the input meets the predefined trigger criteria includes detecting (924) termination of the first input. For example, termination of the first input is detected when the device detects lifting of the contact 5120-c from the touch-sensitive surface (e.g., touch screen 112).
Operations 928-946 are performed (926) in response to detecting that the input meets a predefined trigger criteria. In some embodiments, the device determines whether the first icon is at least partially within the activation region of the second icon. When the first icon is at least partially within the activation region of the second icon (928), the device performs operations 932-938, as discussed in more detail below. Conversely, when the first icon is not at least partially within (930) the activation region of the second icon, the device performs operations 940-946, as discussed in more detail below.
When the device determines (932) that the first icon is at least partially within an activation region (e.g., 5122-13-b in fig. 5 MM) of the second icon, the device performs (934) a first operation associated with the second icon. In some embodiments, the first operation includes creating (936) a folder (e.g., as described in more detail above with reference to diagram 600) that includes the first icon and the second icon. For example, in FIG. 5MM, the device detects the termination of the input (e.g., the lifting of contact 5120-c), and in response to detecting the termination of the input, the device adds a stock application icon 5002-9 to the folder represented by the game folder icon 5004-7, as shown in FIG. 5 NN. In some embodiments, the first operation includes adding (938) the first icon to a folder represented by the second icon. For example, if the device detects termination of the input while the stock application icons 5002-9 are at least partially within the activation area for the other action icon, the device will create a folder that includes the stock application icons 5002-9 and the other action icon, as described in more detail above with reference to fig. 5M-5N.
When the device determines (940) that the first icon is outside the activation region of the second icon, the device performs (942) a second operation that is distinct from the first operation. In some embodiments, the second operation includes rearranging (944) the plurality of icons on the display (e.g., touch screen 112) such that the first icon is proximate to a location of the second icon on the display. For example, in FIG. 5OO, the device detects the input (e.g., contact 5124 and movement 5126 of the contact) and detects that a predefined trigger criterion has been met (e.g., contact 5124 has been paused for more than a predetermined period of time) at a location 5124-b outside of the activation region 5128-11 of the second icon (e.g., the game folder icon 5004-7 in FIG. 5 OO). Continuing with the present example, in response to determining that the predefined criteria have been met, the device rearranges the icons (e.g., such that all icons on the third row of the arrangement are shifted one interval to the left). In some embodiments, the second operation includes returning (946) the first icon to the first position on the display. For example, if the device detects termination of contact 5120-c in FIG. 5MM while the stock application icon 5002-9 is outside of the activation area 5122-13-b for the game folder icon 5004-7, the device will return the stock application icon 5002-9 to its previous location (e.g., the location of the stock application icon 5002-9 in FIG. 5 LL).
It should be appreciated that in some embodiments, one advantage of adjusting the size of the activation region for an icon on a display (e.g., touch screen 112) based on the distance of the icon is that it provides a larger target for icons that are farther from the icon being moved. Typically, when a touch gesture occurs over a longer distance, the gesture will include a faster movement than a portion of the touch gesture that occurs over a shorter distance. In addition, typically, touch gestures that include faster movements are not as accurate as touch gestures that include slower movements. Thus, by increasing the size of the activation region for icons farther from the start point of the touch gesture, the device must compensate for the reduced accuracy of touch gestures that traverse longer distances across the touch-sensitive surface (e.g., touch screen 112) to reach icons farther from the start point, thereby improving the human-machine interface.
Note that the details of the other processes described herein with respect to methods 600, 700, 800, 1000, 1100, 1200, 1300 (e.g., fig. 6A-6E, 7A-7C, 8A-8C, 10A-10B, 11A-11C, 12A-12E, 13A-13E, and 5A-5 LLLL) also apply in a similar manner to method 900 described above. For example, the icons (e.g., action icon 5002 and folder icon 5004) described with reference to fig. 9A-9B may have one or more of the characteristics of the various selectable user interface objects/icons/items (e.g., action icon 5002 and folder icon 5004) described herein with reference to any of methods 600, 700, 800, 1000, 1100, 1200, or 1300. For the sake of brevity, these details are not repeated here.
Fig. 10A-10B are flowcharts illustrating a method 1000 for reconfiguring icons on a display in response to an icon management input, according to some embodiments. The method 1000 is performed at a multifunction device (e.g., the device 300 of fig. 3, or the portable multifunction device 100 of fig. 1) having a display and a touch-sensitive surface. In some embodiments, the display is a touch screen display and the touch sensitive surface is on the display. In some embodiments, the display is separate from the touch-sensitive surface. Some of the operations in method 1000 may be combined and/or the order of some of the operations may be changed.
As described below, the method 1000 provides an intuitive way of managing icons. The method reduces the cognitive burden on the user in managing icons, thereby creating a more efficient human-machine interface. For battery operated computing devices, enabling a user to manage icons faster and more efficiently saves power and increases the time between battery charges.
The device displays 1002 a plurality of icons on a display (e.g., touch screen 112) in a first arrangement (e.g., action icon 5002 and folder icon 5004 in fig. 5 KK). The device detects (1004) an input corresponding to a request to move a first icon of the plurality of icons from a first location on a display (e.g., touch screen 112) to a second location on the display. For example, the device detects a contact 5120 on a touch-sensitive surface (e.g., touch screen 112) at a location corresponding to the location of stock application icons 5002-9 on a display (e.g., touch screen 112) and a subsequent movement 5121 of the contact (e.g., a movement on touch screen 112 from a first location 5120-a in FIG. 5KK to a second location 5120-b in FIG. 5LL to a third location 5120-c in FIG. 5 MM) corresponding to a request to move stock application icons 5002-9 on a display (e.g., touch screen 112) from their initial location in FIG. 5KK to a location proximate to game folder icons 5004-7 in FIG. 5 KK.
Operations 1008-1034 are performed (1006) in response to detecting the input. The device moves (1008) the first icon from the first position to the second position. For example, as shown in FIGS. 5 KK-5 MM, the device moves the stock application icons 5002-9 across the display (e.g., touch screen 112) toward the game folder icons 5004-7. In some embodiments, in response to detecting the input, the device displays (1010) a residual image of the first icon at the first location. In these embodiments, the residual image of the first icon is visually distinct from the first icon. In some embodiments, the residual image of the first icon is a grayed-out, semi-transparent, translucent, reduced contrast, or ghost image (ghost image) of the first icon. In some embodiments, in response to detecting the input, the device displays (1012) the white space at a first location (e.g., as shown in fig. 5 LL-5 MM, with the white space displayed at the right end of the second row of icons).
The device maintains (1014) the position of each respective icon of the plurality of icons other than the first icon until the auto-reconfiguration criteria has been met. For example, in fig. 5 LL-5 MM, the device continues to display all icons except for the stock application icon 5002-9 at the same location on the display (e.g., touch screen 112), albeit with empty space in the arrangement of icons that was previously where the stock application icon 5002-9 was located.
The device determines (1016) that the auto-reconfiguration criteria have been met (e.g., at some later point in time). In some embodiments, the automatic reconfiguration criteria is met when a predetermined period of time (e.g., 0.5 seconds, 1 second, 2 seconds, etc.) has elapsed (1018) since the input (or the beginning of the input) was detected. In some embodiments, the predetermined period of time is measured from the beginning of the input. In some embodiments, the predetermined period of time is measured from the end of the input. In some embodiments, the automatic reconfiguration criteria is met when a predetermined period of time has elapsed (1020) and the first icon is not currently located on the display (e.g., touch screen 112) at a position within the active area of another icon. In other words, in these embodiments, in addition to the predetermined period of time having elapsed, the first icon must be displayed on the display (e.g., touch screen 112) at a location outside of any active areas of the other icons. In some embodiments, the automatic reconfiguration criteria are met when the device detects (1022) termination of the input. For example, in FIG. 5MM, the device detects the lifting of contact 5120-c and in response, the device adds a stock application icon 5002-9 to the folder associated with the game folder icon 5004-7, as shown in FIG. 5NN, and then rearranges the icons, as shown in FIG. 5 OO. In some embodiments, the automatic reconfiguration criteria are met while the first icon is still moving (1024) on the display. In other words, in some embodiments, the predetermined period of time elapses while the input continues to be detected by the device (e.g., before a lift of contact from the touch-sensitive surface is detected).
In response to determining that the auto-reconfiguration criteria have been met, the device moves (1026) one or more icons of the plurality of icons other than the first icon to form a second arrangement that is distinct from the first arrangement. For example, the folder icons 5004-1-b move from the third row in the arrangement of FIG. 5NN to the second row in the arrangement of FIG. 5OO, while the action icons 5002-10, 5002-11, and 5002-12 move to the left, and the game folder 5004-7 moves up from the fourth row in the arrangement of FIG. 5NN to the third row in the arrangement of FIG. 5 OO. In some embodiments, the second arrangement includes (1028) a second icon at the first location that is distinct from the first icon. For example, in FIG. 5OO, the device displays a folder icon 5004-1-b at the location previously occupied by the stock application 5002-9 in FIG. 5 KK. In some embodiments, the first icon (e.g., stock application icons 5002-9) has been moved to a different location within the icon. In some embodiments, the first icon (stock application icons 5002-9) has been removed from the plurality of icons (e.g., by being added to folders 5004-7).
In some embodiments, a second arrangement is formed (1030) after (responsive to) detecting that the input meets a predefined trigger criteria (e.g., detecting termination of the input), and the second arrangement includes displaying (1032) icons on the display in a predefined arrangement (e.g., a two-dimensional grid or other regularly spaced arrangement on the touch screen 112). In some embodiments, when the auto-reconfiguration criteria have been met, the icons are dynamically reconfigured (1034) as the first icon moves back and forth on the display (e.g., so as to avoid overlapping between the first icon and other icons on the touch screen 112). In other words, in some embodiments, when the second position of the first icon at least partially overlaps the corresponding initial position of the second icon, the device moves the second icon from the corresponding initial position to the corresponding new position in order to accommodate the display of the first icon at the second position in the first region. For example, rearranging the icons includes swapping the position of the first icon with the position of the other icons as the first icon moves back and forth across the display (e.g., while continuing to detect contact on the touch screen 112).
Note that the details of the other processes described herein with respect to methods 600, 700, 800, 900, 1100, 1200, 1300 (e.g., fig. 6A-6E, 7A-7C, 8A-8C, 9A-9B, 11A-11C, 12A-12E, 13A-13E, and 5A-5 LLLL) also apply in a similar manner to method 1000 described above. For example, the icons (e.g., action icon 5002 and folder icon 5004) described with reference to fig. 10A-10B may have one or more of the characteristics of the various selectable user interface objects/icons/items (e.g., action icon 5002 and folder icon 5004) described herein with reference to any of methods 600, 700, 800, 1000, 1100, 1200, or 1300. For the sake of brevity, these details are not repeated here.
Fig. 11A-11C are flowcharts illustrating a method 1100 for updating a dynamic folder icon to provide a visual indication of the content of a folder associated with the dynamic folder icon, in accordance with some embodiments. The method 1100 is performed at a multifunction device (e.g., the device 300 of fig. 3, or the portable multifunction device 100 of fig. 1) having a display and a touch-sensitive surface. In some embodiments, the display is a touch screen display and the touch sensitive surface is on the display. In some embodiments, the display is separate from the touch-sensitive surface. Some of the operations in method 1100 may be combined and/or the order of some of the operations may be changed.
As described below, method 1100 provides an intuitive way of managing folders. The method reduces the cognitive burden on the user in managing folders, thereby creating a more efficient human-machine interface. For battery operated computing devices, enabling users to manage folders faster and more efficiently saves power and increases the time between battery charges.
The device displays 1102 a dynamic folder icon (e.g., game folder icons 5004-7 in the SS of fig. 5). The dynamic folder icon (e.g., 5004-7 in FIG. 5 SS) includes visual indications of the current content in the folder (e.g., "O", "R", "S", and "M" in the game folder icons 5004-7 in FIG. 5 SS) associated with the dynamic folder icon. In some embodiments, the dynamic folder icon indicates (1104) a plurality of action icons contained within a folder associated with the dynamic folder icon. For example, the game folder icons 5004-7 in FIG. 5SS include a scaled down representation of four selectable user interface objects within the folder associated with the folder icon (e.g., boxes containing "O", "R", "S", and "M" in the game folder icons 5004-7).
In some embodiments, the content has (1106) a spatial arrangement within the folder, and the dynamic folder icon (e.g., 5004-7 in FIG. 5 TT) provides a visual indication of the spatial arrangement of the content within the folder. In some embodiments, the content includes a plurality of action icons (e.g., 5002-4, 5002-13, 56002-9, 5002-8 in FIG. 5 TT), the spatial arrangement is a predefined grid of action icons (e.g., a one by four grid of action icons shown in folder view 5138 of FIG. 5 TT), and the plurality of corresponding action icons each have (1108) one or more horizontally adjacent action icons. In some embodiments, the dynamic folder icons (e.g., 5004-7 in FIG. 5 TT) include a reduced scale representation of at least a subset of the plurality of action icons (e.g., boxes containing "O", "R", "S", and "M" in the game folder icons 5004-7), and the reduced scale representation is arranged (1110) such that for the plurality of reduced scale representations, the reduced scale representation of each respective action icon is horizontally adjacent to the reduced scale representation of one of the adjacent action icons for that respective action icon. In some embodiments, the folder includes a plurality of action icons having a first scale; and displaying the dynamic folder icon includes displaying (1112) a scaled-down representation of one or more of the plurality of action icons within the dynamic folder icon at a second scale.
For example, in FIG. 5TT, in response to a folder display input (e.g., tap gesture 5136 in FIG. 5 SS), the device displays a folder view 5138 of the folder associated with the game folder icons 5004-7. As shown in fig. 5TT, the folder view includes four selectable user interface objects including a card application icon 5002-4, a racing application icon 5002-13, a stock application icon 5002-9, and a map application icon 5002-8, which have a rectilinear spatial arrangement from left to right. In this embodiment, the game folder icons 5004-7 provide a visual indication of the spatial arrangement of the action icons by displaying a scaled down representation of the content in an order based on the order of the action icons within the folder view. Specifically, the racing application icons 5002-13 are between the card application icons 5002-4 (on the left) and the stock application icons 5002-9 (on the right), and the scaled-down representation of the racing application icons (e.g., "R" in the game folder icons 5004-7 in FIG. 5 TT) is between the scaled-down representation of the card application icons (e.g., "O" in the game folder icons 5004-7 in FIG. 5 TT) and the scaled-down representation of the card application icons (e.g., "S" in the game folder icons 5004-7 in FIG. 5 TT).
The device detects (1114) an input corresponding to a request to modify content in the folder. For example, in FIG. 5TT, the device detects a contact 5140 on the touch-sensitive surface (e.g., touch screen 112) at a location corresponding to the location of stock application icons 5002-9 on the display (e.g., touch screen 112) and a subsequent movement 5142 of the contact to a location on the touch-sensitive surface (e.g., touch screen 112) corresponding to a location on the display (touch screen 112) within folder view 5138. In some embodiments, the folder is an automatically created folder, and detecting input corresponding to a request to modify content in the folder includes detecting (1116) input associated with a respective application such that a respective action icon associated with the respective application is added to the folder, or removed from the folder, or moved within the folder (e.g., the folder is a recently added application folder, and the input is an application folder that has been added to the device, or the folder is the most frequently used application folder, and the input is an application use that makes the application the most frequently used application). For example, when a user launches an application, the application associated with the application is moved into a "recently used application" folder.
Operations 1120-1136 are performed (1118) in response to detecting the input. The device modifies (1120) the contents of the folder; and update the dynamic folder icon (e.g., 5004-7 in UU of fig. 5) to include a visual indication of the spatial arrangement of the modified content within the folder. It should be appreciated that modification of content in a folder can include repositioning content within the folder, deleting content from the folder, and/or adding content to the folder.
In some embodiments, the device detects a repositioning input corresponding to a request to reposition the first action icon within the folder; and in response to detecting the repositioning input: the device relocates the first action icon within the folder according to the relocation input; and repositioning the scaled down representation of the action icon within the dynamic folder icon based on the repositioning of the first action icon within the folder. For example, in fig. 5TT, the device detects a repositioning input that includes a contact 5140 on the touch-sensitive surface (e.g., touch screen 112) at a location corresponding to the location of the stock application icons 5002-9 on the display (e.g., touch screen 112) and a subsequent movement 5142 of the contact to the location on the touch-sensitive surface (e.g., touch screen 112) corresponding to the location on the display (touch screen 112) within the folder view 5138. In this example, in response to detecting the repositioning input, the device moves the stock application icons 5002-9 to a new position within the folder view 5138, as shown in fig. 5UU (e.g., such that the card application icons 5002-4 are displayed between the stock application icons 5002-9 and the racing application icons 5002-13). Additionally, in this embodiment, the device rearranges the scaled-down representation within the game folder icons 5004-7, as shown in FIG. 5UU, such that the scaled-down representation provides a visual indication of the updated spatial arrangement of the content within the folder. Specifically, the scaled-down representation of the card application icon (e.g., "O" in the game folder icons 5004-7 in the UU of FIG. 5) is between the scaled-down representation of the stock application icon (e.g., "S" in the game folder icons 5004-7 in the UU of FIG. 5) and the scaled-down representation of the racing application icon (e.g., "R" in the game folder icons 5004-7 in the UU of FIG. 5).
In some embodiments, the device detects an icon removal input corresponding to a request to remove a first action icon from the folder; and in response to detecting the icon removal input: the device removes (1124) the first action icon from the folder; and removing the scaled down representation of the first action icon from the dynamic folder icon. For example, in fig. 5EE, the device detects an icon removal input that includes a contact 5112 at a location corresponding to the card application icons 5002-4, and a subsequent movement 5114 of the contact out of the folder view 5106 into the first portion 5108 of the background as the object removal area. In response to this icon removal input, the device removes the card application icons 5002-4 from the folder view 5106, as shown in FIG. 5 FF. Additionally, in this example, while the device previously displayed a scaled-down representation of the card application icon in the folder icon 5004-7 associated with the folder view 5106 (e.g., "O" in the play folder icon 5004-7 in fig. 5 EE), the device stopped displaying the scaled-down representation of the card application icon (e.g., "O" in fig. 5FF is no longer displayed in the play folder icon 5004-7).
In some embodiments, the device detects an icon addition input corresponding to a request to add a first action icon to a folder; and in response to detecting the icon removal input: the device adds (1126) the first action icon to the folder; and adding the scaled down representation of the first action icon to the dynamic folder icon. For example, in fig. 5 OO-5 PP, the device detects an icon addition input that includes a contact 5124-a on the touch-sensitive surface (e.g., touch screen 112) at a location corresponding to the location of the map application icon 5002-8 on the display (e.g., touch screen 112) in fig. 5OO, and a subsequent movement (e.g., 5126 and 5130) of the contact on the touch-sensitive surface (e.g., touch screen 112) to a location corresponding to the location of the activation region 5128-11 of the game folder icon 5004-7 on the display (e.g., touch screen 112). In response to this icon addition input, the device adds map application icons 5002-8 to folders associated with game folder icons 5004-7, as shown in FIG. 5 TT. Further, in this example, although the device did not previously display any scaled-down representation of the map application icon in the folder icon 5004-7 (e.g., as shown in fig. 5 OO-5 PP), after (or in conjunction with) adding the map application icon 5002-8 to the game folder icon 5004-7, the device displays the scaled-down representation of the map application icon within the game folder icon 5004-7 (e.g., displaying "M" in fig. 5 QQ).
In some embodiments, when there is a notification from an application corresponding to an action icon, the content of the folder includes the action icon and the dynamic folder icon changes (1128). In some embodiments, the dynamic folder icon is changed by displaying (1130) a status indicator on the dynamic folder icon. For example, in FIG. 5B, folder icon 5004-1-B displays a notification tab (badge) 5012 indicating that one of the applications associated with the application icons within the folder represented by folder icon 5004-1-B has a notification. In some embodiments, the dynamic folder icon is changed by displaying (1132) a status indicator on a scaled down version of the action icon within the dynamic folder icon. For example, in FIG. 5B, the folder icons 5004-1-B display a notification tab 5014 on a scaled-down representation (e.g., "x 7") within the folder icon that indicates that the application associated with the scaled-down representation has a notification (e.g., if the application associated with "x7" is an email application, the notification will typically indicate that a new mail has arrived).
In some embodiments, the appearance of the dynamic folder icon changes when the device is in the user interface reconfiguration mode. In some embodiments, the folder has a limited space (e.g., 9-page slots, 12-page slots, 16-page slots, or any other reasonable number of page slots) for displaying selectable user interface objects (e.g., selectable user interface objects such as application icons and/or file icons), and the appearance of the dynamic folder icon indicates (1136) whether the folder has space to display any additional selectable user interface objects (e.g., by displaying space for a reduced scale representation of additional action icons) in the user interface reconfiguration mode.
For example, in FIG. 5A, when the device is in a normal operating mode, dynamic folder icon 5004-1-a displays scaled-down representations (e.g., "x1", "x2", "x3", "x4", "x5", and "x 6") representing content within the folder (e.g., action icon 5002). However, in this example, there are more than six action icons 5002 within the folder, and thus the device only displays a reduced-size representation of the first six action icons within the dynamic folder icon. Continuing with the present example, when the device enters the user interface reconfiguration mode (e.g., in response to detecting the press and hold gesture 5008 in FIG. 5A), the device changes the dynamic folder icon (e.g., from dynamic folder icon 504-1-B in FIG. 5A to dynamic folder icon 5004-1-B in FIG. 5B), moves the reduced scale representation within the dynamic icon up to reveal an additional reduced scale representation within folder icon 5004-1-B (e.g., "x7" in folder icon 5004-1-B) while ceasing to display the first three reduced scale representations (e.g., "x1", "x2", and "x 3") of the folder icon within folder icon 5004-1-B. In this example, the dynamic folder icon also displays two empty spaces immediately following the new reduced scale representation. In addition to providing an indication that there are more than six items in the folder associated with folder icon 5004-1-b, changing the dynamic folder icon by showing two empty spaces next to the new reduced scale representation also provides space within the folder view to display additional action icons.
Note that the details of the other processes described herein with respect to methods 600, 700, 800, 900, 1000, 1200, 1300 (e.g., fig. 6A-6E, 7A-7C, 8A-8C, 9A-9B, 10A-10B, 12A-12E, 13A-13E, and 5A-5 LLLL) also apply in a similar manner to method 1100 described above. For example, the selectable user interface objects (e.g., action icon 5002 and folder icon 5004) described with reference to fig. 11A-11C may have one or more of the characteristics of the various selectable user interface objects/icons/items (e.g., action icon 5002 and folder icon 5004) described herein with reference to any of methods 600, 700, 800, 900, 1000, 1200, or 1300. For the sake of brevity, these details are not repeated here.
Fig. 12A-12E are flowcharts illustrating a method 1200 for providing context information in connection with displaying the contents of a folder, according to some embodiments. Method 1200 is performed at a multifunction device (e.g., device 300 of fig. 3, or portable multifunction device 100 of fig. 1) having a display and a touch-sensitive surface. In some embodiments, the display is a touch screen display and the touch sensitive surface is on the display. In some embodiments, the display is separate from the touch-sensitive surface. Some of the operations in method 1200 may be combined and/or the order of some of the operations may be changed.
As described below, method 1200 provides an intuitive way of displaying the contents of a folder. The method reduces the cognitive burden on the user in managing folders by providing contextual information in conjunction with displaying the contents of the folders, thereby creating a more efficient human-machine interface. For battery operated computing devices, enabling users to manage folders faster and more efficiently saves power and increases the time between battery charges.
The device displays (1202) a folder icon (e.g., folder icons 5004-7 in fig. 5S) on top of a wallpaper background on a display (e.g., touch screen 112), the folder icon corresponding to a folder containing content including one or more selectable user interface objects (e.g., application icons, bookmark icons, and/or document icons). In some embodiments, the device displays (1204) one or more additional selectable user interface objects (e.g., action icons 5002-1, 5002-2, 5002-3, 5002-5, 5002-6, 5002-7, 5002-8, 5002-9, 5002-10, 5002-11, 5002-12, 5002-14, 5002-15, 5002-16, and folder icons 5004-1-b and 5004-2) before detecting the first input.
The device detects (1206) a first input (e.g., a tap gesture 5076 in fig. 5S) corresponding to a request to display content of a folder (e.g., a folder associated with folder icons 5004-7 in fig. 5S).
Operations 1210-1226 are performed (1208) in response to detecting a first input (e.g., tap gesture 5076 in fig. 5S). The device divides (1210) the wallpaper background into a first portion (e.g., 5084 in fig. 5X-5 AA) and a second portion (5086 in fig. 5X-5 AA). In some embodiments, the one or more additional selectable user interface objects include: a first set of additional selectable user interface objects (e.g., action icons 5002-1, 5002-2, 5002-3, 5002-5, 5002-6, 5002-7, 5002-8, 5002-9, 5002-10, 5002-11, 5002-12, and folder icons 5004-1-b) having predefined locations on a first portion 5084 of the wallpaper background; and a second set of additional selectable user interface objects (e.g., action icons 5002-14, 5002-15, 5002-16, and folder icon 5004-2 in fig. 5X-5 AA) having predefined locations on the second portion 5086 of the wallpaper background (1204).
In some embodiments, a folder icon (e.g., 5004-7 in fig. 5X-5 AA) is displayed (1212) within a first portion of the wallpaper (e.g., 5084 in fig. 5X-5 AA). In some embodiments, when the folder icon (e.g., 5004-2 in fig. 5 CCC-5 FFF) is one of a plurality of selectable user interface objects in a dock (e.g., 5006 in fig. 5 CCC-5 FFF) of a display (e.g., touch screen 112), the first portion (e.g., 5172 in fig. 5 DDD-5 FFF) includes the dock of the display. In contrast, in some embodiments, when the folder icon (e.g., 5004-7 in fig. 5X-5 AA) is one of the plurality of selectable user interface objects outside of the dock (e.g., 5006 in fig. 5X-5 AA) of the display (e.g., touch screen 112), the second portion (e.g., 5086 in fig. 5X-5 AA) includes the dock (e.g., 5006 in fig. 5X-5 AA) of the display. In other words, according to these embodiments, when the folder icon is in the dock, the wallpaper is split above the folder icon, and when the folder icon is in the navigation area above the dock, the wallpaper is split below the folder icon. In some embodiments, the navigation area (e.g., a home screen with a plurality of selectable user interface objects) has a plurality of pages (e.g., the pages can be navigated in response to detecting a horizontal swipe gesture) and the dock remains in a fixed position even when the device scrolls through an arrangement of selectable user interface objects in the navigation area.
The device moves (1214) the second portion away from the first portion. For example, in the example of fig. 5X-5Y, the second portion 5086 is moved away from the first portion 5084. As another example, in fig. 5DD, the second portion 5174 is being moved away from the first portion 5172. In some embodiments, the first portion is also moved away from the second portion. For example, in fig. 5VV to 5WW, the device moves both the upper portion of the wallpaper background and the lower portion of the wallpaper background away from each other. In some embodiments, moving the first portion away from the second portion includes moving (1216) a first set of additional selectable user interface objects in accordance with movement of the first portion of the wallpaper background, and moving a second set of additional selectable user interface objects in accordance with movement of the second portion of the wallpaper background. As shown, for example, in fig. 5X-5Y, as the second portion 5086 is moved away from the first portion 5084, the second portion 5086 selectable user interface objects (e.g., 5002-14, 5002-15, 5002-16, and 5004-2 in fig. 5X) are moved out of the display (e.g., touch screen 112).
In some embodiments, partitioning the wallpaper background includes partitioning (1218) the wallpaper along a partition line such that a contour of a first edge (e.g., 5088 in fig. 5X) of the first portion is complementary to a contour of a second edge (e.g., 5090 in fig. 5X) of the second portion (e.g., the first edge of the first portion and the second edge of the second portion fit together like a tile), and moving the second portion away from the first portion includes adjusting the contour of the first edge and/or adjusting the contour of the second edge such that the contour of the first edge ceases to be complementary to the contour of the second edge (e.g., after moving apart the portions, the first portion and the second portion no longer fit together like a tile). For example, as shown in fig. 5X, the contour of first edge 5088 is initially complementary to the contour of second edge 5090, whereas in fig. 5Y, the contour of first edge 5088 is no longer complementary to the contour of second edge 5090. In some embodiments, the folder icon is displayed within the first portion of the wallpaper; and the profile of the first edge includes (1220) a cut-out shape (e.g., 5094 in fig. 5X-5Y) defined by the dividing line. In some embodiments, the cut-out shape provides a visual indication of the location of the folder icon within the first portion. For example, in fig. 5X-5Y, the cut-out shape 5094 points to the folder icon 5004-7 associated with the folder view 5092.
The device displays (1222) the contents of the folder in an area between the first portion and the second portion. For example, the contents of the folder include a plurality of application icons (e.g., 5002-4 and 5002-13 in FIG. 5Y, or 5002-26, 5002-27, 5002-28, 5002-29, 5002-30, 5002-31, 5002-32, 5002-33, 5002-34, 5002-35, 5002-36, and 5002-37 in FIG. 5 EEE) and the application icons are presented as if they were underneath the wallpaper (e.g., as if the wallpaper were a sliding door that has been opened to present the application icons behind the wallpaper), as shown in FIGS. 5X-5Y and 5 CCC-5 EEE.
In some embodiments, the size of the region between the first portion and the second portion is determined (1224) based on the number of selectable user interface objects within the folder. For example, in FIG. 5Y, the folder includes only two selectable user interface objects, and thus folder view 5092 occupies significantly less than half of the display. As another example, in the EEE of FIG. 5, where a folder includes at least twelve selectable user interface objects, the folder view 5170 occupies more than half of the display. Thus, the folder view (e.g., 5092 or 5170) occupies only a desired amount of space on the display (e.g., touch screen 112), thereby leaving more space on the display (e.g., touch screen 112) to show context information (e.g., other selectable user interface objects outside of the folder view).
In some embodiments, prior to detecting the first input, the device displays one or more additional selection user interface objects (e.g., action icons 5002-1, 5002-2, 5002-3, 5002-5, 5002-6, 5002-7, 5002-8, 5002-9, 5002-10, 5002-11, 5002-12, and folder icons 5004-1-b in FIG. 5X through FIG. 5 AA); and, in response to detecting the first input, the device visually distinguishes 1226 (e.g., by highlighting) the folder icon from one or more additional selectable user interface objects (e.g., by changing the brightness, contrast, chromaticity, saturation, color, etc. of the additional selectable user interface objects as compared to the brightness, contrast, chromaticity, saturation, color, etc. of the folder). In some embodiments, the initial brightness, contrast, chromaticity, saturation, color, etc. of the additional selectable user interface object (e.g., before the first input is detected) is the same as the brightness, contrast, chromaticity, saturation, color, etc. of the folder. For example, in fig. 5Y-5V, the device displays the folder icons 5004-7 with normal opacity and saturation, while other selectable user interface objects (e.g., action icons 5002-1, 5002-2, 5002-3, 5002-5, 5002-6, 5002-7, 5002-8, 5002-9, 5002-10, 5002-11, 5002-12, and folder icons 5004-1-b in fig. 5X-5 AA) are displayed with reduced opacity and saturation.
In some embodiments, operations 1230-1232 are performed when the device is in (1228) the normal mode of operation when the first input is detected. In some of these embodiments, the device detects (1230) a second input corresponding to activation of a respective selectable user interface object in the folder; and, in response to detecting the second input, the device performs (1232) an operation associated with the respective selectable user interface object. For example, in fig. 5PPP, the device displays a folder view 5212 when the device is in a normal operating mode. In this example, the device detects a second input (e.g., a tap gesture 5214) at a location corresponding to the stock application icons 5002-9, and in response to the second input, the device activates the stock application (e.g., launches or displays a view for the stock application).
In some embodiments, operations 1236-1238 are performed when the device is in (1234) the user interface reconfiguration mode when the first input is detected. In some of these embodiments, the device detects (1236) a second input corresponding to a request to move a respective selectable user interface object within the folder; and in response to detecting the second input, the device moves (1238) the corresponding selectable user interface object within the folder. For example, in FIG. 5TT, the device displays folder view 5138 when the device is in user interface reconfiguration mode. In this example, the device detects a second input (e.g., contact 5140 at a location on touch screen 112 corresponding to the location of stock application icons 5002-9 and movement 5142 of the contact across touch screen 112), and in response to the second input, the device moves the mobile stock application 5002-9 in folder view 5138 to a new location within the folder view in accordance with the input, as shown in FIG. 5 SS.
In some embodiments, operations 1242-1244 are performed when the contents of the folder are displayed (1240) in the area between the first portion and the second portion: an input corresponding to a selection of the first portion or the second portion is detected (1242). In some embodiments, in response to detecting the input, the device stops (1244) displaying the contents of the folder (e.g., by moving the first portion and the second portion together to collapse the region). For example, in fig. 5UU, when the device is displaying a folder view 5138 including content (e.g., action icons 5002-9, 5002-4, 5002-13, and 5002-8), the device detects a second input (e.g., a tap gesture 5144) on the touch-sensitive surface (e.g., touch screen 112) at a location corresponding to a location on the display (e.g., touch screen 112) outside of folder view 5138, and in response to detecting the second input, the device stops displaying folder view 5138, as shown in fig. 5 VV.
In some embodiments, the device enters (1246) a user interface reconfiguration mode and, when the contents of the folder are displayed (1248) in an area between the first portion and the second portion, the device detects (1250) input corresponding to a request to move a respective selectable user interface object from the area between the first portion and the second portion into the first portion or the second portion. In some of these embodiments, in response to detecting the input, the device removes (1252) the respective selectable user interface object from the folder. In some embodiments, the folder view ceases to be displayed (e.g., by moving the first and second portions together to collapse the region) in response to the selectable user interface object being moved out of the folder. For example, in FIG. 5YY, while the device is displaying the folder view 5148, the device detects an input corresponding to a request to move the selectable user interface out of the folder view 5148. Specifically, the device detects a contact 5162 at a location (e.g., first location 5162-a in fig. 5 YY) on the touch-sensitive surface (e.g., touch screen 112) corresponding to a location of the application-4 application icon 2002-22 on the display (e.g., touch screen 112), and a subsequent movement 5164 of the contact to a location (e.g., to a second location 5162-b as shown in fig. 5 ZZ) on the touch-sensitive surface (e.g., touch screen 112) corresponding to a location on the display (e.g., touch screen 112) outside of the folder view. In this example, after detecting that the pause in movement has continued for more than a predetermined period of time, the device stops displaying the folder view, as shown in FIG. 5 AAA. Subsequently, the selectable user interface objects (e.g., application-4 application icons 5002-22) are removed from the folder and displayed outside the folder view in an arrangement of selectable user interface objects, as shown in FIG. 5 CCC.
In some embodiments, a first portion of the contents of the folder is displayed (1254) in an area between the first portion and the second portion (e.g., folder view 5170 in the EEE of FIG. 5). In some of these embodiments, the device detects (1256) a next portion input corresponding to a request to display a next portion of content of the folder; and in response to detecting the next portion input, the device displays (1258) a second portion of the content of the folder in an area between the first portion and the second portion (e.g., detecting a flick gesture to the left or right causes a next page of the application icon or a previous page of the application icon to be displayed in the area). For example, in the EEE of FIG. 5, the device displays a first portion of the contents of the folder in folder view 5170 that includes a first plurality of selectable user interface objects (e.g., 5002-26, 5002-27, 5002-28, 5002-29, 5002-30, 5002-31, 5002-32, 5002-33, 5002-34, 5002-35, 5002-36, and 5002-37 in the EEE of FIG. 5). In the FFF of fig. 5, in response to detecting the slide gesture (e.g., contact 5176 and movement 5178 of the contact to the left at a location of touch screen 112 corresponding to a location within folder view 5170), the device displays a second portion of content of the folder within folder view 5170 on a display (e.g., touch screen 112) that includes a second plurality of selectable user interface objects (e.g., 5002-38, 5002-39, 5002-40, 5002-41, 5002-42, 5002-43 in the FFF of fig. 5).
In some embodiments, a first portion of the contents of the folder is displayed (1260) in an area between the first portion and the second portion. In some of these embodiments, the device detects (1262) a scroll input corresponding to a request to scroll content of a folder; and in response to detecting the scroll input, the device scrolls (1264) the content of the folder laterally on the display (e.g., touch screen 112) in an area between the first portion and the second portion to display the second portion of the content. In other words, in some embodiments, in response to detecting a scroll input, a list or array of selectable user interface objects may be continuously scrolled through by the device instead of scrolling distinct portions of the contents of the folder (e.g., a flick gesture comprising a contact 5176 and a subsequent movement 5178 of the contact on touch screen 112 corresponding to a location within the folder view, as shown in fig. 5 EEE).
Note that the details of the other processes described herein with respect to methods 600, 700, 800, 900, 1000, 1100, 1300 (e.g., fig. 6A-6E, 7A-7C, 8A-8C, 9A-9B, 10A-10B, 11A-11C, 13A-13E, and 5A-5 LLLL) also apply in a similar manner to method 1200 described above. For example, the selectable user interface objects (e.g., action icon 5002 and folder icon 5004) described with reference to fig. 12A-12E may have one or more of the characteristics of the various selectable user interface objects/icons/items (e.g., action icon 5002 and folder icon 5004) described herein with reference to any of methods 600, 700, 800, 900, 1000, 1100, or 1300. For the sake of brevity, these details are not repeated here.
Fig. 13-13E are flowcharts illustrating methods 1300 for displaying and navigating a multi-page folder, according to some embodiments. The method 1300 is performed at an electronic device (e.g., the device 300 of fig. 3, or the portable multifunction device 100 of fig. 1) having a display and a touch-sensitive surface. In some embodiments, the display is a touch screen display and the touch sensitive surface is on the display. In some embodiments, the display is separate from the touch-sensitive surface. Some of the operations in method 1300 may be combined and/or the order of some of the operations may be changed.
As described below, the method 1300 provides an intuitive way of displaying and navigating a multi-page folder. The method reduces the cognitive burden on the user when navigating the user interface, particularly when navigating folders, thereby creating a more efficient human-machine interface. For battery operated computing devices, enabling a user to navigate folders faster and more efficiently saves power and increases the time between battery charges.
The device concurrently displays (1302) a plurality of selectable user interface objects including one or more folder icons on a display. For example, FIG. 5QQQ illustrates an exemplary user interface in which selectable user interface objects, such as action icon 5002 and folder icon 5004, are displayed on touch screen 112.
The device detects (1304) a first input (e.g., a click gesture on a respective folder icon) corresponding to a request to select the respective folder icon for the respective folder. For example, FIG. 5QQQ illustrates the detection of a contact 5216, which corresponds to a request for selecting a folder icon 5004-12. The respective folders include a first number of selectable icons that are divided among a plurality of distinct separately displayed pages including a first page and a second page. For example, folders associated with folder icons 5004-12 (FIG. 5 QQQ) include: a first page (shown in fig. 5 TTT) comprising selectable action icons 5002-51 to 5002-52, and a second page (shown in fig. 5 VVV) comprising selectable action icons 5002-60 to 5002-68, and a third page shown in fig. 5FFFF comprising selectable action icons 5002-69 and 5002-70.
In response to detecting the first input, the device displays a folder view (e.g., a first page of a folder, shown in fig. 5 TTT) for the corresponding folder. In some embodiments, a plurality of selectable user interface objects (shown in FIG. 5 QQQ) are displayed (1308) in a corresponding arrangement before the first input is detected. In such an embodiment, displaying a folder view for the respective folder in response to detecting the first input includes zooming in on the respective folder icon (1310). Zooming in on the respective folder icons includes expanding the respective arrangement such that the respective icons are expanded and moved toward the center of the display (e.g., as shown in fig. 5RRR through 5 SSS). After zooming in on the respective folder icon, displaying the folder view for the respective folder in response to detecting the first input includes displaying (1312) the respective folder icon enlarged with the folder view (e.g., as shown in TTT of fig. 5).
In some embodiments, the folder view includes space for simultaneously displaying (1314) no more than a second number of selectable icons, less than the first number of selectable icons. For example, in the example shown in fig. 5TTT, the folder view includes a space for simultaneously displaying no more than nine selectable icons (e.g., action icons 3002). In some embodiments, the first number is determined at least in part by a user setting. For example, the user may set the icon size to be small, medium, or large, and the second number is determined according to the icon size. The folder view display (1316) includes a first page of a first subset of selectable icons in the folder (e.g., selectable action icons 5002-51 through 5002-52 in the TTT of fig. 5). In some embodiments, a first subset of selectable icons is displayed in a first arrangement in a first page of a folder view. In such an embodiment, prior to detecting the first input, the respective folder icon includes a plurality of scaled-down representations of the first subset of selectable icons displayed in the first arrangement. For example, as shown in fig. 5QQQ, the folder icons 5004-12 include scaled-down representations "y1", "y2", "y3", "y4", "y5", "y6", "y7", "y8", "y9", which have a one-to-one correspondence with the following applications, respectively: "application y1" (5002-51), "application y2" (5002-52), "application y3" (5002-53), "application y4" (5002-54), "application y5" (5002-55), "application y6" (5002-56), "application y7" (5002-57), "application y8" (5002-58), "application y9" (5002-59), which correspond to respective applications of the user interface for the respective applications displayed in response to detecting activation of the corresponding selectable icon.
In some embodiments, in response to detecting the first input, the device ceases to display (1320) the plurality of selectable user interface objects.
While displaying the first page of the folder view, the device detects (1322) a second input (e.g., a horizontal swipe gesture on the first page within the folder view) corresponding to a request to display a second page of the folder view. For example, gesture 5220 (fig. 5 TTT) is an example of a page right request. In some embodiments, the first input is a tap gesture (1324) on the corresponding folder icon and the second input is a swipe gesture within the folder view. In some embodiments, displaying the folder view includes displaying (1326) two or more page indicator icons (e.g., page indicator icon 5217, fig. 5 TTT) that provide information about the number of distinct pages of selectable icons among which the selectable icons in the folder are divided. After the first input is detected and before the second input is detected, the two or more page indicator icons have a first appearance. In some embodiments, the page indicator icon indicates (1328) the position of the page currently displayed in the folder view in the page sequence of the selectable icons in the corresponding folder (e.g., when the first page of the folder view is displayed, the leftmost page indicator is filled in, as shown in fig. 5TTT, when the second page of the folder view is displayed, the left-count second page indicator is filled in, as shown in fig. 5VVV, and when the third page of the folder view is displayed, the left-count third page indicator is filled in, as shown in fig. 5 HHHH).
In some embodiments, in response to detecting the second input, the device updates (1330) the two or more page indicator icons to have a second appearance that is different from the first appearance. In some embodiments, a first appearance of the page indicator icon indicates (1332) that a first page is currently displayed in the folder view, and a second appearance of the page indicator icon indicates that a second page is currently displayed in the folder view.
In response to detecting the second input, the device stops (1334) displaying the first page of the folder view and displaying the second page of the folder view for the corresponding folder. The second page of the folder view includes a second subset of selectable icons that is different from the first subset of selectable icons. For example, as shown in fig. 5TTT to 5VVV, the display of the first page is changed to the display of the second page. The appearance of the page indicator icon 5217 is also updated to reflect the change from the display of the first page to the display of the second page.
In some embodiments, when the second page of the folder view is displayed, the device detects (1336) a third input corresponding to the request to close the folder view (e.g., activating a home button with a press input of contact 5236, as shown in fig. 5 hhhhh, or a tap gesture on a portion of the touch screen display associated with closing the folder view). In response to detecting (1338) the third input, the device stops (1340) displaying the folder view. In some embodiments, the device displays a distinct user interface distinct from the folder view after ceasing to display the folder view. For example, in some embodiments, the device displays a home screen with corresponding folder icons. The device displays (1342) a reduced scale representation of the second subset of selectable icons in the respective folder icons for the respective folders (e.g., "y19", "y20", "y16", in the folder icons 5004-12 of fig. 5IIII corresponding to "application y19", "application y20", and "application y20", respectively, in fig. 5 hhhhh.) after displaying the reduced scale representation of the second subset of selectable icons in the respective folder icons, the device displays (1344) a replacement of the reduced scale representation of the second subset of selectable icons with the reduced scale representation of the first subset of selectable icons in the respective folder icons (e.g., by sliding the reduced scale representation of the second subset of selectable icons out of the folder icons in the first direction and sliding the reduced scale representation of the first subset of selectable icons into the folder icons in the first direction) -for example, fig. 5JJJJ illustrates the result of replacing the reduced scale representations of "y19", "y20", and y16 "with the reduced scale representation of the icons in the first page of the file in the folder icons 5004-12.
In some embodiments, when displaying the second page of the folder view, the device detects (1346) a fourth input corresponding to a request to move a respective icon from the second page of the folder view to a respective location on the display (e.g., when in an icon reconfiguration mode initiated by a long press on the respective icon). In response to detecting (1348) the fourth input (e.g., 5226-c in the DDDD of fig. 5 through 5225-d in the EEEE of fig. 5), the device moves (1350) the respective icon from the second page to a different page of the folder view based on a determination that the respective location is within a predetermined page change region (e.g., page change region 5233-2 as shown in the EEEE of fig. 5). On the other hand, in accordance with a determination that the corresponding location is within a predetermined icon removal area distinct from the page-change area (e.g., moving contact 5226 from location 5226-a in FIG. 5AAAA to location 5226-b in FIG. 5 BBBBBB, which is located in icon removal area 5231), the device removes (1354) the corresponding icon from the folder view. In some embodiments, the icon removal area is located (1356) at or near the bottom of the folder view. In some embodiments, in addition to removing the respective icon from the folder view, the respective icon is added to a plurality of selectable user interface objects that include the respective folder icon.
In some embodiments, the plurality of selectable user interface objects is (1358) a first page of selectable user interface objects in a plurality of distinct, separately displayed pages of selectable user interface objects. In some embodiments, the device receives (1360) a fifth input (e.g., a horizontal finger swipe gesture on the touch screen or touch-sensitive surface at a location corresponding to the selectable user object of the first page, such as a horizontal movement of the contact 5238 from location 5238-a in fig. 5jjj to location 5238-b in fig. 5 kkkkk and a subsequent lifting of the contact 5238 from the touch screen 112) corresponding to a request to display a selectable user interface object of a different page of the plurality of distinct separately displayed page selectable user interface objects. In response to receiving (1362) the fifth input, the device ceases (1364) to display the selectable user interface objects (e.g., 5002-4, 5002-2, 5002-3, 5002-5, 5002-6, 5002-22, 5004-7, 5002-7, 5004-1-b, 5002-10, 5002-11, 5002-12, 5002-13, and 5004-12) of the first page and optionally displays (1366) selectable user interface objects (e.g., icons 5002-80 through 5002-93, as shown in FIG. 5 LLLL) of the second page that is different from the selectable user interface objects of the first page.
Note that the details of the other processes described herein with respect to methods 600, 700, 800, 900, 1000, 1100, 1200 (e.g., fig. 6A-6E, 7A-7C, 8A-8C, 9A-9B, 10A-10B, 11A-11C, 12A-12E, and 5A-5 LLLL) also apply in a similar manner to method 1300 described above. For example, the selectable user interface objects (e.g., action icon 5002 and folder icon 5004) described with reference to fig. 13A-13E may have one or more of the characteristics of the various selectable user interface objects/icons/items (e.g., action icon 5002 and folder icon 5004) described herein with reference to any of methods 600, 700, 800, 900, 1000, 1100, or 1200. For the sake of brevity, these details are not repeated here.
Fig. 14 illustrates a functional block diagram of an electronic device 1400 configured in accordance with the principles of various described embodiments, in accordance with some embodiments. The functional block diagram of the device is optionally implemented by hardware, software or a combination of hardware and software to perform the principles of the various described embodiments. Those skilled in the art will appreciate that the functional blocks depicted in fig. 14 may alternatively be combined or divided into sub-blocks to implement the principles of the various described embodiments. Accordingly, the description herein optionally supports any possible combination or division or further definition of the functional blocks described herein.
As shown in fig. 14, the electronic device 1400 includes a display unit 1402 configured to display a graphical user interface, an input unit 1404 configured to receive input (e.g., a touch-sensitive surface unit configured to receive contact); and a processing unit 1408 coupled to the display unit 1402 and the input unit 1404. In some embodiments, processing unit 1408 includes a detection unit 1410, a display implementation unit 1412, a stop unit 1414, an update unit 1416, a scaling unit 1418, a movement unit 1420, and a removal unit 1422.
The processing unit 1408 is configured to implement (e.g., with the display implementation unit 1412) simultaneous display of a plurality of selectable user interface objects including one or more folder icons on the display unit 1402; a first input corresponding to a request to select a respective folder icon for a respective folder is detected (e.g., with detection unit 1410), the respective folder including a first number of selectable icons divided between a plurality of distinct separate display pages including a first page and a second page, and in response to detecting the first input, display of a folder view is effected (e.g., with display implementation unit 1412). The folder view includes space to simultaneously display no more than a second number of selectable icons, the second number of selectable icons being less than the first number of selectable icons; and the folder view displays a first page including a first subset of the selectable icons in the folder. In effecting (e.g., with display effecting unit 1412) display of a first page of the folder view, processing unit 1408 is configured to detect (e.g., with detecting unit 1410) a second input corresponding to a request to display a second page of the folder view; and in response to detecting the second input, ceasing (e.g., with the ceasing unit 1414) to enable display of the first page of the folder view and enabling (e.g., with the display enabling unit 1412) display of a second page of the folder view for the respective folder, wherein the second page of the folder view includes a second subset of selectable icons different from the first subset of selectable icons.
In some embodiments, displaying the folder view includes displaying two or more indicator icons that provide information about the number of distinct pages of selectable icons, the selectable icons in the folder being divided among the different pages of the selectable icons; after detecting the first input and before detecting the second input, the two or more indicator icons have a first appearance; and the processing unit 1408 is further configured to update (e.g., with the updating unit 1416) the two or more page indicator icons to have a second appearance different from the first appearance in response to detecting the second input.
In some embodiments, the page indicator icon indicates the position of the page currently displayed in the folder view in the page sequence of selectable icons in the respective folder.
In some embodiments, the first appearance of the page indicator icon indicates that the first page is currently displayed in the folder view and the second appearance of the page indicator icon indicates that the second page is currently displayed in the folder view.
In some embodiments, a first subset of selectable icons is displayed in a first arrangement in a first page of a folder view; and prior to detecting the first input, the respective folder icon includes a plurality of scaled-down representations of the first subset of selectable icons displayed in the first arrangement.
In some embodiments, the processing unit 1408 is further configured to:
detecting (e.g., with the detecting unit 1410) a third input received by the input unit corresponding to a request to close the folder view while displaying the second page of the folder view; and in response to detecting the third input: stopping (e.g., using stop unit 1414) to enable display of the folder view; and display of the scaled-down representation of the second subset of selectable icons in the respective folder icon for the respective folder is effected (e.g., with the display-effecting unit 1412).
In some embodiments, the processing unit 1408 is further configured to, after effecting (e.g., with the display effecting unit 1412) the display of the reduced-scale representation of the second subset of selectable icons in the respective folder icon, effect (e.g., with the display effecting unit 1412) the display of the reduced-scale representation of the first subset of selectable icons in the respective folder icon in place of the reduced-scale representation of the second subset of selectable icons.
In some embodiments, prior to detecting the first input, displaying a plurality of selectable user interface objects in a corresponding arrangement; and enabling display of a folder view for the respective folder in response to detecting the first input includes: zooming in on the respective folder icon (e.g., with the zoom unit 1418), wherein zooming in on the respective folder icon includes zooming in on the respective arrangement such that the respective icon is zoomed in and moved toward the center of the display unit; and after zooming in on the respective folder icon, effecting (e.g., with the display effecting unit 1412) a display of replacing the zoomed-in respective folder icon with the folder view.
In some embodiments, the processing unit 1408 is further configured to stop (e.g., with the stopping unit 1414) displaying the plurality of selectable user interface objects in response to detecting the first input.
In some embodiments, the processing unit 1408 is further configured to: detecting (e.g., with the detection unit 1410) a fourth input received by the input unit corresponding to a request to move a corresponding icon from the second page of the folder view to a corresponding position on the display unit while the second page of the folder view is displayed; and in response to detecting the fourth input: moving (e.g., using the moving unit 1420) the respective icon from the second page to a different page of the folder view according to a determination that the respective location is within the predetermined page change region; and removes (e.g., with the removal unit 1422) the respective icon from the folder view based on a determination that the respective location is within a predetermined icon removal area distinct from the page change area.
In some embodiments, the page-changing region is located at or near the right side of the folder view.
In some embodiments, the icon removal area is located at or near the bottom of the folder view.
In some embodiments, the plurality of selectable user interface objects is a selectable user interface object of a first page of a plurality of distinct, separately displayed page selectable user interface objects; and the processing unit 1408 is further configured to: detecting (e.g., with detection unit 1410) a fifth input received by the input unit corresponding to a request to display a different page of the selectable user interface objects in the plurality of distinct separately displayed pages; and in response to receiving the fifth input, stopping (e.g., with stopping unit 1414) the implementation of the display of the selectable user interface object of the first page; and display of the selectable user interface object of the second page that is different from the selectable user interface object of the first page is implemented (e.g., with the display implementation unit 1412).
In some embodiments, the first input is a tap gesture on a corresponding folder icon and the second input is a swipe gesture within the folder view.
The operations in the information processing method described above are optionally implemented by running one or more functional modules in an information processing apparatus such as a general-purpose processor (e.g., as described above with respect to fig. 1A and 3) or a dedicated chip.
The operations described above with reference to fig. 6A-6E, 7A-7C, 8A-8C, 9A-9B, 10A-10B, 11A-11C, 12A-12E, 13A-13E may alternatively be implemented by the components depicted in fig. 1A-1C and/or 14. For example, detection operations 1304, 1322; display operations 1306 and 1334 are optionally implemented by event sorter 170, event recognizer 180, and event handler 190. An event monitor 171 in the event sorter 170 detects a contact on the touch-sensitive display 112 and an event dispatcher module 174 delivers event information to the application 136-1. The respective event identifier 180 of the application 136-1 compares the event information to the respective event definition 186 and determines whether the first contact (or rotation of the device) located at the first location on the touch-sensitive surface corresponds to a predefined event or sub-event, such as selection of an object on the user interface, or rotation of the device from one orientation to another. When a respective predefined event or sub-event is detected, event recognizer 180 activates event handler 190 associated with the detection of the event or sub-event. Event handler 190 optionally uses or invokes data updater 176 or object updater 177 to update the application internal state 192. In some implementations, event handler 190 accesses a respective GUI updater 178 to update the interface displayed by the application. Similarly, one of ordinary skill in the art will readily understand how other processes can be implemented based on the components depicted in fig. 1A-1C.
The invention has been described with reference to specific embodiments for purposes of illustration. However, the illustrative discussions above are not intended to be exhaustive or to limit the invention to the precise forms disclosed. Many modifications and variations are possible in light of the above teaching. The embodiments were chosen and described in order to best explain the principles of the invention and its practical application, to thereby enable others skilled in the art to best utilize the invention and various embodiments with various modifications as are suited to the particular use contemplated.
Claims (22)
1. A method, comprising:
at an electronic device having a display:
simultaneously displaying a plurality of selectable user interface objects including one or more folder icons on the display;
detecting a first input corresponding to a request to select a respective folder icon for a respective folder, the respective folder comprising a set of selectable icons divided among a plurality of different pages including a first page and a second page;
in response to detecting the first input, displaying a folder view for the respective folder, wherein the folder view displays a first page comprising a first subset of selectable icons from the set of selectable icons;
Detecting a second input corresponding to a request to move a first selectable icon into a page change region in the folder view;
in response to detecting the second input, displaying a second page in the folder view in place of the first page, the second page including a second subset of selectable icons from the set of selectable icons, the second subset of selectable icons being different from the first subset of selectable icons;
detecting termination of the second input while the second page is displayed; and
in response to detecting termination of the second input, the first selectable icon is added to the second page.
2. The method of claim 1, wherein displaying the second page in the folder view in place of the first page is performed in accordance with a determination that the first selectable icon is within the page-change region for a predetermined amount of time.
3. The method of any of claims 1-2, wherein the page change region is a next page change region corresponding to a request to transition to a next page within the folder view.
4. The method of claim 3, wherein the folder view further comprises a previous page change region that is different from the next page change region, wherein the previous page change region corresponds to a request to transition to a previous page within the folder view.
5. The method of claim 4, wherein the next page-changing region is adjacent to a right side of the folder view and the previous page-changing region is adjacent to a left side of the folder view.
6. The method of any of claims 1-2, further comprising: the arrangement of the second subset of the selectable icons is dynamically rearranged in response to adding the first selectable icon to the second page.
7. The method of any of claims 1-2, further comprising:
detecting a third input corresponding to a request to close the folder view for the respective folder, and
in response to detecting the third input, terminating display of the folder view for the respective folder and returning to concurrently displaying the plurality of selectable user interface objects including one or more folder icons.
8. A computer-readable storage medium comprising one or more programs configured to be executed by one or more processors of an electronic device with a display, the one or more programs comprising instructions for:
simultaneously displaying a plurality of selectable user interface objects including one or more folder icons on the display;
Detecting a first input corresponding to a request to select a respective folder icon for a respective folder, the respective folder comprising a set of selectable icons divided among a plurality of different pages including a first page and a second page;
in response to detecting the first input, displaying a folder view for the respective folder, wherein the folder view displays a first page comprising a first subset of selectable icons from the set of selectable icons;
detecting a second input corresponding to a request to move a first selectable icon into a page change region in the folder view;
in response to detecting the second input, displaying a second page in the folder view in place of the first page, the second page including a second subset of selectable icons from the set of selectable icons, the second subset of selectable icons being different from the first subset of selectable icons;
detecting termination of the second input while the second page is displayed; and
in response to detecting termination of the second input, the first selectable icon is added to the second page.
9. The computer-readable storage medium of claim 8, wherein displaying the second page in place of the first page in the folder view is performed in accordance with a determination that the first selectable icon is within the page-change region for a predetermined amount of time.
10. The computer-readable storage medium of any of claims 8-9, wherein the page-change region is a next-page-change region corresponding to a request to transition to a next page within the folder view.
11. The computer-readable storage medium of claim 10, wherein the folder view further comprises a previous page change region that is different from the next page change region, wherein the previous page change region corresponds to a request to transition to a previous page within the folder view.
12. The computer-readable storage medium of claim 11, wherein the next page-changing region is adjacent to a right side of the folder view and the previous page-changing region is adjacent to a left side of the folder view.
13. The computer readable storage medium of any one of claims 8-9, wherein the one or more programs further comprise instructions for:
The arrangement of the second subset of the selectable icons is dynamically rearranged in response to adding the first selectable icon to the second page.
14. The computer readable storage medium of any one of claims 8-9, wherein the one or more programs include instructions for:
detecting a third input corresponding to a request to close the folder view for the respective folder, and
in response to detecting the third input, terminating display of the folder view for the respective folder and returning to concurrently displaying the plurality of selectable user interface objects including one or more folder icons.
15. An electronic device, comprising:
a display;
one or more processors; and
a memory for storing one or more programs configured to be executed by the one or more processors, the one or more programs comprising instructions for:
simultaneously displaying a plurality of selectable user interface objects including one or more folder icons on the display;
detecting a first input corresponding to a request to select a respective folder icon for a respective folder, the respective folder comprising a set of selectable icons divided among a plurality of different pages including a first page and a second page;
In response to detecting the first input, displaying a folder view for the respective folder, wherein the folder view displays a first page comprising a first subset of selectable icons from the set of selectable icons;
detecting a second input corresponding to a request to move a first selectable icon into a page change region in the folder view;
in response to detecting the second input, displaying a second page in the folder view in place of the first page, the second page including a second subset of selectable icons from the set of selectable icons, the second subset of selectable icons being different from the first subset of selectable icons;
detecting termination of the second input while the second page is displayed; and
in response to detecting termination of the second input, the first selectable icon is added to the second page.
16. The electronic device of claim 15, wherein displaying the second page in the folder view in place of the first page is performed in accordance with a determination that the first selectable icon is within the page-change region for a predetermined amount of time.
17. The electronic device of any of claims 15-16, wherein the page-change region is a next-page-change region corresponding to a request to transition to a next page within the folder view.
18. The electronic device of claim 17, wherein the folder view further comprises a previous page change region that is different from the next page change region, wherein the previous page change region corresponds to a request to transition to a previous page within the folder view.
19. The electronic device of claim 18, wherein the next page-changing region is adjacent to a right side of the folder view and the previous page-changing region is adjacent to a left side of the folder view.
20. The electronic device of any of claims 15-16, wherein the one or more programs include instructions for:
the arrangement of the second subset of the selectable icons is dynamically rearranged in response to adding the first selectable icon to the second page.
21. The electronic device of any of claims 15-16, wherein the one or more programs include instructions for:
Detecting a third input corresponding to a request to close the folder view for the respective folder, and
in response to detecting the third input, terminating display of the folder view for the respective folder and returning to concurrently displaying the plurality of selectable user interface objects including one or more folder icons.
22. An electronic device, comprising:
a display; and
apparatus for performing the method of any one of claims 1-2.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010125835.5A CN111339032B (en) | 2013-06-09 | 2014-05-30 | Device, method and graphical user interface for managing folders with multiple pages |
Applications Claiming Priority (7)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201361832897P | 2013-06-09 | 2013-06-09 | |
US61/832,897 | 2013-06-09 | ||
US14/142,648 US10788976B2 (en) | 2010-04-07 | 2013-12-27 | Device, method, and graphical user interface for managing folders with multiple pages |
US14/142,648 | 2013-12-27 | ||
PCT/US2014/040414 WO2014200735A1 (en) | 2013-06-09 | 2014-05-30 | Device, method, and graphical user interface for managing folders with multiple pages |
CN202010125835.5A CN111339032B (en) | 2013-06-09 | 2014-05-30 | Device, method and graphical user interface for managing folders with multiple pages |
CN201480001676.0A CN104704494A (en) | 2013-06-09 | 2014-05-30 | Device, method, and graphical user interface for managing folders with multiple pages |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201480001676.0A Division CN104704494A (en) | 2013-06-09 | 2014-05-30 | Device, method, and graphical user interface for managing folders with multiple pages |
Publications (2)
Publication Number | Publication Date |
---|---|
CN111339032A CN111339032A (en) | 2020-06-26 |
CN111339032B true CN111339032B (en) | 2023-09-29 |
Family
ID=51059624
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202010125835.5A Active CN111339032B (en) | 2013-06-09 | 2014-05-30 | Device, method and graphical user interface for managing folders with multiple pages |
CN201480001676.0A Pending CN104704494A (en) | 2013-06-09 | 2014-05-30 | Device, method, and graphical user interface for managing folders with multiple pages |
Family Applications After (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201480001676.0A Pending CN104704494A (en) | 2013-06-09 | 2014-05-30 | Device, method, and graphical user interface for managing folders with multiple pages |
Country Status (7)
Country | Link |
---|---|
EP (1) | EP2909707A1 (en) |
JP (1) | JP6097835B2 (en) |
KR (1) | KR101670572B1 (en) |
CN (2) | CN111339032B (en) |
AU (2) | AU2014100582A4 (en) |
HK (1) | HK1210842A1 (en) |
WO (1) | WO2014200735A1 (en) |
Families Citing this family (53)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9483763B2 (en) | 2014-05-29 | 2016-11-01 | Apple Inc. | User interface for payments |
CN108196759B (en) | 2014-09-04 | 2021-11-19 | 华为技术有限公司 | Icon control method and terminal |
CN104573552B (en) * | 2014-12-29 | 2018-01-19 | 广东欧珀移动通信有限公司 | A kind of method and device of hiden application icon |
CN106406924B (en) * | 2015-07-31 | 2020-02-07 | 深圳超多维科技有限公司 | Control method and device for starting and quitting picture of application program and mobile terminal |
CN105677195B (en) * | 2015-12-29 | 2019-02-01 | 宇龙计算机通信科技(深圳)有限公司 | A kind of application management system and method for intelligent terminal |
CN105677158A (en) * | 2016-01-14 | 2016-06-15 | 上海斐讯数据通信技术有限公司 | Method for dissolving folder and mobile terminal |
WO2017143482A1 (en) * | 2016-02-22 | 2017-08-31 | 康志强 | Smart-watch software display method and system |
CN105892801A (en) * | 2016-03-25 | 2016-08-24 | 乐视控股(北京)有限公司 | Desktop icon processing method and terminal |
CN105892802A (en) * | 2016-03-25 | 2016-08-24 | 乐视控股(北京)有限公司 | Desktop icon arrangement method and terminal |
CN105843789A (en) * | 2016-04-01 | 2016-08-10 | 乐视控股(北京)有限公司 | Rich text monitoring method and device |
CN106354373B (en) * | 2016-09-28 | 2020-04-21 | 维沃移动通信有限公司 | Icon moving method and mobile terminal |
CN106354374A (en) * | 2016-09-30 | 2017-01-25 | 维沃移动通信有限公司 | Icon moving method and mobile terminal |
WO2018098944A1 (en) * | 2016-11-30 | 2018-06-07 | 华为技术有限公司 | Application search method and terminal |
CN108399002B (en) * | 2017-02-07 | 2020-10-16 | 阿里巴巴集团控股有限公司 | Folder switching method and device |
CN106886600B (en) * | 2017-02-28 | 2021-12-10 | 深圳传音控股股份有限公司 | File management method and terminal |
DK180127B1 (en) | 2017-05-16 | 2020-05-26 | Apple Inc. | Devices, methods, and graphical user interfaces for moving user interface objects |
WO2018212998A1 (en) * | 2017-05-16 | 2018-11-22 | Apple Inc. | Devices, methods, and graphical user interfaces for moving user interface objects |
KR20220138007A (en) * | 2017-05-16 | 2022-10-12 | 애플 인크. | Devices, methods, and graphical user interfaces for navigating between user interfaces and interacting with control objects |
US11036387B2 (en) | 2017-05-16 | 2021-06-15 | Apple Inc. | Devices, methods, and graphical user interfaces for navigating between user interfaces and interacting with control objects |
WO2018222248A1 (en) * | 2017-06-02 | 2018-12-06 | Apple Inc. | Method and device for detecting planes and/or quadtrees for use as a virtual substrate |
CN108573009B (en) * | 2017-07-03 | 2021-05-25 | 北京金山云网络技术有限公司 | A file search method and device |
CN108415957B (en) * | 2017-11-06 | 2022-06-07 | 北京京东尚科信息技术有限公司 | Method and device for self-defined navigation of webpage |
CN108171034A (en) * | 2017-12-20 | 2018-06-15 | 维沃移动通信有限公司 | A kind of method and terminal for protecting privacy |
DK180081B1 (en) | 2018-06-01 | 2020-04-01 | Apple Inc. | Access to system user interfaces on an electronic device |
CN109582427B (en) * | 2018-12-06 | 2022-03-11 | 深圳市优创亿科技有限公司 | TFT screen display method of wearable device |
JP7163755B2 (en) * | 2018-12-14 | 2022-11-01 | 京セラドキュメントソリューションズ株式会社 | display input device |
CN109656439A (en) | 2018-12-17 | 2019-04-19 | 北京小米移动软件有限公司 | Display methods, device and the storage medium of prompt operation panel |
JP6664597B1 (en) * | 2018-12-28 | 2020-03-13 | ヤフー株式会社 | Information display program, information display device, and information display method |
CN109828705B (en) * | 2018-12-28 | 2021-06-08 | 南京维沃软件技术有限公司 | Icon display method and terminal equipment |
CN111435277B (en) * | 2019-01-15 | 2022-04-19 | Oppo广东移动通信有限公司 | Method, device, terminal and storage medium for displaying content |
CN110399074B (en) * | 2019-07-18 | 2021-11-16 | 上海幻电信息科技有限公司 | Picture splicing method and device, mobile terminal and storage medium |
CN110618969B (en) * | 2019-08-29 | 2022-08-02 | 维沃移动通信有限公司 | Icon display method and electronic equipment |
CN111045559A (en) * | 2019-10-18 | 2020-04-21 | 宇龙计算机通信科技(深圳)有限公司 | Method, device, electronic equipment and medium for arranging application icons |
US11455085B2 (en) | 2020-03-10 | 2022-09-27 | Apple Inc. | Devices, methods, and graphical user interfaces for interacting with user interface objects corresponding to applications |
KR20210114298A (en) * | 2020-03-10 | 2021-09-23 | 삼성전자주식회사 | Electronic device for folder operation and method for operating thereof |
JP7615768B2 (en) | 2020-04-06 | 2025-01-17 | 京セラドキュメントソリューションズ株式会社 | Display input device and image forming device |
KR102491989B1 (en) * | 2020-04-10 | 2023-01-26 | 애플 인크. | User interfaces for enabling an activity |
DK202070633A1 (en) | 2020-04-10 | 2021-11-12 | Apple Inc | User interfaces for enabling an activity |
CN111638825B (en) * | 2020-05-27 | 2022-09-09 | 维沃移动通信有限公司 | Display control method and device and electronic equipment |
CN112068720A (en) * | 2020-09-12 | 2020-12-11 | 贺运涛 | Multi-key mouse multi-mode layout mode and implementation method |
CN114356166A (en) * | 2020-09-27 | 2022-04-15 | 华为技术有限公司 | Application icon display method and related equipment |
CN112269513A (en) * | 2020-10-30 | 2021-01-26 | 维沃移动通信有限公司 | Interface display method and device and electronic equipment |
CN112464125B (en) * | 2020-12-07 | 2025-01-14 | 北京小米松果电子有限公司 | Page display method and device, electronic device, and storage medium |
JP7643095B2 (en) | 2021-03-09 | 2025-03-11 | 富士フイルムビジネスイノベーション株式会社 | Information processing device and program |
CN115469781B (en) * | 2021-04-20 | 2023-09-01 | 华为技术有限公司 | Graphic interface display method, electronic device, medium and program product |
CN113268182B (en) * | 2021-05-19 | 2023-06-16 | 维沃移动通信有限公司 | Application icon management method and electronic device |
CN113312133B (en) * | 2021-06-17 | 2022-06-24 | 浙江齐安信息科技有限公司 | Operation method, system and storage medium |
CN115705124A (en) * | 2021-08-04 | 2023-02-17 | Oppo广东移动通信有限公司 | Application folder control method, device, terminal device and storage medium |
CN113900558A (en) * | 2021-09-28 | 2022-01-07 | 深圳传音控股股份有限公司 | Icon area management method, intelligent terminal and storage medium |
CN115794272B (en) * | 2021-11-03 | 2023-06-27 | 华为技术有限公司 | A display method and electronic device |
US12265687B2 (en) | 2022-05-06 | 2025-04-01 | Apple Inc. | Devices, methods, and graphical user interfaces for updating a session region |
CN114816167B (en) * | 2022-05-23 | 2022-11-08 | 荣耀终端有限公司 | Application icon display method, electronic device and readable storage medium |
CN115543169A (en) * | 2022-10-14 | 2022-12-30 | 维沃移动通信有限公司 | Identification display method and device, electronic equipment and readable storage medium |
Citations (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
GB0803811D0 (en) * | 2008-02-29 | 2008-04-09 | Samsung Electronics Co Ltd | Mobile telephone and other apparatus with a gui |
US7589750B1 (en) * | 2006-03-15 | 2009-09-15 | Adobe Systems, Inc. | Methods and apparatus for arranging graphical objects |
CN102033710A (en) * | 2010-04-07 | 2011-04-27 | 苹果公司 | Method for managing file folder and related equipment |
CN102221931A (en) * | 2011-06-28 | 2011-10-19 | 鸿富锦精密工业(深圳)有限公司 | Touch electronic device and function chart shifting method thereof |
CN102298502A (en) * | 2011-09-26 | 2011-12-28 | 鸿富锦精密工业(深圳)有限公司 | Touch type electronic device and icon page-switching method |
CN102364438A (en) * | 2011-10-10 | 2012-02-29 | 宇龙计算机通信科技(深圳)有限公司 | Application program display and classification method, terminal and mobile terminal |
KR20120050883A (en) * | 2010-11-11 | 2012-05-21 | 김경중 | Application program |
CN102830911A (en) * | 2012-07-30 | 2012-12-19 | 广东欧珀移动通信有限公司 | Method for rapidly dragging application program to switch pages |
KR20130011437A (en) * | 2011-07-21 | 2013-01-30 | 삼성전자주식회사 | Method and apparatus for managing icon in portable terminal |
CN102981704A (en) * | 2012-11-09 | 2013-03-20 | 广东欧珀移动通信有限公司 | Icon placement method and mobile terminal of display interface |
CN102999249A (en) * | 2012-10-17 | 2013-03-27 | 广东欧珀移动通信有限公司 | A user interface management method and system for a mobile terminal with a touch screen |
CN103116440A (en) * | 2013-01-23 | 2013-05-22 | 深圳市金立通信设备有限公司 | Method and terminal for icon to move on terminal |
Family Cites Families (17)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US3859005A (en) | 1973-08-13 | 1975-01-07 | Albert L Huebner | Erosion reduction in wet turbines |
US4826405A (en) | 1985-10-15 | 1989-05-02 | Aeroquip Corporation | Fan blade fabrication system |
CA2318815C (en) | 1998-01-26 | 2004-08-10 | Wayne Westerman | Method and apparatus for integrating manual input |
US7020714B2 (en) | 2000-04-06 | 2006-03-28 | Rensselaer Polytechnic Institute | System and method of source based multicast congestion control |
US7688306B2 (en) | 2000-10-02 | 2010-03-30 | Apple Inc. | Methods and apparatuses for operating a portable device based on an accelerometer |
US7218226B2 (en) | 2004-03-01 | 2007-05-15 | Apple Inc. | Acceleration-based theft detection system for portable electronic devices |
US6677932B1 (en) | 2001-01-28 | 2004-01-13 | Finger Works, Inc. | System and method for recognizing touch typing under limited tactile feedback conditions |
US6570557B1 (en) | 2001-02-10 | 2003-05-27 | Finger Works, Inc. | Multi-touch system and method for emulating modifier keys via fingertip chords |
JP4612902B2 (en) * | 2006-07-04 | 2011-01-12 | キヤノン株式会社 | File display device, control method therefor, and program |
US8601370B2 (en) * | 2007-01-31 | 2013-12-03 | Blackberry Limited | System and method for organizing icons for applications on a mobile device |
EP1956472A1 (en) * | 2007-01-31 | 2008-08-13 | Research In Motion Limited | System and method for organizing icons for applications on a mobile device |
JP5419486B2 (en) * | 2009-02-10 | 2014-02-19 | キヤノン株式会社 | Data processing apparatus, data processing method, and program |
US20100251085A1 (en) * | 2009-03-25 | 2010-09-30 | Microsoft Corporation | Content and subfolder navigation control |
US8881061B2 (en) * | 2010-04-07 | 2014-11-04 | Apple Inc. | Device, method, and graphical user interface for managing folders |
US8799815B2 (en) * | 2010-07-30 | 2014-08-05 | Apple Inc. | Device, method, and graphical user interface for activating an item in a folder |
KR101708821B1 (en) * | 2010-09-30 | 2017-02-21 | 엘지전자 주식회사 | Mobile terminal and method for controlling thereof |
US9886188B2 (en) * | 2011-08-25 | 2018-02-06 | International Business Machines Corporation | Manipulating multiple objects in a graphic user interface |
-
2014
- 2014-05-30 WO PCT/US2014/040414 patent/WO2014200735A1/en active Application Filing
- 2014-05-30 AU AU2014100582A patent/AU2014100582A4/en not_active Ceased
- 2014-05-30 CN CN202010125835.5A patent/CN111339032B/en active Active
- 2014-05-30 JP JP2015532193A patent/JP6097835B2/en active Active
- 2014-05-30 KR KR1020147036624A patent/KR101670572B1/en active Active
- 2014-05-30 EP EP14734674.6A patent/EP2909707A1/en not_active Ceased
- 2014-05-30 CN CN201480001676.0A patent/CN104704494A/en active Pending
- 2014-05-30 AU AU2014274537A patent/AU2014274537A1/en not_active Abandoned
-
2015
- 2015-11-20 HK HK15111515.6A patent/HK1210842A1/en unknown
Patent Citations (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7589750B1 (en) * | 2006-03-15 | 2009-09-15 | Adobe Systems, Inc. | Methods and apparatus for arranging graphical objects |
GB0803811D0 (en) * | 2008-02-29 | 2008-04-09 | Samsung Electronics Co Ltd | Mobile telephone and other apparatus with a gui |
CN102033710A (en) * | 2010-04-07 | 2011-04-27 | 苹果公司 | Method for managing file folder and related equipment |
KR20120050883A (en) * | 2010-11-11 | 2012-05-21 | 김경중 | Application program |
CN102221931A (en) * | 2011-06-28 | 2011-10-19 | 鸿富锦精密工业(深圳)有限公司 | Touch electronic device and function chart shifting method thereof |
KR20130011437A (en) * | 2011-07-21 | 2013-01-30 | 삼성전자주식회사 | Method and apparatus for managing icon in portable terminal |
CN102298502A (en) * | 2011-09-26 | 2011-12-28 | 鸿富锦精密工业(深圳)有限公司 | Touch type electronic device and icon page-switching method |
CN102364438A (en) * | 2011-10-10 | 2012-02-29 | 宇龙计算机通信科技(深圳)有限公司 | Application program display and classification method, terminal and mobile terminal |
CN102830911A (en) * | 2012-07-30 | 2012-12-19 | 广东欧珀移动通信有限公司 | Method for rapidly dragging application program to switch pages |
CN102999249A (en) * | 2012-10-17 | 2013-03-27 | 广东欧珀移动通信有限公司 | A user interface management method and system for a mobile terminal with a touch screen |
CN102981704A (en) * | 2012-11-09 | 2013-03-20 | 广东欧珀移动通信有限公司 | Icon placement method and mobile terminal of display interface |
CN103116440A (en) * | 2013-01-23 | 2013-05-22 | 深圳市金立通信设备有限公司 | Method and terminal for icon to move on terminal |
Non-Patent Citations (2)
Title |
---|
Specifying a visual file system in Z;J. Hughes;《IEEE》;全文 * |
基于认知心理学的误操作研究 ————以触屏手机的界面操作为例;徐琳;《中国优秀硕士学位论文全文数据库电子期刊》;全文 * |
Also Published As
Publication number | Publication date |
---|---|
JP2015528619A (en) | 2015-09-28 |
KR20150021964A (en) | 2015-03-03 |
HK1210842A1 (en) | 2016-05-06 |
AU2014100582A4 (en) | 2014-07-03 |
EP2909707A1 (en) | 2015-08-26 |
CN104704494A (en) | 2015-06-10 |
JP6097835B2 (en) | 2017-03-15 |
CN111339032A (en) | 2020-06-26 |
KR101670572B1 (en) | 2016-10-28 |
WO2014200735A1 (en) | 2014-12-18 |
AU2014274537A1 (en) | 2015-01-22 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US12236079B2 (en) | Device, method, and graphical user interface for managing folders with multiple pages | |
CN111339032B (en) | Device, method and graphical user interface for managing folders with multiple pages | |
US12164745B2 (en) | Device, method, and graphical user interface for managing folders | |
AU2019219816B2 (en) | Device, method, and graphical user interface for managing folders |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |