GESTURE-BASED NAVIGATION AMONG CONTENT ITEMS
First Claim
1. A method comprising:
- at a computing device with a touch-sensitive surface and a display;
displaying a first user interface region at a first scale factor on the display;
concurrently detecting a plurality of contacts on the touch-sensitive surface; and
,while continuing to detect at least one respective contact of the plurality of contacts on the touch-sensitive surface;
detecting a first gesture based on movement of the plurality of contacts relative to each other on the touch-sensitive surface;
in response to detecting the first gesture, displaying the first user interface region at a second scale factor lower than the first scale factor;
detecting a second gesture based on movement of the at least one respective contact on the touch-sensitive surface; and
in response to detecting the second gesture, displaying a respective portion of a second user interface region on the display, wherein the second user interface region is different from the first user interface region and the respective portion of the second user interface region was not displayed prior to detecting the second gesture.
1 Assignment
0 Petitions
Accused Products
Abstract
In any context where a user can view multiple different content items, switching among content items is provided using an array mode. In a full-frame mode, one content item is visible and active, but other content items may also be open. In response to user input the display can be switched to an array mode, in which all of the content items are visible in a scrollable array. Selecting a content item in array mode can result in the display returning to the full-frame mode, with the selected content item becoming visible and active. Smoothly animated transitions between the full-frame and array modes and a gesture-based interface for controlling the transitions can also be provided.
104 Citations
27 Claims
-
1. A method comprising:
-
at a computing device with a touch-sensitive surface and a display; displaying a first user interface region at a first scale factor on the display; concurrently detecting a plurality of contacts on the touch-sensitive surface; and
,while continuing to detect at least one respective contact of the plurality of contacts on the touch-sensitive surface; detecting a first gesture based on movement of the plurality of contacts relative to each other on the touch-sensitive surface; in response to detecting the first gesture, displaying the first user interface region at a second scale factor lower than the first scale factor; detecting a second gesture based on movement of the at least one respective contact on the touch-sensitive surface; and in response to detecting the second gesture, displaying a respective portion of a second user interface region on the display, wherein the second user interface region is different from the first user interface region and the respective portion of the second user interface region was not displayed prior to detecting the second gesture. - View Dependent Claims (2, 3, 4, 5, 6, 7)
-
-
8. A computer program product comprising a computer-readable storage medium encoded with program instructions that, when executed by a processor of a computing device with a touch-sensitive surface and a display device, cause the processor to execute a method comprising:
-
displaying, on the display device, a first user interface region for an application program at a first scale factor within a display area of the application program, wherein the first user interface region fills the display area; concurrently detecting a plurality of contacts on the touch-sensitive surface; and
,while continuing to detect at least one respective contact of the plurality of contacts on the touch-sensitive surface; detecting a first gesture based on movement of the plurality of contacts relative to each other on the touch-sensitive surface; in response to detecting the first gesture, displaying the first user interface region at a second scale factor lower than the first scale factor such that the first user interface region does not fill the display area, and displaying at least a portion of a second user interface region at the second scale factor within the display area; detecting a second gesture based on movement of the at least one respective contact on the touch-sensitive surface; and in response to detecting the second gesture, displaying the second user interface region at the first scale factor such that the second user interface region fills the display area. - View Dependent Claims (9, 10)
-
-
11. A method comprising:
-
at a computing device with a touch-sensitive surface and a display; displaying a first user interface region at a first scale factor on the display; concurrently detecting a plurality of contacts on the touch-sensitive surface; and
,while continuing to detect at least one respective contact of the plurality of contacts on the touch-sensitive surface; detecting a first gesture based on movement of the plurality of contacts relative to each other on the touch-sensitive surface; in response to detecting the first gesture, displaying the first user interface region at a second scale factor lower than the first scale factor, wherein the first user interface region is displayed as one region in a single row of user interface regions displayed at the second scale factor; detecting a second gesture based on movement of the at least one respective contact on the touch-sensitive surface; and in response to detecting the second gesture, scrolling the single row of user interface regions displayed at the second scale factor such that a second user interface region becomes visible, wherein the second user interface region is different from the first user interface region. - View Dependent Claims (12, 13, 14, 15, 16, 17, 18, 19)
-
-
20. A computer system comprising:
-
a display; a touch-sensitive surface; a processor coupled to the touch-sensitive surface; and a computer-readable storage medium coupled to the processor, the computer readable storage medium being encoded with program instructions that, when executed by the processor, cause the processor to; display a first user interface region in a first user interface mode; detect a first gesture on the touch-sensitive surface; in response to detecting the first gesture, switch from the first user interface mode to a second user interface mode; while in the second user interface mode, detect a second gesture that includes movement of a respective contact on the touch-sensitive surface; in response to detecting the second gesture, perform an operation within a user interface displayed in the second user interface mode; and in response to detecting an end of the second gesture; exit the second user interface mode in the event that the respective contact is a contact that was used to perform the first gesture and the respective contact was continuously detected on the touch-sensitive surface throughout the first gesture and the second gesture, and remain in the second user interface mode after detecting the end of the second gesture in the event that the respective contact is not a contact that was continuously detected on the touch-sensitive surface throughout the first gesture and the second gesture. - View Dependent Claims (21, 22, 23, 24, 25, 26, 27)
-
Specification