DEVICE, METHOD, AND GRAPHICAL USER INTERFACE FOR INTEGRATING RECOGNITION OF HANDWRITING GESTURES WITH A SCREEN READER
0 Assignments
0 Petitions
Accused Products
Abstract
While an electronic device with a display and a touch-sensitive surface is in a screen reader accessibility mode, the device displays an application launcher screen including a plurality of application icons. A respective application icon corresponds to a respective application stored in the device. The device detects a sequence of one or more gestures on the touch-sensitive surface that correspond to one or more characters. A respective gesture that corresponds to a respective character is a single finger gesture that moves across the touch-sensitive surface along a respective path that corresponds to the respective character. The device determines whether the detected sequence of one or more gestures corresponds to a respective application icon of the plurality of application icons, and, in response to determining that the detected sequence of one or more gestures corresponds to the respective application icon, performs a predefined operation associated with the respective application icon.
-
Citations
30 Claims
-
1-20. -20. (canceled)
-
21. A method, comprising:
while an electronic device with a display and a touch-sensitive surface is in a screen reader accessibility mode; displaying at least a portion of a web page on the display, the web page including a plurality of user interface elements; detecting a first navigation gesture on the touch-sensitive surface; in response to detecting the first navigation gesture on the touch-sensitive surface, navigating to a first set of one or more user interface elements of the plurality of user interface elements that corresponds to a current navigable unit type, wherein the current navigable unit type is set to a first navigable unit type selected from a plurality of predefined navigable unit types; detecting a navigation setting gesture on the touch-sensitive surface that corresponds to a respective character, wherein; the navigation setting gesture that corresponds to the respective character is a single finger gesture that moves across the touch-sensitive surface along a respective path that corresponds to the respective character; determining whether the navigation setting gesture corresponds to a second navigable unit type of the plurality of predefined navigable unit types; in response to determining that the navigation setting gesture corresponds to the second navigable unit type of the plurality of predefined navigable unit types; changing the current navigable unit type from the first navigable unit type to the second navigable unit type; and outputting audible accessibility information indicating that the second navigable unit type has been selected; after changing the current navigable unit type from the first navigable unit type to the second navigable unit type, detecting a second navigation gesture on the touch-sensitive surface; and
,in response to detecting the second navigation gesture on the touch-sensitive surface, navigating to a second set of one or more user interface elements of the plurality of user interface elements that corresponds to the second navigable unit type. - View Dependent Claims (22, 23, 24, 25, 26, 27, 28)
-
29. An electronic device, comprising:
-
a display; a touch-sensitive surface; one or more processors; memory; and one or more programs, wherein the one or more programs are stored in the memory and configured to be executed by the one or more processors, the one or more programs including instructions for;
while the electronic device is in a screen reader accessibility mode;displaying at least a portion of a web page on the display, the web page including a plurality of user interface elements; detecting a first navigation gesture on the touch-sensitive surface; in response to detecting the first navigation gesture on the touch-sensitive surface, navigating to a first set of one or more user interface elements of the plurality of user interface elements that corresponds to a current navigable unit type, wherein the current navigable unit type is set to a first navigable unit type selected from a plurality of predefined navigable unit types; detecting a navigation setting gesture on the touch-sensitive surface that corresponds to a respective character, wherein; the navigation setting gesture that corresponds to the respective character is a single finger gesture that moves across the touch-sensitive surface along a respective path that corresponds to the respective character; determining whether the navigation setting gesture corresponds to a second navigable unit type of the plurality of predefined navigable unit types; in response to determining that the navigation setting gesture corresponds to the second navigable unit type of the plurality of predefined navigable unit types; changing the current navigable unit type from the first navigable unit type to the second navigable unit type; and outputting audible accessibility information indicating that the second navigable unit type has been selected; after changing the current navigable unit type from the first navigable unit type to the second navigable unit type, detecting a second navigation gesture on the touch-sensitive surface; and
,in response to detecting the second navigation gesture on the touch-sensitive surface, navigating to a second set of one or more user interface elements of the plurality of user interface elements that corresponds to the second navigable unit type.
-
-
30. A non-transitory computer readable storage medium storing one or more programs, the one or more programs comprising instructions, which when executed by an electronic device with a display and a touch-sensitive surface, cause the device to:
while the electronic device is in a screen reader accessibility mode; display at least a portion of a web page on the display, the web page including a plurality of user interface elements; detect a first navigation gesture on the touch-sensitive surface; in response to detecting the first navigation gesture on the touch-sensitive surface, navigate to a first set of one or more user interface elements of the plurality of user interface elements that corresponds to a current navigable unit type, wherein the current navigable unit type is set to a first navigable unit type selected from a plurality of predefined navigable unit types; detect a navigation setting gesture on the touch-sensitive surface that corresponds to a respective character, wherein; the navigation setting gesture that corresponds to the respective character is a single finger gesture that moves across the touch-sensitive surface along a respective path that corresponds to the respective character; determine whether the navigation setting gesture corresponds to a second navigable unit type of the plurality of predefined navigable unit types; in response to determining that the navigation setting gesture corresponds to the second navigable unit type of the plurality of predefined navigable unit types; change the current navigable unit type from the first navigable unit type to the second navigable unit type; and output audible accessibility information indicating that the second navigable unit type has been selected; after changing the current navigable unit type from the first navigable unit type to the second navigable unit type, detect a second navigation gesture on the touch-sensitive surface; and
,in response to detecting the second navigation gesture on the touch-sensitive surface, navigate to a second set of one or more user interface elements of the plurality of user interface elements that corresponds to the second navigable unit type.
Specification