Devices, methods, and graphical user interfaces for accessibility using a touch-sensitive surface
First Claim
1. A method, comprising:
- at an accessible electronic device with a touch-sensitive surface and a display;
displaying a plurality of user interface elements on the display, wherein a current focus is on a first user interface element;
detecting a first finger gesture directly contacting on the touch-sensitive surface, wherein the first finger gesture is remote from a location on the touch-sensitive surface that corresponds to a second user interface element; and
,in response to detecting the first finger gesture;
changing the current focus from the first user interface element in the plurality of user interface elements to the second user interface element in the plurality of user interface elements; and
outputting audible accessibility information associated with the second user interface element.
1 Assignment
0 Petitions
Accused Products
Abstract
A method is performed by an accessible electronic device with a display and a touch-sensitive surface. The method includes: displaying a plurality of user interface elements on the display, wherein a current focus is on a first user interface element; detecting a first finger gesture on the touch-sensitive surface, wherein the first finger gesture is independent of contacting a location on the touch-sensitive surface that corresponds to a second user interface element; and, in response to detecting the first finger gesture: changing the current focus from the first user interface element in the plurality of user interface elements to the second user interface element in the plurality of user interface elements; and outputting accessibility information associated with the second user interface element.
-
Citations
27 Claims
-
1. A method, comprising:
- at an accessible electronic device with a touch-sensitive surface and a display;
displaying a plurality of user interface elements on the display, wherein a current focus is on a first user interface element; detecting a first finger gesture directly contacting on the touch-sensitive surface, wherein the first finger gesture is remote from a location on the touch-sensitive surface that corresponds to a second user interface element; and
,in response to detecting the first finger gesture; changing the current focus from the first user interface element in the plurality of user interface elements to the second user interface element in the plurality of user interface elements; and outputting audible accessibility information associated with the second user interface element. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9)
- at an accessible electronic device with a touch-sensitive surface and a display;
-
10. An accessible electronic device, comprising:
-
a touch-sensitive surface; a display; one or more processors; memory; and one or more programs, wherein the one or more programs are stored in the memory and configured to be executed by the one or more processors, the one or more programs including instructions for; displaying a plurality of user interface elements on the display, wherein a current focus is on a first user interface element; detecting a first finger gesture directly contacting on the touch-sensitive surface, wherein the first finger gesture is remote from a location on the touch-sensitive surface that corresponds to a second user interface element; and
,in response to detecting the first finger gesture; changing the current focus from the first user interface element in the plurality of user interface elements to the second user interface element in the plurality of user interface elements; and outputting audible accessibility information associated with the second user interface element. - View Dependent Claims (12, 13, 14, 15, 16, 17, 18, 19)
-
-
11. A non-transitory computer readable storage medium having stored therein instructions, which when executed by an accessible electronic device with a touch-sensitive surface and a display, cause the device to:
-
display a plurality of user interface elements on the display, wherein a current focus is on a first user interface element; detect a first finger gesture directly contacting on the touch-sensitive surface, wherein the first finger gesture is remote from a location on the touch-sensitive surface that corresponds to a second user interface element; and
,in response to detecting the first finger gesture; change the current focus from the first user interface element in the plurality of user interface elements to the second user interface element in the plurality of user interface elements; and output audible accessibility information associated with the second user interface element. - View Dependent Claims (20, 21, 22, 23, 24, 25, 26, 27)
-
Specification