CONTEXT BASED GESTURE DELINEATION FOR USER INTERACTION IN EYES-FREE MODE
First Claim
1. A device comprising:
- a touch sensor for detecting gestures made by a user; and
a user interface including (a) a content delivery mode in which digital content is delivered to the user, and in which the user interface is configured to respond to a first set of command gestures, and (b) a manual mode in which delivery of the digital content is paused, and in which the user interface is configured to respond to a second set of command gestures, the second set of command gestures including more command gestures than the first set of command gestures,wherein the first and second sets of command gestures are responded to without regard to a particular location on the touch sensor where a particular gesture is detected.
9 Assignments
0 Petitions
Accused Products
Abstract
Techniques are disclosed for facilitating the use of an electronic device having a user interface that is sensitive to a user'"'"'s gestures. An “eyes-free” mode is provided in which the user can control the device without looking at the device display. Once the eyes-free mode is engaged, the user can control the device by performing gestures that are detected by the device, wherein a gesture is interpreted by the device without regard to a specific location where the gesture is made. The eyes-free mode can be used, for example, to look up a dictionary definition of a word in an e-book or to navigate through and select options from a hierarchical menu of settings on a tablet. The eyes-free mode advantageously allows a user to interact with the user interface in situations where the user has little or no ability to establish concentrated visual contact with the device display.
122 Citations
20 Claims
-
1. A device comprising:
-
a touch sensor for detecting gestures made by a user; and a user interface including (a) a content delivery mode in which digital content is delivered to the user, and in which the user interface is configured to respond to a first set of command gestures, and (b) a manual mode in which delivery of the digital content is paused, and in which the user interface is configured to respond to a second set of command gestures, the second set of command gestures including more command gestures than the first set of command gestures, wherein the first and second sets of command gestures are responded to without regard to a particular location on the touch sensor where a particular gesture is detected. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9, 10, 11)
-
-
12. A mobile electronic device comprising:
-
a touch sensitive display for displaying digital content and detecting gestures made by a user; a speaker; a text-to-speech module; and a user interface including (a) a reading mode in which the text-to-speech module converts the displayed digital content into an audio signal that is played using the speaker, and (b) a manual mode in which the playing of the audio signal generated by the text-to-speech module is paused, wherein; the user interface is configured to respond to a first set of command gestures detected by the touch sensitive display while in the reading mode, the user interface is configured to respond to a second set of command gestures detected by the touch sensitive display while in the manual mode, the first set of command gestures includes fewer command gestures than the second set of command gestures, the first and second sets of command gestures are responded to without regard to a particular location on the touch sensitive display where a particular gesture is detected, and a transition command gesture is included in both the first and second set of command gestures, the transition command gesture being configured to toggle the user interface back-and-forth between the reading and manual modes. - View Dependent Claims (13, 14, 15, 16)
-
-
17. A non-transitory computer readable medium encoded with instructions that, when executed by at least one processor, cause an eyes-free control process to be carried out, the control process comprising:
-
providing a touch sensitive user interface having a reading mode and a manual mode; aurally presenting digital content in the reading mode; pausing the aural presentation in the manual mode; responding to a first set of command gestures when the user interface is in the reading mode; and responding to a second set of command gestures when the user interface is in the manual mode, wherein; the first and second sets of command gestures are responded to without regard to a particular location on a touch sensor where a particular gesture is detected, and a transition command gesture is included in both the first and second set of command gestures, the transition command gesture being configured to toggle the user interface back-and-forth between the reading and manual modes. - View Dependent Claims (18, 19, 20)
-
Specification