Haptic/voice-over navigation assistance
First Claim
Patent Images
1. A method comprising:
- displaying, by a user device that includes a touch display, a user interface;
receiving, by the user device, a user input on the touch display and in a first area of the user interface;
determining, by the user device, whether the user input in the first area is included in a second area of the user interface;
determining, by the user device, a type of a file associated with the first area;
formulating, by the user device, a name of the file based on the type of the file and path information associated with the file;
generating, by the user device, audio based on the type of the file and the path information associated with the file;
outputting, by the user device, one or more vibrations in response to determining that the user input is included in the second area of the user interface; and
outputting, by the user device, the audio that includes a vocalization of the name of the file associated with the first area in response to determining that the user input is included in the second area of the user interface.
1 Assignment
0 Petitions
Accused Products
Abstract
A method includes displaying a user interface on a touch display, receiving a user input on the touch display, determining whether the user input is associated with a navigational assistance area of the user interface, outputting one or more vibrations, when it is determined that the user input is associated with the navigational assistance area of the user interface, and outputting an auditory navigational cue that corresponds to a name associated with the navigational assistance area, when it is determined that the user input is associated with the navigational assistance area of the user interface.
-
Citations
20 Claims
-
1. A method comprising:
-
displaying, by a user device that includes a touch display, a user interface; receiving, by the user device, a user input on the touch display and in a first area of the user interface; determining, by the user device, whether the user input in the first area is included in a second area of the user interface; determining, by the user device, a type of a file associated with the first area; formulating, by the user device, a name of the file based on the type of the file and path information associated with the file; generating, by the user device, audio based on the type of the file and the path information associated with the file; outputting, by the user device, one or more vibrations in response to determining that the user input is included in the second area of the user interface; and outputting, by the user device, the audio that includes a vocalization of the name of the file associated with the first area in response to determining that the user input is included in the second area of the user interface. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 20)
-
-
9. A device comprising:
-
a touch display; and one or more processors to; display a user interface on the touch display; detect a user input on the touch display and in a first area of the user interface; determine whether the user input in the first area is included in a second area of the user interface, determine a type of a file associated with the first area; formulate a name of the file based on the type of the file and path information associated with the file; generate audio based on the type of the file and the path information associated with the file; output one or more vibrations in response to a determination that the user input is included in the second area of the user interface; and output the audio that includes a vocalization of the name of the file associated with the first area in response to a determination that the user input is included in the second area of the user interface. - View Dependent Claims (10, 11, 12, 13, 14, 15)
-
-
16. A non-transitory computer-readable medium comprising:
-
instructions that, when executed by at least one processor, cause the at least one processor to; detect a user input in a first area of a user interface displayed by a touch screen; determine whether the user input in the first area is included in a second area of the user interface; determine a type of a file associated with the first area; formulate a name of the file based on the type of the file and path information associated with the file; generate audio based on the type of the file and the path information associated with the file; output one or more vibrations in response to a determination that the user input is included in the second area of the user interface; and output the audio that includes a vocalization of the name of the file associated with the first area in response to a determination that the user input is included in the second area of the user interface. - View Dependent Claims (17, 18, 19)
-
Specification