Synchronized audio feedback for non-visual touch interface system and method
First Claim
Patent Images
1. A computer-implemented user interface method for managing user inputs, the method comprising:
- receiving, from a touchscreen, a location of a reference input corresponding to a first touch on the touchscreen at the location of the reference input and a location of a subsequent input corresponding to a first touch on the touchscreen at the location of the subsequent input;
receiving, from the touchscreen, the location of the reference input corresponding to a second touch on the touchscreen at the location of the reference input and the location of the subsequent input corresponding to a second touch on the touchscreen at the location of the subsequent input;
identifying, by a controller, a horizontal distance between the location of the reference input and the location of the subsequent input along a horizontal axis and a vertical distance between the location of the reference input and the location of the subsequent input along a vertical axis;
identifying, by the controller, a horizontal direction between the location of the reference input and the location of the subsequent input along the horizontal axis and a vertical direction between the location of the reference input and the location of the subsequent input along the vertical axis;
determining, by the controller, a first time duration between the first touch on the touchscreen at the location of the reference input and the first touch on the touchscreen at the location of the subsequent input and a second time duration between the second touch on the touchscreen at the location of the reference input and the second touch on the touchscreen at the location of the subsequent input, the second time duration being different than the first time duration;
associating, by the controller, the horizontal distance, the horizontal direction, the vertical distance, the vertical direction and the first time duration with a first gesture and the horizontal distance, the horizontal direction, the vertical distance, the vertical direction and the second time duration with a second gesture that is different than the first gesture;
associating, by the controller, the first gesture with a first instruction to be performed by the controller and the second gesture with a second instruction to be performed by the controller; and
determining, by the controller, an audio output based on a predictive pattern and having a first characteristic that is determined based on the horizontal distance and the horizontal direction and a second characteristic that is determined based on the vertical distance and the vertical direction.
2 Assignments
0 Petitions
Accused Products
Abstract
The disclosed systems and methods are directed to interfaces, and more particularly, to an audio feedback system for touch interface. This touch interface allows a user to interact with a touchscreen without a priori knowledge of where items are located on the touchscreen or even the relative orientation of that touchscreen. For instance, this interface may operate responsive to relative offset between touches.
26 Citations
19 Claims
-
1. A computer-implemented user interface method for managing user inputs, the method comprising:
-
receiving, from a touchscreen, a location of a reference input corresponding to a first touch on the touchscreen at the location of the reference input and a location of a subsequent input corresponding to a first touch on the touchscreen at the location of the subsequent input; receiving, from the touchscreen, the location of the reference input corresponding to a second touch on the touchscreen at the location of the reference input and the location of the subsequent input corresponding to a second touch on the touchscreen at the location of the subsequent input; identifying, by a controller, a horizontal distance between the location of the reference input and the location of the subsequent input along a horizontal axis and a vertical distance between the location of the reference input and the location of the subsequent input along a vertical axis; identifying, by the controller, a horizontal direction between the location of the reference input and the location of the subsequent input along the horizontal axis and a vertical direction between the location of the reference input and the location of the subsequent input along the vertical axis; determining, by the controller, a first time duration between the first touch on the touchscreen at the location of the reference input and the first touch on the touchscreen at the location of the subsequent input and a second time duration between the second touch on the touchscreen at the location of the reference input and the second touch on the touchscreen at the location of the subsequent input, the second time duration being different than the first time duration; associating, by the controller, the horizontal distance, the horizontal direction, the vertical distance, the vertical direction and the first time duration with a first gesture and the horizontal distance, the horizontal direction, the vertical distance, the vertical direction and the second time duration with a second gesture that is different than the first gesture; associating, by the controller, the first gesture with a first instruction to be performed by the controller and the second gesture with a second instruction to be performed by the controller; and determining, by the controller, an audio output based on a predictive pattern and having a first characteristic that is determined based on the horizontal distance and the horizontal direction and a second characteristic that is determined based on the vertical distance and the vertical direction. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9)
-
-
10. A computer-implemented user interface comprising:
-
a touchscreen configured to detect a first touch at a reference location, a first touch at a subsequent location, a second touch at the reference location and a second touch at the subsequent location; a controller configured to; determine a horizontal distance between the reference location and the subsequent location along a horizontal axis and a vertical distance between the reference location and the subsequent location along a vertical axis, determine a horizontal direction between the reference location and the subsequent location along the horizontal axis and a vertical direction between the reference location and the subsequent location along the vertical axis, determine a first time duration between the first touch at the reference location and the first touch at the subsequent location and a second time duration between the second touch at the reference location and the second touch at the subsequent location, the second time duration being different than the first time duration; associate the horizontal distance, the horizontal direction, the vertical distance and the vertical direction with a gesture, associate the gesture with an instruction to be performed by the controller, determine a first audio output based on a predictive pattern, the horizontal distance, the horizontal direction, the vertical distance, the vertical direction and the first time duration, and determine a second audio output that is different than the first audio output based on the predictive pattern, the horizontal distance, the horizontal direction, the vertical distance, the vertical direction and the second time duration; and a speaker configured to output the first audio output in response to the first touch at the subsequent location being detected and to output the second audio output in response to the second touch at the subsequent location being detected. - View Dependent Claims (11, 12)
-
-
13. A non-transitory computer-readable medium having stored thereon sequences of instructions, the sequences of instructions including instructions which in response to execution by a processor causes the processor to perform the steps of:
-
receiving a location of a reference input corresponding to a first touch at the location of the reference input from a touchscreen; receiving a location of a first subsequent input corresponding to a first touch at the location of the subsequent input from the touchscreen; receiving the location of the reference input corresponding to a second touch at the location of the reference input from the touchscreen; receiving the location of the first subsequent input corresponding to a second touch at the location of the subsequent input from the touchscreen; receiving a location of a second subsequent input from the touchscreen; determining a first horizontal distance between the location of the reference input and the location of the first subsequent input along a horizontal axis and a first vertical distance between the location of the reference input and the location of the first subsequent input along a vertical axis; determining a second horizontal distance between the location of the first subsequent input and the location of the second subsequent input along the horizontal axis and a second vertical distance between the location of the first subsequent input and the location of the second subsequent input along the vertical axis; determining a first horizontal direction between the location of the reference input and the location of the first subsequent input along the horizontal axis and a first vertical direction between the location of the reference input and the location of the first subsequent input along the vertical axis; determining a second horizontal direction between the location of the first subsequent input and the location of the second subsequent input along the horizontal axis and a second vertical direction between the location of the first subsequent input and the location of the second subsequent input along the vertical axis; determining a first time duration between the first touch on the touchscreen at the location of the reference input and the first touch on the touchscreen at the location of the first subsequent input and a second time duration between the second touch on the touchscreen at the location of the reference input and the second touch on the touchscreen at the location of the first subsequent input, the second time duration being different than the first time duration; associating the first horizontal distance, the first vertical distance, the second horizontal distance, the second vertical distance, the first horizontal direction, the first vertical direction, the second horizontal direction, the second vertical direction and the first time duration with a first gesture; associating the first horizontal distance, the first vertical distance, the second horizontal distance, the second vertical distance, the first horizontal direction, the first vertical direction, the second horizontal direction, the second vertical direction and the second time duration with a second gesture; associating the first gesture with a first instruction to be performed by the processor; associating the second gesture with a second instruction to be performed by the processor; and determining audio output based on the gesture. - View Dependent Claims (14, 15)
-
-
16. A computer-implemented user interface method for managing user inputs, the method comprising:
-
receiving, from a touchscreen, a location of a reference input corresponding to a first touch on the touchscreen at the location of the reference input; receiving, from the touchscreen, a location of a first subsequent input corresponding to a first touch on the touchscreen at the location of the first subsequent input; receiving, from a touchscreen, a location of a reference input corresponding to a second touch on the touchscreen at the location of the reference input; receiving, from the touchscreen, a location of a first subsequent input corresponding to a second touch on the touchscreen at the location of the first subsequent input; receiving, from the touchscreen, a location of a second subsequent input; identifying, by a controller, a first horizontal distance between the reference input and the first subsequent input along a horizontal axis and a first vertical distance between the location of the reference input and the location of the first subsequent input along a vertical axis; identifying, by the controller, a second horizontal distance between the first subsequent input and the second subsequent input along the horizontal axis and a second vertical distance between the location of the first subsequent input and the location of the second subsequent input along the vertical axis; identifying, by the controller, a first horizontal direction between the reference input and the first subsequent input along the horizontal axis and a first vertical direction between the location of the reference input and the location of the first subsequent input along the vertical axis; identifying, by the controller, a second horizontal direction between the first subsequent input and the second subsequent input along the horizontal axis and a second vertical direction between the location of the first subsequent input and the location of the second subsequent input along the vertical axis; determining, by the controller, a first time duration between the first touch on the touchscreen at the location of the reference input and the first touch on the touchscreen at the location of the first subsequent input and a second time duration between the second touch on the touchscreen at the location of the reference input and the second touch on the touchscreen at the location of the first subsequent input the second time duration being different than the first time duration; associating, by the controller, the first horizontal distance, the first vertical distance, the second horizontal distance, the second vertical distance, the first horizontal direction, the first vertical direction, the second horizontal direction, the second vertical direction and the first time duration of the received inputs with a first gesture; associating, by the controller, the first horizontal distance, the first vertical distance, the second horizontal distance, the second vertical distance, the first horizontal direction, the first vertical direction, the second horizontal direction, the second vertical direction and the second time duration of the received inputs with a second gesture that is different than the first gesture; associating, by the controller, the first gesture with a first instruction to be performed by the controller; associating, by the controller, the second gesture with a second instruction to be performed by the controller; and determining, by the controller, an audio output based on a predictive pattern, the first horizontal distance, the first vertical distance, the second horizontal distance, the second vertical distance, the first horizontal direction, the first vertical direction, the second horizontal direction and the second vertical direction. - View Dependent Claims (17, 18, 19)
-
Specification