Gestures for touch sensitive input devices
First Claim
Patent Images
1. A computer implemented method for processing touch inputs, said method comprising:
- reading data from a touch screen, the data pertaining to touch input with respect to the touch screen, and the touch screen having a multipoint capability; and
identifying at least one multipoint gesture based on the data from the touch screen.
2 Assignments
0 Petitions
Accused Products
Abstract
Methods and systems for processing touch inputs are disclosed. The invention in one respect includes reading data from a multipoint sensing device such as a multipoint touch screen where the data pertains to touch input with respect to the multipoint sensing device, and identifying at least one multipoint gesture based on the data from the multipoint sensing device.
3372 Citations
100 Claims
-
1. A computer implemented method for processing touch inputs, said method comprising:
-
reading data from a touch screen, the data pertaining to touch input with respect to the touch screen, and the touch screen having a multipoint capability; and
identifying at least one multipoint gesture based on the data from the touch screen. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8)
-
-
9. A gestural method, comprising:
-
detecting multiple touches at different points on a touch sensitive surface at the same time; and
segregating the multiple touches into at least two separate gestural inputs occurring simultaneously, each gestural input having a different function. - View Dependent Claims (10)
-
-
11. A gestural method, comprising:
-
concurrently detecting a plurality of gestures that are concurrently performed with reference to a touch sensing device;
producing different commands for each of the gestures that have been detected. - View Dependent Claims (12, 13, 14)
-
-
15. A gestural method, comprising:
-
displaying a graphical image on a display screen;
detecting a plurality of touches at the same time on a touch sensitive device; and
linking the detected multiple touches to the graphical image presented on the display screen. - View Dependent Claims (16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28)
-
-
29. A method of invoking a user interface element on a display via a multipoint touch screen of a computing system, said method comprising:
-
detecting and analyzing the simultaneous presence of two or more objects on contact with said touch screen;
based at least in part on said analyzing, selecting a user interface tool, from a plurality of available tools, to display on a display for interaction by a user of said computing system; and
controlling the interface tool based at least in part on the further movement of said objects in relation to said touch screen. - View Dependent Claims (30, 31)
-
-
32. A touch-based method, comprising
detecting a user input that occurs over a multipoint sensing device, the user input including one or more inputs, each input having a unique identifier; -
during the user input, classifying the user input as a tacking or selecting input when the user input includes one unique identifier or a gesture input when the user input includes at least two unique identifiers;
performing tracking or selecting during the user input when the user input is classified as a tracking or selecting input;
performing one or more control actions during the user input when the user input is classified as a gesturing input. - View Dependent Claims (33, 34, 35, 36, 37, 92, 93)
-
-
38. A touch-based method, comprising:
-
outputting a GUI on a display;
detecting a user input on a touch sensitive device;
analyzing the user input for characteristics indicative of tracking, selecting or a gesturing;
categorizing the user input as a tracking, selecting or gesturing input;
performing tracking or selecting in the GUI when the user input is categorized as a tracking or selecting input;
performing control actions in the GUI when the user input is categorized as a gesturing input, the actions being based on the particular gesturing input. - View Dependent Claims (94, 95, 96, 97, 98, 99, 100)
-
-
39. A touch-based method, comprising:
-
capturing an initial touch image;
determining the touch mode based on the touch image;
capturing the next touch image;
determining if the touch mode changed between the initial and next touch images;
if the touch mode changed, setting the next touch image as the initial touch image and determining the touch mode based on the new initial touch image; and
if the touch mode stayed the same, comparing the touch images and performing a control function based on the comparison. - View Dependent Claims (40)
-
-
41. A computer implemented method for processing touch inputs, said method comprising:
-
reading data from a touch screen, the data pertaining to touch input with respect to the touch screen, and the touch screen having a multipoint capability;
converting the data to a collection of features;
classifying the features;
grouping the features into one or more feature groups;
calculating key parameters of the feature groups; and
associating the feature groups to user interface elements on a display. - View Dependent Claims (46)
-
-
42. The computer implemented method as recited in 41 wherein said method further comprises:
recognizing when at least one of the feature groups indicates performance of a gesture relative to its associated user interface element.
-
43. The computer implemented method as recited in 41 wherein said method further comprises:
providing user feedback when at least one of the feature groups indicates performance of a gesture relative to its associated user interface element.
-
44. The computer implemented method as recited in 41 wherein said method further comprises:
implementing an action when at least one of the feature groups indicates performance of a gesture relative to its associated user interface element.
-
45. The computer implemented method as recited in 44, wherein said method further comprises:
providing user feedback in conjunction with the action.
-
47. A computer implemented method, comprising:
-
outputting a graphical image;
receiving a multitouch gesture input over the graphical image; and
changing the graphical image based on and in unison with multitouch gesture input. - View Dependent Claims (48, 49)
-
-
50. A touch based method, comprising:
-
receiving a gestural input over a first region;
generating a first command when the gestural input is received over the first region;
receiving the same gestural input over a second region; and
generating a second command when the same gestural input is received over the second region, the second command being different than the first command.
-
-
51. A method for recognizing multiple gesture inputs, the method comprising:
-
receiving a multitouch gestural stroke on a touch sensitive surface, the multitouch gestural stroke maintaining continuous contact on the touch sensitive surface;
recognizing a first gesture input during the multitouch gestural stroke; and
recognizing a second gesture input during the multitouch gestural stroke. - View Dependent Claims (52)
-
-
53. A computer implemented method, comprising:
-
detecting a plurality of touches on a touch sensing device;
forming one or more touch groups with the plurality of touches;
monitoring the movement of and within each of the touch groups; and
generating control signals when the touches within the touch groups are moved or when the touch groups are moved in their entirety.
-
-
54. A method for recognizing a zoom gesture made on a multipoint touch screen computing system, comprising:
-
detecting the relative locations of a first object and a second object at the same time;
detecting a change in the relative locations of said first and second object;
generating a zoom signal in response to said detected change. - View Dependent Claims (55, 56, 57, 58, 59, 60)
-
-
61. The method as recited in claim 61 wherein the amount of zooming varies according to the distance between the two objects.
-
62. A method for recognizing a pan gesture made on a multipoint touch screen, comprising:
-
detecting the presence of at least a first object and a second object at the same time;
monitoring the position of the said at least first and second objects when the objects are moved together across the touch screen; and
generating a pan signal when the position of the said at least first and second objects changes relative to an initial position. - View Dependent Claims (63)
-
-
64. A method for recognizing a rotate gesture made on a multipoint touch screen, comprising:
-
detecting the presence of at least a first object and a second object at the same time;
detecting a rotation of said at least first and second objects; and
generating a rotate signal in response to said detected rotation of said at least first and second objects. - View Dependent Claims (65)
-
-
66. A computer implemented method for initiating floating controls via a touch screen, the method comprising:
-
detecting the presence of an object on the touch screen;
recognizing the object; and
generating a user interface element on the touch screen in the vicinity of the object based on the recognized object. - View Dependent Claims (67, 68, 69, 70, 71)
-
-
72. A computer implemented method for initiating zooming targets via a touch screen, the method comprising:
-
displaying an image on a GU;
enlarging the image for a period of time when the presence of an object is detected over the image. - View Dependent Claims (73, 74, 75, 76, 77, 78)
-
-
79. A computer implemented method of initiating a page turn via a touch screen, the method comprising:
-
displaying a page from a multitude of pages in a GUI presented on the touch screen;
detecting the presence of an object in a predetermined region on the touch screen over the page; and
generating a page turn signal when the object is translated on the touch screen in the predetermined region. - View Dependent Claims (80, 81, 82)
-
-
83. A computer implemented method of initiating inertia, the method comprising:
-
displaying an image on a GUI;
detecting a stroke on a touch sensitive surface;
noting the speed and direction of the stroke;
moving the image or features embedded in the image in accordance with the speed and direction of the stroke; and
slowing the motion of the image or features embedded in the image in accordance with inertia principals when the stroke is no longer detected. - View Dependent Claims (84)
-
-
85. A method of simulating a keyboard, comprising:
-
providing a display and a touch screen positioned over the display;
displaying a keyboard on the display, the keyboard including at least a first and a second key;
detecting the presence of a first object over the first key and a second object over the second key at the same time; and
generating a single control function in response to the detection of said first object over the first key and said second object over said second key. - View Dependent Claims (86, 87, 88)
-
-
89. A computer implemented method comprising:
-
presenting a virtual wheel;
detecting at least a first finger over the virtual wheel;
associating the finger to the virtual wheel; and
generating control signals when the finger is moved about the virtual wheel. - View Dependent Claims (90, 91)
-
Specification