Detecting and interpreting real-world and security gestures on touch and hover sensitive devices
First Claim
1. A method for receiving an affirmative gesture formed on or about a sensor panel, comprising:
- detecting one or more images at the sensor panel generated from a hand formed in a shape of an OK sign;
determining that the one or more images are arranged in a pattern corresponding to a predetermined OK gesture;
determining a centering parameter from the one or more images;
associating the OK gesture with a user interface (UI) element coincident with the centering parameter; and
performing an affirmative action in accordance with the UI element.
2 Assignments
0 Petitions
Accused Products
Abstract
“Real-world” gestures such as hand or finger movements/orientations that are generally recognized to mean certain things (e.g., an “OK” hand signal generally indicates an affirmative response) can be interpreted by a touch or hover sensitive device to more efficiently and accurately effect intended operations. These gestures can include, but are not limited to, “OK gestures,” “grasp everything gestures,” “stamp of approval gestures,” “circle select gestures,” “X to delete gestures,” “knock to inquire gestures,” “hitchhiker directional gestures,” and “shape gestures.” In addition, gestures can be used to provide identification and allow or deny access to applications, files, and the like.
-
Citations
60 Claims
-
1. A method for receiving an affirmative gesture formed on or about a sensor panel, comprising:
-
detecting one or more images at the sensor panel generated from a hand formed in a shape of an OK sign; determining that the one or more images are arranged in a pattern corresponding to a predetermined OK gesture; determining a centering parameter from the one or more images; associating the OK gesture with a user interface (UI) element coincident with the centering parameter; and performing an affirmative action in accordance with the UI element. - View Dependent Claims (2)
-
-
3. A method for receiving a grouping gesture formed on or about a sensor panel, comprising:
-
detecting one or more images at the sensor panel generated from a hand changing from a palm-down, outstretched shape to a clenched shape; determining that the one or more images are arranged in space and time in a pattern and sequence corresponding to a predetermined grasp everything gesture; determining a circumferential boundary from the one or more images; associating the grasp everything gesture with user interface (UI) elements within the circumferential boundary; and performing a grouping action in accordance with the UI elements within the circumferential boundary. - View Dependent Claims (4)
-
-
5. A method for receiving an affirmative gesture formed on or about a sensor panel, comprising:
-
detecting one or more images at the sensor panel generated from a clenched fist; determining that the one or more images are arranged in a pattern corresponding to a predetermined stamp of approval gesture; determining a centering parameter from the one or more images; associating the stamp of approval gesture with a user interface (UI) element coincident with the centering parameter; and performing an affirmative action in accordance with the UI element. - View Dependent Claims (6)
-
-
7. A method for receiving a selection gesture formed on or about a sensor panel, comprising:
-
detecting an image at the sensor panel generated from a single finger; tracking movement of the image over time; determining that the movement of the image is arranged in space and time in a pattern and sequence corresponding to a predetermined circle select gesture; determining a circumferential boundary from the movement of the image; associating the circle select gesture with user interface (UI) elements within the circumferential boundary; and performing a selecting action in accordance with the UI elements within the circumferential boundary. - View Dependent Claims (8)
-
-
9. A method for receiving a deletion gesture formed on or about a sensor panel, comprising:
-
detecting a first image at the sensor panel generated from a single finger and representative of a first touch; tracking movement of the first image over time; detecting a second image at the sensor panel generate from the same finger and representative of a second touch; tracking movement of the second image over time; determining that the movement of the first and second images are arranged in space and time in a pattern and sequence corresponding to a predetermined X to delete gesture; determining an intersection of the movements of the first and second images; associating the X to delete gesture with a user interface (UI) element coincident with the intersection; and performing a deleting action in accordance with the UI element coincident with the intersection. - View Dependent Claims (10)
-
-
11. A method for receiving an inquiry gesture formed on or about a sensor panel, comprising:
-
detecting one or more substantially linearly arranged first images at the sensor panel generated from one or more knuckles in a clenched fist and representative of a first knock; detecting one or more substantially linearly arranged second images at the sensor panel generated from the same knuckles and representative of a second knock; determining that the first and second images are arranged in space and time in a pattern and sequence corresponding to a predetermined knock to inquire gesture; determining centering parameter from the first or second images; associating the knock to inquire gesture with a user interface (UI) element coincident with the centering parameter; and performing an inquiry action in accordance with the UI element coincident with the centering parameter. - View Dependent Claims (12)
-
-
13. A method for receiving a directional gesture formed on or about a sensor panel, comprising:
-
detecting one or more images at the sensor panel generated from a hand with thumb extended and all other fingers curled; determining that the one or more images are arranged in a pattern corresponding to a predetermined hitchhiker gesture; determining a directional parameter from the one or more images; associating the hitchhiker gesture with a user interface (UI) element; and performing a directional action in accordance with the UI element. - View Dependent Claims (14)
-
-
15. A method for receiving a shape gesture formed on or about a sensor panel, comprising:
-
detecting one or more images at the sensor panel generated from a hand formed in a particular shape; determining that the one or more images are arranged in a pattern corresponding to a predetermined shape gesture; associating the shape gesture with a user interface (UI) element; and performing an action associated with the shape gesture upon the UI element. - View Dependent Claims (16, 17, 18, 19)
-
-
20. A method for receiving an identification gesture formed on or about a sensor panel, comprising:
-
detecting one or more images at the sensor panel; determining that the one or more images represent an attempted identification gesture; associating the attempted identification gesture with a user interface (UI) element; and determining whether the movement of the one or more images are arranged in space and time in a pattern and sequence corresponding to a user authorized to access the UI element; and granting or denying access to the UI element in accordance with the determination of whether the attempted identification gesture corresponds to a user authorized to access the UI element.
-
-
21. A computer-readable medium comprising program code for receiving an affirmative gesture formed on or about a sensor panel, the program code for causing performance of a method comprising:
-
detecting one or more images at the sensor panel-generated from a hand formed in a shape of an OK sign; determining that the one or more images are arranged in a pattern corresponding to a predetermined OK gesture; determining a centering parameter from the one or more images; associating the OK gesture with a user interface (UI) element coincident with the centering parameter; and performing an affirmative action in accordance with the UI element. - View Dependent Claims (22)
-
-
23. A computer-readable medium comprising program code for receiving a grouping gesture formed on or about a sensor panel, the program code for causing performance of a method comprising:
-
detecting one or more images at the sensor panel generated from a hand changing from a palm-down, outstretched shape to a clenched shape; determining that the one or more images are arranged in space and time in a pattern and sequence corresponding to a predetermined grasp everything gesture; determining a circumferential boundary from the one or more images; associating the grasp everything gesture with user interface (UI) elements within the circumferential boundary; and performing a grouping action in accordance with the UI elements within the circumferential boundary. - View Dependent Claims (24)
-
-
25. A computer-readable medium comprising program code for receiving an affirmative gesture formed on or about a sensor panel, the program code for causing performance of a method comprising:
-
detecting one or more images at the sensor panel generated from a clenched fist; determining that the one or more images are arranged in a pattern corresponding to a predetermined stamp of approval gesture; determining a centering parameter from the one or more images; associating the stamp of approval gesture with a user interface (UI) element coincident with the centering parameter; and performing an affirmative action in accordance with the UI element. - View Dependent Claims (26)
-
-
27. A computer-readable medium comprising program code for receiving a selection gesture formed on or about a sensor panel, the program code for causing performance of a method comprising:
-
detecting an image at the sensor panel generated from a single finger; tracking movement of the image over time; determining that the movement of the image is arranged in space and time in a pattern and sequence corresponding to a predetermined circle select gesture; determining a circumferential boundary from the movement of the image; associating the circle select gesture with user interface (UI) elements within the circumferential boundary; and performing a selecting action in accordance with the UI elements within the circumferential boundary. - View Dependent Claims (28)
-
-
29. A computer-readable medium comprising program code for receiving a deletion gesture formed on or about a sensor panel, the program code for causing performance of a method comprising:
-
detecting a first image at the sensor panel generated from a single finger and representative of a first touch; tracking movement of the first image over time; detecting a second image at the sensor panel generate from the same finger and representative of a second touch; tracking movement of the second image over time; determining that the movement of the first and second images are arranged in space and time in a pattern and sequence corresponding to a predetermined X to delete gesture; determining an intersection of the movements of the first and second images; associating the X to delete gesture with a user interface (UI) element coincident with the intersection; and performing a deleting action in accordance with the UI element coincident with the intersection. - View Dependent Claims (30)
-
-
31. A computer-readable medium comprising program code for receiving an inquiry gesture formed on or about a sensor panel, the program code for causing performance of a method comprising:
-
detecting one or more substantially linearly arranged first images at the sensor panel generated from one or more knuckles in a clenched fist and representative of a first knock; detecting one or more substantially linearly arranged second images at the sensor panel generated from the same knuckles and representative of a second knock; determining that the first and second images are arranged in space and time in a pattern and sequence corresponding to a predetermined knock to inquire gesture; determining centering parameter from the first or second images; associating the knock to inquire gesture with a user interface (UI) element coincident with the centering parameter; and performing an inquiry action in accordance with the UI element coincident with the centering parameter. - View Dependent Claims (32)
-
-
33. A computer-readable medium comprising program code for receiving a directional gesture formed on or about a sensor panel, the program code for causing performance of a method comprising:
-
detecting one or more images at the sensor panel generated from a hand with thumb extended and all other fingers curled; determining that the one or more images are arranged in a pattern corresponding to a predetermined hitchhiker gesture; determining a directional parameter from the one or more images; associating the hitchhiker gesture with a user interface (UI) element; and performing a directional action in accordance with the UI element. - View Dependent Claims (34)
-
-
35. A computer-readable medium comprising program code for receiving a shape gesture formed on or about a sensor panel, the program code for causing performance of a method comprising:
-
detecting one or more images at the sensor panel generated from a hand formed in a particular shape; determining that the one or more images are arranged in a pattern corresponding to a predetermined shape gesture; associating the shape gesture with a user interface (UI) element; and performing an action associated with the shape gesture upon the UI element. - View Dependent Claims (36, 37, 38, 39)
-
-
40. A computer-readable medium comprising program code for receiving an identification gesture formed on or about a sensor panel, the program code for causing performance of a method comprising:
-
detecting one or more images at the sensor panel; determining that the one or more images represent an attempted identification gesture; associating the attempted identification gesture with a user interface (UI) element; and determining whether the movement of the one or more images are arranged in space and time in a pattern and sequence corresponding to a user authorized to access the UI element; and granting or denying access to the UI element in accordance with the determination of whether the attempted identification gesture corresponds to a user authorized to access the UI element.
-
-
41. A method for receiving a security gesture formed on or about a sensor panel, comprising:
-
detecting one or more images at the sensor panel generated from two fingers and two thumbs of two hands touching near a center of the sensor panel; determining that the one or more images correspond to a predetermined unlock gesture; and performing an unlocking action. - View Dependent Claims (42)
-
-
43. A method for receiving a security gesture formed on or about a sensor panel, comprising:
-
detecting four images at the sensor panel generated from two fingers and two thumbs of two hands, each image touching near a different corner of the sensor panel; determining that the four images correspond to a predetermined lock gesture; and performing an locking action. - View Dependent Claims (44)
-
-
45. A method for receiving a hand edge gesture formed on or about a sensor panel, comprising:
-
detecting one or more images at a first location on the sensor panel generated from a side of a palm and pinky finger rotating or translating to a second location on the sensor panel; determining that the one or more images are arranged in space and time in a pattern and sequence corresponding to a predetermined hand edge gesture; and performing a predetermined action in accordance with the detected hand edge gesture. - View Dependent Claims (46, 47, 48, 49)
-
-
50. A method for receiving a framing gesture formed on or about a sensor panel, comprising:
-
detecting a plurality of images at the sensor panel generated from the palms and pinky fingers of two hands touching the sensor panel in an approximate upside-down U shape; determining that the one or more images correspond to a predetermined framing gesture; computing a frame area based on a location of the plurality of detected images; and performing an action within the frame area.
-
-
51. A computer-readable medium comprising program code for receiving a security gesture formed on or about a sensor panel, the program code for causing performance of a method comprising:
-
detecting one or more images at the sensor panel generated from two fingers and two thumbs of two hands touching near a center of the sensor panel; determining that the one or more images correspond to a predetermined unlock gesture; and performing an unlocking action. - View Dependent Claims (52)
-
-
53. A computer-readable medium comprising program code for receiving a security gesture formed on or about a sensor panel, the program code for causing performance of a method comprising:
-
detecting four images at the sensor panel generated from two fingers and two thumbs of two hands, each image touching near a different corner of the sensor panel; determining that the four images correspond to a predetermined lock gesture; and performing an locking action. - View Dependent Claims (54)
-
-
55. A computer-readable medium comprising program code for receiving a hand edge gesture formed on or about a sensor panel, the program code for causing performance of a method comprising:
-
detecting one or more images at a first location on the sensor panel generated from a side of a palm and pinky finger rotating or translating to a second location on the sensor panel; determining that the one or more images are arranged in space and time in a pattern and sequence corresponding to a predetermined hand edge gesture; and performing a predetermined action in accordance with the detected hand edge gesture. - View Dependent Claims (56, 57, 58, 59)
-
-
60. A computer-readable medium comprising program code for receiving a framing gesture formed on or about a sensor panel, the program code for causing performance of a method comprising:
-
detecting a plurality of images at the sensor panel generated from the palms and pinky fingers of two hands touching the sensor panel in an approximate upside-down U shape; determining that the one or more images correspond to a predetermined framing gesture; computing a frame area based on a location of the plurality of detected images; and performing an action within the frame area.
-
Specification