3-D MOTION CONTROL FOR DOCUMENT DISCOVERY AND RETRIEVAL
First Claim
1. A document processing method comprising:
- in memory, associating each of a plurality of hand gestures that are detectable with a three-dimensional sensor with a respective one of a plurality of item processing tasks, the three-dimensional sensor being associated with a touch sensitive display device, at least one touch gesture that is detectable with the touch sensitive display device being associated with at least one of the plurality of item processing tasks;
displaying a set of graphic objects on the touch sensitive display device, each graphic object being associated with a respective item;
with the three-dimensional sensor, detecting a hand gesture;
identifying the respective one of the item processing tasks that is associated with the detected hand gesture; and
implementing the identified one of the item processing tasks on the displayed graphic objects, comprising causing at least a subset of the displayed graphic objects to respond on the display device based on attributes of the respective items.
1 Assignment
0 Petitions
Accused Products
Abstract
A processing method includes, associating each of a plurality of hand gestures that are detectable with a three-dimensional sensor with a respective one of a plurality of item processing tasks in memory. A plurality of graphic objects is displayed on a touch-sensitive display device, each graphic object being associated with a respective item. With the three-dimensional sensor, a hand gesture is detected. The respective one of the item processing tasks that is associated with the detected hand gesture is identified and the identified one of the item processing tasks is implemented on the displayed graphic objects, comprising causing at least a subset of the displayed graphic objects to respond on the display device based on attributes of the respective items. Item processing tasks may also be implemented through predefined touch gestures.
32 Citations
22 Claims
-
1. A document processing method comprising:
-
in memory, associating each of a plurality of hand gestures that are detectable with a three-dimensional sensor with a respective one of a plurality of item processing tasks, the three-dimensional sensor being associated with a touch sensitive display device, at least one touch gesture that is detectable with the touch sensitive display device being associated with at least one of the plurality of item processing tasks; displaying a set of graphic objects on the touch sensitive display device, each graphic object being associated with a respective item; with the three-dimensional sensor, detecting a hand gesture; identifying the respective one of the item processing tasks that is associated with the detected hand gesture; and implementing the identified one of the item processing tasks on the displayed graphic objects, comprising causing at least a subset of the displayed graphic objects to respond on the display device based on attributes of the respective items. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19)
-
-
20. An interactive user interface for processing items comprising:
-
a touch-sensitive display device; a three-dimensional sensor for detection of hand gestures adjacent the display device; instructions stored in memory for; associating each of a plurality of hand gestures that are detectable with a three-dimensional sensor with a respective one of a plurality of item processing tasks, at least one of the item processing tasks being associated with a touch gesture detectable by the touch-sensitive display device, displaying a set of graphic objects on the display, each graphic object representing a respective item, with the three-dimensional sensor, detecting a hand gesture, identifying the respective one of the item processing tasks that is associated with the detected hand gesture, and implementing the identified one of the item processing tasks on the displayed graphic objects, comprising causing at least a subset of the displayed graphic objects to respond on the display device based on attributes of the respective items; and a processor in communication with the memory and display for executing the instructions. - View Dependent Claims (21)
-
-
22. A method for using 2D and 3D motion control on a common user interface comprising:
-
providing for receiving a 2D gesture on a graphic object displayed on a tactile user interface from a user'"'"'s hand; using a 3D sensor, capturing an orientation of the user'"'"'s hand; with a processor, calculating a location of the user from the hand orientation; and
performing at least one of;repositioning the graphic object on the tactile user interface, based on the detected hand orientation, such that the graphic objects are viewable by the user in a correct orientation to the user'"'"'s location, and creating a workspace boundary around the graphic objects of each of a plurality of users such that each user'"'"'s hand gestures are used for implementing an item processing task on the displayed graphic objects within the respective boundary.
-
Specification