System and method for gesture based control system
First Claim
1. A method of controlling a computer display comprising:
- detecting a physical control gesture made by a user from gesture data received via an optical detector, wherein the gesture data is absolute three-space location data of an instantaneous state of the user at a point in time and space, the detecting comprising aggregating the gesture data, and identifying the physical control gesture using only the gesture data;
translating the control gesture to an executable command;
updating the computer display in response to the executable command.
5 Assignments
0 Petitions
Accused Products
Abstract
The system provides a gestural interface to various visually presented elements, presented on a display screen or screens. A gestural vocabulary includes ‘instantaneous’ commands, in which forming one or both hands into the appropriate ‘pose’ results in an immediate, one-time action; and ‘spatial’ commands, in which the operator either refers directly to elements on the screen by way of literal ‘pointing’ gestures or performs navigational maneuvers by way of relative or “offset” gestures. The system contemplates the ability to identify the users hands in the form of a glove or gloves with certain indicia provided thereon, or any suitable means for providing recognizable indicia on a user'"'"'s hands or body parts. A system of cameras can detect the position, orientation, and movement of the user'"'"'s hands and translate that information into executable commands.
737 Citations
107 Claims
-
1. A method of controlling a computer display comprising:
-
detecting a physical control gesture made by a user from gesture data received via an optical detector, wherein the gesture data is absolute three-space location data of an instantaneous state of the user at a point in time and space, the detecting comprising aggregating the gesture data, and identifying the physical control gesture using only the gesture data; translating the control gesture to an executable command; updating the computer display in response to the executable command.
-
-
2. A method comprising:
-
detecting poses and motion of an object from gesture data received via an optical detector, wherein the gesture data is absolute three-space location data of an instantaneous state of the poses and motion at a point in time and space, the detecting comprising aggregating the gesture data, and identifying the poses and motion using only the gesture data; translating the poses and motion into a control signal using a gesture notation; and controlling a computer application using the control signal.
-
-
3. A method comprising:
-
automatically detecting a gesture of a body from gesture data received via an optical detector, wherein the gesture data is absolute three-space location data of an instantaneous state of the body at a point in time and space, the detecting comprising aggregating the gesture data, and identifying the gesture using only the gesture data; translating the gesture to a gesture signal; and controlling a component coupled to a computer in response to the gesture signal. - View Dependent Claims (4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 44, 45, 46, 47, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62, 63, 64, 65, 66, 67, 68, 69, 70, 71, 72, 73, 74, 75, 76, 77, 78, 79, 80, 81, 82, 83, 84, 85, 86, 87, 88, 89, 90, 91, 92, 93, 94, 95, 96, 97, 98, 99, 100, 101, 102, 103, 104, 105, 106, 107)
-
Specification