Three dimensional user interface session control using depth sensors
First Claim
Patent Images
1. A method for interactive control, comprising:
- processing, by a computer executing a non-tactile three dimensional (3D) user interface, 3D coordinates of a hand positioned within a field of view of a sensing device coupled to the computer in order to detect gestures made by the hand;
in response to a first gesture detected by the computer, transitioning the 3D user interface from a first state in which the 3D user interface is disengaged to a second state in which the 3D user interface tracks but does not respond to the detected gestures;
in response to a second gesture detected by the computer, subsequent to the first gesture, transitioning the 3D user interface from the second state to a third state in which user control of the 3D user interface is engaged; and
in response to a third gesture detected by the computer, subsequent to the second gesture, accepting and executing a command indicated by the third gesture.
2 Assignments
0 Petitions
Accused Products
Abstract
A method, including receiving, by a computer executing a non-tactile three dimensional (3D) user interface, a set of multiple 3D coordinates representing a gesture by a hand positioned within a field of view of a sensing device coupled to the computer, the gesture including a first motion in a first direction along a selected axis in space, followed by a second motion in a second direction, opposite to the first direction, along the selected axis. Upon detecting completion of the gesture, the non-tactile 3D user interface is transitioned from a first state to a second state.
14 Citations
15 Claims
-
1. A method for interactive control, comprising:
-
processing, by a computer executing a non-tactile three dimensional (3D) user interface, 3D coordinates of a hand positioned within a field of view of a sensing device coupled to the computer in order to detect gestures made by the hand; in response to a first gesture detected by the computer, transitioning the 3D user interface from a first state in which the 3D user interface is disengaged to a second state in which the 3D user interface tracks but does not respond to the detected gestures; in response to a second gesture detected by the computer, subsequent to the first gesture, transitioning the 3D user interface from the second state to a third state in which user control of the 3D user interface is engaged; and in response to a third gesture detected by the computer, subsequent to the second gesture, accepting and executing a command indicated by the third gesture. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9, 10)
-
-
11. Apparatus for interactive control, comprising:
-
a sensing device, configured to output an indication of three dimensional (3D) coordinates of a hand positioned within a field of view of the sensing device; and a computer configured to execute a non-tactile 3D user interface and to detect, based on the indication output by the sensing device gestures made by the hand, comprising at least first, second and third gestures, wherein in response to the first gesture, the computer transitions the 3D user interface from a first state in which the 3D user interface is disengaged to a second state in which the 3D user interface tracks but does not respond to the detected gestures, and in response to the second gesture detected by the computer, subsequent to the first gesture, the computer transitions the 3D user interface from the second state to a third state in which user control of the 3D user interface is engaged, and in response to the third gesture detected by the computer, subsequent to the second gesture, the computer accepts and executes a command indicated by the third gesture. - View Dependent Claims (12, 13, 14, 15)
-
Specification