Three-dimensional user interface session control
First Claim
Patent Images
1. A method, comprising:
- receiving, by a computer executing a non-tactile three dimensional (3D) user interface, a set of multiple 3D coordinates representing a gesture by a hand positioned within a field of view of a sensing device coupled to the computer, the gesture comprising a first motion in a first direction along a selected axis in space, followed by a second motion in a second direction, opposite to the first direction, along the selected axis; and
transitioning the non-tactile 3D user interface from a first state to a second state upon detecting completion of the gesture.
3 Assignments
0 Petitions
Accused Products
Abstract
A method, including receiving, by a computer executing a non-tactile three dimensional (3D) user interface, a set of multiple 3D coordinates representing a gesture by a hand positioned within a field of view of a sensing device coupled to the computer, the gesture including a first motion in a first direction along a selected axis in space, followed by a second motion in a second direction, opposite to the first direction, along the selected axis. Upon detecting completion of the gesture, the non-tactile 3D user interface is transitioned from a first state to a second state.
-
Citations
35 Claims
-
1. A method, comprising:
-
receiving, by a computer executing a non-tactile three dimensional (3D) user interface, a set of multiple 3D coordinates representing a gesture by a hand positioned within a field of view of a sensing device coupled to the computer, the gesture comprising a first motion in a first direction along a selected axis in space, followed by a second motion in a second direction, opposite to the first direction, along the selected axis; and transitioning the non-tactile 3D user interface from a first state to a second state upon detecting completion of the gesture. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9, 10)
-
-
11. A method, comprising:
-
associating, in a computer executing a non-tactile three dimensional (3D) user interface, multiple regions, comprising at least first and second regions, within a field of view of a sensing device coupled to the computer with respective states of the non-tactile 3D user interface, comprising at least first and second states associated respectively with the first and second regions; conveying visual feedback to a user of the computer on a display having a vertical orientation; receiving a set of multiple 3D coordinates representing a vertical hand movement from the first region to the second region; and responsively to the vertical hand movement, transitioning the non-tactile 3D user interface from the first state to the second state. - View Dependent Claims (12, 13, 14)
-
-
15. An apparatus, comprising:
-
a three dimensional (3D) optical sensor having a field of view and coupled to a computer executing a non-tactile three dimensional (3D) user interface; and an illumination element that when illuminated, is configured to be visible to a user when the user is positioned within the field of view of the 3D optical sensor so as to convey visual feedback to the user indicating the user'"'"'s position relative to the field of view. - View Dependent Claims (16, 17, 18, 19)
-
-
20. An apparatus, comprising:
-
a sensing device; and a computer executing a non-tactile three dimensional (3D) user interface and configured to receive, from the sensing device, a set of multiple 3D coordinates representing a gesture by a hand positioned within a field of view of the sensing device, the gesture comprising a first motion in a first direction along a selected axis in space, followed by a second motion in a second direction, opposite to the first direction, along the selected axis, and to transition the non-tactile 3D user interface from a first state to a second state upon detecting completion of the gesture. - View Dependent Claims (21, 22, 23, 24, 25, 26, 27, 28, 29)
-
-
30. An apparatus, comprising:
-
a sensing device; a display having a vertical orientation; and a computer coupled to drive the display to convey visual feedback to a user of the computer while executing a non-tactile three dimensional (3D) user interface and configured to associate multiple regions, comprising at least first and second regions, within a field of view of the sensing device with respective states of the non-tactile 3D user interface, comprising at least first and second states associated respectively with the first and second regions, to receiving a set of multiple 3D coordinates representing a vertical hand movement from the first region to the second region, and responsively to the vertical hand movement, to transition the non-tactile 3D user interface from the first state to the second state. - View Dependent Claims (31, 32, 33)
-
-
34. A computer software product comprising a non-transitory computer-readable medium, in which program instructions are stored, which instructions, when read by a computer executing a non-tactile user interface, cause the computer to receive, from a sensing device, a set of multiple 3D coordinates representing a gesture by a hand positioned within a field of view of the sensing device, the gesture comprising a first motion in a first direction along a selected axis in space, followed by a second motion in a second direction, opposite to the first direction, along the selected axis, and to transition the non-tactile 3D user interface from a first state to a second state upon detecting completion of the gesture.
-
35. A computer software product comprising a non-transitory computer-readable medium, in which program instructions are stored, which instructions, when read by a computer coupled to drive a display having a vertical orientation to convey visual feedback to a user of the computer and executing a non-tactile user interface, cause the computer to associate multiple regions, comprising at least first and second regions, within a field of view of a sensing device with respective states of the non-tactile 3D user interface, comprising at least first and second states associated respectively with the first and second regions, to receive a set of multiple 3D coordinates representing a vertical hand movement from the first region to the second region, and responsively to the vertical hand movement, to transition the non-tactile 3D user interface from the first state to the second state.
Specification