Sessionless pointing user interface
First Claim
Patent Images
1. A method, comprising:
- identifying, by a computer coupled to a three-dimensional (3D) sensing device and a display, different, respective locations of multiple controllable devices other than the display;
receiving, by the computer from the 3D sensing device, a sequence of three-dimensional maps containing at least a head and a hand of a user of the computer;
detecting in the maps a gaze direction of the user that is directed toward a given device among the multiple controllable devices;
defining a region in space with an apex at the head of the user and a base encompassing the given device;
defining an interaction zone within the region;
defining an angle threshold;
defining a minimum time period;
analyzing the maps to detect a gesture performed by the hand within the defined interaction zone that is directed toward an identified location of the given device among the multiple controllable devices,wherein the gesture comprises extending an elbow associated with the hand at an angle greater than or equal to the angle threshold, extending the hand toward the device and pausing the hand for the minimum time period; and
actuating the given device responsively to the gesture.
3 Assignments
0 Petitions
Accused Products
Abstract
A method, including receiving, by a computer, a sequence of three-dimensional maps containing at least a hand of a user of the computer, and identifying, in the maps, a device coupled to the computer. The maps are analyzed to detect a gesture performed by the user toward the device, and the device is actuated responsively to the gesture.
277 Citations
10 Claims
-
1. A method, comprising:
-
identifying, by a computer coupled to a three-dimensional (3D) sensing device and a display, different, respective locations of multiple controllable devices other than the display; receiving, by the computer from the 3D sensing device, a sequence of three-dimensional maps containing at least a head and a hand of a user of the computer; detecting in the maps a gaze direction of the user that is directed toward a given device among the multiple controllable devices; defining a region in space with an apex at the head of the user and a base encompassing the given device; defining an interaction zone within the region; defining an angle threshold; defining a minimum time period; analyzing the maps to detect a gesture performed by the hand within the defined interaction zone that is directed toward an identified location of the given device among the multiple controllable devices, wherein the gesture comprises extending an elbow associated with the hand at an angle greater than or equal to the angle threshold, extending the hand toward the device and pausing the hand for the minimum time period; and actuating the given device responsively to the gesture. - View Dependent Claims (2, 3, 4, 5)
-
-
6. An apparatus, comprising:
-
a three-dimensional sensing device; a display; and a computer configured to identify different, respective locations of multiple controllable devices other than the display, to receive from the three-dimensional sensing device a sequence of three-dimensional maps containing at least a head and a hand of a user of the computer, to detect in the maps a gaze direction of the user that is directed toward a given device among the multiple controllable devices, to define a region in space with an apex at the head of the user and a base encompassing the given device, to define an interaction zone within the region, to analyze the maps to detect a gesture performed by the hand within the defined interaction zone that is directed toward an identified location of the given device among the multiple controllable devices, and to actuate the given device responsively to the gesture, wherein the computer is configured to define an angle threshold and a minimum time period, and wherein the gesture detected by the computer comprises extending an elbow associated with the hand at an angle greater than or equal to the angle threshold, extending the hand toward the given device, and pausing the hand for the minimum time period. - View Dependent Claims (7, 8, 9)
-
-
10. A computer software product comprising a non-transitory computer-readable medium, in which program instructions are stored, which instructions, when read by a computer executing a non-tactile user interface and coupled to a display, cause the computer to:
-
identify different, respective locations of multiple controllable devices other than the display, receive from a three-dimensional sensing device that is coupled to the computer a sequence of three-dimensional maps containing at least a head and a hand of a user of the computer, detect in the maps a gaze direction of the user that is directed toward a given device among the multiple controllable devices, define a region in space with an apex at the head of the user and a base encompassing the given device, define an interaction zone within the region, analyze the maps to detect a gesture performed by the hand within the defined interaction zone that is directed toward an identified location of the given device among the multiple controllable devices, and actuate the given device responsively to the gesture, wherein the instructions cause the computer to define an angle threshold and a minimum time period, and wherein the gesture detected by the computer comprises extending an elbow associated with the hand at an angle greater than or equal to the angle threshold, extending the hand toward the given device, and pausing the hand for the minimum time period.
-
Specification