Sessionless pointing user interface
First Claim
Patent Images
1. A method, comprising:
- identifying, by a computer coupled to a three-dimensional (3D) sensing device and a display, different, respective locations of one or more controllable devices other than the display and the computer;
receiving, by the computer from the 3D sensing device, a sequence of three-dimensional maps containing at least a head and a hand of a user of the computer;
detecting in the maps a gaze direction of the user that is directed toward a given device among the one or more controllable devices;
defining a region in space with an apex at the head of the user and a base encompassing the given device;
defining an interaction zone within the region;
analyzing the maps to detect a gesture performed by the hand within the defined interaction zone that is directed toward an identified location of the given device among the one or more controllable devices; and
actuating the given device responsively to the gesture.
2 Assignments
0 Petitions
Accused Products
Abstract
A method, including receiving, by a computer, a sequence of three-dimensional maps containing at least a hand of a user of the computer, and identifying, in the maps, a device coupled to the computer. The maps are analyzed to detect a gesture performed by the user toward the device, and the device is actuated responsively to the gesture.
9 Citations
20 Claims
-
1. A method, comprising:
-
identifying, by a computer coupled to a three-dimensional (3D) sensing device and a display, different, respective locations of one or more controllable devices other than the display and the computer; receiving, by the computer from the 3D sensing device, a sequence of three-dimensional maps containing at least a head and a hand of a user of the computer; detecting in the maps a gaze direction of the user that is directed toward a given device among the one or more controllable devices; defining a region in space with an apex at the head of the user and a base encompassing the given device; defining an interaction zone within the region; analyzing the maps to detect a gesture performed by the hand within the defined interaction zone that is directed toward an identified location of the given device among the one or more controllable devices; and actuating the given device responsively to the gesture. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8)
-
-
9. An apparatus, comprising:
-
a display; a three-dimensional (3D) sensing device; and a computer coupled to the display and the 3D sensing device and configured to identify different, respective locations of one or more controllable devices other than the display and the computer, to receive from the 3D sensing device a sequence of three-dimensional maps containing at least a head and a hand of a user of the computer, to detect in the maps a gaze direction of the user that is directed toward a given device among the multiple controllable devices, to define a region in space with an apex at the head of the user and a base encompassing the given device, to define an interaction zone within the region, to analyze the maps to detect a gesture performed by the hand within the defined interaction zone that is directed toward an identified location of the given device among the multiple controllable devices, and to actuate the given device responsively to the gesture. - View Dependent Claims (10, 11, 12, 13, 14, 15, 16)
-
- 17. A computer software product comprising a non-transitory computer-readable medium, in which program instructions are stored, which instructions, when read by a computer, which is coupled to a display, cause the computer to identify different, respective locations of one or more controllable devices other than the display and the computer, to receive from a 3D sensing device a sequence of three-dimensional maps containing at least a head and a hand of a user of the computer, to detect in the maps a gaze direction of the user that is directed toward a given device among the multiple controllable devices, to define a region in space with an apex at the head of the user and a base encompassing the given device, to define an interaction zone within the region, to analyze the maps to detect a gesture performed by the hand within the defined interaction zone that is directed toward an identified location of the given device among the multiple controllable devices, and to actuate the given device responsively to the gesture.
Specification