Sessionless pointing user interface
First Claim
Patent Images
1. A method, the method comprising:
- identifying, by a computer with a display, a controlled device that is controlled by the computer, the controlled device is other than the computer display;
receiving, by a computer, a sequence of three-dimensional maps including at least an arm, an elbow and a hand, of a user of the computer;
detecting, in the maps, a gaze direction of the user;
analyzing the maps to detect a pointing gesture performed by the arm and the hand of the user toward the controlled device,wherein analyzing the maps comprises defining a pyramid shaped region having an apex meeting the user and a base encompassing the device, and defining an interaction region contained within the pyramid shaped region; and
actuating the controlled device responsively to the pointing gesture on condition that the elbow is extended in the pointing gesture toward the controlled device at an angle no less than a predefined angular threshold, and on condition that the hand is positioned within the interaction region and moved within the interaction region toward the controlled device, and the gaze direction.
2 Assignments
0 Petitions
Accused Products
Abstract
A method, including receiving, by a computer, a sequence of three-dimensional maps containing at least a hand of a user of the computer, and identifying, in the maps, a device coupled to the computer. The maps are analyzed to detect a gesture performed by the user toward the device, and the device is actuated responsively to the gesture.
-
Citations
17 Claims
-
1. A method, the method comprising:
-
identifying, by a computer with a display, a controlled device that is controlled by the computer, the controlled device is other than the computer display; receiving, by a computer, a sequence of three-dimensional maps including at least an arm, an elbow and a hand, of a user of the computer; detecting, in the maps, a gaze direction of the user; analyzing the maps to detect a pointing gesture performed by the arm and the hand of the user toward the controlled device, wherein analyzing the maps comprises defining a pyramid shaped region having an apex meeting the user and a base encompassing the device, and defining an interaction region contained within the pyramid shaped region; and actuating the controlled device responsively to the pointing gesture on condition that the elbow is extended in the pointing gesture toward the controlled device at an angle no less than a predefined angular threshold, and on condition that the hand is positioned within the interaction region and moved within the interaction region toward the controlled device, and the gaze direction. - View Dependent Claims (2, 3, 4, 5, 6)
-
-
7. An apparatus, comprising:
-
a three-dimensional sensing device; and a computer configured to identify a controlled device that is controlled by the computer, the controlled device is other than a computer display, to receive from the three-dimensional sensing device a sequence of three-dimensional maps including at least an arm, an elbow and a hand, of a user of the computer, to detect, in the maps, a gaze direction of the user, to analyze the maps to detect a pointing gesture performed by the arm and the hand of the user toward the controlled device and to define a pyramid shaped region having an apex that meets the user and a base encompassing the device and an interaction region contained within the pyramid shaped region, and to actuate the controlled device responsively to the pointing gesture on condition that the elbow is extended in the pointing gesture toward the controlled device at an angle no less than a predefined angular threshold, and on condition that the hand is positioned within the interaction region and is moved within the interaction region toward the controlled device and the gaze direction. - View Dependent Claims (8, 9, 10, 11, 12)
-
- 13. A computer software product comprising a non-transitory computer-readable medium, in which program instructions are stored, which instructions, when read by a computer, cause the computer to identify a controlled device that is controlled by the computer, the controlled device is other than a computer display, to receive a sequence of three-dimensional maps including at least an arm, an elbow and a hand, of a user of the computer, to detect, in the maps, a gaze direction of the user, to analyze the maps to detect a pointing gesture performed by the arm and the hand of the user toward the controlled device and to define a pyramid shaped region having an apex that meets the user and a base encompassing the device and an interaction region contained within the pyramid shaped region, and to actuate the controlled device responsively to the pointing gesture on condition that the elbow is extended in the pointing gesture toward the controlled device at an angle no less than a predefined angular threshold, and on condition that the hand is positioned within the interaction region and moved within the interaction region toward the controlled device and the gaze direction.
Specification