Gaze detection in a 3D mapping environment
First Claim
Patent Images
1. An apparatus, comprising:
- a sensing device configured to receive a sequence of three dimensional (3D) maps of at least a part of a body of a user, including a head of the user, and to receive a two dimensional (2D) image of the user, the 2D image including an eye of the user; and
a computer coupled to the sensing device and configured to extract, from the 3D maps, 3D coordinates of the head of the user and to identify, based on the 3D coordinates of the head and the image of the eye, a direction of a gaze performed by the user, to identify an interactive item presented in the direction of the gaze on a display coupled to the computer, to extract from the 3D maps an indication that the user is moving a limb of the body in a specific direction, and to move the identified interactive item on the display responsively to the indication.
2 Assignments
0 Petitions
Accused Products
Abstract
A method includes receiving a sequence of three-dimensional (3D) maps of at least a part of a body of a user of a computerized system and extracting, from the 3D map, 3D coordinates of a head of the user. Based on the 3D coordinates of the head, a direction of a gaze performed by the user and an interactive item presented in the direction of the gaze on a display coupled to the computerized system are identified. An indication is extracted from the 3D maps an indication that the user is moving a limb of the body in a specific direction, and the identified interactive item is repositioned on the display responsively to the indication.
9 Citations
14 Claims
-
1. An apparatus, comprising:
-
a sensing device configured to receive a sequence of three dimensional (3D) maps of at least a part of a body of a user, including a head of the user, and to receive a two dimensional (2D) image of the user, the 2D image including an eye of the user; and a computer coupled to the sensing device and configured to extract, from the 3D maps, 3D coordinates of the head of the user and to identify, based on the 3D coordinates of the head and the image of the eye, a direction of a gaze performed by the user, to identify an interactive item presented in the direction of the gaze on a display coupled to the computer, to extract from the 3D maps an indication that the user is moving a limb of the body in a specific direction, and to move the identified interactive item on the display responsively to the indication. - View Dependent Claims (2, 3, 4, 5, 12)
-
-
6. A method, comprising:
-
receiving a sequence of three-dimensional (3D) maps of at least a part of a body of a user of a computerized system; extracting, from the 3D maps, 3D coordinates of a head of the user; receiving a two dimensional (2D) image of the user, the image including an eye of the user; identifying, based on the 3D coordinates of the head and the image of the eye, a direction of a gaze performed by the user; identifying an interactive item presented in the direction of the gaze on a display coupled to the computerized system; extracting from the 3D maps an indication that the user is moving a limb of the body in a specific direction; and moving the identified interactive item on the display responsively to the indication. - View Dependent Claims (7, 8, 9, 10, 11, 13)
-
-
14. A computer software product comprising a non-transitory computer-readable medium, in which program instructions are stored, which instructions, when read by a computer, cause the computer to receive a sequence of three-dimensional (3D) maps of at least a part of a body of a user of the computer, including a head of the user, and to receive a two dimensional (2D) image of the user, the 2D image including an eye of the user, to extract, from the 3D maps, 3D coordinates of the head of the user, to identify, based on the 3D coordinates of the head and the image of the eye, a direction of a gaze performed by the user, to identify an interactive item presented in the direction of the gaze on a display coupled to the computer, to extract from the 3D maps an indication that the user is moving a limb of the body in a specific direction, and to move the identified interactive item on the display responsively to the indication.
Specification