PORTABLE EYE TRACKING DEVICE
First Claim
1. A method for calibrating a gaze detection system, wherein the method comprises:
- displaying a virtual object in a virtual reality environment on a display of a wearable device worn by a user;
determining that the user is interacting with the virtual object;
determining a gaze direction of the user relative to the virtual object with an eye tracking device of the wearable device;
calibrating the eye tracking device based at least in part on the gaze direction.
1 Assignment
0 Petitions
Accused Products
Abstract
A portable eye tracker device is disclosed which includes a frame, at least one optics holding member, a movement sensor, and a control unit. The frame may be a frame adapted for wearing by a user. The at least one optics holding member may include at least one illuminator configured to selectively illuminate at least a portion of at least one eye of the user, and at least one image sensor configured to capture image data representing images of at least a portion of at least one eye of the user. The movement sensor may be configured to detect movement of the frame. The control unit may be configured to control the at least one illuminator for the selective illumination of at least a portion of at least one eye of the user, receive the image data from the image sensors, and receive information from the movement sensor.
-
Citations
20 Claims
-
1. A method for calibrating a gaze detection system, wherein the method comprises:
-
displaying a virtual object in a virtual reality environment on a display of a wearable device worn by a user; determining that the user is interacting with the virtual object; determining a gaze direction of the user relative to the virtual object with an eye tracking device of the wearable device; calibrating the eye tracking device based at least in part on the gaze direction.
-
-
2. The method for calibrating a gaze detection system of claim 1, wherein:
-
the method further comprises receiving an input; and calibrating the eye tracking device occurs in response to the input.
-
-
3. The method for calibrating a gaze detection system of claim 2, wherein the input comprises:
an input from a hand held controller.
-
4. The method for calibrating a gaze detection system of claim 2, wherein the input comprises:
an interaction by the user with the virtual object that moves the virtual object closer to a particular portion of a virtual body of the user in the virtual reality environment.
-
5. The method for calibrating a gaze detection system of claim 2, wherein the input comprises:
the gaze direction remaining consistent for a predetermined amount of time.
-
6. The method for calibrating a gaze detection system of claim 1, wherein:
-
the virtual object comprises a salient feature; and determining the gaze direction of the user relative to the virtual object comprises determining the gaze direction of the user relative to the salient feature.
-
-
7. The method for calibrating a gaze detection system of claim 1, wherein determining that the user is interacting with the virtual object comprises:
determining that the user has picked up the virtual object in the virtual reality environment.
-
8. The method for calibrating a gaze detection system of claim 1, wherein determining that the user is interacting with the virtual object comprises:
determining that the user has moved the virtual object in the virtual reality environment.
-
9. The method for calibrating a gaze detection system of claim 8, wherein the determining that the user has moved the virtual object comprises:
determining that the user has moved the virtual object closer to a particular portion of a virtual body of the user in the virtual reality environment.
-
10. A non-transitory machine-readable medium having instructions stored thereon for calibrating a gaze detection system, wherein the instructions are executable by at least one processor to at least:
-
display a virtual object in a virtual reality environment on a display of a wearable device worn by a user; determine that the user is interacting with the virtual object; determine a gaze direction of the user relative to the virtual object with an eye tracking device of the wearable device; calibrate the eye tracking device based at least in part on the gaze direction.
-
-
11. The non-transitory machine-readable medium of claim 10, wherein the instructions are further executable to at least:
receive an input, wherein calibrating the eye tracking device occurs in response to the input.
-
12. The non-transitory machine-readable medium of claim 11, wherein the input comprises:
the gaze direction remaining consistent for a predetermined amount of time.
-
13. The non-transitory machine-readable medium of claim 10, wherein:
-
the virtual object comprises a salient feature; and determining the gaze direction of the user relative to the virtual object comprises determining the gaze direction of the user relative to the salient feature.
-
-
14. The non-transitory machine-readable medium of claim 10, wherein determining that the user is interacting with the virtual object comprises:
determining that the user has picked up the virtual object in the virtual reality environment.
-
15. The non-transitory machine-readable medium of claim 10, wherein determining that the user is interacting with the virtual object comprises:
determining that the user has moved the virtual object in the virtual reality environment.
-
16. A system for determining a gaze direction of a user, wherein the system comprises:
-
a wearable device having a display and an eye tracking device; one or more processors configured to at least; display a virtual object in a virtual reality environment on the display; determine that the user is interacting with the virtual object; determine a gaze direction of the user relative to the virtual object with the eye tracking device; calibrate the eye tracking device based at least in part on the gaze direction.
-
-
17. The system of claim 16, wherein the one or more processors are further configured to at least:
receive an input, wherein calibrating the eye tracking device occurs in response to the input.
-
18. The system of claim 16, wherein:
-
the virtual object comprises a salient feature; and determining the gaze direction of the user relative to the virtual object comprises determining the gaze direction of the user relative to the salient feature.
-
-
19. The system of claim 16, wherein determining that the user is interacting with the virtual object comprises:
determining that the user has picked up the virtual object in the virtual reality environment.
-
20. The system of claim 16, wherein determining that the user is interacting with the virtual object comprises:
determining that the user has moved the virtual object in the virtual reality environment.
Specification