Portable eye tracking device
First Claim
1. A method for calibrating a gaze detection system, wherein the method comprises:
- displaying a plurality of multipurpose virtual objects as part of an interactive virtual experience in a virtual reality environment on a display of a wearable device worn by a user;
selecting a multipurpose virtual object of the plurality of multipurpose virtual objects based at least in part on a determination that the user is anticipated to interact with the multipurpose virtual object during the interactive virtual experience;
detecting a user interaction of the user with the multipurpose virtual object during the interactive virtual experience;
determining that the user interaction of the user with the multipurpose virtual object causes a change to the multipurpose virtual object; and
responsive to determining that the user interaction of the user with the multipurpose virtual object causes the change to the multipurpose virtual object, initiating, during the interactive virtual experience, a calibration process of the gaze detection system, the calibration process comprising;
determining a gaze direction of the user relative to the multipurpose virtual object with the gaze detection system of the wearable device; and
calibrating the gaze detection system based at least in part on the gaze direction.
1 Assignment
0 Petitions
Accused Products
Abstract
A portable eye tracker device is disclosed which includes a frame, at least one optics holding member, a movement sensor, and a control unit. The frame may be a frame adapted for wearing by a user. The at least one optics holding member may include at least one illuminator configured to selectively illuminate at least a portion of at least one eye of the user, and at least one image sensor configured to capture image data representing images of at least a portion of at least one eye of the user. The movement sensor may be configured to detect movement of the frame. The control unit may be configured to control the at least one illuminator for the selective illumination of at least a portion of at least one eye of the user, receive the image data from the image sensors, and receive information from the movement sensor.
-
Citations
20 Claims
-
1. A method for calibrating a gaze detection system, wherein the method comprises:
-
displaying a plurality of multipurpose virtual objects as part of an interactive virtual experience in a virtual reality environment on a display of a wearable device worn by a user; selecting a multipurpose virtual object of the plurality of multipurpose virtual objects based at least in part on a determination that the user is anticipated to interact with the multipurpose virtual object during the interactive virtual experience; detecting a user interaction of the user with the multipurpose virtual object during the interactive virtual experience; determining that the user interaction of the user with the multipurpose virtual object causes a change to the multipurpose virtual object; and responsive to determining that the user interaction of the user with the multipurpose virtual object causes the change to the multipurpose virtual object, initiating, during the interactive virtual experience, a calibration process of the gaze detection system, the calibration process comprising; determining a gaze direction of the user relative to the multipurpose virtual object with the gaze detection system of the wearable device; and calibrating the gaze detection system based at least in part on the gaze direction.
-
-
2. The method for calibrating a gaze detection system of claim 1, wherein:
-
the method further comprises receiving an input; and calibrating the gaze detection system occurs in response to the input.
-
-
3. The method for calibrating a gaze detection system of claim 2, wherein the input comprises:
an input from a hand held controller.
-
4. The method for calibrating a gaze detection system of claim 2, wherein the input comprises:
an interaction by the user with the multipurpose virtual object that moves the multipurpose virtual object closer to a particular portion of a virtual body of the user in the virtual reality environment.
-
5. The method for calibrating a gaze detection system of claim 2, wherein the input comprises:
the gaze direction remaining consistent for a predetermined amount of time.
-
6. The method for calibrating a gaze detection system of claim 1, wherein:
-
the multipurpose virtual object comprises a salient feature; and determining the gaze direction of the user relative to the multipurpose virtual object comprises determining the gaze direction of the user relative to the salient feature.
-
-
7. The method for calibrating a gaze detection system of claim 1, wherein determining that the user interaction of the user with the multipurpose virtual object causes the change to the multipurpose virtual object comprises:
determining that the user has picked up the multipurpose virtual object in the virtual reality environment.
-
8. The method for calibrating a gaze detection system of claim 1, wherein determining that the user interaction of the user with the multipurpose virtual object causes the change to the multipurpose virtual object comprises:
determining that the user has moved the multipurpose virtual object in the virtual reality environment.
-
9. The method for calibrating a gaze detection system of claim 8, wherein the determining that the user has moved the multipurpose virtual object comprises:
determining that the user has moved the multipurpose virtual object closer to a particular portion of a virtual body of the user in the virtual reality environment.
-
10. The method for calibrating a gaze detection system of claim 1, wherein the multipurpose virtual object is used in the virtual reality environment for one or more purposes in addition to the calibration process.
-
11. The method for calibrating a gaze detection system of claim 1, wherein the calibration process is a recalibration process.
-
12. The method for calibrating a gaze detection system of claim 1, wherein a first purpose of the multipurpose virtual object is for the calibration process of the gaze detection system and a second purpose of the multipurpose virtual object is for the interactive virtual experience other than the calibration process.
-
13. A non-transitory machine-readable medium having instructions stored thereon for calibrating a gaze detection system, wherein the instructions are executable by at least one processor to at least:
-
displaying a plurality of multipurpose virtual objects as part of an interactive virtual experience in a virtual reality environment on a display of a wearable device worn by a user; selecting a multipurpose virtual object of the plurality of multipurpose virtual objects based at least in part on a determination that the user is anticipated to interact with the multipurpose virtual object during the interactive virtual experience; detect a user interaction of the user with the multipurpose virtual object during the interactive virtual experience; determine that the user interaction of the user with the multipurpose virtual object causes a change to the multipurpose virtual object; and responsive to determining that the user interaction of the user with the multipurpose virtual object causes the change to the multipurpose virtual object, initiate, during the interactive virtual experience, a calibration process of the gaze detection system, the calibration process comprising; determine a gaze direction of the user relative to the multipurpose virtual object with the gaze detection system of the wearable device; and calibrate the gaze detection system based at least in part on the gaze direction.
-
-
14. The non-transitory machine-readable medium of claim 13, wherein the instructions are further executable to at least:
receive an input, wherein calibrating the gaze detection system occurs in response to the input.
-
15. The non-transitory machine-readable medium of claim 14, wherein the input comprises:
the gaze direction remaining consistent for a predetermined amount of time.
-
16. The non-transitory machine-readable medium of claim 13, wherein:
-
the multipurpose virtual object comprises a salient feature; and determining the gaze direction of the user relative to the multipurpose virtual object comprises determining the gaze direction of the user relative to the salient feature.
-
-
17. The non-transitory machine-readable medium of claim 13, wherein determining that the user interaction of the user with the multipurpose virtual object causes the change to the multipurpose virtual object comprises:
determining that the user has picked up the multipurpose virtual object in the virtual reality environment.
-
18. A system for determining a gaze direction of a user, wherein the system comprises:
-
a wearable device having a display and a gaze detection system; and one or more processors configured to at least; display a plurality of multipurpose virtual objects as part of an interactive virtual experience in a virtual reality environment on the display; select a multipurpose virtual object of the plurality of multipurpose virtual objects based at least in part on a determination that the user is anticipated to interact with the multipurpose virtual object during the interactive virtual experience; detect a user interaction of a user with the multipurpose virtual object during the interactive virtual experience; determine that the user interaction of the user with the multipurpose virtual object causes a change to the multipurpose virtual object; in response to determining that the user interaction of the user with the multipurpose virtual object causes the change to the multipurpose virtual object, initiate, during the interactive virtual experience, a calibration process of the gaze detection system, the calibration process comprising; determining a gaze direction of the user relative to the multipurpose virtual object with the gaze detection system; and calibrating the gaze detection system based at least in part on the gaze direction.
-
-
19. The system of claim 18, wherein:
-
the multipurpose virtual object comprises a salient feature; and determining the gaze direction of the user relative to the multipurpose virtual object comprises determining the gaze direction of the user relative to the salient feature.
-
-
20. The system of claim 18, wherein determining that the user interaction of the user with the multipurpose virtual object causes the change to the multipurpose virtual object comprises:
determining that the user has picked up the multipurpose virtual object in the virtual reality environment.
Specification