Portable eye tracking device
First Claim
1. A wearable device comprising:
- at least one illuminator configured to illuminate at least a portion of at least one eye of a user when activated; and
at least one image sensor configured to provide first image data representative of images of the at least one eye of the user; and
one or more processors configured to;
activate, selectively, the at least one illuminator;
receive the first image data from the at least one image sensor;
calibrate, based at least in part on the first image data, an algorithm executable by the one or more processors to determine a gaze direction of the user;
determine whether the algorithm has reduced in accuracy;
receive second image data from at least two image sensors in response to determining the algorithm has reduced in accuracy; and
determine a gaze direction for the user, based at least in part on the algorithm and at least one of the first image data or the second image data.
1 Assignment
0 Petitions
Accused Products
Abstract
A portable eye tracker device is disclosed which includes a frame, at least one optics holding member, and a control unit. The frame may be adapted for wearing by a user. The at least one optics holding member may include at least one illuminator configured to selectively illuminate at least a portion of at least one eye of the user, and at least one image sensor configured to capture image data representing images of at least a portion of at least one eye of the user. The control unit may be configured to control the at least one illuminator for the selective illumination of at least a portion of at least one eye of the user, receive the image data from the at least one image sensor, and calibrate at least one illuminator, at least one image sensor, or an algorithm of the control unit.
-
Citations
21 Claims
-
1. A wearable device comprising:
-
at least one illuminator configured to illuminate at least a portion of at least one eye of a user when activated; and at least one image sensor configured to provide first image data representative of images of the at least one eye of the user; and one or more processors configured to; activate, selectively, the at least one illuminator; receive the first image data from the at least one image sensor; calibrate, based at least in part on the first image data, an algorithm executable by the one or more processors to determine a gaze direction of the user; determine whether the algorithm has reduced in accuracy; receive second image data from at least two image sensors in response to determining the algorithm has reduced in accuracy; and determine a gaze direction for the user, based at least in part on the algorithm and at least one of the first image data or the second image data.
-
-
2. The wearable device of claim 1, further comprising:
a camera facing away from the user to capture scene image data representing images of at least a portion of a user'"'"'s field of view.
-
3. The wearable device of claim 2, wherein the one or more processors are further configured to:
determine a gaze target area based at least in part on the scene image data.
-
4. The wearable device of claim 2, wherein the one or more processors are further configured to:
-
determine, automatically, a pattern or item within the scene image data; and calibrate, based at least in part on the pattern or item within the scene image data, the algorithm.
-
-
5. The wearable device of claim 1, further comprising a positioning device;
- and
wherein the one or more processors being configured to determine whether the algorithm has reduced in accuracy is based on data from the positioning device.
- and
-
6. The wearable device of claim 1, wherein the one or more processors are further configured to calibrate the at least one illuminator or the at least one image sensor.
-
7. The wearable device of claim 1, wherein calibrating the algorithm comprises changing a value of a parameter used in the algorithm.
-
8. A method comprising:
-
illuminating, with at least one illuminator, at least a portion of at least one eye of a user; collecting, with at least one image sensor, first image data representative of an image of the at least one eye of the user while the at least one eye of the user is illuminated; calibrating, based at least in part on the first image data, at least one illuminator, at least one image sensor, or an algorithm executable to determine a gaze direction of the user; determining, using a processor, that the algorithm has reduced in accuracy; collecting, with at least two image sensors, second image data representative of a set of images of the at least one eye of the user while the at least one eye of the user is illuminated in response to determining the algorithm has reduced in accuracy; and determining a gaze direction of the user, based on at least the first image data or the second image data.
-
-
9. The method of claim 8, further comprising:
collecting surrounding image data representative of a field of view of a user.
-
10. The method of claim 9, further comprising:
determining, based at least in part on the surrounding image data and the gaze direction, a gaze target.
-
11. The method of claim 9, further comprising:
determining whether the surrounding image data contains a pattern or item.
-
12. The method of claim 11, further comprising:
calibrating the algorithm based at least in part on the determination that the surrounding image data contains a pattern or item.
-
13. The method of claim 9, further comprising:
calibrating the algorithm based at least in part on the surrounding image data.
-
14. The method of claim 8, further comprising:
-
collecting position data or motion data from a sensor on a wearable device; and wherein determining that the algorithm has reduced in accuracy is based on at least the position data or the motion data.
-
-
15. A non-transitory computer-readable medium having instructions stored thereon executable by a computing device to cause the computing device to perform operations comprising:
-
illuminate, with at least one illuminator, at least one portion of one eye of a user; collect, with at least one image sensor, first image data corresponding to an image of the at least one portion of one eye of the user while illuminated; calibrate, based at least in part on the image data, at least one illuminator, at least one image sensor, or an algorithm executable by the computing device to determine a gaze direction of the user; determine whether the algorithm has reduced in accuracy; collect, with at least two image sensors, second image data corresponding to a set of images of the at least one portion of one eye of the user while illuminated; and determine a gaze direction for the user based at least in part on at least the first image data or the second image data.
-
-
16. The non-transitory computer-readable medium of claim 15, wherein the instructions are further executable to:
collect, by a scene camera, scene image data corresponding to a user'"'"'s field of view.
-
17. The non-transitory computer-readable medium of claim 16, wherein calibrating the algorithm is further based on the scene image data.
-
18. The non-transitory computer-readable medium of claim 15, wherein calibrating the algorithm comprises altering a parameter value of the algorithm.
-
19. The non-transitory computer-readable medium of claim 15, wherein the instructions are further executable to:
-
collect position data or motion data from a sensor; and wherein determining whether the algorithm as reduced in accuracy is based at least in part on the position data of the motion data.
-
-
20. The non-transitory computer-readable medium of claim 16, wherein the instructions are further executable to:
-
determine, automatically, a pattern or item within the scene image data; and calibrate, based at least in part on the pattern or item within the scene image data, the algorithm.
-
-
21. The wearable device of claim 5, wherein the data from the positioning device comprises motion data relating to a movement of the wearable device.
Specification