PORTABLE EYE TRACKING DEVICE
First Claim
1. A portable eye tracker device comprising:
- a frame adapted for wearing by a user;
at least one optics holding member, wherein the optics holding member comprises;
at least one illuminator configured to selectively illuminate at least a portion of at least one eye of the user; and
at least one image sensor configured to capture image data representing images of at least a portion of at least one eye of the user;
a movement sensor configured to detect movement of the frame; and
a control unit configured to;
control the at least one illuminator for the selective illumination of at least a portion of at least one eye of the user;
receive the image data from the image sensors; and
receive information from the movement sensor.
1 Assignment
0 Petitions
Accused Products
Abstract
A portable eye tracker device is disclosed which includes a frame, at least one optics holding member, a movement sensor, and a control unit. The frame may be a frame adapted for wearing by a user. The at least one optics holding member may include at least one illuminator configured to selectively illuminate at least a portion of at least one eye of the user, and at least one image sensor configured to capture image data representing images of at least a portion of at least one eye of the user. The movement sensor may be configured to detect movement of the frame. The control unit may be configured to control the at least one illuminator for the selective illumination of at least a portion of at least one eye of the user, receive the image data from the image sensors, and receive information from the movement sensor.
48 Citations
20 Claims
-
1. A portable eye tracker device comprising:
-
a frame adapted for wearing by a user; at least one optics holding member, wherein the optics holding member comprises; at least one illuminator configured to selectively illuminate at least a portion of at least one eye of the user; and at least one image sensor configured to capture image data representing images of at least a portion of at least one eye of the user; a movement sensor configured to detect movement of the frame; and a control unit configured to; control the at least one illuminator for the selective illumination of at least a portion of at least one eye of the user; receive the image data from the image sensors; and receive information from the movement sensor.
-
-
2. The portable eye tracker device of claim 1, wherein the control unit is further configured to:
determine a gaze target area for the user based at least in part on the image data.
-
3. The portable eye tracker device of claim 1, wherein the movement sensor comprises:
a gyroscope.
-
4. The portable eye tracker device of claim 1, wherein the control unit is further configured to:
determine a gaze target area for the user based at least in part on the image data and information received from the movement sensor.
-
5. The portable eye tracker device of claim 1, further comprising:
a scene camera facing away from the user and configured to capture image data representing images of at least a portion of a view of the user.
-
6. The portable eye tracker device of claim 5, wherein the control unit is further configured to:
-
determine a gaze target area within the image data representing images of at least a portion user'"'"'s field of view wherein the gaze target area comprises less than ten percent of the image data in an image from the scene camera; and control the scene camera to adjust at least one of focus or light sensitivity based on the gaze target area.
-
-
7. The portable eye tracker device of claim 6, wherein:
all illuminators emit the same wavelength of light.
-
8. The portable eye tracker device of claim 1, wherein the control unit is further configured to:
-
control a first subset of the at least one illuminator to selectively illuminate at least a portion of at least one eye of the user; receive a first set of image data from at least one image sensor of the first subset, wherein the first set of image data represents images of at least a portion of at least one eye of the user captured when at least one eye of the user is exposed to light from the first subset of the at least one illuminator; determine that a gaze target area cannot be determined based on the first set of image data; control a second subset of the at least one illuminator to selectively illuminate at least a portion of at least one eye of the user; receive a second set of image data from at least one image sensor of the second subset, wherein the second set of image data represents images of at least a portion of at least one eye of the user captured when at least one eye of the user is exposed to light from the second subset of the at least one illuminator; and determine the gaze target area for the user based at least in part on the second set of image data.
-
-
9. The portable eye tracker device of claim 1, wherein:
-
the portable eye tracker device further comprises a positioning device configured to determine a position of the portable eye tracker device; and the control unit is further configured to determine a gaze target area for the user based at least in part on the image data and information received from the positioning device.
-
-
10. The portable eye tracker device of claim 1, wherein the control unit is further configured to:
-
receive image data from at least one image sensor; determine a level of ambient light from the image data; adjust a brightness or exposure time of at least one illuminator based at least in part on the level of ambient light.
-
-
11. The portable eye tracker device of claim 1, further comprising:
a display viewable by the user.
-
12. The portable eye tracker device of claim 1, wherein the control unit is further configured to:
receive image data from at least one image sensor representing images including areas besides the user'"'"'s eyes.
-
13. A method of determining a gaze direction for a user, comprising:
-
activating at least one illuminator on a frame worn by a user to selectively illuminate at least a portion of at least one of the user'"'"'s eyes; receiving image data representing images at least a portion of at least one of the user'"'"'s eyes from at least one image sensor on the frame; receiving information from a movement sensor configured to detect movement of the frame; and determining a gaze target area for the user based at least in part on the image data and information from the movement sensor.
-
-
14. The method of claim 13, wherein the movement sensor comprises:
a gyroscope.
-
15. The method of claim 13, further comprising:
receiving images captured by a scene camera representing images of at least a portion of a view of the user.
-
16. The method of claim 15, further comprising:
-
determining a gaze target area within the image data representing images of at least a portion user'"'"'s field of view wherein the gaze target area comprises less than ten percent of the image data in an image from the scene camera; and controlling the scene camera to adjust at least one of focus or light sensitivity based on the gaze target area.
-
-
17. A non-transitory machine readable medium having instructions thereon for determining a gaze direction for a user, the instructions executable by a processor for at least:
-
activating at least one illuminator on a frame worn by a user to selectively illuminate at least a portion of at least one of the user'"'"'s eyes; receiving image data representing images at least a portion of at least one of the user'"'"'s eyes from at least one image sensor on the frame; receiving information from a movement sensor configured to detect movement of the frame; and determining a gaze target area for the user based at least in part on the image data and information from the movement sensor.
-
-
18. The non-transitory machine-readable medium of claim 17, wherein the movement sensor comprises:
a gyroscope.
-
19. The non-transitory machine-readable medium of claim 17, wherein the instructions are further executable for at least:
receiving images captured by a scene camera representing images of at least a portion of a view of the user.
-
20. The non-transitory machine-readable medium of claim 19, wherein the instructions are further executable for at least:
-
determining a gaze target area within the image data representing images of at least a portion user'"'"'s field of view wherein the gaze target area comprises less than ten percent of the image data in an image from the scene camera; and controlling the scene camera to adjust at least one of focus or light sensitivity based on the gaze target area.
-
Specification