Portable eye tracking device
First Claim
1. A portable eye tracker device comprising:
- a frame adapted for wearing by a user;
at least one illuminator configured to selectively illuminate at least a portion of at least one eye of the user;
at least one image sensor configured to capture first image data representing images of at least a portion of at least one eye of the user;
a scene camera facing away from the user and configured to capture second image data representing images of at least a portion of a view of the user; and
a control unit configured to;
control the at least one illuminator for selective illumination of at least a portion of the at least one eye of the user;
receive the first image data from the at least one image sensor;
determine a gaze target area for the user based at least in part on the first image data;
determine a distance between an object located within the gaze target area and the portable eye tracker device based at least in part on the second image data;
cause a video stream from the scene camera to be provided, the video stream comprising;
a first video feed of an entire field of view of the scene camera, the first video feed being processed at a first quality; and
a second video feed of a sub-portion of the entire field of view of the scene camera, the second video feed being processed at a second quality that is greater than the first quality, and the sub-portion including the object within the gaze target area; and
control the scene camera to adjust light sensitivity of the second video feed based at least in part on the distance between the object within the gaze target area and the portable eye tracker device.
1 Assignment
0 Petitions
Accused Products
Abstract
A portable eye tracker device is disclosed which includes a frame, at least one optics holding member, a movement sensor, and a control unit. The frame may be a frame adapted for wearing by a user. The at least one optics holding member may include at least one illuminator configured to selectively illuminate at least a portion of at least one eye of the user, and at least one image sensor configured to capture image data representing images of at least a portion of at least one eye of the user. The movement sensor may be configured to detect movement of the frame. The control unit may be configured to control the at least one illuminator for the selective illumination of at least a portion of at least one eye of the user, receive the image data from the image sensors, and receive information from the movement sensor.
-
Citations
20 Claims
-
1. A portable eye tracker device comprising:
-
a frame adapted for wearing by a user; at least one illuminator configured to selectively illuminate at least a portion of at least one eye of the user; at least one image sensor configured to capture first image data representing images of at least a portion of at least one eye of the user; a scene camera facing away from the user and configured to capture second image data representing images of at least a portion of a view of the user; and a control unit configured to; control the at least one illuminator for selective illumination of at least a portion of the at least one eye of the user; receive the first image data from the at least one image sensor; determine a gaze target area for the user based at least in part on the first image data; determine a distance between an object located within the gaze target area and the portable eye tracker device based at least in part on the second image data; cause a video stream from the scene camera to be provided, the video stream comprising; a first video feed of an entire field of view of the scene camera, the first video feed being processed at a first quality; and a second video feed of a sub-portion of the entire field of view of the scene camera, the second video feed being processed at a second quality that is greater than the first quality, and the sub-portion including the object within the gaze target area; and control the scene camera to adjust light sensitivity of the second video feed based at least in part on the distance between the object within the gaze target area and the portable eye tracker device.
-
-
2. The portable eye tracker device of claim 1, wherein the portable eye tracker device further comprises:
a movement sensor configured to detect movement of the frame.
-
3. The portable eye tracker device of claim 2, wherein the control unit is further configured to:
determine the gaze target area for the user further based at least in part on information received from the movement sensor.
-
4. The portable eye tracker device of claim 1, wherein:
all illuminators emit the same wavelength of light.
-
5. The portable eye tracker device of claim 1, wherein the control unit is further configured to:
-
control a first subset of the at least one illuminator to selectively illuminate at least a portion of at least one eye of the user; receive a first set of image data from the at least one image sensor, wherein the first set of image data represents images of at least the portion of the at least one eye of the user captured when the at least one eye of the user is exposed to light from the first subset of the at least one illuminator; determine that the gaze target area cannot be determined based on the first set of image data; control a second subset of the at least one illuminator to selectively illuminate at least a different portion of the at least one eye of the user; receive a second set of image data from at least one image sensor, wherein the second set of image data represents images of at least the different portion of the at least one eye of the user captured when the at least one eye of the user is exposed to light from the second subset of the at least one illuminator; and wherein determining the gaze target area for the user based at least in part on the first image data comprises determining the gaze target area for the user based at least in part on the second set of image data.
-
-
6. The portable eye tracker device of claim 1, wherein:
-
the portable eye tracker device further comprises a positioning device configured to determine a position of the portable eye tracker device; and the control unit is further configured to determine the gaze target area for the user based at least in part on the first image data and information received from the positioning device.
-
-
7. The portable eye tracker device of claim 1, wherein the control unit is further configured to:
-
determine a level of ambient light from the first image data; adjust a brightness or exposure time of the at least one illuminator based at least in part on the level of ambient light.
-
-
8. The portable eye tracker device of claim 1, further comprising:
a display viewable by the user.
-
9. The portable eye tracker device of claim 1, wherein:
the first image data from at least one image sensor comprises images including areas besides eyes of the user.
-
10. The portable eye tracker device of claim 1, wherein:
controlling the scene camera to adjust at least one of focus or light sensitivity of the scene camera based on the gaze target area comprises focusing the scene camera on the gaze target area.
-
11. The portable eye tracker device of claim 10, wherein the control unit is further configured to:
capture an image which includes the gaze target area.
-
12. The portable eye tracker device of claim 11, wherein the control unit is further configured to:
crop the image which includes the gaze target area.
-
13. The portable eye tracker device of claim 1, wherein:
-
the control unit is further configured to compute a gaze vector corresponding to the gaze target area, the gaze vector simulating a perspective of a gaze of the user within the gaze target area; and controlling the scene camera to adjust the light sensitivity is further based on focusing the scene camera on the gaze target area using the gaze vector.
-
-
14. The portable eye tracker device of claim 1, wherein causing the video stream from the scene camera to be provided comprises causing the video stream of the scene camera to be provided to a display that is separate from the portable eye tracker device.
-
15. The portable eye tracker device of claim 1, wherein determining the gaze target area for the user is further based at least in part on the second image data obtained from the scene camera.
-
16. A method of determining a gaze direction for a user, comprising:
-
activating a first subset of illuminators on a frame worn by a user to selectively illuminate at least a first portion of at least one eye of the user; receiving, from at least one image sensor on the frame, a first set of image data representing images of at least the first portion of the at least one eye of the user exposed to light from the first subset of illuminators; determining that a gaze target area cannot be determined based on the first set of image data; activating a second subset of illuminators on the frame worn by the user to selectively illuminate at least a second portion of the at least one eye of the user or a third portion of a different eye of the user; receiving, from the at least one image sensor or from a different image sensor on the frame, a second set of image data representing images of at least the second portion or the third portion exposed to light from the second subset of illuminators; determining the gaze target area for the user based at least in part on the second set of image data; identifying an object located within the gaze target area based at least in part on other image data obtained from a scene camera on the frame, the scene camera configured to capture the other image data that represents images of at least a portion of a view of the user; causing a video stream from the scene camera to be provided, the video stream comprising; a first video feed of an entire field of view of the scene camera, the first video feed being processed at a first quality; and a second video feed of a sub-portion of the entire field of view of the scene camera, the second video feed being processed at a second quality that is greater than the first quality, and the sub-portion including the object within the gaze target area; and controlling the scene camera to adjust light sensitivity of the second video feed based at least in part on the gaze target area.
-
-
17. The method of claim 16, wherein the method further comprises:
-
receiving information from a movement sensor configured to detect movement of the frame; and wherein determining the gaze target area is further based at least in part on information from the movement sensor.
-
-
18. The method of claim 16, further comprising:
receiving images captured by the scene camera representing images of at least a portion of a view of the user.
-
19. A non-transitory machine readable medium having instructions thereon for determining a gaze direction for a user, the instructions executable by a processor for at least:
-
activating at least one illuminator on a frame worn by a user to selectively illuminate at least a portion of at least one eye of the user; receiving image data representing images at least the portion of the at least one eye of the user from at least one image sensor on the frame; determining a level of ambient light from the image data; adjust a brightness or an exposure time of the at least one illuminator based at least in part on the level of ambient light; determining a gaze target area for the user based at least in part on the image data; determining a distance between an object located within the gaze target area and the frame based at least in part on other image data obtained from a scene camera on the frame, the scene camera configured to capture the other image data that represents images of at least a portion of a view of the user; causing a video stream from the scene camera on the frame to be provided, the video stream comprising; a first video feed of an entire field of view of the scene camera, the first video feed being processed at a first quality; and a second video feed of a sub-portion of the entire field of view of the scene camera, the second video feed being processed at a second quality that is greater than the first quality, and the sub-portion including the object within the gaze target area; and controlling the scene camera to adjust light sensitivity of the second video feed based at least in part on the distance between the object within the gaze target area and the frame.
-
-
20. The non-transitory machine-readable medium of claim 19, wherein:
-
the instructions are further executable for at least receiving information from a movement sensor configured to detect movement of the frame; and determining the gaze target area for the user is further based at least in part on the information received from the movement sensor.
-
Specification