PORTABLE EYE TRACKING DEVICE
1 Assignment
0 Petitions
Accused Products
Abstract
A portable eye tracker device is disclosed which includes a frame, at least one optics holding member, and a control unit. The frame may be adapted for wearing by a user and include an eyeglass frame having a nose bridge. The at least one optics holding member may include at least one illuminator configured to selectively illuminate at least a portion of at least one eye of the user and at least one image sensor configured to capture image data representing images of at least a portion of at least one eye of the user. The optics holding member may be coupled with the nose bridge. The control unit may be configured to control the at least one illuminator for the selective illumination of at least a portion of at least one eye of the user, and receive the image data from the at least one image sensor.
14 Citations
21 Claims
-
1. (canceled)
-
2. A method for causing an action to be performed by a device, comprising:
-
causing an illuminator to illuminate at least a portion of an eye of a user; receiving image data representing images of at least a portion eye of the user; determining at least one gaze direction of the user based at least in part on the image data; determining whether at least one of the gaze direction or the image data meets a condition; and causing a first action to be performed by a device based at least in part on a determination that at least one of the gaze direction or the image data meets the condition.
-
-
3. The method for causing an action to be performed by a device of claim 2, wherein the device is selected from a group consisting of:
-
a television; a computer; a tablet; and a game machine.
-
-
4. The method for causing an action to be performed by a device of claim 2, wherein causing the first action to be performed by the device is further based at least in part on:
actuation of a physical button.
-
5. The method for causing an action to be performed by a device of claim 2, wherein causing the first action to be performed by the device is further based at least in part on:
an audible input, including a sound or at least one word spoken by the user.
-
6. The method for causing an action to be performed by a device of claim 2, wherein the method further comprises:
-
receiving image data representative of the eye of the user not being present; and causing a second action to be performed by a device based at least in part on the image data representing an image of the eye of the user not being present.
-
-
7. The method for causing an action to be performed by a device of claim 2, wherein determining whether at least one of the gaze direction or the image data meets the condition comprises:
determining if the image data is representative of images of the eye of the user being present.
-
8. The method for causing an action to be performed by a device of claim 2, wherein determining whether at least one of the gaze direction or the image data meets a condition comprises:
determining that the gaze direction is fixated.
-
9. The method for causing an action to be performed by a device of claim 2, wherein determining whether at least one of the gaze direction or the image data meets a condition comprises:
determining that the gaze direction is representative of a saccade of the eye of the user.
-
10. The method for causing an action to be performed by a device of claim 2, wherein determining whether at least one of the gaze direction or the image data meets a condition comprises:
determining that the gaze direction is in a particular direction.
-
11. The method for causing an action to be performed by a device of claim 2, wherein determining whether at least one of the gaze direction or the image data meets a condition comprises:
determining that the gaze direction has moved in a particular pattern.
-
12. A system for causing an action to be performed by a device, wherein the system comprises:
-
an illuminator; an image sensor; and a device configured to at least; cause the illuminator to illuminate at least a portion of an eye of a user; receive image data from the image sensor representing images of at least a portion eye of the user; determine at least one gaze direction of the user based at least in part on the image data; determine whether at least one of the gaze direction or the image data meets a condition; and cause a first action to be performed by the device based at least in part on a determination that at least one of the gaze direction or the image data meets the condition.
-
-
13. The system for causing an action to be performed by a device of claim 12, wherein causing the first action to be performed by the device is further based at least in part on a selection from a group consisting of:
-
actuation of a physical button; and an audible input, including a sound or at least one word spoken by the user.
-
-
14. The system for causing an action to be performed by a device of claim 12, wherein determining whether at least one of the gaze direction or the image data meets a condition comprises a selection from a group consisting of:
-
determining that the gaze direction is fixated; and determining that the gaze direction is in a particular direction.
-
-
15. The system for causing an action to be performed by a device of claim 12, wherein determining whether at least one of the gaze direction or the image data meets a condition comprises:
determining that the gaze direction is representative of a saccade of the eye of the user.
-
16. The system for causing an action to be performed by a device of claim 12, wherein determining whether at least one of the gaze direction or the image data meets a condition comprises:
determining that the gaze direction has moved in a particular pattern.
-
17. A non-transitory machine readable medium having instructions stored thereon for causing an action to be performed by a device, the instructions executable by at least one processor to at least:
-
cause an illuminator to illuminate at least a portion of an eye of a user; receive image data from an image sensor representing images of at least a portion eye of the user; determine at least one gaze direction of the user based at least in part on the image data; determine whether at least one of the gaze direction or the image data meets a condition; and cause a first action to be performed by a device based at least in part on a determination that at least one of the gaze direction or the image data meets the condition.
-
-
18. The non-transitory machine readable medium of claim 17, wherein causing the first action to be performed by the device is further based at least in part on a selection from a group consisting of:
-
actuation of a physical button; and an audible input, including a sound or at least one word spoken by the user.
-
-
19. The non-transitory machine readable medium of claim 17, wherein determining whether at least one of the gaze direction or the image data meets a condition comprises a selection from a group consisting of:
-
determining that the gaze direction is fixated; and determining that the gaze direction is in a particular direction.
-
-
20. The non-transitory machine readable medium of claim 17, wherein determining whether at least one of the gaze direction or the image data meets a condition comprises:
determining that the gaze direction is representative of a saccade of the eye of the user.
-
21. The non-transitory machine readable medium of claim 17, wherein determining whether at least one of the gaze direction or the image data meets a condition comprises:
determining that the gaze direction has moved in a particular pattern.
Specification