Systems and methods of eye tracking control on mobile device
First Claim
1. A method comprising:
- controlling, at an eye tracking device, one or more light sources located within the eye tracking device to synchronize emission of light from the one or more light sources with a timing of image capture by a camera with a depth of field approximately in the range 40 cm to 80 cm, wherein the controlling further includes filtering the light from the light sources using an infrared pass filter so that only light in the 800-900 nm range enters the camera, and wherein the camera has a field of view that matches the field of emission of the one or more light sources;
receiving, at the eye tracking device, an image of a portion of a user captured by the camera, the image including reflections caused by light emitted on the user from the one or more light sources located within the eye tracking device;
detecting one or more eye features associated with an eye of the user using the reflections;
determining point of regard information using the one or more eye features, the point of regard information indicating a location on a display of a computing device coupled to the eye tracking device at which the user was looking when the image of the portion of the user was taken; and
sending the point of regard information to an application capable of performing a subsequent operation using the point of regard information.
6 Assignments
0 Petitions
Accused Products
Abstract
Methods and systems to facilitate eye tracking control on mobile devices are provided. An image of a portion of a user is received at an eye tracking device, where the image includes reflections caused by light emitted on the user from one or more light sources located within the eye tracking device. One or more eye features associated with an eye of the user is detected using the reflections. Point of regard information is determined using the one or more eye features, where the point of regard information indicates a location on a display of a computing device coupled to the eye tracking device at which the user was looking when the image of the portion of the user was taken. The point of regard information is sent to an application capable of performing a subsequent operation using the point of regard information.
111 Citations
18 Claims
-
1. A method comprising:
-
controlling, at an eye tracking device, one or more light sources located within the eye tracking device to synchronize emission of light from the one or more light sources with a timing of image capture by a camera with a depth of field approximately in the range 40 cm to 80 cm, wherein the controlling further includes filtering the light from the light sources using an infrared pass filter so that only light in the 800-900 nm range enters the camera, and wherein the camera has a field of view that matches the field of emission of the one or more light sources; receiving, at the eye tracking device, an image of a portion of a user captured by the camera, the image including reflections caused by light emitted on the user from the one or more light sources located within the eye tracking device; detecting one or more eye features associated with an eye of the user using the reflections; determining point of regard information using the one or more eye features, the point of regard information indicating a location on a display of a computing device coupled to the eye tracking device at which the user was looking when the image of the portion of the user was taken; and sending the point of regard information to an application capable of performing a subsequent operation using the point of regard information. - View Dependent Claims (2, 3, 4, 5, 6)
-
-
7. A non-transitory machine-readable storage medium storing instructions which, when executed by one or more processors, cause the one or more processors to perform operations comprising:
-
controlling one or more light sources located within an eye tracking device to synchronize emission of light from the one or more light sources with a timing of image capture by a camera with a depth of field approximately in the range 40 cm to 80 cm, wherein the controlling further includes filtering the light from the light sources using an infrared pass filter so that only light in the 800-900 nm range enters the camera, and wherein the camera has a field of view that matches the field of emission of the one or more light sources; receiving an image of a portion of a user captured by the camera, the image including reflections caused by light emitted on the user from the one or more light sources located within an eye tracking device; detecting one or more eye features associated with an eye of the user using the reflections; determining point of regard information using the one or more eye features, the point of regard information indicating a location on a display of a computing device coupled to the eye tracking device at which the user was looking when the image of the portion of the user was taken; and sending the point of regard information to an application capable of performing a subsequent operation using the point of regard information. - View Dependent Claims (8, 9)
-
-
10. An eye tracking device comprising:
-
one or more light sources configured to emit light, wherein the one or more sources further comprise an infrared pass at filters that to be within an 800-900 nm range; and one or more hardware processors in communication with the one or more light sources, the one or more hardware processors configured to; control the one or more light sources to synchronize emission of light from the one or more light sources with a timing of image capture by a camera with a depth of field approximately in the range 40 cm to 80 cm, wherein the camera has a field of view that matches the field of emission of the one or more light sources; receive an image of a portion of a user captured by the camera, the image including reflections caused by light emitted on the user from the one or more light sources; detect one or more eye features associated with an eye of the user including detecting the one or more eye features using the reflections; determine point of regard information using the one or more eye features, the point of regard information indicating a location on a display of a computing device coupled to the eye tracking device at which the user was looking when the image of the portion of the user was taken; and send the point of regard information to an application capable of performing a subsequent operation using the point of regard information. - View Dependent Claims (11, 12, 13, 14, 15)
-
-
16. A system comprising:
-
a mobile computing device; and an eye tracking device coupled to and in communication with the mobile computing device, the eye tracking device being configured to; control one or more light sources located within the eve tracking device to synchronize emission of light from the one or more light sources with a timing of image capture by a camera with a depth of held approximately in the range 30 cm to 60 cm, wherein to control the one or more light sources the eve tracking device is further configured to filter the light from the one or more light sources using an infrared pass filter so that only the 800-900 nm range enters the camera and wherein the camera has a field of view that matches the field of emission of the one or more light sources; receive an image of a portion of a user captured by the camera, the image including reflections caused by light emitted on the user from the one or more light sources located within the eye tracking device; detect one or more eye features associated with an eye of the user using the reflections; determine point of regard information using the one or more eye features, the point of regard information indicating a location on a display of the mobile computing device at which the user was looking when the image of the portion of the user was taken; and send the point of regard information to an application on the mobile computing device capable of performing a subsequent operation using the point of regard information. - View Dependent Claims (17, 18)
-
Specification