Systems and methods of eye tracking data analysis
First Claim
Patent Images
1. A method comprising:
- displaying a resource associated with a study, the resource comprising a material subject to eye tracking analysis;
receiving, at a computing device, an image depicting a portion of a user, the image captured while displaying the resource and including reflections caused by light reflected from the user;
detecting one or more eye features associated with an eye of the user including detecting the one or more eye features using the reflections;
determining, using the one or more eye features, point of regard information associated with the resource, the point of regard information indicating a location on a display of the computing device at which the user was looking when the image of the portion of the user was taken;
capturing study data while displaying the resource, the study data comprising data selected from the group consisting of screen capture, blink rate, and mouse data;
presenting a calibration interface after displaying the resource, the calibration interface including one or more calibration points to calculate an accuracy of user calibration for determining the point of regard information;
determining, based on the accuracy of user calibration, an accuracy value for the resource associated with the point of regard information; and
sending the point of regard information in association with the study data and the accuracy value to a server, the server configured to provide the point of regard information and the associated study data and accuracy value to a second user for data analysis.
5 Assignments
0 Petitions
Accused Products
Abstract
Methods and systems to facilitate eye tracking data analysis are provided. Point of regard information from a first client device of a first user is received, where the point of regard information is determined by the first client device by detecting one or more eye features associated with an eye of the first user. The point of regard information is stored. A request to access the point of regard information is received, and the point of regard information is sent in response to the request, where the point of regard information is used in a subsequent operation.
-
Citations
19 Claims
-
1. A method comprising:
-
displaying a resource associated with a study, the resource comprising a material subject to eye tracking analysis; receiving, at a computing device, an image depicting a portion of a user, the image captured while displaying the resource and including reflections caused by light reflected from the user; detecting one or more eye features associated with an eye of the user including detecting the one or more eye features using the reflections; determining, using the one or more eye features, point of regard information associated with the resource, the point of regard information indicating a location on a display of the computing device at which the user was looking when the image of the portion of the user was taken; capturing study data while displaying the resource, the study data comprising data selected from the group consisting of screen capture, blink rate, and mouse data; presenting a calibration interface after displaying the resource, the calibration interface including one or more calibration points to calculate an accuracy of user calibration for determining the point of regard information; determining, based on the accuracy of user calibration, an accuracy value for the resource associated with the point of regard information; and sending the point of regard information in association with the study data and the accuracy value to a server, the server configured to provide the point of regard information and the associated study data and accuracy value to a second user for data analysis. - View Dependent Claims (2, 3, 4, 5, 6, 7)
-
-
8. A non-transitory machine-readable storage medium storing instructions which, when executed by one or more processors, cause the one or more processors to perform operations comprising:
-
displaying a resource associated with a study, the resource comprising a material subject to eye tracking analysis; receiving an image depicting a portion of a user, the image captured while displaying the resource and including reflections caused by light with which the user is illuminated and which is reflected from the user; detecting one or more eye features associated with an eye of the user including detecting the one or more eye features using the reflections; determining, using the one or more eye features, point of regard information associated with the resource, the point of regard information indicating a location on a display of the computing device at which the user was looking when the image of the portion of the user was taken; capturing study data while displaying the resource, the study data comprising data selected from the group consisting of screen capture, blink rate, and mouse data; presenting a calibration interface after displaying the resource, the calibration interface including one or more calibration points to calculate an accuracy of user calibration for determining the point of regard information; determining, based on the accuracy of user calibration, an accuracy value for the resource associated with the point of regard information; and sending the point of regard information in association with the study data and the accuracy value to a server, the server configured to provide the point of regard information and the associated study data and accuracy value to a second user for data analysis. - View Dependent Claims (9, 10, 11, 12, 13)
-
-
14. A system, comprising:
-
a memory that stores instructions and; one or more processors configured by the instructions to perform operations comprising; displaying a resource associated with a study, the resource comprising a material subject to eye tracking analysis; receiving an image depicting a portion of a user, the image captured while displaying the resource and including reflections caused by light with which the user is illuminated and which is reflected from the user; detecting one or more eye features associated with an eye of the user including detecting the one or more eye features using the reflections; determining, using the one or more eye features, point of regard information associated with the resource, the point of regard information indicating a location on a display of the computing device at which the user was looking when the image of the portion of the user was taken; capturing study data while displaying the resource, the study data comprising data selected from the group consisting of screen capture, blink rate, and mouse data; presenting a calibration interface after displaying the resource, the calibration interface including one or more calibration points to calculate an accuracy of user calibration for determining the point regard information; determining, based on the accuracy of user calibration, an accuracy value for the resource associated with the point of regard information; and sending the point of regard information in association with the study data and the accuracy value to a server, the server configured to provide the point of regard information and the associated study data and accuracy value to a second user for data analysis. - View Dependent Claims (15, 16, 17, 18, 19)
-
Specification