Detecting hand-eye coordination in real time by combining camera eye tracking and wearable sensing
First Claim
1. A computer-implemented method, comprising:
- determining an eye gaze path of an individual during a given period of time, wherein said determining the gaze path comprises measuring, via captured video data, (i) any eye movements made by the individual during the given period of time and (ii) any head movements made by the individual during the given period of time;
measuring, via data derived from one or more wearable sensors, any hand movements made by the individual during the given period of time;
performing a spatio-temporal analysis of the determined eye gaze path and the measured hand movements to compute a hand-eye coordination value attributable to the individual, wherein the spatio-temporal analysis comprises;
comparing (i) temporal values associated with movement along the determined eye gaze path and (ii) temporal values associated with the measured hand movement;
defining a given projected plane as an optical projection plane in the three-dimensional space, wherein said defining comprises maintaining one dimension as a constant and varying the other two dimensions; and
comparing (i) one or more spatial trajectories associated with the determined eye gaze path on the given projected plane and (ii) one or more spatial trajectories associated with the measured hand movements on the given projected plane, wherein said comparing spatial trajectories comprises measuring one or more deviations, with respect to a set of one or more reference points, between (a) the one or more spatial trajectories associated with the measured hand movements on the given projected plane and (b) one or more gaze saccades within the determined eye gaze path; and
outputting the computed hand-eye coordination value to at least one user;
wherein the steps are carried out by at least one computing device.
1 Assignment
0 Petitions
Accused Products
Abstract
Methods, systems, and computer program products for detecting hand-eye coordination in real time by combining camera eye tracking and wearable sensing are provided herein. A computer-implemented method includes determining an eye gaze path of an individual during a given period of time by measuring (i) any eye movements made by the individual during the given period of time and (ii) any head movements made by the individual during the given period of time; measuring any hand movements made by the individual during the given period of time; performing a spatio-temporal analysis of the determined eye gaze path and the measured hand movements to compute a hand-eye coordination value attributable to the individual; and outputting the computed hand-eye coordination value to at least one user.
-
Citations
18 Claims
-
1. A computer-implemented method, comprising:
-
determining an eye gaze path of an individual during a given period of time, wherein said determining the gaze path comprises measuring, via captured video data, (i) any eye movements made by the individual during the given period of time and (ii) any head movements made by the individual during the given period of time; measuring, via data derived from one or more wearable sensors, any hand movements made by the individual during the given period of time; performing a spatio-temporal analysis of the determined eye gaze path and the measured hand movements to compute a hand-eye coordination value attributable to the individual, wherein the spatio-temporal analysis comprises; comparing (i) temporal values associated with movement along the determined eye gaze path and (ii) temporal values associated with the measured hand movement; defining a given projected plane as an optical projection plane in the three-dimensional space, wherein said defining comprises maintaining one dimension as a constant and varying the other two dimensions; and comparing (i) one or more spatial trajectories associated with the determined eye gaze path on the given projected plane and (ii) one or more spatial trajectories associated with the measured hand movements on the given projected plane, wherein said comparing spatial trajectories comprises measuring one or more deviations, with respect to a set of one or more reference points, between (a) the one or more spatial trajectories associated with the measured hand movements on the given projected plane and (b) one or more gaze saccades within the determined eye gaze path; and outputting the computed hand-eye coordination value to at least one user; wherein the steps are carried out by at least one computing device. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12)
-
-
13. A computer program product comprising a computer readable storage medium having program instructions embodied therewith, the program instructions executable by a device to cause the device to:
-
determine an eye gaze path of an individual during a given period of time, wherein said determining the gaze path comprises measuring, via captured video data, (i) any eye movements made by the individual during the given period of time and (ii) any head movements made by the individual during the given period of time; measure, via data derived from one or more wearable sensors, any hand movements made by the individual during the given period of time; perform a spatio-temporal analysis of the determined eye gaze path and the measured hand movements to compute a hand-eye coordination value attributable to the individual, wherein the spatio-temporal analysis comprises; comparing (i) temporal values associated with movement along the determined eye gaze path and (ii) temporal values associated with the measured hand movement; defining a given projected plane as an optical projection plane in the three-dimensional space, wherein said defining comprises maintaining one dimension as a constant and varying the other two dimensions; and comparing (i) one or more spatial trajectories associated with the determined eye gaze path on the given projected plane and (ii) one or more spatial trajectories associated with the measured hand movements on the given projected plane, wherein said comparing spatial trajectories comprises measuring one or more deviations, with respect to a set of one or more reference points, between (a) the one or more spatial trajectories associated with the measured hand movements on the given projected plane and (b) one or more gaze saccades within the determined eye gaze path; and output the computed hand-eye coordination value to at least one user. - View Dependent Claims (14, 15, 16)
-
-
17. A system comprising:
-
a memory; and at least one processor operably coupled to the memory and configured for; determining an eye gaze path of an individual during a given period of time, wherein said determining the gaze path comprises measuring, via captured video data, (i) any eye movements made by the individual during the given period of time and (ii) any head movements made by the individual during the given period of time; measuring, via data derived from one or more wearable sensors, any hand movements made by the individual during the given period of time; performing a spatio-temporal analysis of the determined eye gaze path and the measured hand movements to compute a hand-eye coordination value attributable to the individual, wherein the spatio-temporal analysis comprises; comparing (i) temporal values associated with movement along the determined eye gaze path and (ii) temporal values associated with the measured hand movement; defining a given projected plane as an optical projection plane in the three-dimensional space, wherein said defining comprises maintaining one dimension as a constant and varying the other two dimensions; and comparing (i) one or more spatial trajectories associated with the determined eye gaze path on the given projected plane and (ii) one or more spatial trajectories associated with the measured hand movements on the given projected plane, wherein said comparing spatial trajectories comprises measuring one or more deviations, with respect to a set of one or more reference points, between (a) the one or more spatial trajectories associated with the measured hand movements on the given projected plane and (b) one or more gaze saccades within the determined eye gaze path; and outputting the computed hand-eye coordination value to at least one user.
-
-
18. A computer-implemented method, comprising:
-
providing, to a user, a task which requires hand-eye coordination by the user; commencing the task at a first time value and ending the task at a second time value; determining an eye gaze path of the user during a temporal period that includes the first time value and the second time value, wherein said determining the gaze path comprises measuring, via captured video data, (i) any eye movements made by the user during the temporal period and (ii) any head movements made by the user during the temporal period; measuring, via data derived from one or more wearable sensors, any hand movements made by the user during the temporal period; performing a spatio-temporal analysis of the determined eye gaze path and the measured hand movements to compute a hand-eye coordination value for the user, with respect to the task, wherein the spatio-tempora analysis comprises; comparing (i) temporal values associated with movement along the determined eye gaze path and (ii) temporal values associated with the measured hand movement; defining a given projected plane as an optical projection plane in the three-dimensional space, wherein said defining comprises maintaining one dimension as a constant and varying the other two dimensions; and comparing (i) one or more spatial trajectories associated with the determined eye gaze path on the given projected plane and (ii) one or more spatial trajectories associated with the measured hand movements on the given projected plane, wherein said comparing spatial trajectories comprises measuring one or more deviations, with respect to a set of one or more reference points, between (a) the one or more spatial trajectories associated with the measured hand movements on the given projected plane and (b) one or more gaze saccades within the determined eye gaze path; and outputting the computed hand-eye coordination value to the user; wherein the steps are carried out by at least one computing device.
-
Specification