Method for detecting, evaluating, and analyzing look sequences
First Claim
1. A method for detecting, evaluating, and analyzing look sequences of a test person in an environment using a look detection system comprising the steps of:
- detecting a field of view of the test person with a first test camera, said camera pointing forward and being firmly held on the head of the test person so as to provide the test person an unimpeded view of the environment, said field of view being recorded in frames of a field of view video,detecting a pupil image of at least one of the pupils in the eyes of the test person with a second camera, said second camera being also firmly held on the head of the test person, said pupil image being recorded in frames of an eye video;
recording each frame of the eye video at the same time as a corresponding frame of the field of view video is recorded;
automatically determining pupil coordinates for a center of said pupil image in each frame of the eye video;
instructing the test person to look at a respective given object in the environment at at least one respective given time, a respective given frame of the eye video corresponding to each respective given time when the test person looks at said respective given object, said respective given object being located at a respective given visual point in the field of view shown in each corresponding frame of the field of view video;
determining location coordinates for said respective given visual point in each corresponding frame of the field of view video;
establishing a correlation function between the pupil coordinates in given frames of the eye video and the location coordinates of said respective given visual point in the field of view shown in each corresponding frame of the field of view video; and
extrapolating from pupil coordinates determined from a selected frame of the eye video the location coordinates of the respective selected visual point in the corresponding frame of the field of view video using said correlation function, the location coordinates of said respective selected visual point indicating where in the environment the test person was looking at a time other than a given time.
1 Assignment
0 Petitions
Accused Products
Abstract
In a method for detecting, evaluating, and analyzing look sequences of a test person using a look detection system, the field of view of the test person is detected by a first camera, which is directed forwardly and firmly held on the person'"'"'s head, and recorded in a field of view video. The movement of the pupils is detected by a second camera, also firmly held on the person'"'"'s head and recorded in an eye video. The pupil coordinates for each frame of the eye video are determined by automatically measuring the contrast of the pupils relative to the surrounding areas with an image recognition program, and the centroid of the dark area, which corresponds to the pupil center having the pupil coordinates is established. A correlation function between pupil coordinates in the eye video and the coordinates of the corresponding visual point is determined in the field of view video. Subsequently, the coordinates of the corresponding visual point in the field of view video are extrapolated from the pupil coordinates for each frame in the eye video.
67 Citations
18 Claims
-
1. A method for detecting, evaluating, and analyzing look sequences of a test person in an environment using a look detection system comprising the steps of:
-
detecting a field of view of the test person with a first test camera, said camera pointing forward and being firmly held on the head of the test person so as to provide the test person an unimpeded view of the environment, said field of view being recorded in frames of a field of view video, detecting a pupil image of at least one of the pupils in the eyes of the test person with a second camera, said second camera being also firmly held on the head of the test person, said pupil image being recorded in frames of an eye video; recording each frame of the eye video at the same time as a corresponding frame of the field of view video is recorded; automatically determining pupil coordinates for a center of said pupil image in each frame of the eye video; instructing the test person to look at a respective given object in the environment at at least one respective given time, a respective given frame of the eye video corresponding to each respective given time when the test person looks at said respective given object, said respective given object being located at a respective given visual point in the field of view shown in each corresponding frame of the field of view video; determining location coordinates for said respective given visual point in each corresponding frame of the field of view video; establishing a correlation function between the pupil coordinates in given frames of the eye video and the location coordinates of said respective given visual point in the field of view shown in each corresponding frame of the field of view video; and extrapolating from pupil coordinates determined from a selected frame of the eye video the location coordinates of the respective selected visual point in the corresponding frame of the field of view video using said correlation function, the location coordinates of said respective selected visual point indicating where in the environment the test person was looking at a time other than a given time. - View Dependent Claims (2, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 15, 17, 18)
-
-
3. A method for detecting, evaluating, and analyzing look sequences of a test person in an environment using a look detection system, comprising the steps of:
-
detecting a field of view of the test person with a first test camera, said first camera pointing forward and being firmly held on the head of the test person, said field of view being recorded in frames of a field of view video;
detecting a pupil image of at least one of the pupils in the eyes of the test person with a second camera, said second camera being also firmly held on the head of the test person, said pupil image being recorded in frames of an eye video;
recording each frame of the eye video at the same time as a corresponding frame of the field of view video;automatically determining pupil coordinates for a centroid of said pupil image in each frame of the eye video; instructing the test person to look at a respective given object in the environment at at least one respective given time, a respective given frame of the eye video corresponding to each respective given time when the test person looks at said respective given object, said respective given object being located at a respective given visual point in the field of view shown in each corresponding frame of the field of view video; recording at least one look sequence, said look sequence providing a plurality of respective given frames of an eye video and corresponding frames of a field of view video; determining location coordinates for said respective given visual point in each corresponding frame of the field of view video; associating the pupil coordinates in each given frame of the eye video with location coordinates of the respective visual point where the respective given object is located in the corresponding frame of the field of view video in a look sequence; storing the associated pupil coordinates and location coordinates as a data set; establishing a correlation function between said pupil coordinates in said given frames of the eye video and said location coordinates of the respective visual point in said corresponding frames from said data set; and extrapolating from pupil coordinates determined from a selected frame of the eye video the location coordinates of the respective selected visual point in the corresponding frame of the field of view video using said correlation function, the location coordinates of said respective selected visual point indicating where in the environment the test person was looking at a time in the look sequence other than a given time. - View Dependent Claims (14, 16)
-
Specification