Gaze tracking system
First Claim
1. Apparatus for associating gaze data representative of the location of gaze of an operator with classifications of images of an operator looking at different locations, comprising:
- a receiver operable to receive a video stream defining a sequence of images representative of an operator at different points in time;
a classification unit operable to assign one of a number of classifications to images in the video stream received by said receiver, said classification unit being operable to assign the same classifications to images of an operator looking at the same locations; and
a calibration unit comprising;
a classification store configured to store data identifying the pairs of different classifications assigned to pairs of images in a video stream received by said receiver representative of an operator at different times separated by less than a preset time period;
a gaze conversion store configured to store data associating each of said number of classifications with gaze data representative of a gaze location; and
an update unit operable to update gaze data stored in said gaze conversion store by updating said gaze data such that gaze data for pairs of different classifications identified by data stored in said classification store identify gaze locations which are closer together and gaze data for pairs of different classifications not identified by data stored in said classification store identify gaze locations which are further apart.
1 Assignment
0 Petitions
Accused Products
Abstract
A gaze tracking system (1,3,8) is provided which obtains images of an operator (9) and processes the images to associate the images with coordinates of where the operators gaze is directed towards.
In order to identify images where an operator is looking in the same direction, images obtained by a camera (8) are first processed using the retinex algorithm. Image patches from the processed image are then compared with stored feature images and an initial classification based upon the correspondence between areas of the processed image and the stored feature images is determined. This initial classification is then further processed and each image is assigned a single classification. The areas identified by different classifications are then determined so that classifications assigned to adjacent images identify points closer together and classifications never assigned to an adjacent images identify points further apart.
-
Citations
43 Claims
-
1. Apparatus for associating gaze data representative of the location of gaze of an operator with classifications of images of an operator looking at different locations, comprising:
-
a receiver operable to receive a video stream defining a sequence of images representative of an operator at different points in time;
a classification unit operable to assign one of a number of classifications to images in the video stream received by said receiver, said classification unit being operable to assign the same classifications to images of an operator looking at the same locations; and
a calibration unit comprising;
a classification store configured to store data identifying the pairs of different classifications assigned to pairs of images in a video stream received by said receiver representative of an operator at different times separated by less than a preset time period;
a gaze conversion store configured to store data associating each of said number of classifications with gaze data representative of a gaze location; and
an update unit operable to update gaze data stored in said gaze conversion store by updating said gaze data such that gaze data for pairs of different classifications identified by data stored in said classification store identify gaze locations which are closer together and gaze data for pairs of different classifications not identified by data stored in said classification store identify gaze locations which are further apart. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16)
-
-
17. A method associating gaze data representative of the location of gaze of an operator with classifications of images of an operator looking at different locations, comprising:
-
receiving a video stream defining a sequence of images representative of an operator at different points in time;
assigning one of a number of classifications to images in the received video stream, wherein the same classifications are assigned to images of an operator looking at the same locations;
storing data identifying the pairs of different classifications assigned to pairs of images in a received video stream representative of an operator at different times separated by less than a preset time period;
storing data associating each of said number of classifications with gaze data representative of a gaze location; and
updating stored gaze data by updating said gaze data such that gaze data for pairs of different classifications identified by stored data said gaze data identifies gaze locations which are closer together and gaze data for pairs of different classifications not identified by stored data are updated to identify gaze locations which are further apart. - View Dependent Claims (18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35)
-
-
36. Apparatus for classifying images comprising:
-
a receiver operable to receive image data defining images;
a patch generation module operable to generate for a number of points in an image defined by image data received by said receiver, image patches, said image patches being derived by processing image data defining portions of said image which include said points, a plurality of different image patches being derived for each of said points;
a patch comparison module operable to compare stored data with image patches generated by said patch generation module to identify stored data which most closely corresponds to said generated patches; and
a classification unit for generating a classification of an image defined by image data received by said receiver, wherein said classification comprises data identifying points in an image and for each of said points, data identifying stored data determined to most closely correspond to an image patch derived by said patch generation module from image data defining a portion of the image including said point.
-
-
37. A method of classifying images comprising:
-
receiving image data defining images;
generating for a number of points in an image defined by received image data, image patches, said image patches being derived by processing image data defining portions of said image which include said points, a plurality of different image patches being derived for each of said points;
comparing stored data with said generated image patches to identify stored data which most closely corresponds to said generated patches; and
generating a classification of an image defined by received image data, wherein said classification comprises data identifying points in an image and for each of said points, data identifying stored data determined to most closely correspond to an image patch generated from image data defining a portion of the image including said point.
-
-
38. Apparatus for comparing colour images independently of the illumination of a subject object in said images, comprising:
-
a receiver operable to receive image data defining images, said image data comprising colour data for a plurality of pixels representative of the colours of a subject object appearing in said images;
a processing unit operable to derive from image data received by said receiver, a colour reflectance image said colour reflectance image comprising colour data for said plurality of pixels representative of the contribution to the colour of a subject object in an image not arising due to the colour of the illumination of the subject object in said image, said processing unit being operable to derive said colour reflectance image such that the ratios of colour data of pixels in said colour reflectance image for a said subject object are independent of the colour of the illumination of said subject object in said image; and
a comparator operable to compare generated colour reflectance images with stored image data to determine the correspondence between said stored data and said generated images;
wherein said correspondence is determined on the basis of the comparison of the ratios of colour data for pixels in a said generated colour reflectance image and corresponding ratios of colour data for pixels of said stored image data.
-
-
39. A method of comparing colour images independently of the illumination of a subject object in said images, comprising:
-
receiving image data defining images, said image data comprising colour data for a plurality of pixels representative of the colours of a subject object appearing in said images;
processing a received image to derive from received image data a colour reflectance image said colour reflectance image comprising colour data for said plurality of pixels representative of the contribution to the colour of a subject object in an image not arising due to the colour of the illumination of the subject object in said image, said processing being such to derive said colour reflectance image such that the ratios of colour data of pixels in said colour reflectance image for a said subject object are independent of the colour of the illumination of said subject object in said image; and
comparing generated colour reflectance images with stored image data to determine the correspondence between said stored data and said generated images;
wherein said correspondence is determined on the basis of the comparison of the ratios of colour data for pixels in a said generated colour reflectance image and corresponding ratios of colour data for pixels of said stored image data.
-
-
40. Apparatus for associating data indicative of the orientation of a person'"'"'s head with classifications of images of a person'"'"'s head comprising:
-
a receiver operable to receive a video stream defining a sequence of images of a person'"'"'s head;
a classification unit operable to assign one of a number of classifications to images in the video stream received by said receiver, said classification unit being operable to assign the same classifications to images of a person'"'"'s head in substantially the same orientations; and
a calibration unit comprising;
a classification store configured to store data identifying pairs of different classifications assigned to pairs of images in a video stream received by said receiver representative of a person'"'"'s head at different times separated by less than a reference time period;
a conversion store configured to store data associating each of said number of classifications with data representative of at least one head orientation; and
an update unit operable to update data stored in said conversion store by updating said data such that data for pairs of different classifications identified by data stored in said classification store identify head orientations which are closer together and data for pairs of different classifications not identified by data stored in said classification store identify head orientations which are further apart.
-
-
41. A method of associating data indicative of the orientation of a person'"'"'s head with classifications of images of a person'"'"'s head comprising:
-
receiving a video stream defining a sequence of images of a person'"'"'s head;
assigning one of a number of classifications to images in the received video stream, wherein the same classifications are assigned to images of a person'"'"'s head in substantially similar orientations;
storing pair data identifying pairs of different classifications assigned to pairs of images in a received video stream representative of a person'"'"'s head at different times separated by less than a reference time period;
storing association data associating each of said number of classifications with data representative of at least one head orientation; and
updating said association data such that association data for pairs of different classifications identified by stored pair data identify head orientations which are closer together and association data for pairs of different classifications not identified by stored pair data identify head orientations which are further apart.
-
-
42. Apparatus for associating data indicative of the viewing direction of a person'"'"'s eyes with classifications of images of a person'"'"'s eyes comprising:
-
a receiver operable to receive a video stream defining a sequence of images of a person'"'"'s eyes;
a classification unit operable to assign one of a number of classifications to images in the video stream received by said receiver, said classification unit being operable to assign the same classifications to images of a person'"'"'s eyes in substantially the same viewing directions; and
a calibration unit comprising;
a classification store configured to store data identifying pairs of different classifications assigned to pairs of images in a video stream received by said receiver representative of a person'"'"'s eyes at different times separated by less than a reference time period;
a conversion store configured to store data associating each of said number of classifications with data representative of at least one viewing direction; and
an update unit operable to update data stored in said conversion store by updating said data such that data for pairs of different classifications identified by data stored in said classification store identify viewing directions which are closer together and data for pairs of different classifications not identified by data stored in said classification store identify viewing directions which are further apart.
-
-
43. A method of associating data indicative of the viewing directions of a person'"'"'s eyes with classifications of images of a person'"'"'s eyes comprising:
-
receiving a video stream defining a sequence of images of a person'"'"'s eyes;
assigning one of a number of classifications to images in the received video stream, wherein the same classifications are assigned to images of a person'"'"'s eyes in substantially similar viewing directions;
storing pair data identifying pairs of different classifications assigned to pairs of images in a received video stream representative of a person'"'"'s eyes at different times separated by less than a reference time period;
storing association data associating each of said number of classifications with data representative of at least one viewing direction; and
updating said association data such that association data for pairs of different classifications identified by stored pair data identify viewing directions which are closer together and association data for pairs of different classifications not identified by stored pair data identify viewing directions which are further apart.
-
Specification