EYE GAZE USER INTERFACE AND CALIBRATION METHOD
First Claim
1. A method of controlling a head-mountable, vision-controlled, device for transmitting and receiving information from a human user with at least one eye, said device having a virtual display and at least one eye position tracking sensor, method comprising:
- displaying a plurality of visual targets on said virtual display, said plurality of visual targets each comprising a visible element embedded within a eye position zone with an area that is equal to or larger than said visible element;
using said at least one eye position tracking sensor to determine when said at least one eye is on average gazing in a direction that is within the eye position zone of at least one of said plurality of visible elements for a time period exceeding a hovering time interval, signaling when said hovering time interval has been exceeded, and if said gaze remains within the eye position zone of said at least one of said plurality of visible elements for a time period exceeding a keypress time interval, registering that a virtual key corresponding to said at least one of said plurality of visible targets has been pressed by said user; and
using said virtual key presses to control either transmitting or receiving said information.
2 Assignments
0 Petitions
Accused Products
Abstract
A software controlled user interface and calibration method for an eye gaze controlled device, designed to accommodate angular accuracy versus time averaging tradeoffs for eye gaze direction sensors. The method can scale between displaying a small to large number of different eye gaze target symbols at any given time, yet still transmit a large array of different symbols to outside devices with minimal user training. At least part of the method may be implemented by way of a virtual window onto the surface of a virtual cylinder, with eye gaze sensitive symbols that can be rotated by eye gaze thus bringing various groups of symbols into view, and then selected by continual gazing. Specific examples of use of this interface and method on an eyeglasses-like head-mountable, vision-controlled, device are disclosed, along with various operation examples including sending and receiving text messages, control of robotic devices and control of remote vehicles.
99 Citations
20 Claims
-
1. A method of controlling a head-mountable, vision-controlled, device for transmitting and receiving information from a human user with at least one eye, said device having a virtual display and at least one eye position tracking sensor, method comprising:
-
displaying a plurality of visual targets on said virtual display, said plurality of visual targets each comprising a visible element embedded within a eye position zone with an area that is equal to or larger than said visible element; using said at least one eye position tracking sensor to determine when said at least one eye is on average gazing in a direction that is within the eye position zone of at least one of said plurality of visible elements for a time period exceeding a hovering time interval, signaling when said hovering time interval has been exceeded, and if said gaze remains within the eye position zone of said at least one of said plurality of visible elements for a time period exceeding a keypress time interval, registering that a virtual key corresponding to said at least one of said plurality of visible targets has been pressed by said user; and using said virtual key presses to control either transmitting or receiving said information. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16)
-
-
17. A method of calibrating the eye position tracking sensor of a head-mountable, vision-controlled, device for transmitting and receiving information from a human user with at least one eye, said device having a virtual display and at least one eye position tracking sensor, said method comprising:
-
displaying, over a plurality of time sequences, different visual eye calibration targets on said virtual display, each different visual eye calibration target being displayed at known location on said virtual display, and each comprising a visible element embedded within an eye position zone with an area that is equal to or larger than said visible element; for at least some of said time sequences, instructing said user to gaze at said different visual eye calibration targets, and determining each time when said at least one eye of said user is gazing at each of said plurality of visual eye calibration targets; for said each time, using said at least one eye position tracking sensor to obtain eye calibration data pertaining to the appearance of said at least one eye during said each time, and recording said eye calibration data in memory; and using said eye calibration data as a calibration reference to subsequently determine the eye gaze direction of said at least one eye when said at least one eye is gazing at one or more visual targets on said virtual display. - View Dependent Claims (18, 19, 20)
-
Specification