Apparatus and method for eye tracking interface
First Claim
1. An eye tracking interface system for controlling task-performing functions comprising:
- a detecting device adapted to detect bio-electromagnetic signals generated by eye movements;
a first processor adapted to receive said detected bio-electromagnetic signals and, in response thereto, assigning tokens from a set of symbolic tokens corresponding to eye movement classifications, thereby producing a plurality of tokens representative of said bio-electromagnetic signals;
a second processor adapted to receive said plurality of tokens, recognize patterns of tokens, and generate command signals based on said recognized patterns;
where said command signals control task-performing functions.
1 Assignment
0 Petitions
Accused Products
Abstract
An eye tracking interface system for generating communication and control functions as a result of pre-defined eye gestures is disclosed. The system includes a detecting device adapted to detect bio-electromagnetic signals generated by eye movements. A first processor receives the detected bio-electromagnetic signals, and generate tokens corresponding to said pre-defined eye gestures. A second processor receives the tokens, and generates command signals based on a protocol correlating tokens to desired command signals. Thereafter, a user interface responds to said command signals, and provides control functions in response to said command signals.
194 Citations
70 Claims
-
1. An eye tracking interface system for controlling task-performing functions comprising:
-
a detecting device adapted to detect bio-electromagnetic signals generated by eye movements; a first processor adapted to receive said detected bio-electromagnetic signals and, in response thereto, assigning tokens from a set of symbolic tokens corresponding to eye movement classifications, thereby producing a plurality of tokens representative of said bio-electromagnetic signals; a second processor adapted to receive said plurality of tokens, recognize patterns of tokens, and generate command signals based on said recognized patterns; where said command signals control task-performing functions. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41)
-
-
42. In a system having a plurality of channels, each channel detecting electro-oculograph (EOG) signals corresponding to horizontal and vertical movements of the eye, a signal processor comprising:
-
a filter responsive to said EOG signals to provide filtered EOG signals; a position estimator responsive to said filtered EOG signals that uses an estimated position value to produce first smoothed EOG signals; a slope estimator responsive to said first smoothed EOG signals that uses an estimated velocity value to produce second smoothed EOG signals; said signal processor when in a calibration mode, classifying information from at least said second smoothed EOG signals by determining a set of one or more parameters corresponding to a given eye movement, where said set of parameters corresponds to a token; and said signal processor when in interaction mode, assigning a token from a plurality of tokens, when EOG signals resulting from eye movements correspond to the classification attributed to said assigned token. - View Dependent Claims (43, 44, 45, 46)
-
-
47. A method for providing communication or control functions as a result of eye movements, comprising the steps of:
-
detecting bio-electromagnetic signals generated by eye movements; processing said detected bio-electromagnetic signals by assigning, in response to said detected bio-electromagnetic signals, symbolic tokens corresponding to eye movement classifications, thereby generating tokens corresponding to said bio-electromagnetic signals; generating command signals based on a protocol correlating patterns of tokens to a desired command signal; and providing communication or control functions in response to said command signals. - View Dependent Claims (48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61)
-
-
62. In a system having a plurality of channels, each channel detecting electro-oculograph (EOG) signals corresponding to horizontal and vertical movements of each eye such that said eye movements can provide control and communication, a signal processing method comprising the steps of:
-
filtering said EOG signals; processing said EOG signals using an estimated position value; processing said EOG signals using an estimated velocity value; classifying said filtered and processed EOG signals by determining a set of one or more parameters corresponding to a given eye movement, where said set of parameters corresponds to a token, such that a plurality of tokens are classified during a calibration mode; and assigning a token from said plurality of tokens, when EOG signal resulting from eye movements fall within the classification attributed to the selected token during interaction mode. - View Dependent Claims (63)
-
-
64. A method for determining the 3D position and velocity of a gaze point comprising the steps of:
-
detecting signals associated with eye movements from each eye; processing said detected signals from each eye using triangulation to determine the gaze point in 3D and the time rate of change of said gaze point. - View Dependent Claims (65)
-
-
66. A method for disengaging an eye tracking interface comprising the steps of:
-
detecting signals associated with eye movements; calibrating to determine a normative fixated distance for a user'"'"'s gaze at an object; processing said detected signals to determine the distance of said user'"'"'s fixated gaze during an interaction period; causing the eye tracking interface to disengage when said distance differs from said normative fixated distance.
-
-
67. A method for electro-oculographic (EOG) eye tracking or control while a user is looking through an optical device comprising:
-
calibrating an EOG eye tracking system by a user providing predetermined eye movements with reference to calibrating points in said user'"'"'s field of view as said user looks through said optical device; generating parameter classifications based on said user provided eye movements, so that during an interaction mode, actual eye movements are detected and tokens corresponding to said. classifications are generated. - View Dependent Claims (68, 69)
-
-
70. A method for electro-oculographic (EOG) eye tracking or control while a user is looking through an optical device comprising:
-
calibrating an EOG eye tracking system with the naked eye to determine naked eye calibration parameters; transforming said naked eye calibration parameters using the optical characteristics of said optical device; where said transformed calibration parameters are used for eye tracking or control with said optical device.
-
Specification