Eye tracking as a method to improve the user interface
First Claim
Patent Images
1. A method for managing a user interface, comprising:
- detecting a gaze of a user within a display via an eye tracking sensor, the eye tracking sensor measuring eye movement or eye motion;
correlating the gaze of the user to a first location associated with an item displayed on the display;
receiving an input from the user related to adjusting the item relative to a second location or adjusting a function associated with the item relative to the second location; and
communicating the input to at least one of;
an on board aircraft communication system and an off board aircraft communication system, wherein the adjusting of the item or the adjusting of the function is only performed if the input and the gaze correspond to the second location associated with the item displayed on the display.
1 Assignment
0 Petitions
Accused Products
Abstract
The present disclosure is directed to a method for managing a user interface. The method may include the step of detecting a gaze of a user within a display. The method also includes the step of correlating the gaze of the user to an item displayed on the display. A further step of the method entails receiving an input from the user related to the item.
-
Citations
20 Claims
-
1. A method for managing a user interface, comprising:
-
detecting a gaze of a user within a display via an eye tracking sensor, the eye tracking sensor measuring eye movement or eye motion; correlating the gaze of the user to a first location associated with an item displayed on the display; receiving an input from the user related to adjusting the item relative to a second location or adjusting a function associated with the item relative to the second location; and communicating the input to at least one of;
an on board aircraft communication system and an off board aircraft communication system, wherein the adjusting of the item or the adjusting of the function is only performed if the input and the gaze correspond to the second location associated with the item displayed on the display. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9, 10, 11)
-
-
12. A user interface system, comprising:
-
an eye tracking sensor, the eye tracking sensor configured to detect a gaze of a user viewing a display and to measure eye movement or eye motion; a processor in communication with the eye tracking sensor and in communication with at least one of an on board aircraft communication system and an off board aircraft communication system, the processor configured to receive the gaze of the user, the processor further configured to correlate the gaze of the user to a first area associated with an item displayed on the display; and an input device, the input device in communication with the processor, the input device configured to receive an input from the user related to adjusting the item relative to a second area or adjusting a function associated with the item relative to the second area, wherein the adjusting the item or the adjusting the function associated with the item is only performed by the processor if the input and the gaze of the user correspond to the second area associated with the item. - View Dependent Claims (13, 14, 15, 16, 17)
-
-
18. An input processing method, comprising:
-
receiving a voice command from a user related to an item displayed on the display; detecting a gaze of the user within the display; measuring an eye movement; correlating the gaze of the user to a first location associated with the item displayed on the display according to the eye movement; and confirming the voice command and (i) adjusting the item relative to a second location or (ii) adjusting a function associated with the item relative to the second location when the second location associated with the item and the gaze correspond to the voice command. - View Dependent Claims (19, 20)
-
Specification