SYSTEMS AND METHODS FOR BIOMECHANICALLY-BASED EYE SIGNALS FOR INTERACTING WITH REAL AND VIRTUAL OBJECTS
First Claim
1. A method for providing a graphical user interface to determine intent of a user based at least in part on movement of the user'"'"'s eye or eyes relative to an electronic display using a detector, comprising:
- identifying when the gaze of at least one eye is directed at an object on the display;
identifying a saccadic movement of the at least one eye from the object towards a target location including a first icon on the display corresponding to an action;
confirming that the gaze of the at least one eye fixates at the target location at completion of the saccadic movement; and
performing the action on the object.
3 Assignments
0 Petitions
Accused Products
Abstract
Systems and methods are provided for discerning the intent of a device wearer primarily based on movements of the eyes. The system can be included within unobtrusive headwear that performs eye tracking and controls screen display. The system can also utilize remote eye tracking camera(s), remote displays and/or other ancillary inputs. Screen layout is optimized to facilitate the formation and reliable detection of rapid eye signals. The detection of eye signals is based on tracking physiological movements of the eye that are under voluntary control by the device wearer. The detection of eye signals results in actions that are compatible with wearable computing and a wide range of display devices.
331 Citations
40 Claims
-
1. A method for providing a graphical user interface to determine intent of a user based at least in part on movement of the user'"'"'s eye or eyes relative to an electronic display using a detector, comprising:
-
identifying when the gaze of at least one eye is directed at an object on the display; identifying a saccadic movement of the at least one eye from the object towards a target location including a first icon on the display corresponding to an action; confirming that the gaze of the at least one eye fixates at the target location at completion of the saccadic movement; and performing the action on the object. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14)
-
-
15. A system for providing a graphical user interface to determine intent of a user based at least in part on movement of the user'"'"'s eye or eyes, comprising:
-
a detector configured to monitor movement of at least eye of the user; an electronic display operatively associated with the detector; a processing unit operatively coupled to the detector and electronic display to; identify when the gaze of the at least one eye is directed at an object on the display; identify a saccadic movement of the at least one eye from the object towards target location including a first icon on the display corresponding to an action; confirm that the gaze of the at least one eye fixates at the target location at completion of the saccadic movement; and perform the action on the object. - View Dependent Claims (16)
-
-
17-35. -35. (canceled)
-
36. A system for providing a graphical user interface to determine intent of a user based on movement of the user'"'"'s eye or eyes, comprising:
-
a detector configured to monitor movement of at least one eye of a user; an electronic display operatively associated with the detector; a processing unit operatively coupled to the detector and electronic display to; identify a first saccadic movement of the at least one eye towards a first target location including a first icon on the display corresponding; replace the first icon on the display with a plurality of second icons at a plurality of second locations different than the first location; confirm that the gaze of the at least one eye fixates at the first target location after the first saccadic movement; and thereafter monitor the at least one eye to identify whether the at least one eye performs a second saccadic movement towards one of the plurality of second target locations. - View Dependent Claims (37, 38, 39)
-
-
40-61. -61. (canceled)
Specification