Systems and methods for biomechanically-based eye signals for interacting with real and virtual objects
First Claim
1. A method for providing a graphical user interface based at least in part on movement of a user'"'"'s eye or eyes relative to an electronic display using a detector, the method comprising:
- identifying, by a processing unit, when a gaze of at least one eye is directed at an object on the display;
identifying, by the processing unit, an initiation of a change in the gaze based on a saccadic movement of the at least one eye away from the object and towards a target location including a first icon on the display corresponding to an action, wherein identifying the saccadic movement comprises determining a velocity and direction of the eye movement and, based on the velocity and direction of eye movement, predicting that the destination for the eye movement is the target location;
confirming, by the processing unit, that the gaze of the at least one eye fixates at the target location at completion of the saccadic movement;
performing, by the processing unit, the action on the object; and
replacing, by the processing unit, the first icon with a second icon corresponding to the first icon and overlapping a portion of the display previously occupied by the first icon.
3 Assignments
0 Petitions
Accused Products
Abstract
Systems and methods are provided for discerning the intent of a device wearer primarily based on movements of the eyes. The system can be included within unobtrusive headwear that performs eye tracking and controls screen display. The system can also utilize remote eye tracking camera(s), remote displays and/or other ancillary inputs. Screen layout is optimized to facilitate the formation and reliable detection of rapid eye signals. The detection of eye signals is based on tracking physiological movements of the eye that are under voluntary control by the device wearer. The detection of eye signals results in actions that are compatible with wearable computing and a wide range of display devices.
-
Citations
17 Claims
-
1. A method for providing a graphical user interface based at least in part on movement of a user'"'"'s eye or eyes relative to an electronic display using a detector, the method comprising:
-
identifying, by a processing unit, when a gaze of at least one eye is directed at an object on the display; identifying, by the processing unit, an initiation of a change in the gaze based on a saccadic movement of the at least one eye away from the object and towards a target location including a first icon on the display corresponding to an action, wherein identifying the saccadic movement comprises determining a velocity and direction of the eye movement and, based on the velocity and direction of eye movement, predicting that the destination for the eye movement is the target location; confirming, by the processing unit, that the gaze of the at least one eye fixates at the target location at completion of the saccadic movement; performing, by the processing unit, the action on the object; and replacing, by the processing unit, the first icon with a second icon corresponding to the first icon and overlapping a portion of the display previously occupied by the first icon. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9, 10, 11)
-
-
12. A system for providing a graphical user interface based at least in part on movement of a user'"'"'s eye or eyes, the system comprising:
-
a detector configured to monitor movement of at least one eye of the user; an electronic display operatively associated with the detector; a processing unit operatively coupled to the detector and electronic display to; identify when a gaze of the at least one eye is directed at an object on the electronic display; identify an initiation of a change in the gaze based on a saccadic movement of the at least one eye away from the object and towards a target location including a first icon on the display corresponding to an action, wherein identifying the saccadic movement further comprises determining a velocity and direction of the eye movement and, based on the velocity and direction of eye movement, predicting that the destination for the eye movement is the target location; confirm that the gaze of the at least one eye fixates at the target location at completion of the saccadic movement; perform the action on the object; and replacing the first icon with a second icon corresponding to the first icon and overlapping a portion of the display previously occupied by the first icon. - View Dependent Claims (13)
-
-
14. A system for providing a graphical user interface based on movement of a user'"'"'s eye or eyes, the system comprising:
-
a detector configured to monitor movement of at least one eye of a user; an electronic display operatively associated with the detector; a processing unit operatively coupled to the detector and electronic display to; identify an initiation of a change in the gaze based on a first saccadic movement of the at least one eye away from an object and towards a first target location including a first icon on the display, wherein identifying the saccadic movement further comprises determining a velocity and direction of the eye movement and, based on the velocity and direction of eye movement, predicting that the destination for the eye movement is the first target location; replace the first icon on the display with a plurality of second icons at a plurality of second target locations different than the first target location; replace the first icon with a third icon overlapping a portion of the display previously occupied by the first icon; confirm that the gaze of the at least one eye fixates at the first target location after the first saccadic movement; and thereafter monitor the at least one eye to identify whether the at least one eye performs a second saccadic movement towards one of the plurality of second target locations. - View Dependent Claims (15, 16, 17)
-
Specification