SYSTEMS AND METHODS FOR BIOMECHANICALLY-BASED EYE SIGNALS FOR INTERACTING WITH REAL AND VIRTUAL OBJECTS
3 Assignments
0 Petitions
Accused Products
Abstract
Systems and methods are provided for discerning the intent of a device wearer primarily based on movements of the eyes. The system can be included within unobtrusive headwear that performs eye tracking and controls screen display. The system can also utilize remote eye tracking camera(s), remote displays and/or other ancillary inputs. Screen layout is optimized to facilitate the formation and reliable detection of rapid eye signals. The detection of eye signals is based on tracking physiological movements of the eye that are under voluntary control by the device wearer. The detection of eye signals results in actions that are compatible with wearable computing and a wide range of display devices.
45 Citations
60 Claims
-
1-30. -30. (canceled)
-
31. A method for providing a graphical user interface to determine intent of a user based at least in part on movement of the user'"'"'s one or both eyes using a detector, comprising:
-
identifying, with the detector, when the user'"'"'s one or both eyes are directed at an object at an object location; identifying, with the detector, a first saccade of the user'"'"'s one or both eyes from the object location towards a target at a target location; identifying, with the detector, one or more corrective saccades of the user'"'"'s one or both eyes moving closer towards the target location; confirming, with the detector, that at least one of the corrective saccades is completed within a pre-determined distance from the target location; and performing an action related to one or more of the object, the object location, the target, and the target location. - View Dependent Claims (32, 33)
-
-
34. A method for providing a graphical user interface to determine intent of a user based at least in part on movement of the user'"'"'s one or both eyes using a detector, comprising:
-
identifying, with the detector, when the user'"'"'s one or both eyes are directed at a first object at a first object location; identifying, with the detector, a saccade of the user'"'"'s one or both eyes from the first object location towards a target object at a target location on a display; confirming, with the detector, that one of a predicted landing location and an actual landing location of the saccade is within a pre-determined distance from the target location; and performing an action that includes replacing the target object with an icon at the target location related to one of the first object, and the first object location. - View Dependent Claims (35, 36, 37, 38, 39, 40, 41, 42)
-
-
43. A method for providing a graphical user interface to determine intent of a user based at least in part on movement of the user'"'"'s head and one or both of the user'"'"'s eyes using a head movement detector and an eye movement detector, comprising:
-
identifying, with the head movement detector, when the user'"'"'s head moves at a head velocity; identifying at substantially the same time, with the eye movement detector, when the one or both of the user'"'"'s eyes move at an eye velocity; identifying, based at least in part on the head velocity and the eye velocity, a vestibulo-ocular movement of the one or both of the user'"'"'s eyes; confirming that the one or both of the user'"'"'s eyes are directed at a viewed object at a viewed object location based at least in part on the determined vestibulo-ocular movement of the one or both of the user'"'"'s eyes; and performing an action related to one or more of the viewed object, and the viewed object location. - View Dependent Claims (44, 45, 46, 47, 48, 49, 50, 51)
-
-
52. A method for providing a graphical user interface to determine intent of a user based at least in part on movements of the user'"'"'s head and one or both of the user'"'"'s eyes using a head movement detector and an eye movement detector, comprising:
-
identifying, with the eye movement detector, an initial viewed object at an initial viewed object location; identifying, with the head movement detector, when the user'"'"'s head moves at a head velocity; identifying at substantially the same time, with the eye movement detector, when the one or both of the user'"'"'s eyes move at an eye velocity; identifying, based on the eye velocity, a first saccadic movement of the eye; confirming, either upon completion of the head movement and the first saccadic eye movement or before completion of the head movement and the first saccadic eye movement, that the one or both of the user'"'"'s eyes are directed at a target object at a target object location; and performing an action related to one or more of the initial viewed object, the initial viewed object location, the target object, and the target object location. - View Dependent Claims (53, 54, 55, 56, 57, 58, 59, 60)
-
Specification