Eye-wearable device user interface and method
First Claim
1. A method of controlling a head-mountable device for transmitting and receiving information from a human user with at least one eye, said device having a virtual display, at least one eye position tracking sensor, and connectivity to at least one touchpad;
- said device comprising a frame configured to attach to the head of a user;
wherein said connectivity to at least one touchpad is achieved by physically connecting said at least one touchpad to said frame, thereby becoming a head mounted touchpad, said method comprising;
using at least one processor to display a plurality of visual targets on said virtual display, said plurality of visual targets each comprising a visual element area embedded within a visual element position zone with an area that is equal to or larger than said visual element area, thus producing a plurality of visual element areas, each visual element area and visual element position zone mapping to a touch sensitive touch element on a touchpad position zone;
wherein at least some of said plurality of visual targets further comprise a visual element embedded within an eye position zone with an area that is equal to or larger than said visual element;
using any of said at least one touchpad or said at least one eye position tracking sensor to register that a virtual key corresponding to said at least one of said plurality of visual targets has been pressed by said user by using said at least one eye position tracking sensor to determine when said at least one eye is on average continuously gazing in a direction that is within an eye position zone of at least one of said plurality of visual elements for a time period exceeding a hovering time interval, signaling when said hovering time interval has been exceeded, and if said gaze remains within the eye position zone of said at least one of said plurality of visual elements, without looking away, for a time period exceeding a keypress time interval, also registering that an eye gaze controlled virtual key corresponding to said at least one of said plurality of visual targets has been pressed by said user; and
using said virtual key presses to control either transmitting or receiving said information to at least one computerized device outside of said head-mountable device'"'"'s said frame.
2 Assignments
0 Petitions
Accused Products
Abstract
A software controlled user interface and method for a head-mountable device equipped with at least one display, and connectivity to at least one touch pad. The method can scale between displaying a small to large number of different eye gaze target symbols at any given time, yet still transmit a large array of different symbols to outside devices. At least part of the method may be implemented by way of a virtual window onto the surface of a virtual cylinder, with touchpad or eye gaze sensitive symbols that can be rotated by touchpad touch or eye gaze thus bringing various groups of symbols into view, and then selected. Specific examples of use of this interface and method on a head-mountable device are disclosed, along with various operation examples including sending and receiving text messages, moving about in virtual or augmented reality, control of robotic devices and control of remote vehicles.
-
Citations
20 Claims
-
1. A method of controlling a head-mountable device for transmitting and receiving information from a human user with at least one eye, said device having a virtual display, at least one eye position tracking sensor, and connectivity to at least one touchpad;
-
said device comprising a frame configured to attach to the head of a user; wherein said connectivity to at least one touchpad is achieved by physically connecting said at least one touchpad to said frame, thereby becoming a head mounted touchpad, said method comprising; using at least one processor to display a plurality of visual targets on said virtual display, said plurality of visual targets each comprising a visual element area embedded within a visual element position zone with an area that is equal to or larger than said visual element area, thus producing a plurality of visual element areas, each visual element area and visual element position zone mapping to a touch sensitive touch element on a touchpad position zone; wherein at least some of said plurality of visual targets further comprise a visual element embedded within an eye position zone with an area that is equal to or larger than said visual element; using any of said at least one touchpad or said at least one eye position tracking sensor to register that a virtual key corresponding to said at least one of said plurality of visual targets has been pressed by said user by using said at least one eye position tracking sensor to determine when said at least one eye is on average continuously gazing in a direction that is within an eye position zone of at least one of said plurality of visual elements for a time period exceeding a hovering time interval, signaling when said hovering time interval has been exceeded, and if said gaze remains within the eye position zone of said at least one of said plurality of visual elements, without looking away, for a time period exceeding a keypress time interval, also registering that an eye gaze controlled virtual key corresponding to said at least one of said plurality of visual targets has been pressed by said user; and using said virtual key presses to control either transmitting or receiving said information to at least one computerized device outside of said head-mountable device'"'"'s said frame. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12)
-
-
13. A method of controlling a head-mountable, vision-controlled, device for transmitting and receiving information from a human user with at least one eye, said device having a virtual display, at least one eye position tracking sensor, and connectivity to at least one touchpad;
-
said device comprising a frame configured to attach to the head of a user; wherein said connectivity to at least one touchpad is achieved by physically connecting said at least one touchpad to said frame, thereby becoming a head mounted touchpad, or wherein connectivity to at least one hardware or virtual touchpad is achieved by a wireless or wired connection to at least one touchpad not physically attached to said frame;
said method comprising;using at least one processor to display a plurality of visual targets on said virtual display, said plurality of visual targets each comprising a visual element area embedded within a visual element position zone with an area that is equal to or larger than said visual element area, thus producing a plurality of visual element areas, each visual element area and visual element position zone mapping to a touch sensitive touch element on a hardware or virtual touchpad position zone; using said at least one touchpad to determine when at least one user touch is on average touching an area that is within a visual element position zone of at least one of said plurality of visual element area for a time period exceeding a hovering time interval, signaling when said hovering time interval has been exceeded, then using said at least one touchpad to register that a virtual key corresponding to said at least one of said plurality of visual targets has been pressed by said user; wherein at least some of said plurality of visual targets further comprise a visual element embedded within an eye position zone with an area that is equal to or larger than said visual element area; further using said at least one eye position tracking sensor to determine when said at least one eye is on average gazing in a direction that is within an eye position zone of at least one of said plurality of visual elements for a time period exceeding a hovering time interval, signaling when said hovering time interval associated with said gaze has been exceeded, and if said gaze remains within the eye position zone of said at least one of said plurality of visual elements, without looking away, for a time period exceeding a keypress time interval, also registering that a virtual key corresponding to said at least one of said plurality of visual targets has been pressed by said user; and using said virtual key presses to control either transmitting or receiving said information to at least one computerized device outside of said head-mountable device'"'"'s said frame. - View Dependent Claims (14, 15, 16, 17, 18)
-
-
19. A head-mountable device for transmitting and receiving information from a human user with at least one eye, said device having a virtual display, at least one eye position tracking sensor, at least one processor, and connectivity to at least one touchpad;
-
said device further comprising a frame configured to attach to the head of a user; said display, mounted in said device'"'"'s said frame, further comprising at least one transparent lens or other transparent device to allow visual information from an outside environment to also be seen by at least one eye of said user; wherein said connectivity to at least one touchpad is achieved by a touchpad that is physically attached to said frame, thereby forming a head mounted touchpad;
said device further comprising;at least one processor configured to display a plurality of visual targets on said virtual display, said plurality of visual targets each comprising a visual element area embedded within a visual element position zone with an area that is equal to or larger than said visual element area, thus producing a plurality of visual element areas, each visual element area and visual element position zone mapping to a touch sensitive touch element on a touchpad position zone; said at least one processor also configured to associate at least some of said plurality of visual targets with a visual element embedded within an eye position zone with an area that is equal to or larger than said visual element area; said at least one processor configured to accept input from any of said at least one touchpad and said at least one eye position tracking sensor to determine when any of at least one user touch and gaze is on average directed at an area that is within a visual element position zone of at least one of said plurality of visual element area for a time period exceeding a hovering time interval, and signal when said hovering time interval has been exceeded; said at least one processor also configured to use said at least one eye position tracking sensor to determine when said at least one eye is on average gazing in a direction that is within an eye position zone of at least one of said plurality of visual elements for a time period exceeding a hovering time interval, signaling when said hovering time interval has been exceeded, and if said gaze remains within the eye position zone of said at least one of said plurality of visual elements, without looking away, for a time period exceeding a keypress time interval, also registering that a virtual key corresponding to said at least one of said plurality of visual targets has been pressed by said user; said at least one processor further configured to accept input from any of said at least one touchpad or said at least one eye position tracking sensor to register that a virtual key corresponding to said at least one of said plurality of visual targets has been pressed by said user; and use said virtual key presses to control either transmitting or receiving said information to at least one computerized device outside of said head-mountable device'"'"'s said frame. - View Dependent Claims (20)
-
Specification