SYSTEM FOR GAZE INTERACTION
First Claim
1. A control module for generating gesture based commands during user interaction with an information presentation area, wherein said control module is configured to:
- acquire user input from input means adapted to detect user generated gestures and gaze data signals from a gaze tracking module; and
determine at least one user generated gesture based control command based on said user input;
determine a gaze point area on said information presentation area including the user'"'"'s gaze point based on at least the gaze data signals; and
execute at least one user action manipulating a view presented on said graphical information presentation area based on said determined gaze point area and at least one user generated gesture based control command, wherein said user action is executed with said determined gaze point area as a starting point.
2 Assignments
0 Petitions
Accused Products
Abstract
A method and system for assisting a user when interacting with a graphical user interface combines gaze based input with gesture based user commands. A user of a computer system without a traditional touch-screen can interact with graphical user interfaces in a touch-screen like manner using a combination of gaze based input and gesture based user commands. A solution for touch-screen like interaction uses gaze input and gesture based input as a complement or an alternative to touch-screen interactions with a computer device having a touch-screen. Combined gaze and gesture based interaction with graphical user interfaces can be used to achieve a touchscreen like environment in computer systems without a traditional touchscreen or in computer systems having a touchscreen arranged ergonomically unfavourable for the user or a touchscreen arranged such that it is more comfortable for the user to use gesture and gaze for the interaction than the touchscreen.
265 Citations
20 Claims
-
1. A control module for generating gesture based commands during user interaction with an information presentation area, wherein said control module is configured to:
-
acquire user input from input means adapted to detect user generated gestures and gaze data signals from a gaze tracking module; and determine at least one user generated gesture based control command based on said user input; determine a gaze point area on said information presentation area including the user'"'"'s gaze point based on at least the gaze data signals; and execute at least one user action manipulating a view presented on said graphical information presentation area based on said determined gaze point area and at least one user generated gesture based control command, wherein said user action is executed with said determined gaze point area as a starting point. - View Dependent Claims (2, 3, 4, 18, 19, 20)
-
-
5. A method for generating gesture based control commands during user interaction with an information presentation area associated with a computer device, said method comprising:
-
acquiring user input corresponding to user generated gestures and gaze data signals; and determining at least one user generated gesture based control command based on said user input; determining a gaze point area on said information presentation area including the user'"'"'s gaze point based on at least the gaze data signals; and executing at least one user action manipulating a view presented on said information presentation area based on said determined gaze point area and at least one user generated gesture based control command, wherein said user action is executed with said determined gaze point area as a starting point. - View Dependent Claims (6, 7, 8, 9, 10, 11)
-
-
12. A wireless transmit/receive unit, WTRU, associated with an information presentation area and comprising input means adapted to detect user generated gestures and a gaze tracking module adapted to detect gaze data of a viewer of said information presentation area, said WTRU further comprising a control module configured to:
-
acquire user input from said input means and gaze data signals from said gaze tracking module; determine at least one user generated gesture based control command based on said user input; determine a gaze point area on said information presentation area including the user'"'"'s gaze point based on at least the gaze data signals; and execute at least one user action manipulating a view presented on said information presentation area based on said determined gaze point area and at least one user generated gesture based control command, wherein said user action is executed with said determined gaze point area as a starting point.
-
-
13. A system for user interaction with an information presentation area, said system comprising:
-
input means adapted to detect user generated gestures; a gaze tracking module adapted to detect gaze data of a viewer of said information presentation area; a control module configured to; acquire user input from said input means and gaze data signals from said gaze tracking module; determine at least one user generated gesture based control command based on said user input; determine a gaze point area on said information presentation area including the user'"'"'s gaze point based on at least the gaze data signals; and execute at least one user action manipulating a view presented on said graphical information presentation area based on said determined gaze point area and at least one user generated gesture based control command, wherein said user action is executed with said determined gaze point area as a starting point.
-
-
14. A computer device associated with an information presentation area, said computer device comprising:
-
input means adapted to detect user generated gestures; a gaze tracking module adapted to detect gaze data of a viewer of said information presentation area; a control module configured to; acquire user input from input means adapted to detect user generated gestures and gaze data signals from a gaze tracking module; determine at least one user generated gesture based control command based on said user input; determine a gaze point area on said information presentation area including the user'"'"'s gaze point based on at least the gaze data signals; and execute at least one user action manipulating a view presented on said information presentation area based on said determined gaze point area and at least one user generated gesture based control command, wherein said user action is executed with said determined gaze point area as a starting point.
-
-
15. A handheld portable device including an information presentation area and comprising input means adapted to detect user generated gestures and a gaze tracking module adapted to detect gaze data of a viewer of said information presentation area, said device further comprising a control module configured to:
-
acquire user input from said input means and gaze data signals from said gaze tracking module; determine at least one user generated gesture based control command based on said user input; determine a gaze point area on said information presentation area including the user'"'"'s gaze point based on at least the gaze data signals; and execute at least one user action manipulating a view presented on said information presentation area based on said determined gaze point area and at least one user generated gesture based control command, wherein said user action is executed with said determined gaze point area as a starting point.
-
-
16. A system for user interaction with a wearable head mounted information presentation area, said system comprising:
-
input means adapted to be worn on a wrist, a hand, or at least a finger, said input means being configured to detect user generated gestures and adapted to wirelessly communicate with a control module; a gaze tracking module adapted to detect gaze data of a viewer of said information presentation area; and
wherein said control module is configured to;acquire user input from said input means and gaze data signals from said gaze tracking module; determine at least one user generated gesture based control command based on said user input; determine a gaze point area on said information presentation area including the user'"'"'s gaze point based on at least the gaze data signals; and execute at least one user action manipulating a view presented on said graphical information presentation area based on said determined gaze point area and at least one user generated gesture based control command, wherein said user action is executed with said determined gaze point area as a starting point.
-
-
17. A system for user interaction with an information presentation area, said system comprising:
-
input means adapted to detect user generated gestures, said input means comprising at least one touchpad arranged on a steering device of a vehicle or adapted to be integrated in a steering device of a vehicle; a gaze tracking module adapted to detect gaze data of a viewer of said information presentation area; a control module configured to; acquire user input from said input means and gaze data signals from said gaze tracking module; determine at least one user generated gesture based control command based on said user input; determine a gaze point area on said information presentation area including the user'"'"'s gaze point based on at least the gaze data signals; and execute at least one user action manipulating a view presented on said graphical information presentation area based on said determined gaze point area and at least one user generated gesture based control command, wherein said user action is executed with said determined gaze point area as a starting point.
-
Specification