Interaction and management of devices using gaze detection
First Claim
1. A method, comprising:
- receiving, by a device, user gaze information;
in response to the user gaze information indicating that a user is gazing at a first portion of a graphical interface;
activating a first set of speech inputs recognizable by the device; and
associating a first set of inputs with the first portion of the graphical interface; and
in response to the gaze information indicating that the user is not gazing at any portion of the device, providing information to the user via another device.
2 Assignments
0 Petitions
Accused Products
Abstract
User gaze information, which may include a user line of sight, user point of focus, or an area that a user is not looking at, is determined from user body, head, eye and iris positioning. The user gaze information is used to select a context and interaction set for the user. The interaction sets may include grammars for a speech recognition system, movements for a gesture recognition system, physiological states for a user health parameter detection system, or other possible inputs. When a user focuses on a selected object or area, an interaction set associated with that object or area is activated and used to interpret user inputs. Interaction sets may also be selected based upon areas that a user is not viewing. Multiple devices can share gaze information so that a device does not require its own gaze detector.
32 Citations
19 Claims
-
1. A method, comprising:
-
receiving, by a device, user gaze information; in response to the user gaze information indicating that a user is gazing at a first portion of a graphical interface; activating a first set of speech inputs recognizable by the device; and associating a first set of inputs with the first portion of the graphical interface; and in response to the gaze information indicating that the user is not gazing at any portion of the device, providing information to the user via another device. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 16, 17, 18)
-
-
9. A method, comprising:
-
receiving, by a device, user gaze information; in response to the user gaze information indicating that a user is gazing at a first portion of a graphical interface, activating a first set of gesture inputs recognizable by the device; and in response to the gaze information indicating that the user is not gazing at any portion of the device, providing information to the user via another device. - View Dependent Claims (10, 11, 12, 13, 19)
-
-
14. A device, comprising:
-
a gaze detector adapted to generate gaze information identifying a user line of sight or a user point of focus or an area that the user is not viewing; and a user-interface controller coupled to or integrated into the device, the user-interface controller adapted to; in response to the gaze information indicating that the user is gazing at a portion of the device, activate a set of audio inputs recognizable by the device, recognize an audio command uttered by the user, and provide an audio response to the audio command via the device; and in response to the gaze information indicating that the user is not gazing at any portion of the device, provide information to the user via another device. - View Dependent Claims (15)
-
Specification