Interaction and management of devices using gaze detection
First Claim
1. A device, comprising:
- a gaze detector adapted to generate gaze information; and
a user-interface controller adapted to;
in response to the gaze information indicating that the user is gazing at the device, recognize an audio command uttered by the user, and provide a response to the audio command via the device; and
in response to the gaze information indicating that the user is not gazing at the device, recognize an audio command uttered by the user, and route response information to a second device for presentation to the user.
3 Assignments
0 Petitions
Accused Products
Abstract
User gaze information, which may include a user line of sight, user point of focus, or an area that a user is not looking at, is determined from user body, head, eye and iris positioning. The user gaze information is used to select a context and interaction set for the user. The interaction sets may include grammars for a speech recognition system, movements for a gesture recognition system, physiological states for a user health parameter detection system, or other possible inputs. When a user focuses on a selected object or area, an interaction set associated with that object or area is activated and used to interpret user inputs. Interaction sets may also be selected based upon areas that a user is not viewing. Multiple devices can share gaze information so that a device does not require its own gaze detector.
15 Citations
20 Claims
-
1. A device, comprising:
-
a gaze detector adapted to generate gaze information; and a user-interface controller adapted to; in response to the gaze information indicating that the user is gazing at the device, recognize an audio command uttered by the user, and provide a response to the audio command via the device; and in response to the gaze information indicating that the user is not gazing at the device, recognize an audio command uttered by the user, and route response information to a second device for presentation to the user. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8)
-
-
9. A method, comprising:
-
receiving, by a device, user gaze information; in response to the user gaze information indicating that a user is gazing at a first portion of a graphical interface; activating a first set of speech inputs recognizable by the device; and associating a first set of inputs with the first portion of the graphical interface; and in response to the gaze information indicating that the user is not gazing at any portion of the device; activating a second set of speech inputs recognizable by the device; and routing response information to a selected one of alternative devices. - View Dependent Claims (10, 11, 12, 13, 14, 15, 16, 17, 18)
-
-
19. A method, comprising:
-
receiving, by a device, user gaze information; in response to the user gaze information indicating that a user is gazing at a graphical interface on the device, providing responses to user inputs via the graphical interface; and in response to the gaze information indicating that the user is not gazing at the device, routing response information to the user via a selected one of alternative devices. - View Dependent Claims (20)
-
Specification