Dynamic device interaction adaptation based on user engagement
First Claim
1. A computing device, comprising:
- one or more processors;
a touchscreen for displaying graphical user interfaces and for receiving input gestures from a user to the computing device, wherein the computing device is a hand-held device;
a memory in communication with the one or more processors, the memory having computer-readable instructions stored thereupon which, when executed by the one or more processors, cause the computing device to;
receive image data from one or more sensors, wherein the image data indicates a gaze target of the user;
analyze the image data to determine a selected object based on the gaze target, wherein the selected object is the computing device when the gaze target is between a selection boundary and the computing device, and wherein the selected object is a remote device when the gaze target is between the selection boundary and the remote device;
in response to determining that the selected object is the computing device, selecting a first interaction model, wherein the first interaction model comprises displaying a first user interface of an application module on the touchscreen and causing the touchscreen to receive user gestures for providing input to the application module executing on the computing device; and
in response to determining that the selected object is a remote device in communication with the computing device, selecting a second interaction model, wherein the second interaction model causes the touchscreen to receive user gestures for providing input to the remote device, wherein the second interaction model causes a display of a second user interface that is configured and arranged according to functions of the remote device, wherein the selection boundary is adjusted in response to determining that the selected object is the computing device or the remote device.
1 Assignment
0 Petitions
Accused Products
Abstract
The techniques disclosed herein enable dynamic device interaction adaptation based on user engagement. In general, a computer can leverage sensors to determine a user'"'"'s level of engagement with one or more devices. As the user transitions his or her attention to different devices, the computer can utilize different interaction models to assist the user in interacting with each device that meets a threshold level of engagement with the user. Individual interaction models can configure the computer to direct input gestures received by the computing device to a selected device, and adapt the computer'"'"'s input modality to provide user interface controls that are optimized for the selected device. When a user is interacting with two or more devices, automatic transitions between different interaction models can help improve the user'"'"'s interaction with each device.
79 Citations
19 Claims
-
1. A computing device, comprising:
-
one or more processors; a touchscreen for displaying graphical user interfaces and for receiving input gestures from a user to the computing device, wherein the computing device is a hand-held device; a memory in communication with the one or more processors, the memory having computer-readable instructions stored thereupon which, when executed by the one or more processors, cause the computing device to; receive image data from one or more sensors, wherein the image data indicates a gaze target of the user; analyze the image data to determine a selected object based on the gaze target, wherein the selected object is the computing device when the gaze target is between a selection boundary and the computing device, and wherein the selected object is a remote device when the gaze target is between the selection boundary and the remote device; in response to determining that the selected object is the computing device, selecting a first interaction model, wherein the first interaction model comprises displaying a first user interface of an application module on the touchscreen and causing the touchscreen to receive user gestures for providing input to the application module executing on the computing device; and in response to determining that the selected object is a remote device in communication with the computing device, selecting a second interaction model, wherein the second interaction model causes the touchscreen to receive user gestures for providing input to the remote device, wherein the second interaction model causes a display of a second user interface that is configured and arranged according to functions of the remote device, wherein the selection boundary is adjusted in response to determining that the selected object is the computing device or the remote device. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13)
-
-
14. A method, comprising:
-
receiving image data, at a computing device, from one or more sensors, wherein the image data indicates a gaze target of a user, the computing device comprising a touchscreen for displaying graphical user interfaces and for receiving input gestures from a user to the computing device, wherein the computing device is a hand-held device; analyzing the image data to select the computing device or a remote device in communication with the computing device, wherein the computing device is selected when the gaze target is directed to the computing device, and wherein the remote device is selected when the gaze direction is directed to the remote device, and wherein the selected object is the computing device when the gaze target is between a selection boundary and the computing device, and wherein the selected object is a remote device when the gaze target is between the selection boundary and the remote device; in response to determining that the selected gaze direction is directed to the computing device, selecting the computing device and a first interaction model, wherein the first interaction model comprises displaying a first user interface of an application module on the touchscreen and causing the touchscreen to receive user gestures for providing input to the application module executing on the computing device; and in response to determining that the selected gaze direction is directed to the remote device, selecting the remote device and a second interaction model, wherein the second interaction model causes the touchscreen to receive user gestures for providing input to the remote device, wherein the second interaction model causes a display of a second user interface that is configured and arranged according to functions of the remote device, wherein the selection boundary is adjusted in response to determining that the selected object is the computing device or the remote device. - View Dependent Claims (15, 16, 17, 18)
-
-
19. A non-transitory computer-readable medium comprising instructions which, when executed by one or more processors of a computing device comprising a touchscreen for displaying graphical user interfaces and for receiving input gestures from a user, cause the computing device to:
-
receive image data from one or more sensors, wherein the image data indicates a gaze target of the user; analyze the image data to determine a selected an object based on the gaze target, wherein the selected object is the computing device when the gaze target is between a selection boundary and the computing device, and wherein the selected object is a remote device when the gaze target is between the selection boundary and the remote device; in response to determining that the selected object is the computing device, selecting a first interaction model, wherein the first interaction model comprises displaying a first user interface of an application module on the touchscreen and causing the touchscreen to receive user gestures for providing input to the application module executing on the computing device; and in response to determining that the selected object is a remote device in communication with the computing device, selecting a second interaction model, wherein the second interaction model causes the touchscreen to receive user gestures for providing input to the remote device, wherein the second interaction model causes a display of a second user interface that is configured and arranged according to functions of the remote device, wherein the selection boundary is adjusted in response to determining that the selected object is the computing device or the remote device, wherein the computing device is a hand-held device.
-
Specification