Contextual user interface
First Claim
1. A method of presenting an interface that is optimized for a present context, the method comprising:
- determining a present context for a user interface by analyzing image data depicting an environment proximate to the user interface, wherein the present context is a user performing a gesture with a user'"'"'s left hand to interact with an object depicted on the user interface, the gesture detected by analyzing the image data;
determining that the user'"'"'s left hand is coming between a second user and the user interface while making the gesture by analyzing the image data; and
upon said determining that the user'"'"'s left hand is coming between a second user and the user interface, automatically generating an updated user interface with interface objects relocated to encourage the user to make right handed gestures instead of left handed gestures.
3 Assignments
0 Petitions
Accused Products
Abstract
Embodiments of the present invention analyze a context in which a user interacts with a computer interface and automatically optimizes the interface for the context. The controller or control mode the user selects for interaction may define the context, in part. Examples of control modes include gesturing, audio control, use of companion devices, and use of dedicated control devices, such as game controllers and remote controls. The different input devices are designed for different tasks. Nevertheless, a user will frequently attempt to perform a task using a control input that is not adapted for the task. Embodiments of the present invention change the characteristics of the user interface to make it easier for the user to complete an intended task using the input device of the user'"'"'s choice.
-
Citations
20 Claims
-
1. A method of presenting an interface that is optimized for a present context, the method comprising:
-
determining a present context for a user interface by analyzing image data depicting an environment proximate to the user interface, wherein the present context is a user performing a gesture with a user'"'"'s left hand to interact with an object depicted on the user interface, the gesture detected by analyzing the image data; determining that the user'"'"'s left hand is coming between a second user and the user interface while making the gesture by analyzing the image data; and upon said determining that the user'"'"'s left hand is coming between a second user and the user interface, automatically generating an updated user interface with interface objects relocated to encourage the user to make right handed gestures instead of left handed gestures. - View Dependent Claims (2, 3, 4, 5, 6, 7)
-
-
8. One or more computer-storage hardware media having computer-executable instructions embodied thereon, that when executed by a computing device perform a method of presenting an interface that is optimized for a present context, the method comprising:
-
determining a present context for a user interface by analyzing image data depicting an environment proximate to the user interface, wherein the present context is a user performing a gesture with a user'"'"'s left hand to interact with an object depicted on the user interface, the gesture detected by analyzing the image data; determining that the user'"'"'s left hand is coming between a second user and the user interface while making the gesture by analyzing the image data; and upon said determining that the user'"'"'s left hand is coming between a second user and the user interface, automatically generating an updated user interface with interface objects relocated to encourage the user to make right handed gestures instead of left handed gestures. - View Dependent Claims (9, 10, 11, 12, 13, 14)
-
-
15. A computing system comprising:
-
a hardware processor; and computer storage memory having computer-executable instructions stored thereon which, when executed by the processor, implement a method of presenting an interface that is optimized for a present context, the method comprising; determining a present context for a user interface by analyzing image data depicting an environment proximate to the user interface, wherein the present context is a user performing a gesture with a user'"'"'s left hand to interact with an object depicted on the user interface, the gesture detected by analyzing the image data; determining that the user'"'"'s left hand is coming between a second user and the user interface while making the gesture by analyzing the image data; and upon said determining that the user'"'"'s left hand is coming between a second user and the user interface, automatically generating an updated user interface with interface objects relocated to encourage the user to make right handed gestures instead of left handed gestures. - View Dependent Claims (16, 17, 18, 19, 20)
-
Specification