Low distraction interfaces
First Claim
1. A computer-implemented method for enabling a user to interface with an electronic device, comprising:
- receiving image data from at least one image capture element of the electronic device;
analyzing the image data to determine at least one period during which a viewing direction of a user is in a direction at least substantially toward a display element of the electronic device;
analyzing the image data to determine a relative orientation angle between the user and a plane of the display element of the electronic device;
determining information to be displayed on the display element based at least upon a movement or gesture represented in the image data;
determining an adjusted rendering angle and elongation based at least in part on the relative orientation angle between the user and the plane of the display element;
displaying, during the at least one period, the information on the display element according to the adjusted rendering angle and the elongation so that the information appears in a normal aspect ratio and at a normal angle to the user;
analyzing additional image data received from the at least one image capture element of the electronic device to determine one or more gestures; and
displaying additional information based at least upon the one or more gestures and the relative orientation angle between the user and the plane of the display element.
1 Assignment
0 Petitions
Accused Products
Abstract
A low distraction interface can be provided for an electronic device by monitoring information such as a gaze direction of a user. A device can be configured to only display information to a user when the user is looking substantially toward the device, and can be configured to present that information in a way that is of minimal distraction to others nearby. The user can control aspects of the display by looking away from the device, continuing to look at the device for a period of time, or otherwise providing input without physically interacting with the device. Such an approach can be beneficial for settings such as business meetings where a user might want to obtain information from the device but does not want to appear inconsiderate by picking up and checking information on the device.
31 Citations
17 Claims
-
1. A computer-implemented method for enabling a user to interface with an electronic device, comprising:
-
receiving image data from at least one image capture element of the electronic device; analyzing the image data to determine at least one period during which a viewing direction of a user is in a direction at least substantially toward a display element of the electronic device; analyzing the image data to determine a relative orientation angle between the user and a plane of the display element of the electronic device; determining information to be displayed on the display element based at least upon a movement or gesture represented in the image data; determining an adjusted rendering angle and elongation based at least in part on the relative orientation angle between the user and the plane of the display element; displaying, during the at least one period, the information on the display element according to the adjusted rendering angle and the elongation so that the information appears in a normal aspect ratio and at a normal angle to the user; analyzing additional image data received from the at least one image capture element of the electronic device to determine one or more gestures; and displaying additional information based at least upon the one or more gestures and the relative orientation angle between the user and the plane of the display element. - View Dependent Claims (2, 3, 4, 5, 6, 7)
-
-
8. A computer system, comprising:
-
at least one processor; at least one display element; at least one image capture element; and at least one memory device including instructions operable to be executed by the at least one processor to perform a set of actions, enabling the computer system to; receive image data from the at least one image capture element of the computer system; analyze the image data to determine at least one period during which a viewing direction of a user is in a direction at least substantially towards the at least one display element of the computer system; analyze the image data to determine a relative orientation angle between the user and a plane of the at least one display element of the computer system; determine information to be displayed on the at least one display element based at least upon a movement or gesture represented in the image data; determine an adjusted rendering angle and elongation based at least in part on the relative orientation angle between the user and the plane of the at least one display element; display, during the at least one period, the information on the at least one display element according to the adjusted rendering angle and the elongation; analyze additional image data received from the at least one image capture element of the computer system to determine one or more gestures; and display additional information based at least upon the one or more gestures and the relative orientation angle between the user and the plane of the at least one display element. - View Dependent Claims (9, 10, 11, 12, 13, 14)
-
-
15. A computing device including instructions that, upon being executed by a processor of a computing device, cause the computing device to:
-
determine at least one period during which a viewing direction of a user is in a direction that is at least substantially toward a display element of the computing device; determine a relative orientation angle between the user and a plane of the display element of the computing device; determine information to be displayed on the display element based at least upon a gesture; determine an adjusted rendering angle and elongation based at least in part on the relative orientation angle between the user and the plane of the display element; display, during the at least one period, the information on the display element according to the adjusted rendering angle and the elongation; analyze additional image data received from at least one image capture element of the computing device to determine one or more gestures; and display additional information based at least upon the one or more gestures and the relative orientation angle between the user and the plane of the display element. - View Dependent Claims (16, 17)
-
Specification