Low distraction interfaces
First Claim
1. A method of providing a low distraction interface on an electronic device, comprising:
- capturing image information using at least one image capture element of the electronic device;
analyzing, using a processor of the electronic device, the image information to determine a gaze direction of a user with respect to the electronic device;
displaying a determined type of information when the gaze direction of the user is determined to be substantially towards the electronic device, the determined type of information including one or more elongated characters based at least in part on a particular orientation of the user relative to the electronic device;
displaying additional information of the determined type of information when the gaze direction of the user is determined to be substantially towards the electronic device for at least a threshold period of time, wherein displaying the additional information is based, at least in part, on detecting at least one of a head nod of the user, a head shake of the user, a hand gesture of the user, a blink of the user, or a movement of a facial feature of the user other than an eye;
ceasing the display of the determined type of information when the gaze direction of the user is determined to be substantially away from the electronic device, wherein the display of information on the electronic device is operable to be controlled by the user through changes in gaze direction and without a requirement for physical interaction between the user and the electronic device;
storing, at least temporarily, at least a portion of the image information, wherein the image information is captured over time; and
determining that the at least a portion of the image information corresponds to a feature of the user in response to at least one calibration for the electronic device to improve determination of the gaze direction of the user, the at least one calibration being based at least in part upon a previous calibration performed on the electronic device, the electronic device being calibrated based, at least in part, on identifying the feature of the user represented in the at least the portion of the image information and receiving a confirmation that the feature is correctly identified.
1 Assignment
0 Petitions
Accused Products
Abstract
A low distraction interface can be provided for an electronic device by monitoring information such as a gaze direction of a user. A device can be configured to only display information to a user when the user is looking substantially toward the device, and can be configured to present that information in a way that is of minimal distraction to others nearby. The user can control aspects of the display by looking away from the device, continuing to look at the device for a period of time, or otherwise providing input without physically interacting with the device. Such an approach can be beneficial for settings such as business meetings where a user might want to obtain information from the device but does not want to appear inconsiderate by picking up and checking information on the device.
-
Citations
25 Claims
-
1. A method of providing a low distraction interface on an electronic device, comprising:
-
capturing image information using at least one image capture element of the electronic device; analyzing, using a processor of the electronic device, the image information to determine a gaze direction of a user with respect to the electronic device; displaying a determined type of information when the gaze direction of the user is determined to be substantially towards the electronic device, the determined type of information including one or more elongated characters based at least in part on a particular orientation of the user relative to the electronic device; displaying additional information of the determined type of information when the gaze direction of the user is determined to be substantially towards the electronic device for at least a threshold period of time, wherein displaying the additional information is based, at least in part, on detecting at least one of a head nod of the user, a head shake of the user, a hand gesture of the user, a blink of the user, or a movement of a facial feature of the user other than an eye; ceasing the display of the determined type of information when the gaze direction of the user is determined to be substantially away from the electronic device, wherein the display of information on the electronic device is operable to be controlled by the user through changes in gaze direction and without a requirement for physical interaction between the user and the electronic device; storing, at least temporarily, at least a portion of the image information, wherein the image information is captured over time; and determining that the at least a portion of the image information corresponds to a feature of the user in response to at least one calibration for the electronic device to improve determination of the gaze direction of the user, the at least one calibration being based at least in part upon a previous calibration performed on the electronic device, the electronic device being calibrated based, at least in part, on identifying the feature of the user represented in the at least the portion of the image information and receiving a confirmation that the feature is correctly identified. - View Dependent Claims (2, 3)
-
-
4. A method of enabling a user to interface with an electronic device, comprising:
under control of one or more computing systems configured with executable instructions, capturing one or more images using at least one image capture element; analyzing the captured one or more images to determine one or more periods during which a gaze direction of at least one user is in a direction at least substantially towards a display element of the electronic device; displaying information on the display element during the one or more periods when the gaze direction of the at least one user is in a direction at least substantially towards a display element of the electronic device, the displayed information including one or more elongated characters based at least in part on a particular orientation of the at least one user relative to the electronic device; based at least in part upon a motion or aspect of the user detected in the captured one or more images, selecting the information to be displayed on the display element, wherein the at least one user is able to control the display of information on the display element of the electronic device without physically contacting the device; displaying additional information on the display element based, at least in part, on detecting at least one of a head nod of the at least one user, a head shake of the at least one user, a hand gesture of the at least one user, a blink of the at least one user, or a movement of a facial feature of the at least one user other than an eye; storing, at least temporarily, at least a portion of data represented in the one or more images, wherein the one or more images are captured over time; and determining that the at least a portion of the data corresponds to a feature of the at least one user in response to at least one calibration for the electronic device to improve detection of the motion or aspect of the at least one user based, at least in part, on the at least the portion of the data, the at least one calibration being based at least in part upon a previous calibration performed on the electronic device. - View Dependent Claims (5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18)
-
19. A computing device, comprising:
-
a processor; a display element; at least one image capture element; and a memory device including instructions operable to be executed by the processor to perform a set of actions, enabling the processor to; capture one or more images using the at least one image capture element; analyze the captured one or more images to determine one or more periods during which at least one user is looking in a direction at least substantially towards the display element of the computing device; display information on the display element during the one or more periods when the at least one user is determined to be looking in the direction at least substantially towards the display element of the computing device, the displayed information including one or more elongated characters based at least in part on a particular orientation of the at least one user relative to the computing device; based at least in part upon a motion or aspect of the at least one user detected in the captured one or more images, select the information to be displayed on the display element, wherein the at least one user is able to control the display of information on the display element of the electronic device without physically contacting the computing device; display additional information on the display element based, at least in part, on detecting at least one of a head nod of the at least one user, a head shake of the at least one user, a hand gesture of the at least one user, a blink of the at least one user, or a movement of a facial feature of the at least one user other than an eye; store, at least temporarily, at least a portion of data represented in the one or more images, wherein the one or more images are captured over time; and determine that the at least a portion of the data corresponds to a feature of the at least one user in response to at least one calibration for the computing device to improve detection of the motion or aspect of the at least one user based, at least in part, on the at least the portion of the data, the at least one calibration being based at least in part upon a previous calibration performed on the computing device. - View Dependent Claims (20, 21, 22)
-
-
23. A non-transitory computer-readable storage medium storing processor-executable instructions for controlling a computing device, comprising:
-
program code for capturing one or more images using the at least one image capture element; program code for analyzing the captured one or more images to determine one or more periods during which at least one user is looking in a direction at least substantially towards the display element of the computing device; program code for displaying information on the display element during the one or more periods when the at least one user is determined to be looking in the direction at least substantially towards the display element of the computing device, the displayed information including one or more elongated characters based at least in part on a particular orientation of the at least one user relative to the computing device; program code for, based at least in part upon a motion or aspect of the at least one user detected in the captured one or more images, selecting the information to be displayed on the display element, wherein the at least one user is able to control the display of information on the display element of the electronic device without physically contacting the computing device; program code for displaying additional information on the display element based, at least in part, on detecting at least one of a head nod of the at least one user, a head shake of the at least one user, a hand gesture of the at least one user, a blink of the at least one user, or a movement of a facial feature of the at least one user other than an eye; program code for storing, at least temporarily, at least a portion of data represented in the one or more images, wherein the one or more images are captured over time; and program code for determining that the at least a portion of the data corresponds to a feature of the at least one user in response to at least one calibration for the computing device to improve detection of the motion or aspect of the at least one user based, at least in part, on the at least the portion of the data, the at least one calibration being based at least in part upon a previous calibration performed on the computing device. - View Dependent Claims (24, 25)
-
Specification