Detecting user focus on hinged multi-screen device
First Claim
1. A mobile computing device comprising:
- a processor;
an accelerometer;
two or more display devices; and
a housing including the processor, the accelerometer, and the two or more display devices, the housing including a hinge between a pair of display devices of the two or more display devices, the hinge being configured to permit the pair of display devices to rotate between angular orientations from a face-to-face angular orientation to a back-to-back angular orientation;
the processor being configured to;
detect a current angular orientation between the pair of display devices indicating that the pair of display devices are facing away from each other;
detect a first touch input via a capacitive touch sensor of a first display device of the pair of display devices, a second touch input via a capacitive touch sensor of a second display device of the pair of display devices, and a third touch input via a capacitive touch sensor of a side of the mobile computing device;
determine that the first touch input is an intended touch input and the second touch input is an incidental touch input;
determine an orientation of the mobile computing device based on at least the first touch input, the second touch input, and the third touch input; and
determine which display device of the two or more display devices has a current user focus based on at least the determined orientation of the mobile computing device; and
perform a predetermined action based on the current user focus.
1 Assignment
0 Petitions
Accused Products
Abstract
A mobile computing device is provided that includes a processor, an accelerometer, two or more display devices, and a housing including the processor, the accelerometer, and the two or more display devices, determine a current user focus indicating that a first display device of the pair of display devices is being viewed by the user, and that a second display device of the pair of display devices is not being viewed by the user, detect a signature gesture input based on accelerometer data received via the accelerometer detecting that the mobile computing device has been rotated more than a threshold degree, determine that the current user focus has changed from the first display device to the second display device based on at least detecting the signature gesture input, and perform a predetermined action based on the current user focus.
-
Citations
16 Claims
-
1. A mobile computing device comprising:
-
a processor; an accelerometer; two or more display devices; and a housing including the processor, the accelerometer, and the two or more display devices, the housing including a hinge between a pair of display devices of the two or more display devices, the hinge being configured to permit the pair of display devices to rotate between angular orientations from a face-to-face angular orientation to a back-to-back angular orientation; the processor being configured to; detect a current angular orientation between the pair of display devices indicating that the pair of display devices are facing away from each other; detect a first touch input via a capacitive touch sensor of a first display device of the pair of display devices, a second touch input via a capacitive touch sensor of a second display device of the pair of display devices, and a third touch input via a capacitive touch sensor of a side of the mobile computing device; determine that the first touch input is an intended touch input and the second touch input is an incidental touch input; determine an orientation of the mobile computing device based on at least the first touch input, the second touch input, and the third touch input; and determine which display device of the two or more display devices has a current user focus based on at least the determined orientation of the mobile computing device; and perform a predetermined action based on the current user focus. - View Dependent Claims (2, 3, 4, 5, 6, 7)
-
-
8. A mobile computing device comprising:
-
a processor; an accelerometer; two or more display devices; and a housing including the processor, the accelerometer, and the two or more display devices, the housing including a hinge between a pair of display devices of the two or more display devices, the hinge being configured to permit the pair of display devices to rotate between angular orientations from a face-to-face angular orientation to a back-to-back angular orientation; the processor being configured to; detect a current angular orientation between the pair of display devices indicating that the pair of display devices are facing away from each other; detect a first ambient light level for a surrounding in front of a first display device of the pair of display devices via a first ambient light sensor and a second ambient light level for a surrounding in front of a second display device of the pair of display devices via a second ambient light sensor; determine that the first ambient light level is above a threshold light level and that the second ambient light level is below the threshold light level; based on at least determining that the first ambient light level is above the threshold light level, determine that the first display device has the current user focus indicating that the first display device of the pair of display devices is being viewed by the user; and based on at least determining that the second ambient light level is below the threshold light level, determine that the second display device does not have the current user focus indicating that the second display device of the pair of display devices is not being viewed by the user; and perform a predetermined action based on the current user focus.
-
-
9. A mobile computing device comprising:
-
a processor; an accelerometer; two or more display devices; and a housing including the processor, the accelerometer, and the two or more display devices, the housing including a hinge between a pair of display devices of the two or more display devices, the hinge being configured to permit the pair of display devices to rotate between angular orientations from a face-to-face angular orientation to a back-to-back angular orientation; the processor being configured to; detect a current angular orientation between the pair of display devices indicating that the pair of display devices are facing away from each other; detect a first depth image of a surrounding in front of a first display device of the pair of display devices via a first depth camera and a second depth image of a surrounding in front of a second display device of the pair of display devices via a second depth camera; determine that the first depth image includes depth values within a depth threshold and that the second depth image does not include depth values with the depth threshold; based on at least determining that the first depth image includes depth values within the depth threshold, determine that the first display device has a current user focus indicating that the first display device of the pair of display devices is being viewed by the user; and based on at least determining that the second depth image does not include depth values within the depth threshold, determine that the second display device does not have the current user focus indicating that the second display device of the pair of display devices is not being viewed by the user; and perform a predetermined action based on the current user focus.
-
-
10. A computer-implemented method comprising:
-
detecting a current angular orientation between a pair of display devices of a mobile computing device, indicating that the pair of display devices are facing away from each other; detecting a first touch input via a capacitive touch sensor of a first display device of the pair of display devices, a second touch input via a capacitive touch sensor of a second display device of the pair of display devices, and a third touch input via a capacitive touch sensor of a side of the mobile computing device; determining that the first touch input is an intended touch input and the second touch input is an incidental touch input; determining an orientation of the mobile computing device based on at least the first touch input, the second touch input, and the third touch input; and determining which display device of the two or more display devices has a current user focus based on at least the determined orientation of the mobile computing device; and performing a predetermined action based on the current user focus. - View Dependent Claims (11, 12, 13, 14, 15)
-
-
16. A mobile computing device comprising:
-
a processor; two or more sensor devices including a first sensor and a second sensor, the first sensor consuming less power than the second sensor; two or more display devices; and a housing including the processor, the two or more sensor devices, and the two or more display devices, the housing including a hinge between a pair of display devices of the two or more display devices, the hinge being configured to permit the pair of display devices to rotate between angular orientations from a face-to-face angular orientation to a back-to-back angular orientation; the processor being configured to; detect a current angular orientation between the pair of display devices indicating that the pair of display devices are facing away from each other; determine a current user focus indicating that a first display device of the pair of display devices is being viewed by the user, and that a second display device of the pair of display devices is not being viewed by the user, where to determine the current user focus, the processor is configured to; receive a first signal from the first sensor and determine a candidate user focus based on the first signal at a first computed confidence level; determine that the first computed confidence level does not exceed a predetermined threshold, and in response to this determination, powering up the second sensor; receive a second signal from the second sensor; determine the candidate user focus based on the second signal and the first signal at a second computed confidence level; and determine whether the second computed confidence level exceeds the predetermined threshold, and if so, determine that the current user focus is the candidate user focus; and reduce a power usage of a display device that does not have the current user focus.
-
Specification