Mobile device off-screen gesture area
First Claim
1. A method, comprising:
- providing a device having at least first and second screens with a seam located between the first and the second screens, wherein the device includes an off-screen gesture area;
receiving a first user interface input in the off-screen gesture area that triggers a mode change in the device to select an object on a display area of one of the first and second screens, the display area being separate from the off-screen gesture area;
receiving a second user interface input across the seam in a second off-screen gesture area on the second screen;
determining whether the first and second user interface inputs should be considered as a single user interaction or two separate user interactions based at least on a prediction that the first user interface input will continue across the seam onto the second screen;
determining an application mode of the device, andin response to the first and second user interface inputs, and the determined application mode, changing a display of the object in one or more of the first and second screens.
1 Assignment
0 Petitions
Accused Products
Abstract
Methods and devices for receiving input and presenting a user interface with two screens and an off screen gesture area. The device may have an off screen gesture area that accepts user input outside the display area. The interface inputs received in the off screen gesture are may have special handling and cause different display changes. Further, the device, having two screens, may receive user interface inputs that cross the seam between the two displays. To provide a display that acts like a single display area, the device can predict motions may cross the seam and then interrelate separate inputs on separate screens. The interrelated inputs can cause display changes as if the inputs were received as a single user interaction.
12 Citations
20 Claims
-
1. A method, comprising:
-
providing a device having at least first and second screens with a seam located between the first and the second screens, wherein the device includes an off-screen gesture area; receiving a first user interface input in the off-screen gesture area that triggers a mode change in the device to select an object on a display area of one of the first and second screens, the display area being separate from the off-screen gesture area; receiving a second user interface input across the seam in a second off-screen gesture area on the second screen; determining whether the first and second user interface inputs should be considered as a single user interaction or two separate user interactions based at least on a prediction that the first user interface input will continue across the seam onto the second screen; determining an application mode of the device, and in response to the first and second user interface inputs, and the determined application mode, changing a display of the object in one or more of the first and second screens. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9, 10)
-
-
11. A device, comprising:
-
a first screen; a second screen; a seam located between the first and the second screens; an off screen gesture area associated with the first screen; a second off screen gesture area associated with second screen; a memory; a processor in communication with the memory, the first screen, and the second screen, the processor operable to; receive a first user interface input in the off-screen gesture area that triggers a mode change in the device to select an object on a display area of one of the first and second screens, the display area being separate from the off-screen gesture area; receive a second user interface input across the seam in the second off-screen gesture area; determine whether the first and second user interface inputs should be considered to be a single user interaction or two separate user interactions based at least on a prediction that the first user interface input will continue across the seam onto the second screen; determine an application mode of the device, and in response to the first and second user interface inputs, and the determined application mode, change a display of the object in one or more of the first screen or second screen. - View Dependent Claims (12, 13, 14, 15)
-
-
16. A non-transitory computer readable medium having stored thereon computer-executable instructions, the computer executable instructions causing a processor of a multi-screen device with a seam between a first screen and a second screen to execute a method for providing a user interface, the computer-executable instructions comprising:
-
instructions to receive a first user interface input in the off-screen gesture area of the first screen that triggers a mode change in the device to select an object on a display area of one of the first and second screens, the display area being separate from the off-screen gesture area; instructions to receive a second user interface input across the seam in a second off-screen gesture area on the second screen; instructions to determine whether the first user interface input and the second user interface input should be considered as a single user interaction or two separate user interactions based at least on a prediction that the first user interface input will continue across the seam onto the second screen; instructions to determine an application mode of the device, and in response to the first and second user interface inputs, and the determined application mode, instructions to change a display of the object in one or more of the first screen and second screen. - View Dependent Claims (17, 18, 19, 20)
-
Specification