Gesture controls for multi-screen user interface
First Claim
Patent Images
1. A method for controlling a plurality of displays of a handheld computing device, comprising:
- providing a hand held device having two displays, wherein the displays are physically attached but movable with respect to each other, wherein when the hand held device is folded, the two displays face in substantially opposite directions and, when the hand held device is open, the displays face in a substantially same direction, wherein the first display includes a first gesture sensor and a first touch sensitive display within a boundary of a first bezel, and wherein the second display includes a second gesture sensor and a second touch sensitive display within a boundary of a second bezel, wherein the first and second gesture sensors accept touch input but do not display information on the first or second displays;
executing an application on said handheld computing device such that a screen of said application is displayed in the first touch sensitive display;
receiving a first gesture input at the first or second gesture sensor of said handheld computing device, wherein the first and second gesture sensors are physically separate from the first and second touch sensitive displays;
modifying the first and second touch sensitive displays in response to the first gesture input such that the screen is displayed in at least a second touch sensitive display of the plurality of touch sensitive displays, wherein if the first gesture input were received in the first or second touch sensitive display, a different response would occur, and wherein at least one of the following is true;
(a) the first gesture input is directional, the second touch sensitive display is located in a direction with respect to the first touch sensitive display corresponding to the direction of the first gesture, the screen occupies a single touch sensitive display and said modifying includes moving the screen from the first touch sensitive display to the second touch sensitive display, and the modifying includes revealing an underlying screen in the first touch sensitive display and obscuring another screen in the second touch sensitive display,(b) the first gesture input is directional, the second touch sensitive display is located in a direction with respect to the first touch sensitive display corresponding to the direction of the first gesture, the application is expandable to occupy the first and second touch sensitive displays and the modifying includes expanding the application such that one or more screens corresponding to the application occupy both the first touch sensitive display and the second touch sensitive display, and the expanding includes scaling the application to occupy the first and second touch sensitive display, or(c) the modifying includes constrained movement of the screen linearly between the first touch sensitive display and the second touch sensitive display.
4 Assignments
0 Petitions
Accused Products
Abstract
Method and apparatus for controlling a computing device using gesture inputs. The computing device may be a handheld computing device with multiple displays. The displays may be capable of displaying a graphical user interface (GUI). The GUI may be a multi screen GUI or a single screen GUI such that receipt of gesture inputs may result in the movement of a GUI from one display to another display or may result in maximization of a multi screen GUI across multiple displays.
-
Citations
13 Claims
-
1. A method for controlling a plurality of displays of a handheld computing device, comprising:
-
providing a hand held device having two displays, wherein the displays are physically attached but movable with respect to each other, wherein when the hand held device is folded, the two displays face in substantially opposite directions and, when the hand held device is open, the displays face in a substantially same direction, wherein the first display includes a first gesture sensor and a first touch sensitive display within a boundary of a first bezel, and wherein the second display includes a second gesture sensor and a second touch sensitive display within a boundary of a second bezel, wherein the first and second gesture sensors accept touch input but do not display information on the first or second displays; executing an application on said handheld computing device such that a screen of said application is displayed in the first touch sensitive display; receiving a first gesture input at the first or second gesture sensor of said handheld computing device, wherein the first and second gesture sensors are physically separate from the first and second touch sensitive displays; modifying the first and second touch sensitive displays in response to the first gesture input such that the screen is displayed in at least a second touch sensitive display of the plurality of touch sensitive displays, wherein if the first gesture input were received in the first or second touch sensitive display, a different response would occur, and wherein at least one of the following is true; (a) the first gesture input is directional, the second touch sensitive display is located in a direction with respect to the first touch sensitive display corresponding to the direction of the first gesture, the screen occupies a single touch sensitive display and said modifying includes moving the screen from the first touch sensitive display to the second touch sensitive display, and the modifying includes revealing an underlying screen in the first touch sensitive display and obscuring another screen in the second touch sensitive display, (b) the first gesture input is directional, the second touch sensitive display is located in a direction with respect to the first touch sensitive display corresponding to the direction of the first gesture, the application is expandable to occupy the first and second touch sensitive displays and the modifying includes expanding the application such that one or more screens corresponding to the application occupy both the first touch sensitive display and the second touch sensitive display, and the expanding includes scaling the application to occupy the first and second touch sensitive display, or (c) the modifying includes constrained movement of the screen linearly between the first touch sensitive display and the second touch sensitive display. - View Dependent Claims (2, 3, 4, 5, 6)
-
-
7. A method for controlling a plurality of displays of a handheld computing device, comprising:
-
executing an application on a handheld computing device such that one or more displays of said application is displayed in a first touch sensitive display and a second touch sensitive display of said handheld computing device, wherein the hand held computing device has two touch sensitive displays including the first and second touch sensitive displays, wherein the touch sensitive displays are physically attached but movable with respect to each other, wherein when the hand held device is folded, the two touch sensitive displays face in substantially opposite directions and, when the hand held device is open, the touch sensitive displays face in a substantially same direction, wherein the first touch sensitive display includes a first gesture sensor and a touch sensitive display, within a boundary of a first bezel, and wherein the second touch sensitive display includes a second gesture sensor and a touch sensitive display, within a boundary of a second bezel, wherein the first and second gesture sensors accept touch input but do not display information on the first or second touch sensitive displays; receiving a gesture input at the first or second gesture sensor of said handheld computing device, wherein the first and second gesture sensors are physically separate from the first and second touch sensitive displays; and modifying the first and second touch sensitive displays in response to the gesture input such that the application is displayed in the first touch sensitive display of the handheld computing device and not in the second touch sensitive display, wherein if the first gesture input were received in the first or second touch sensitive display, a different response would occur, wherein the gesture display is directional and the first touch sensitive display corresponds to the direction of the gesture, and wherein at least one of the following is true; (a) said modifying includes hiding a node screen displayed in the second touch sensitive display prior to the modifying, or (b) said modifying includes reducing the visible area of the application. - View Dependent Claims (8, 9)
-
-
10. A non-transitory computer readable medium that causes a processor to execute a method for controlling a plurality of displays of a handheld computing device, comprising:
-
providing a hand held computing device with a unitary display having two portions, wherein the two portions are logically separate, wherein a first portion is a first touch sensitive display and is associated with a first gesture sensor, and wherein a second portion is a second touch sensitive display and is associated with a second gesture sensor; executing an application on said handheld computing device such that a screen of said application is displayed in the first touch sensitive display; receiving a first gesture input at the first gesture sensor or the second gesture sensor of said handheld computing device, wherein the first and second gesture sensors are physically separate from the first and second touch sensitive displays but located on the display within a boundary of a bezel defining the unitary display, and wherein the first and second gesture sensors accept touch input but do not display information on the first or second touch sensitive displays; in response to the first gesture input, modifying the first and second touch sensitive displays such that the screen is displayed in at least the second touch sensitive display, wherein if the first gesture input were received in the first touch sensitive display, a different response would occur. - View Dependent Claims (11, 12, 13)
-
Specification