MULTI-TOUCH INTERFACE GESTURES FOR KEYBOARD AND/OR MOUSE INPUTS
First Claim
Patent Images
1. A method of interpreting gestures made on a touch screen that is displaying a mouse-and-keyboard based user interface (UI), comprising:
- receiving a press-and-hold event in combination with a tap event in response to gestures made on the touch screen;
translating the received events to a mouse right-click command;
transmitting the mouse right-click command to an operating system; and
receiving from the guest operating system, in response to the transmitted mouse right-click command, an updated mouse-and-keyboard based UI.
1 Assignment
0 Petitions
Accused Products
Abstract
A mouse-and-keyboard based user interface is updated based on gestures made on a touch screen that is displaying the mouse-and-keyboard based user interface. The user interface update process includes the steps of receiving one or more touch events in response to a gesture made on the touch screen, translating the touch events to a mouse-and-keyboard based command, transmitting the mouse-and-keyboard based command to an operating system, and receiving an updated display in response thereto.
98 Citations
18 Claims
-
1. A method of interpreting gestures made on a touch screen that is displaying a mouse-and-keyboard based user interface (UI), comprising:
-
receiving a press-and-hold event in combination with a tap event in response to gestures made on the touch screen; translating the received events to a mouse right-click command; transmitting the mouse right-click command to an operating system; and receiving from the guest operating system, in response to the transmitted mouse right-click command, an updated mouse-and-keyboard based UI. - View Dependent Claims (2, 3, 4, 5, 6)
-
-
7. A method of interpreting gestures made on a touch screen that is displaying a mouse-and-keyboard based user interface (UI), comprising:
-
receiving a first press-and-hold event for a time period that is longer than a predetermined time period to cause a loupe that provides an enlarged view of a portion of the UI within the loupe to be displayed and, at the end of the time period, to cause a drag handle to be displayed in place of the loupe; receiving a second press-and-hold event at a location of the drag handle; and transmitting mouse down commands with locations that correspond to updated locations of the second press-and-hold event. - View Dependent Claims (8, 9, 10, 11, 12)
-
-
13. A method of interpreting gestures made on a touch screen that is displaying a mouse-and-keyboard based user interface (UI), comprising:
-
receiving a simultaneous multi-finger tap event to cause a virtual keyboard to be displayed; determining that the virtual keyboard obscures a portion of the UI on which the multi-finger tap event was made; and in response to said determining, causing the UI to be panned such that the portion of the UI is visible to the user. - View Dependent Claims (14)
-
-
15. A computing device comprising:
-
a touch screen displaying a user interface (UI) having a cursor, and a virtual touch pad overlaid on the UI; and a processing unit programmed to recognize gestures made on the touch screen and execute, in response thereto, a process associated with the gestures, wherein the gestures made on top of the virtual touch pad cause an updated UI with a new position of the cursor to be displayed. - View Dependent Claims (16, 17, 18)
-
Specification