Application programming interfaces for gesture operations
First Claim
1. A machine readable non-transitory medium storing one or more programs for execution by one or more processing units of a data processing system, the one or more programs including instructions for:
- displaying two or more windows of an application;
receiving an event;
determining whether the event is a hand event that is based on a user input that includes one or more input points touching a display of the system or is a system level event that is not a hand event; and
in accordance with a determination that the event is a hand event based on a user input, routing the event to a window that received the user input and routing the event from the window to an appropriate control of the application by calling a mouse function or a gesture function for processing the event, wherein the window is a respective window of the two or more displayed windows of the application.
0 Assignments
0 Petitions
Accused Products
Abstract
At least certain embodiments of the present disclosure include an environment with user interface software interacting with a software application to provide gesture operations for a display of a device. A method for operating through an application programming interface (API) in this environment includes transferring a scaling transform call. The gesture operations include performing a scaling transform such as a zoom in or zoom out in response to a user input having two or more input points. The gesture operations also include performing a rotation transform to rotate an image or view in response to a user input having two or more input points.
-
Citations
29 Claims
-
1. A machine readable non-transitory medium storing one or more programs for execution by one or more processing units of a data processing system, the one or more programs including instructions for:
-
displaying two or more windows of an application; receiving an event; determining whether the event is a hand event that is based on a user input that includes one or more input points touching a display of the system or is a system level event that is not a hand event; and in accordance with a determination that the event is a hand event based on a user input, routing the event to a window that received the user input and routing the event from the window to an appropriate control of the application by calling a mouse function or a gesture function for processing the event, wherein the window is a respective window of the two or more displayed windows of the application. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9)
-
-
10. A method, comprising:
at a data processing system with a display, one or more processing units, and memory storing one or more programs for execution by the one or more processing units; displaying two or more windows of an application; receiving an event; determining whether the event is a hand event that is based on a user input that includes one or more input points touching the display or a system level event that is not a hand event; and in accordance with a determination that the event is a hand event based on a user input, routing the event to a window that received the user input and routing the event from the window to an appropriate control of the application by calling a mouse function or a gesture function for processing the event, wherein the window is a respective window of the two or more displayed windows of the application. - View Dependent Claims (11, 12, 13, 14, 15, 16, 17, 18, 19, 20)
-
21. An electronic device comprising:
-
a display integrated with an input panel; one or more processing units coupled to the input panel; and memory storing one or more programs for execution by the one or more processing units, the one or more programs including instructions for; displaying two or more windows of an application; receiving an event; determining whether the event is a hand event that is based on a user input that includes one or more input points touching the display or is a system level event that is not a hand event; and in accordance with a determination that the event is a hand event based on a user input, routing the event to a window that received the user input and routing the event from the window to an appropriate control of the application by calling a mouse function or a gesture function for processing the event, wherein the window is a respective window of the two or more displayed windows of the application. - View Dependent Claims (22, 23, 24, 25, 26, 27, 28, 29)
-
Specification