Multi-touch object inertia simulation
First Claim
1. A computing device for handling inputs from multi-input hardware, the multi-input hardware allowing a user to input the inputs by manipulating one or more physical input objects sensed by the multi-input hardware, the system comprising:
- a module available for manipulation processing of inputs for arbitrary applications running on the computer device, wherein the inputs are passed to the module, the inputs having been inputted by the multi-input hardware sensing changing locations of a physical input object being moved by the user, wherein the module translates the inputs to geometric manipulations that are received by an application associated with the inputs, the application displaying a graphic object on the display by applying the manipulations to the graphic object to change a location and/or an appearance of the graphic object;
when the multi-input hardware stops providing the inputs corresponding to the physical input object, the module stops generating the manipulations and generates simulated geometric manipulations based on the inputs.
1 Assignment
0 Petitions
Accused Products
Abstract
The inertia system provides a common platform and application-programming interface (API) for applications to extend the input received from various multi-touch hardware devices to simulate real-world behavior of application objects. To move naturally, application objects should exhibit physical characteristics such as elasticity and deceleration. When a user lifts all contacts from an object, the inertia system provides additional manipulation events to the application so that the application can handle the events as if the user was still moving the object with touch. The inertia system generates the events based on a simulation of the behavior of the objects. If the user moves an object into another object, the inertia system simulates the boundary characteristics of the objects. Thus, the inertia system provides more realistic movement for application objects manipulated using multi-touch hardware and the API provides a consistent feel to manipulations across applications.
26 Citations
22 Claims
-
1. A computing device for handling inputs from multi-input hardware, the multi-input hardware allowing a user to input the inputs by manipulating one or more physical input objects sensed by the multi-input hardware, the system comprising:
-
a module available for manipulation processing of inputs for arbitrary applications running on the computer device, wherein the inputs are passed to the module, the inputs having been inputted by the multi-input hardware sensing changing locations of a physical input object being moved by the user, wherein the module translates the inputs to geometric manipulations that are received by an application associated with the inputs, the application displaying a graphic object on the display by applying the manipulations to the graphic object to change a location and/or an appearance of the graphic object; when the multi-input hardware stops providing the inputs corresponding to the physical input object, the module stops generating the manipulations and generates simulated geometric manipulations based on the inputs. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9)
-
-
10. A device for handling touch input from multi-touch hardware, the device comprising:
-
a hardware interface that when the computer system is operating will communicate with the multi-touch hardware to receive touch contact information and movements of contacts; one or more manipulation processors configured to manage interpretation of movement of each contact associated with a particular application object; an input transformation component configured to interpret a meaning of received movements of various contacts to produce manipulations of application objects, wherein the application applies the manipulations to the application object to move the application object; a simulation component configured to simulate continued movement of the application object after a user stops touching the object; and an application interface configured to communicate with the application to receive contact movement information and provide manipulation transforms to the application. - View Dependent Claims (11, 12, 13, 14, 15)
-
-
16. A method performed by a computing device, the method comprising:
-
executing a manipulation system available to process inputs for arbitrary applications on the computing device, each input comprising a location and a corresponding input time captured by an input device capable of concurrently sensing locations of two or more physical objects, the manipulation system interpreting the inputs to generate manipulations that are received and handled by the applications, each manipulation comprising a geometric transformation and/or translation, wherein one of the applications applies manipulations received thereby to a graphic object managed by the application to graphically transform and/or translate the graphic object; and determining that input from a physical object has ended, and in response the manipulation system, according to inputs of the physical object previously received by the manipulation system before the determining, automatically generating simulated manipulations that are received by the one of the applications, the simulated manipulations comprising respective geometric transformations and/or translations that are applied by the application to the graphic object to simulate continuation of the transformation and/or translation of the graphic object. - View Dependent Claims (17, 18, 19, 20)
-
-
21. A method performed by a computing device comprising storage storing an operating system, a processor, a display, and a touch input surface, the method comprising:
-
executing an application displaying a user interface element, wherein a user of the computing device moves a finger touching the touch input surface to interactively move the user interface element on the display, the moving implemented by the operating system passing touch inputs to the application and the application using the touch inputs to obtain computed manipulations that the application applies to the user interface element to cause the displayed movement thereof; responsive to determining that the user has stopped touching the touch input surface with the finger, notifying an inertia engine to simulate inertial movement according to at least some of the touch inputs; after the finger has stopped touching the touch input surface, receiving, by the application, simulated movements, and applying, by the application, the simulated movements to the user interface element to simulate inertial movement of the user interface object. - View Dependent Claims (22)
-
Specification