Instantiable gesture objects
First Claim
1. A method comprising:
- instantiating a gesture object for an application to handle gesture recognition for the application through native gesture functionality provided by a computing device;
associating the gesture object with interaction inputs and a target element specified by the application such that the interaction inputs directed to the target element are offloaded to the gesture object configured for the application, the target element representing a selectable element rendered by the computing device;
creating a recognizer on behalf of the application to facilitate gesture recognition through the native gesture functionality provided by the computing device;
feeding interaction input data for the interaction inputs to the recognizer to enable recognition of gestures corresponding to the application based on the interaction input data;
obtaining gesture event messages from the recognizer that are indicative of recognized gestures for the application;
processing raw gesture data described by the gesture event messages on behalf of the application using the gesture object; and
firing gesture events having processed gesture data to the associated target element in accordance with a content model for the application such that the recognized gestures conveyed via the gesture event messages are applied to the target element.
2 Assignments
0 Petitions
Accused Products
Abstract
Instantiable gesture object techniques are described in which native gesture functionality is abstracted to applications using a script-based recognition interface. Gesture objects may be instantiated for different interaction contexts at the direction of applications programmed using dynamic scripting languages. Gesture objects can be configured to designate particular touch contacts and/or other inputs to consider for gesture recognition and a target element of content to which corresponding recognized gestures are applicable. After creation, gesture objects manage gesture processing operations on behalf of the applications including creating recognizers with the native gesture system, feeding input data for processing, and transforming raw gesture data into formats appropriate for the application and/or a target element. Accordingly, script-based applications may use the gesture objects to offload processing tasks associated with gesture recognition and take advantage of native gesture functionality.
-
Citations
20 Claims
-
1. A method comprising:
-
instantiating a gesture object for an application to handle gesture recognition for the application through native gesture functionality provided by a computing device; associating the gesture object with interaction inputs and a target element specified by the application such that the interaction inputs directed to the target element are offloaded to the gesture object configured for the application, the target element representing a selectable element rendered by the computing device; creating a recognizer on behalf of the application to facilitate gesture recognition through the native gesture functionality provided by the computing device; feeding interaction input data for the interaction inputs to the recognizer to enable recognition of gestures corresponding to the application based on the interaction input data; obtaining gesture event messages from the recognizer that are indicative of recognized gestures for the application; processing raw gesture data described by the gesture event messages on behalf of the application using the gesture object; and firing gesture events having processed gesture data to the associated target element in accordance with a content model for the application such that the recognized gestures conveyed via the gesture event messages are applied to the target element. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9, 10, 11)
-
-
12. A computing device comprising:
-
a processing system; a gesture module operable via the processing system to provide native gesture recognition functionality for the computing device, the native gesture recognition functionality representing gesture recognition functionality made accessible to applications via an operating system of the computing device; and a recognition interface operable via the processing system to handle gesture recognition for applications that use dynamic scripting language;
including;registering the applications for gesture recognition with the gesture module via gesture objects that are instantiated to represent multiple interaction contexts, each gesture object configured to specify one or more interaction inputs and a target element of application content for a corresponding one of the multiple interaction contexts; supplying input data for the interaction inputs to the gesture module for processing on behalf of the applications; transforming raw gesture data obtained from the gesture module that provides the native functionality for recognized gestures into coordinate systems employed by target elements specified by the gesture objects; and communicating the transformed gesture data for use by the applications to manipulate display of the target elements in accordance with the recognized gestures, the transformed gesture data being usable by the applications in the dynamic scripting language. - View Dependent Claims (13, 14, 15, 16, 17)
-
-
18. One or more computer-readable storage media storing instructions that, when executed via a computing device, cause the computing device to implement an dynamically compiled application configured to perform operations including:
-
directing creation of a gesture object for a selected target element of content for the application to register for gesture detection via a script-based recognition interface exposed by a rendering engine of the computing device; obtaining gesture events fired on the target element by the gesture object that are indicative of gestures recognized based upon user interaction with the target element; tracking a state of the target element; applying the gesture events to the target element to manipulate display of the target element in accordance with the recognized gestures; and updating the state of the target element based on the applying. - View Dependent Claims (19, 20)
-
Specification