USER INTERFACE WITH PHYSICS ENGINE FOR NATURAL GESTURAL CONTROL
First Claim
1. A method of providing input to a device, the method comprising the steps of:
- providing a User Input (UI) with behavior that simulates attributes associated with a physically embodied object, the attributes including inertia and friction;
accepting user input to modify the UI behavior; and
in response to the user input, generating an event that conforms to the modified UI behavior.
2 Assignments
0 Petitions
Accused Products
Abstract
A UI (user interface) for natural gestural control uses inertial physics coupled to gestures made on a gesture-pad (“GPad”) by the user in order to provide an enhanced list and grid navigation experience which is both faster and more enjoyable to use than current list and grid navigation methods using a conventional 5-way D-pad (directional pad) controllers. The UI makes use of the GPad'"'"'s gesture detection capabilities, in addition to its ability to sense standard button presses, and allows end users to use either or both navigation mechanisms, depending on their preference and comfort level. End users can navigate the entire UI by using button presses only (as with conventional UIs) or they can use button presses in combination with gestures for a more fluid and enhanced browsing experience.
-
Citations
20 Claims
-
1. A method of providing input to a device, the method comprising the steps of:
-
providing a User Input (UI) with behavior that simulates attributes associated with a physically embodied object, the attributes including inertia and friction; accepting user input to modify the UI behavior; and in response to the user input, generating an event that conforms to the modified UI behavior. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8)
-
-
9. A method of navigating through a UI, the method comprising the steps of:
-
receiving a gesture input by a user; and responding to the gesture by changing a feature being displayed on a display device in accordance with attributes associated with a physically embodied object. - View Dependent Claims (10, 11, 12, 13, 14, 15)
-
-
16. A method for causing an action in response to user input, the method comprising the steps of:
-
accepting a gesture from a user on a touch sensitive surface; determining a type of gesture that has been accepted by the touch sensitive surface using a sensor array and a single mechanical, momentary contact switch activated by the sensor array; and performing an action in response to the type of gesture that has been accepted, the action at least in part simulating behavior of a physically embodied object. - View Dependent Claims (17, 18, 19, 20)
-
Specification