PREDICTIVE GESTURING IN GRAPHICAL USER INTERFACE
First Claim
Patent Images
1. A surface computing system, comprising:
- a display presenting a user interface;
a gesture input operatively aligned with the display and configured to translate a user gesture into a command for controlling the surface computing system;
a gesture-predicting engine to predict a plurality of different possible commands based on a beginning of the user gesture; and
a rendering engine to indicate the plurality of different possible commands via the user interface.
2 Assignments
0 Petitions
Accused Products
Abstract
A computing system. The computing system includes a display presenting a user interface, and a gesture input configured to translate a user gesture into a command for controlling the computing system. The computing system also includes a gesture-predicting engine to predict a plurality of possible commands based on the beginning of the user gesture, and a rendering engine to indicate the plurality of possible commands via the user interface.
-
Citations
20 Claims
-
1. A surface computing system, comprising:
-
a display presenting a user interface; a gesture input operatively aligned with the display and configured to translate a user gesture into a command for controlling the surface computing system; a gesture-predicting engine to predict a plurality of different possible commands based on a beginning of the user gesture; and a rendering engine to indicate the plurality of different possible commands via the user interface. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9)
-
-
10. A computing system, comprising:
-
a display presenting a user interface; a gesture input configured to translate a user gesture into a command for controlling the computing system; a gesture-predicting engine to predict a plurality of possible commands based on a beginning of the user gesture; and a rendering engine to indicate the plurality of possible commands via the user interface. - View Dependent Claims (11, 12, 13, 14)
-
-
15. A method of facilitating interaction with a user interface, comprising:
-
analyzing a beginning of a user input gesture; identifying one or more possible gestures that begin with the beginning of the user input gesture; and rendering, for each possible gesture;
a gesture hint indicating how that gesture is completed. - View Dependent Claims (16, 17, 18, 19, 20)
-
Specification