Self-revelation aids for interfaces
First Claim
Patent Images
1. A computer-implemented method for providing gesture-based input to an application user interface, comprising:
- associating a floating tool palette with the application user interface, the floating tool palette including at least one gestural tip that presents gesture functionality associated with the application user interface based at least in part on a period of time between a current interaction of a user with a particular element of the application user interface and a previous interaction of the user with the particular element, wherein the at least one gestural tip illustrates movement of a particular gesture sequence, and wherein the at least one gestural tip comprises a video or an animation of the particular gesture sequence;
capturing at least one gesture input in response to presentation of the at least one gestural tip, wherein the at least one gesture input is associated with the at least one gestural tip included in the floating tool palette; and
based at least in part on the captured at least one gesture input, forwarding event data to an application associated with the application user interface.
2 Assignments
0 Petitions
Accused Products
Abstract
Systems and/or methods are provided that facilitates revealing assistance information associated with a user interface. An interface can obtain input information related to interactions between the interface and a user. In addition, the interface can output assistance information in situ with the user interface. Further, a decision component that determines the in situ assistance information output by the interface based at least in part on the obtained input information.
-
Citations
19 Claims
-
1. A computer-implemented method for providing gesture-based input to an application user interface, comprising:
-
associating a floating tool palette with the application user interface, the floating tool palette including at least one gestural tip that presents gesture functionality associated with the application user interface based at least in part on a period of time between a current interaction of a user with a particular element of the application user interface and a previous interaction of the user with the particular element, wherein the at least one gestural tip illustrates movement of a particular gesture sequence, and wherein the at least one gestural tip comprises a video or an animation of the particular gesture sequence; capturing at least one gesture input in response to presentation of the at least one gestural tip, wherein the at least one gesture input is associated with the at least one gestural tip included in the floating tool palette; and based at least in part on the captured at least one gesture input, forwarding event data to an application associated with the application user interface. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8)
-
-
9. One or more computer-readable storage media storing instructions that, when executed by at least one processor, instruct the at least one processor to perform actions comprising:
-
providing a tool palette for a user interface of an application, the tool palette including gestural tips that are provided based at least partly on a period of time between a current interaction of a user with a particular element of the application user interface and a previous interaction of the user with the particular element, wherein each of the gestural tips illustrates movement of a respective gesture sequence, and wherein each of the gestural tips comprises a video or an animation of the respective gesture sequence; capturing at least one gesture associated with the gestural tips in response to presentation of at least one of the gestural tips, wherein the at least one gesture is input at least partly to the tool palette; and forwarding event data to the application, the event data generated based at least partly on the captured at least one gesture. - View Dependent Claims (10, 11, 12, 13)
-
-
14. A system comprising:
-
at least one processor; an intelligence component comprising instructions executed by the at least one processor to infer an unfamiliarity of a user with an element of a user interface for an application, the inferring based at least partly on a period of time between a current interaction of the user with the element and a previous interaction of the user with the element; a decision component comprising instructions executed by the at least one processor to provide a tool palette that includes gestural tips for the user interface, the gestural tips based at least partly on the inferring, wherein each of the gestural tips illustrates movement of a respective gesture sequence, and wherein each of the gesture tips comprises a video or an animation of the respective gesture sequence; and an interface component comprising instructions executed by the at least one processor to; capture at least one gesture associated with the gestural tips in response to presentation of at least one of the gestural tips, wherein the at least one gesture is input at least partly to the tool palette; generate event data based at least partly on the captured at least one gesture; and forward the generated event data to the application. - View Dependent Claims (15, 16, 17, 18, 19)
-
Specification