DETERMINING GESTURES ON CONTEXT BASED MENUS
First Claim
Patent Images
1. A method executed at least in part in a computing device for determining gestures on context based menus, the method comprising:
- in response to detecting one of a;
a tap action, a swipe action, a mouse input, a pen input, and a keyboard input, presenting a context based menu in relation to a displayed content on a user interface;
detecting a gesture associated with the context based menu; and
determining an action to be performed in association with the context based menu by interpreting the detected gesture based on at least one from a set of;
a direction, an angle, a length, and a location of the gesture.
2 Assignments
0 Petitions
Accused Products
Abstract
Context based menus are employed for content management through touch or gesture actions, keyboard entries, mouse or pen actions, and similar input. Different actions and combinations of actions enable users to activate sub-menus, execute commands, or collapse context based menus. Gestures associated with the actions are determined through action analysis. The action analysis includes tap action hit target region analysis and swipe action direction, angle, and/or length analysis.
-
Citations
20 Claims
-
1. A method executed at least in part in a computing device for determining gestures on context based menus, the method comprising:
-
in response to detecting one of a;
a tap action, a swipe action, a mouse input, a pen input, and a keyboard input, presenting a context based menu in relation to a displayed content on a user interface;detecting a gesture associated with the context based menu; and determining an action to be performed in association with the context based menu by interpreting the detected gesture based on at least one from a set of;
a direction, an angle, a length, and a location of the gesture. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9, 10)
-
-
11. A computing device for determining gestures on context based menus, the computing device comprising:
-
a input device configured to detect at least one of;
a swipe action and a tap action;a memory; a processor coupled to the memory, the processor executing an application and causing a user interface associated with the application to be displayed on a screen, wherein the processor is configured to; in response to detecting one of a;
a tap action, a swipe action, a mouse input, a pen input, and a keyboard input, present a context based menu in relation to a displayed content on the user interface;detect a swipe action associated with the context based menu; and determine an action to be performed in association with the context based menu by interpreting the detected swipe action based on at least one from a set of;
a direction, an angle, a length, and a location of the swipe action relative to one or more gesture zones defined on the context based menu. - View Dependent Claims (12, 13, 14, 15, 16, 17)
-
-
18. A computer-readable memory device with instructions stored thereon for determining gestures on context based menus, the instructions comprising:
-
in response to detecting one of a;
a tap action, a swipe action, a mouse input, and a keyboard input, presenting a context based menu in relation to a displayed content on a user interface;detecting a gesture associated with the context based menu; and determining an action to be performed in association with the context based menu by interpreting the detected gesture being one a tap action on a predefined zone on the context based menu and a swipe action across one or more predefined zones on the context based menu based on at least one from a set of;
a direction, an angle, a length, and a location of the swipe action. - View Dependent Claims (19, 20)
-
Specification