Contextually changing omni-directional navigation mechanism
First Claim
Patent Images
1. A computer-implemented system, comprising:
- a navigation node, associated with a first user interface context and a second user interface context and sensitive to at least one user gesture applied to a screen of a device at a display position of the navigation node;
a display mechanism, to;
cause, via at least one processor, display, on the screen of the device, of the navigation node associated with the first user interface context as a first icon and display of the navigation node associated with the second user interface context as a second icon different than the first icon;
define an input region, inside which the at least one user gesture actuates the navigation node, wherein the display mechanism further controls the display position of the navigation node on the screen of the device and moves the display position of the navigation node away from an original position to follow the user gesture;
a graphic icon database, containing icons to be displayed by the display mechanism; and
a context database, containing respective contextual data associated with the at least one user gesture, wherein the contextual data comprises;
a first navigation action to be triggered in response to actuation of the navigation node when displayed as the first icon in the first user interface context, the first navigation action comprising a navigation from a first content page of an application to access a second content page of the application;
a second navigation action to be triggered in response to actuation of the navigation node when displayed as the second icon in the second user interface context;
the display mechanism further configured to;
cause display, on the screen of the device, of the navigation node as the first icon according to the first user interface context concurrently with display of the first content page of the application, wherein a first user gesture corresponds to (i) a first non-navigation action defined by the application when applied to a display position of a portion of the first content page and (ii) the first navigation action when applied to the input region to actuate the navigate node to trigger access of the second content page of the application; and
cause display, on the screen of the device, of the navigation node as the second icon according to the second user interface context concurrently with display of the second content page of the application.
2 Assignments
0 Petitions
Accused Products
Abstract
Systems and methods for omnidirectional application navigation are provided. In example embodiments, a navigation icon associated with a first user interface context is caused to be displayed on a device. A change is sensed, by a device, of a change from the first user interface context to a second user interface context. A graphics database is accessed. From the graphics database, a graphic which is associated with the second user interface context is identified. Responsive to the change from the first user interface context to the second user interface context, the navigation icon is changed using the identified graphic from the graphics database.
-
Citations
9 Claims
-
1. A computer-implemented system, comprising:
-
a navigation node, associated with a first user interface context and a second user interface context and sensitive to at least one user gesture applied to a screen of a device at a display position of the navigation node; a display mechanism, to; cause, via at least one processor, display, on the screen of the device, of the navigation node associated with the first user interface context as a first icon and display of the navigation node associated with the second user interface context as a second icon different than the first icon; define an input region, inside which the at least one user gesture actuates the navigation node, wherein the display mechanism further controls the display position of the navigation node on the screen of the device and moves the display position of the navigation node away from an original position to follow the user gesture; a graphic icon database, containing icons to be displayed by the display mechanism; and a context database, containing respective contextual data associated with the at least one user gesture, wherein the contextual data comprises; a first navigation action to be triggered in response to actuation of the navigation node when displayed as the first icon in the first user interface context, the first navigation action comprising a navigation from a first content page of an application to access a second content page of the application; a second navigation action to be triggered in response to actuation of the navigation node when displayed as the second icon in the second user interface context; the display mechanism further configured to; cause display, on the screen of the device, of the navigation node as the first icon according to the first user interface context concurrently with display of the first content page of the application, wherein a first user gesture corresponds to (i) a first non-navigation action defined by the application when applied to a display position of a portion of the first content page and (ii) the first navigation action when applied to the input region to actuate the navigate node to trigger access of the second content page of the application; and cause display, on the screen of the device, of the navigation node as the second icon according to the second user interface context concurrently with display of the second content page of the application. - View Dependent Claims (2, 3)
-
-
4. A method comprising:
-
displaying, on the screen of a device, a navigation node that is associated with a first user interface context and a second user interface context and sensitive to at least one user gesture applied to the screen of the device at a display position of the navigation node, the navigation node being displayed as a first icon when displayed concurrently with a first content page, and displayed as a second icon that is different than the first icon when displayed concurrently with a second content page that is different than the first content page, the navigation node being associated with the first user interface context when displayed as the first icon and the navigation node being associated with the second user interface context when displayed as the second icon different than the first icon, the device including a graphic icon database containing icons to be displayed by the device; defining an input region, inside which the at least one user gesture input actuates the navigation node, the device including a context database containing respective contextual data associated with the at least one user gesture input, wherein a first user gesture input corresponds to a first non-navigation action defined by an application when applied to a display position of a portion of a first content page, and a first navigation action when applied to the input region actuates the navigate node to trigger access of a second content page of the application; receiving the first user gesture input within the input region while the navigation node is displayed as the first icon, the first user gesture input triggering the first navigation action, the first navigation action comprising a navigation from the first content page to the second content page, wherein the display position of the navigation node on the screen of the device moves to follow the first user gesture input; receiving a second user gesture input within the input region while the navigation node is displayed as the second icon, the second user gesture input triggering a second navigation action that is different than the first navigation action, wherein the display position of the navigation node on the screen of the device moves to follow the second user gesture input. - View Dependent Claims (5, 6)
-
-
7. A non-transitory computer-readable medium storing instructions that, when execute by one or more computer processors of a computing device, cause the computing device to perform operations comprises:
-
displaying, on the screen of a computing device, a navigation node that is associated with a first user interface context and a second user interface context and sensitive to at least one user gesture applied to the screen of the computing device at a display position of the navigation node, the navigation node being displayed as a first icon when displayed concurrently with a first content page, and displayed as a second icon that is different than the first icon when displayed concurrently with a second content page that is different than the first content page, the navigation node being associated with the first user interface context when displayed as the first icon and the navigation node being associated with the second user interface context when displayed as the second icon different than the first icon, the computing device including a graphic icon database containing icons to be displayed by the computing device; defining an input region, inside which the at least one user gesture input actuates the navigation node, the computing device including a context database containing respective contextual data associated with the at least one user gesture input, wherein a first user gesture input corresponds to a first non-navigation action defined by an application when applied to a display position of a portion of a first content page, and a first navigation action when applied to the input region actuates the navigate node to trigger access of a second content page of the application; receiving the first user gesture input within the input region while the navigation node is displayed as the first icon, the first user gesture input triggering the first navigation action, the first navigation action comprising a navigation from the first content page to the second content page, wherein the display position of the navigation node on the screen of the computing device moves to follow the first user gesture input; receiving a second user gesture input within the input region while the navigation node is displayed as the second icon, the second user gesture input triggering a second navigation action that is different than the first navigation action, wherein the display position of the navigation node on the screen of the computing device moves to follow the second user gesture input. - View Dependent Claims (8, 9)
-
Specification