User interface functionality for facilitating interaction between users and their environments
First Claim
1. A computing device comprising:
- a touch-sensitive screen;
a processing device; and
a storage resource having computer-readable instructions which, when executed by the processing device, cause the processing device to;
obtain different location contexts for different locations of a user as the user moves through a physical space;
detect a touch gesture performed on the touch-sensitive screen;
responsive to the touch gesture performed on the touch-sensitive screen, navigate among a plurality of workspaces available for presentation on the touch-sensitive screen to a first workspace, the plurality of workspaces having predetermined spatial relationships with respect to other workspaces and respective functionality by which the user may interact with the physical space;
after the user has navigated to the first workspace, detect a tap on the touch-sensitive screen and, in response to the tap, output a spoken message conveying information regarding a current location context of the user;
detect another touch gesture performed on the touch-sensitive screen; and
responsive to the another touch gesture, display a menu associated with the first workspace on the touch-sensitive screen, the menu having a plurality of menu items relating to the current location context of the user in the physical space.
1 Assignment
0 Petitions
Accused Products
Abstract
Space interaction (SI) functionality is described herein for assisting a user in interacting with a space without unduly distracting the user. The SI functionality includes an application interface module that presents information and/or exposes functionality within a plurality of workspaces. Each workspace has a determined spatial relationship with respect to other workspaces. Further, the application interface module may detect and respond to various gestures, by which the user may move among workspaces and interact with menus and other information that are presented in those workspaces. The user'"'"'s interaction with these workspaces and menus may be supplemented by various sounds generated by a sound generation module, and/or various haptic cues (e.g., vibration cues) generated by a haptic cue generation module.
79 Citations
20 Claims
-
1. A computing device comprising:
-
a touch-sensitive screen; a processing device; and a storage resource having computer-readable instructions which, when executed by the processing device, cause the processing device to; obtain different location contexts for different locations of a user as the user moves through a physical space; detect a touch gesture performed on the touch-sensitive screen; responsive to the touch gesture performed on the touch-sensitive screen, navigate among a plurality of workspaces available for presentation on the touch-sensitive screen to a first workspace, the plurality of workspaces having predetermined spatial relationships with respect to other workspaces and respective functionality by which the user may interact with the physical space; after the user has navigated to the first workspace, detect a tap on the touch-sensitive screen and, in response to the tap, output a spoken message conveying information regarding a current location context of the user; detect another touch gesture performed on the touch-sensitive screen; and responsive to the another touch gesture, display a menu associated with the first workspace on the touch-sensitive screen, the menu having a plurality of menu items relating to the current location context of the user in the physical space. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9, 10, 11)
-
-
12. A method performed by a computing device having a touch-sensitive screen, the method comprising:
-
obtaining different location contexts for different locations of a user as the user moves through a physical or virtual space using a mode of transportation; detecting a touch gesture performed on the touch-sensitive screen; responsive to the touch gesture, navigating among a plurality of workspaces to a particular workspace, the plurality of workspaces having respective predetermined spatial relationships and respective associated menus; after the user has navigated to the particular workspace, detecting a tap on the touch-sensitive screen and, in response, outputting a spoken message conveying information regarding a current location context of the user in the physical or virtual space; detecting another touch gesture performed on the touch-sensitive screen; responsive to the another touch gesture, displaying an individual menu associated with the particular workspace on the touch-sensitive screen, the individual menu having a plurality of menu items relating to the current location context of the user in the physical or virtual space; and performing a particular operation in response to a selection of a particular menu item from the individual menu. - View Dependent Claims (13, 14, 15, 16, 17, 18, 19)
-
-
20. A computer-readable storage medium storing instructions which, when executed by a processing device, cause the processing device to perform acts comprising:
-
obtaining different location contexts for different locations of a user as the user moves through a physical space using a mode of transportation; detecting a gesture performed by the user; based at least upon a direction of the gesture, selecting a particular workspace from a plurality of available workspaces having respective associated menus; detecting a tap gesture performed by the user and, in response, outputting a spoken message conveying information regarding a current location context of the user; detecting another gesture performed by the user; displaying an individual menu associated with the particular workspace, the individual menu having a plurality of menu items relating to the current location context of the user; and performing a particular operation in response to a user selection of a particular menu item from the individual menu.
-
Specification