GESTURE BASED USER INTERFACE FOR AUGMENTED REALITY
First Claim
Patent Images
1. A method, comprising:
- receiving sensory information associated with an object in proximity to, or in contact with, an input device including receiving at least one level of interaction differentiation detected from at least two levels of interaction differentiation;
interpreting a command from the sensory information as a function of the at least one level of interaction differentiation; and
outputting an action indication based on the command.
2 Assignments
0 Petitions
Accused Products
Abstract
Technologies are generally described for systems and methods effective to provide a gesture keyboard that can be utilized with a virtual display. In an example, the method includes receiving sensory information associated with an object in proximity to, or in contact with, an input device including receiving at least one level of interaction differentiation detected from at least three levels of interaction differentiation, interpreting a command from the sensory information as a function of the at least one level of interaction differentiation, and outputting an action indication based on the command.
69 Citations
40 Claims
-
1. A method, comprising:
-
receiving sensory information associated with an object in proximity to, or in contact with, an input device including receiving at least one level of interaction differentiation detected from at least two levels of interaction differentiation; interpreting a command from the sensory information as a function of the at least one level of interaction differentiation; and outputting an action indication based on the command. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14)
-
-
15. A system, comprising:
-
an environmental capture component configured to receive at least one gesture within a space relative to a keyboard; an interpretation component configured to identify a command based on the at least one gesture; and an output component configured to render information of the at least one gesture and a result of the command, wherein the information is configured to be rendered on a virtual display. - View Dependent Claims (16, 17, 18, 19, 20, 21, 22, 23, 24, 25)
-
-
26. A computer-readable storage medium having stored thereon computer-executable instructions that, in response to execution, cause a computing device to perform operations, comprising:
-
detecting a gesture that indicates at least one command to be performed; interpreting the gesture as the at least one command selected from a plurality of commands; and initiating a result of the at least one command as a perceivable event within a virtual space. - View Dependent Claims (27, 28, 29, 30, 31, 32, 33)
-
-
34-35. -35. (canceled)
-
36. A computing device, comprising:
-
a keyboard comprising; an array of keys, wherein at least a subset of keys of the array of keys comprise a respective displacement actuated switch configured to detect pressure applied to a respective key of at least the subset of keys; and at least one capacitive sensor configured to detect a finger near the keyboard; a translation module configured to translate a gesture near the keyboard into a command; and a processor configured to change a display as a function of the command. - View Dependent Claims (37, 38, 39, 40)
-
Specification