Manipulating And Displaying An Image On A Wearable Computing System
First Claim
1. A method comprising:
- a wearable computing system providing a view of a real-world environment of the wearable computing system;
imaging at least a portion of the view of the real-world environment in real-time to obtain a real-time image;
the wearable computing system receiving at least one input command that is associated with a desired manipulation of the real-time image, wherein the at least one input command comprises an input command that identifies a portion of the real-time image to be manipulated, wherein the input command that identifies the portion of the real-time image to be manipulated comprises a hand gesture detected in a region of the real-world environment, wherein the region corresponds to the portion of the real-time image to be manipulated;
based on the at least one received input command, the wearable computing system manipulating the real-time image in accordance with the desired manipulation; and
the wearable computing system displaying the manipulated real-time image in a display of the wearable computing system.
2 Assignments
0 Petitions
Accused Products
Abstract
Example methods and systems for manipulating and displaying a real-time image and/or photograph on a wearable computing system are disclosed. A wearable computing system may provide a view of a real-world environment of the wearable computing system. The wearable computing system may image at least a portion of the view of the real-world environment in real-time to obtain a real-time image. The wearable computing system may receive at least one input command that is associated with a desired manipulation of the real-time image. The at least one input command may be a hand gesture. Then, based on the at least one received input command, the wearable computing system may manipulate the real-time image in accordance with the desired manipulation. After manipulating the real-time image, the wearable computing system may display the manipulated real-time image in a display of the wearable computing system.
76 Citations
20 Claims
-
1. A method comprising:
-
a wearable computing system providing a view of a real-world environment of the wearable computing system; imaging at least a portion of the view of the real-world environment in real-time to obtain a real-time image; the wearable computing system receiving at least one input command that is associated with a desired manipulation of the real-time image, wherein the at least one input command comprises an input command that identifies a portion of the real-time image to be manipulated, wherein the input command that identifies the portion of the real-time image to be manipulated comprises a hand gesture detected in a region of the real-world environment, wherein the region corresponds to the portion of the real-time image to be manipulated; based on the at least one received input command, the wearable computing system manipulating the real-time image in accordance with the desired manipulation; and the wearable computing system displaying the manipulated real-time image in a display of the wearable computing system. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16)
-
-
17. A non-transitory computer readable medium having instructions stored thereon that, in response to execution by a processor, cause the processor to perform operations, the instructions comprising:
-
instructions for providing a view of a real-world environment of a wearable computing system; instructions for imaging at least a portion of the view of the real-world environment in real-time to obtain a real-time image; instructions for receiving at least one input command that is associated with a desired manipulation of the real-time image, wherein the at least one input command comprises an input command that identifies a portion of the real-time image to be manipulated, wherein the input command that identifies the portion of the real-time image to be manipulated comprises a hand gesture detected in a region of the real-world environment, wherein the region corresponds to the portion of the real-time image to be manipulated; instructions for, based on the at least one received input command, manipulating the real-time image in accordance with the desired manipulation; and instructions for displaying the manipulated real-time image in a display of the wearable computing system.
-
-
18. A wearable computing system comprising:
-
a head-mounted display, wherein the head-mounted display is configured to provide a view of a real-world environment of the wearable computing system, wherein providing the view of the real-world environment comprises displaying computer-generated information and allowing visual perception of the real-world environment; an imaging system, wherein the imaging system is configured to image at least a portion of the view of the real-world environment in real-time to obtain a real-time image; a controller, wherein the controller is configured to (i) receive at least one input command that is associated with a desired manipulation of the real-time image and (ii) based on the at least one received input command, manipulate the real-time image in accordance with the desired manipulation, wherein the at least one input command comprises an input command that identifies a portion of the real-time image to be manipulated, wherein the input command that identifies the portion of the real-time image to be manipulated comprises a hand gesture detected in a region of the real-world environment, wherein the region corresponds to the portion of the real-time image to be manipulated; and a display system, wherein the display system is configured to display the manipulated real-time image in a display of the wearable computing system. - View Dependent Claims (19, 20)
-
Specification