Canvas Manipulation Using 3D Spatial Gestures
First Claim
1. A computer-readable medium that stores a set of instructions which when executed perform a method for displaying information based on gesture detection, the method comprising:
- displaying a first representation of a user interface at a display device, the first representation of the user interface being a two-dimensional (2D) representation of the user interface;
detecting a first user gesture; and
displaying, in response to detecting the first user gesture at the display device, a second representation of the user interface, the second representation of the user interface being a three-dimensional (3D) representation of the user interface.
2 Assignments
0 Petitions
Accused Products
Abstract
User interface manipulation using three-dimensional (3D) spatial gestures may be provided. A two-dimensional (2D) user interface (UI) representation may be displayed. A first gesture may be performed, and, in response to the first gesture'"'"'s detection, the 2D UI representation may be converted into a 3D UI representation. A second gesture may then be performed, and, in response to the second gesture'"'"'s detection, the 3D UI representation may be manipulated. Finally, a third gesture may be performed, and, in response to the third gesture'"'"'s detection, the 3D UI representation may be converted back into the 2D UI representation.
172 Citations
20 Claims
-
1. A computer-readable medium that stores a set of instructions which when executed perform a method for displaying information based on gesture detection, the method comprising:
-
displaying a first representation of a user interface at a display device, the first representation of the user interface being a two-dimensional (2D) representation of the user interface; detecting a first user gesture; and displaying, in response to detecting the first user gesture at the display device, a second representation of the user interface, the second representation of the user interface being a three-dimensional (3D) representation of the user interface. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9)
-
-
10. A method for multi-dimensional user interface navigation based on gesture detection, the method comprising:
-
displaying a two-dimensional (2D) representation of a user interface at a display device; detecting a first user gesture; displaying, in response to detecting the first user gesture, a three-dimensional (3D) representation of the user interface at the display device; detecting a second user gesture while displaying the 3D representation of the user interface; and manipulating, in response to detecting the second user gesture, the 3D representation of the user interface. - View Dependent Claims (11, 12, 13, 14, 15, 16, 17)
-
-
18. A system for displaying information based on gesture detection, the system comprising:
-
a display device operative to display a two-dimensional (2D) representation of a user interface and a three-dimensional (3D) representation of the user interface; a gesture detection device operative to detect hand gestures and send signals corresponding to the detected hand gestures; a memory storage for storing a plurality of instructions associated with the detected hand gestures; and a processing unit coupled to the display device, the gesture detection device, and the memory storage, wherein the processing unit is operative to; cause a display of a first portion of the user interface in the 2D representation of the user interface; receive a first signal indicative of a first detected hand gesture from the gesture detection device, determine a first set of instructions associated with the first detected hand gesture from the plurality of instructions in the memory storage, wherein the first set of instructions provides instructions to cause a display of a second portion of the user interface in the 3D representation of the user interface, cause a display of the user interface in accordance with the determined first set of instructions, receive a second signal indicative of a second detected hand gesture from the gesture detection device, determine a second set of instructions associated with the second detected hand gesture from the plurality of instructions in the memory storage, wherein the second set of instructions provides instructions to manipulate the 3D representation of the user interface, manipulate the 3D representation of the user interface in accordance with the determined second set of instructions, receive a third signal indicative of a third detected hand gesture from the gesture detection device, determine a third set of instructions associated with the third detected gesture from the plurality of instructions in the memory storage, wherein the third set of instructions provides instructions to cause a display of a viewable portion of the manipulated 3D representation of the user interface when displayed in the 2D representation of the user interface, and cause a display of the viewable portion of the manipulated 3D representation of the user interface in the 2D representation of the user interface. - View Dependent Claims (19, 20)
-
Specification