User Interaction in Augmented Reality
First Claim
1. A computer-implemented method of direct user-interaction in an augmented reality system, comprising:
- controlling, using a processor, a display device to display a three-dimensional augmented reality environment comprising a virtual object and a real first and second object controlled by a user;
receiving, at the processor, a sequence of images from at least one camera showing the first and second object, and using the images to track the position of the first and second object in three dimensions;
enabling interaction between the second object and the virtual object when the first and second object are in contact at the location of the virtual object from the perspective of the user.
2 Assignments
0 Petitions
Accused Products
Abstract
Techniques for user-interaction in augmented reality are described. In one example, a direct user-interaction method comprises displaying a 3D augmented reality environment having a virtual object and a real first and second object controlled by a user, tracking the position of the objects in 3D using camera images, displaying the virtual object on the first object from the user'"'"'s viewpoint, and enabling interaction between the second object and the virtual object when the first and second objects are touching. In another example, an augmented reality system comprises a display device that shows an augmented reality environment having a virtual object and a real user'"'"'s hand, a depth camera that captures depth images of the hand, and a processor. The processor receives the images, tracks the hand pose in six degrees-of-freedom, and enables interaction between the hand and the virtual object.
512 Citations
20 Claims
-
1. A computer-implemented method of direct user-interaction in an augmented reality system, comprising:
-
controlling, using a processor, a display device to display a three-dimensional augmented reality environment comprising a virtual object and a real first and second object controlled by a user; receiving, at the processor, a sequence of images from at least one camera showing the first and second object, and using the images to track the position of the first and second object in three dimensions; enabling interaction between the second object and the virtual object when the first and second object are in contact at the location of the virtual object from the perspective of the user. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9, 10, 11)
-
-
12. An augmented reality system, comprising:
-
a display device arranged to display a three-dimensional augmented reality environment comprising a virtual object and a real hand of a user; a depth camera arranged to capture images of the hand of the user having a plurality of image elements, each image element having a value indicating a distance between the camera and a corresponding portion of the hand; a processor arranged to receive the depth camera images, track the movement and pose of the hand of the user in six degrees of freedom, monitor the pose of the hand to detect a predefined gesture, and, responsive to detecting the predefined gesture, trigger an associated interaction between the hand of the user and the virtual object. - View Dependent Claims (13, 14, 15, 16, 17, 18, 19)
-
-
20. One or more tangible device-readable media with device-executable instructions that, when executed by a computing device, direct the computing device to perform steps comprising:
-
generating a three-dimensional augmented reality environment comprising a virtual object and a real first hand and second hand of one or more users; controlling a display device to display the virtual object and the first hand and second hand; receiving a sequence of images from a depth camera showing the first hand and second hand; analyzing the sequence of images to determine a position and pose of each of the first hand and second hand in six degrees of freedom; using the position and pose of the second hand to render the virtual object at a location in the augmented reality environment such that the virtual object appears to be located on the surface of the second hand from the perspective of the user, and moving the virtual object in correspondence with movement of the second hand; and triggering interaction between the first hand and the virtual object at the instance when the position and pose of the first hand and second hand indicates that a digit of the first hand is touching the second hand at the location of the virtual object.
-
Specification