SPATIAL INTERACTION IN AUGMENTED REALITY
First Claim
1. A method, comprising:
- acquiring, by a user device, an image of a real-world scene;
displaying, by the user device, an augmented reality (AR) scene that includes the image of the real-world scene, a virtual target object, and a virtual cursor, wherein a position of the virtual cursor is provided according to a first coordinate system within the AR scene;
tracking a pose of the user device relative to a user hand according to a second coordinate system that defines a relationship between the user device and the user hand;
mapping the second coordinate system to the first coordinate system to control movement of the virtual cursor in the AR scene in response to movements of the user hand, wherein mapping the second coordinate system to the first coordinate system includes a first mapping mode and a second mapping mode, wherein the first mapping mode is configured to control movement of the virtual cursor to change a distance between the virtual cursor and the virtual target object in the AR scene, and wherein the second mapping mode is configured to control movement of virtual cursor to manipulate the virtual target object within the AR scene; and
detecting, at the user device, a user input to control which of the first mapping mode and the second mapping mode is used to control movement of the virtual cursor in the AR scene.
1 Assignment
0 Petitions
Accused Products
Abstract
A method for spatial interaction in Augmented Reality (AR) includes displaying an AR scene that includes an image of a real-world scene, a virtual target object, and a virtual cursor. A position of the virtual cursor is provided according to a first coordinate system within the AR scene. A user device tracks a pose of the user device relative to a user hand according to a second coordinate system. The second coordinate system is mapped to the first coordinate system to control movements of the virtual cursor. In a first mapping mode, virtual cursor movement is controlled to change a distance between the virtual cursor and the virtual target object. In a second mapping mode, virtual cursor movement is controlled to manipulate the virtual target object. User input is detected to control which of the first mapping mode or the second mapping mode is used.
35 Citations
30 Claims
-
1. A method, comprising:
-
acquiring, by a user device, an image of a real-world scene; displaying, by the user device, an augmented reality (AR) scene that includes the image of the real-world scene, a virtual target object, and a virtual cursor, wherein a position of the virtual cursor is provided according to a first coordinate system within the AR scene; tracking a pose of the user device relative to a user hand according to a second coordinate system that defines a relationship between the user device and the user hand; mapping the second coordinate system to the first coordinate system to control movement of the virtual cursor in the AR scene in response to movements of the user hand, wherein mapping the second coordinate system to the first coordinate system includes a first mapping mode and a second mapping mode, wherein the first mapping mode is configured to control movement of the virtual cursor to change a distance between the virtual cursor and the virtual target object in the AR scene, and wherein the second mapping mode is configured to control movement of virtual cursor to manipulate the virtual target object within the AR scene; and detecting, at the user device, a user input to control which of the first mapping mode and the second mapping mode is used to control movement of the virtual cursor in the AR scene. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12)
-
-
13. An apparatus, comprising:
-
means for acquiring, by a user device, an image of a real-world scene; means for displaying, on the user device, an augmented reality (AR) scene that includes the image of the real-world scene, a virtual target object, and a virtual cursor, wherein a position of the virtual cursor is provided according to a first coordinate system within the AR scene; means for tracking a pose of the user device relative to a user hand according to a second coordinate system that defines a relationship between the user device and the user hand; means for mapping the second coordinate system to the first coordinate system to control movement of the virtual cursor in the AR scene in response to movements of the user hand, wherein the means for mapping the second coordinate system to the first coordinate system includes a first mapping mode and a second mapping mode, wherein the first mapping mode is configured to control movement of the virtual cursor to change a distance between the virtual cursor and the virtual target object in the AR scene, and wherein the second mapping mode is configured to control movement of virtual cursor to manipulate the virtual target object within the AR scene; and means for detecting, at the user device, a user input to control which of the first mapping mode and the second mapping mode is used to control movement of the virtual cursor in the AR scene. - View Dependent Claims (14, 15, 16)
-
-
17. A user device, comprising:
-
a camera configured to capture an image of a real-world scene; a display configured to display an augmented reality (AR) scene that includes the image of the real-world scene, a virtual target object, and a virtual cursor, wherein a position of the virtual cursor is provided according to a first coordinate system within the AR scene; memory adapted to store program code; and a processing unit coupled to the memory to access and execute instructions included in the program code to direct the user device to; track a pose of the user device relative to a user hand according to a second coordinate system that defines a relationship between the user device and the user hand; map the second coordinate system to the first coordinate system to control movement of the virtual cursor in the AR scene in response to movements of the user hand, wherein mapping the second coordinate system to the first coordinate system includes a first mapping mode and a second mapping mode, wherein the first mapping mode is configured to control movement of the virtual cursor to change a distance between the virtual cursor and the virtual target object in the AR scene, and wherein the second mapping mode is configured to control movement of virtual cursor to manipulate the virtual target object within the AR scene; and detect a user input to control which of the first mapping mode and the second mapping mode is used to control movement of the virtual cursor in the AR scene. - View Dependent Claims (18, 19, 20, 21, 22, 23, 24, 25)
-
-
26. A non-transitory computer-readable medium including program code stored thereon, the program code comprising instructions which when executed cause a user device to:
-
acquire an image of a real-world scene; display an augmented reality (AR) scene that includes the image of the real-world scene, a virtual target object, and a virtual cursor, wherein a position of the virtual cursor is provided according to a first coordinate system within the AR scene; track a pose of the user device relative to a user hand according to a second coordinate system that defines a relationship between the user device and the user hand; map the second coordinate system to the first coordinate system to control movement of the virtual cursor in the AR scene in response to movements of the user hand, wherein mapping the second coordinate system to the first coordinate system includes a first mapping mode and a second mapping mode, wherein the first mapping mode is configured to control movement of the virtual cursor to change a distance between the virtual cursor and the virtual target object in the AR scene, and wherein the second mapping mode is configured to control movement of virtual cursor to manipulate the virtual target object within the AR scene; and detect a user input to control which of the first mapping mode and the second mapping mode is used to control movement of the virtual cursor in the AR scene. - View Dependent Claims (27, 28, 29, 30)
-
Specification