Three-dimensional tracking of a user control device in a volume
First Claim
Patent Images
1. A system, comprising:
- a display configured to display a stereoscopic three-dimensional (3D) scene within a virtual space;
a user control device configured to manipulate objects within the stereoscopic 3D scene, wherein the user control device includes at least one visually detectable point;
at least one tracking sensor configured to track the at least one visually detectable point in a physical space; and
a processing subsystem configured to;
provide the stereoscopic 3D scene to the display;
receive tracking information for the at least one visually detectable point from the at least one tracking sensor;
receive additional information from the user control device, wherein the additional information comprises measurement information from at least one sensor regarding dynamic state of the user control device;
determine a position and an orientation of the user control device in the physical space based on the received tracking information and the additional information; and
provide an updated stereoscopic 3D scene to the display based on the determined position and orientation of the user control device in the physical space, wherein the updated 3D stereoscopic scene in the virtual space correlates to the physical space within 1 mm in each of an x, y, and z axis.
7 Assignments
0 Petitions
Accused Products
Abstract
Tracking objects presented within a stereo three-dimensional (3D) scene. The user control device may include one or more visually indicated points for at least one tracking sensor to track. The user control device may also include other position determining devices, for example, an accelerometer and/or gyroscope. Precise 3D coordinates of the stylus may be determined based on location information from the tracking sensor(s) and additional information from the other position determining devices. A stereo 3D scene may be updated to reflect the determined coordinates.
-
Citations
23 Claims
-
1. A system, comprising:
-
a display configured to display a stereoscopic three-dimensional (3D) scene within a virtual space; a user control device configured to manipulate objects within the stereoscopic 3D scene, wherein the user control device includes at least one visually detectable point; at least one tracking sensor configured to track the at least one visually detectable point in a physical space; and a processing subsystem configured to; provide the stereoscopic 3D scene to the display; receive tracking information for the at least one visually detectable point from the at least one tracking sensor; receive additional information from the user control device, wherein the additional information comprises measurement information from at least one sensor regarding dynamic state of the user control device; determine a position and an orientation of the user control device in the physical space based on the received tracking information and the additional information; and provide an updated stereoscopic 3D scene to the display based on the determined position and orientation of the user control device in the physical space, wherein the updated 3D stereoscopic scene in the virtual space correlates to the physical space within 1 mm in each of an x, y, and z axis. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15)
-
-
16. A method, comprising:
-
presenting a stereoscopic 3D scene within a virtual space by at least one display, wherein said presenting the stereoscopic 3D scene comprises displaying at least one stereoscopic image of the stereoscopic 3D scene by the at least one display, wherein a user control device is used to manipulate objects within the stereoscopic 3D scene; determining first location information of the user control device in a physical space based on two or more captured images of at least one visually detectable point of the user control device, wherein each of the two or more captured images is captured from a distinct perspective; determining second location information of the user control device in the physical space based on additional information from the user control device, wherein the additional information comprises measurement information from at least one sensor regarding dynamic state of the user control device; determining a position and an orientation of the user control device in the physical space based on the first and second location information; and updating the stereoscopic 3D scene based on said determining the position and orientation of the user control device, wherein the updated 3D stereoscopic scene in the virtual space correlates to the physical space within 1 mm in each of an x, y, and z axis. - View Dependent Claims (17, 18, 19, 20, 21, 22)
-
-
23. A non-transitory computer accessible memory medium storing program instructions, wherein the program instructions are executable by a processor to:
-
provide a stereoscopic three-dimensional (3D) scene to a display;
wherein the stereoscopic 3D scene is display within a virtual space;receive tracking information for at least one visually detectable point of a user control device from at least one tracking sensor; receive additional information from the user control device, wherein the additional information comprises measurement information from at least one sensor regarding dynamic state of the user control device; determine a position and an orientation of the user control device in physical space based on the tracking information for the at least one visually detectable point and the additional information; and provide an updated stereoscopic 3D scene to the display based on the determined position and orientation of the user control device in the physical space, wherein the updated 3D stereoscopic scene in the virtual space correlates to the physical space within 1 mm in each of an x, y, and z axis.
-
Specification