USING DEPTH INFORMATION FOR DRAWING IN AUGMENTED REALITY SCENES
First Claim
1. A computer-implemented method comprising:
- receiving a stream of depth data associated with a real scene of an augmented reality display;
receiving a stream of color data associated with the real scene;
processing the stream of depth data to construct a first mesh;
projecting the first mesh into a color space associated with the stream of color data to construct a second mesh; and
drawing one or more synthetic objects into the real scene based at least in part on boundaries of real objects in the real scene that are defined by the second mesh.
2 Assignments
0 Petitions
Accused Products
Abstract
Optimizing augmented reality scenes by using depth information to accurately display interactions between real objects and synthetic objects is described. A stream of depth data associated with a real scene of an augmented reality display and a stream of color data associated with the real scene may be received. The stream of depth data may be processed to construct a first mesh and the first mesh may be projected into a color space associated with the stream of color data to construct a second mesh. In some examples, a position of the synthetic objects respective to real objects in the real scene may be determined and/or queries may be conducted to determine how the synthetic objects interact with the real objects in the real scene. Based at least on constructing the second mesh, determining positions, and/or conducting queries, one or more synthetic objects may be drawn into the real scene.
-
Citations
20 Claims
-
1. A computer-implemented method comprising:
-
receiving a stream of depth data associated with a real scene of an augmented reality display; receiving a stream of color data associated with the real scene; processing the stream of depth data to construct a first mesh; projecting the first mesh into a color space associated with the stream of color data to construct a second mesh; and drawing one or more synthetic objects into the real scene based at least in part on boundaries of real objects in the real scene that are defined by the second mesh. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9)
-
-
10. A system comprising:
-
computer-readable media storing at least a rendering module; a processing unit operably coupled to the computer-readable media, the processing unit adapted to execute at least the rendering module, the rendering module comprising; an input module for receiving at least two data streams associated with one or more real objects in a real scene, wherein a first data stream of the at least two data streams includes a depth data stream and a second data stream of the at least two data streams includes a color data stream; a reconstruction module for constructing a mesh defining surfaces associated with the one or more real objects in the real scene, the constructing based at least in part on projecting the depth data from the first data stream into color data from the second data stream; and a drawing module for drawing one or more synthetic objects into the real scene based at least in part on boundaries of the one or more real objects that are defined by the mesh. - View Dependent Claims (11, 12, 13, 14, 15, 16)
-
-
17. One or more computer-readable media encoded with instructions that, when executed by a processor, configure a computer to perform acts comprising:
-
receive depth data comprising a plurality of depth pixels arranged in a point cloud representative of a real scene of an augmented reality display; receive a stream of color data associated with the real scene; construct a mesh based at least in part on the plurality of depth pixels; update the mesh based at least in part on projecting the mesh into a color space associated with the stream of color data; and draw at least one synthetic object into the real scene based at least in part on surface boundaries defined by the mesh. - View Dependent Claims (18, 19, 20)
-
Specification