Integration of graphical application content into the graphical scene of another application
First Claim
Patent Images
1. A method, comprising:
- intercepting, by an intercept module, buffer-based architecture three-dimensional (3D) imagery produced by a first application, wherein the 3D imagery is not provided by the first application for consumption by the intercept module;
extracting a two-dimensional (2D) imagery, color imagery and depth imagery from the intercepted 3D imagery;
combining the 2D imagery, the color imagery and the depth imagery with a 3D graphics command stream of a second application to drive display hardware in order to integrate a 3D object based on the two-dimensional (2D) imagery, the color imagery and the depth imagery into a 3D scene, using a same 3D coordinate system of the 3D scene, based on the 3D graphics command stream of the second application, wherein the 3D scene is independent of the first application, wherein the combining further comprises modifying the 2D imagery, the color imagery and the depth imagery in order to visually integrate the 3D object into the 3D scene.
3 Assignments
0 Petitions
Accused Products
Abstract
This application describes a system that captures 3D geometry commands from a first 3D graphics process and stores them in a shared memory. A second 3D environment process creates a 3D display environment using a display and display hardware. A third process obtains the 3D commands and supplies them to the hardware to place 3D objects in the 3D environment. The result is a fused display environment where 3D objects are displayed along with other display elements. Input events in the environment are analyzed and mapped to the 3D graphics process or the environment where they affect corresponding processing.
60 Citations
19 Claims
-
1. A method, comprising:
-
intercepting, by an intercept module, buffer-based architecture three-dimensional (3D) imagery produced by a first application, wherein the 3D imagery is not provided by the first application for consumption by the intercept module; extracting a two-dimensional (2D) imagery, color imagery and depth imagery from the intercepted 3D imagery; combining the 2D imagery, the color imagery and the depth imagery with a 3D graphics command stream of a second application to drive display hardware in order to integrate a 3D object based on the two-dimensional (2D) imagery, the color imagery and the depth imagery into a 3D scene, using a same 3D coordinate system of the 3D scene, based on the 3D graphics command stream of the second application, wherein the 3D scene is independent of the first application, wherein the combining further comprises modifying the 2D imagery, the color imagery and the depth imagery in order to visually integrate the 3D object into the 3D scene. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12)
-
-
13. A system, comprising:
-
graphics hardware configured to produce display signals; a display configured to produce an image using the display signals; a shared memory; and a computer comprising; a three-dimensional (3D) graphics process configured to generate buffer-based architecture 3D graphics imagery; an intercept process configured to extract a two-dimensional (2D) graphics imagery, color imagery and depth imagery from the 3D graphics imagery produced by the 3D graphics process and, wherein the 3D graphics imagery is not provided by the 3D graphics process for consumption by the intercept process, and storing the 2D graphics imagery, the color imagery and the depth imagery in the shared memory; a 3D environment process configured to generate a 3D graphics command stream for a 3D fusion environment and supply the 3D graphics command stream to the graphics hardware; an integration process configured to integrate a 3D object based on the stored 2D graphics imagery, the color imagery and the depth imagery into the 3D fusion environment, using a same 3D coordinate system of the 3D fusion environment, based on the 3D graphics command stream of the second application, wherein the 3D fusion environment is independent of the 3D graphics process, wherein the integration process is further configured to modify the 2D imagery, the color imagery and the depth imagery in order to visually integrate the 3D object into the 3D fusion environment. - View Dependent Claims (14, 15)
-
-
16. A display, comprising:
-
a three-dimensional (3D) fusion display environment displayed in a window, independent of a first process, created by a buffer-based architecture 3D graphics command stream generated by a second process; a 3D object created by two-dimensional (2D) graphics imagery, color imagery and depth imagery generated from 3D graphics imagery generated by a first process, intercepted by an intercept module, and integrated into the 3D fusion display environment, using a same 3D coordinate system of the 3D fusion display environment, wherein the 2D graphics imagery, the color imagery and the depth imagery are not provided by the first process for consumption by the intercept module, wherein the combining further comprises modifying the 2D imagery, the color imagery and the depth imagery in order to visually integrate the 3D object into the 3D fusion display environment; and display hardware configured to display the integrated 3D fusion display environment. - View Dependent Claims (17, 18)
-
-
19. A non-transitory computer-readable medium having stored thereon computer-executable instructions that, if executed by a computing device, cause the computing device to perform a method comprising:
-
intercepting, by an intercept module, buffer-based architecture three-dimensional (3D) imagery produced by a first application, wherein the 3D imagery is not provided by the first application for consumption by the intercept module; extracting a two-dimensional (2D) imagery, the color imagery and the depth imagery from the intercepted 3D imagery; combining the 2D imagery, color imagery and depth imagery with a 3D graphics command stream of a second application to drive display hardware in order to integrate a 3D object based on the two-dimensional (2D) imagery, the color imagery and the depth imagery into a 3D scene, using a same 3D coordinate system of the 3D scene, based on the 3D graphics command stream of the second application, wherein the 3D scene is independent of the first application, wherein the combining further comprises modifying the 2D imagery, the color imagery and the depth imagery in order to visually integrate the 3D object into the 3D scene.
-
Specification