Method of generating digital images of objects in 3D scenes while eliminating object overdrawing within the multiple graphics processing pipeline (GPPLS) of a parallel graphics processing system generating partial color-based complementary-type images along the viewing direction using black pixel rendering and subsequent recompositing operations
First Claim
1. A method of generating image frames of a 3D scene containing objects along a viewing direction, using an object-division based parallel rendering process carried out on a parallel graphics processing system employing a plurality of graphics processing pipelines (GPPLs) configured to operate according to object-division mode of parallel rendering, while eliminating the overdrawing of objects in said 3D scene that are occluded by other objects along said viewing direction, wherein said GPPLs include a primary GPPL, wherein each said GPPL has a Z depth buffer in which each depth value has an x, y position, and a color image buffer in which each pixel value has an x, y position, wherein said 3D scene to be rendered along said viewing direction is decomposed into objects, and said objects are assigned to particular GPPLs for graphics processing, wherein during the generation of each image frame, said method comprising the steps of:
- (a) transmitting graphics commands and data for all objects in the image frame, to all said GPPLs to be rendered;
(b) within each said GPPL, using said graphics commands and data transmitted in step (a) to locally generate a Global Depth Map (GDM), and then storing said GDM within the Z depth buffers of all of said GPPLs;
(c) transmitting graphics commands and data of objects in the image frame, to only assigned GPPLs;
(d) within the color buffer of each GPPL, (i) generating a partial color-based complementary-type image, using said GDM, a Z test filter operating on said Z depth buffer, and said graphics commands and data transmitted in step (c), and (ii) then storing said partial color-based complementary-type image in the color buffer of said GPPL;
wherein the pixels of objects sent to assigned GPPLs are rendered as color pixel values within the color image buffers of the assigned GPPLs, while pixels of objects sent to non-assigned GPPLs are rendered as black pixel values within the color image buffers of the non-assigned GPPLs;
wherein said partial color-based complementary-type image within the color frame buffer of each said GPPL comprises (i) black pixel values corresponding to objects in the image frame sent to a non-assigned GPPL, and (ii) color pixel values corresponding to objects in the image frame sent to an assigned-GPPL and located closest to the viewer along said viewing direction;
wherein, at a given x,y position in the color image buffers of said GPPLs, (i) at most only one said GPPL holds a color pixel value which has survived the Z test filter operating on said Z buffers using said GDM stored in said Z buffers of all said GPPLs, while all other said GPPLs hold a black pixel value; and
(e) after a final pass, recompositing a complete color image frame of said 3D scene within said primary GPPL, using said partial color-based complementary-type images stored in said color image buffers of said GPPLs, by simply combining, at each x, y position, the color and black pixel values of said partial color-based complementary-type images stored in the color image buffers within said GPPLs, so as to form said complete color image frame of the 3D scene, without using said GDM or any depth value information stored in the Z buffers of said GPPLs.
4 Assignments
0 Petitions
Accused Products
Abstract
A multi-pass method of generating an image frame of a 3D scene while eliminating the overdrawing of objects within the multiple graphics processing pipelines (GPPLs) supported on a parallel graphics processing system The GPPLs include a primary GPPL, and each GPPL, includes a color frame buffer and Z depth buffer. The GPPLs support an object-division based parallel graphics rendering process, in which the 3D scene is decomposed into objects that are assigned to particular GPPLs for processing. The multi-pass method involves, during a first pass, locally a Global Depth Map (GDM) which is provided to the Z depth buffer of each GPPL. This step involves the transmission of graphics commands and data for all objects in the image frame, to all GPPLs to be rendered. Then, during subsequent passes, a complementary-type partial image consisting of visible pixels only is generated within the color buffer of each GPPL using the GDM and a Z test filter supported by the Z depth buffer. After subsequent passes are performed, a complete color image is recomposited within the primary GPPL, using the complementary-type partial images stored in the color buffers of the GPPLs, without comparing or recompositing depth values in the Z depth buffers.
244 Citations
18 Claims
-
1. A method of generating image frames of a 3D scene containing objects along a viewing direction, using an object-division based parallel rendering process carried out on a parallel graphics processing system employing a plurality of graphics processing pipelines (GPPLs) configured to operate according to object-division mode of parallel rendering, while eliminating the overdrawing of objects in said 3D scene that are occluded by other objects along said viewing direction, wherein said GPPLs include a primary GPPL, wherein each said GPPL has a Z depth buffer in which each depth value has an x, y position, and a color image buffer in which each pixel value has an x, y position, wherein said 3D scene to be rendered along said viewing direction is decomposed into objects, and said objects are assigned to particular GPPLs for graphics processing, wherein during the generation of each image frame, said method comprising the steps of:
-
(a) transmitting graphics commands and data for all objects in the image frame, to all said GPPLs to be rendered; (b) within each said GPPL, using said graphics commands and data transmitted in step (a) to locally generate a Global Depth Map (GDM), and then storing said GDM within the Z depth buffers of all of said GPPLs; (c) transmitting graphics commands and data of objects in the image frame, to only assigned GPPLs; (d) within the color buffer of each GPPL, (i) generating a partial color-based complementary-type image, using said GDM, a Z test filter operating on said Z depth buffer, and said graphics commands and data transmitted in step (c), and (ii) then storing said partial color-based complementary-type image in the color buffer of said GPPL; wherein the pixels of objects sent to assigned GPPLs are rendered as color pixel values within the color image buffers of the assigned GPPLs, while pixels of objects sent to non-assigned GPPLs are rendered as black pixel values within the color image buffers of the non-assigned GPPLs; wherein said partial color-based complementary-type image within the color frame buffer of each said GPPL comprises (i) black pixel values corresponding to objects in the image frame sent to a non-assigned GPPL, and (ii) color pixel values corresponding to objects in the image frame sent to an assigned-GPPL and located closest to the viewer along said viewing direction; wherein, at a given x,y position in the color image buffers of said GPPLs, (i) at most only one said GPPL holds a color pixel value which has survived the Z test filter operating on said Z buffers using said GDM stored in said Z buffers of all said GPPLs, while all other said GPPLs hold a black pixel value; and (e) after a final pass, recompositing a complete color image frame of said 3D scene within said primary GPPL, using said partial color-based complementary-type images stored in said color image buffers of said GPPLs, by simply combining, at each x, y position, the color and black pixel values of said partial color-based complementary-type images stored in the color image buffers within said GPPLs, so as to form said complete color image frame of the 3D scene, without using said GDM or any depth value information stored in the Z buffers of said GPPLs. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9)
-
-
10. A computing system for generating image frames of a 3D scene containing objects along a viewing direction, using an object-division based parallel rendering process carried out on a parallel graphics processing subsystem, while eliminating the overdrawing of objects in said 3D scene that are occluded by other objects along said viewing direction, said computing system comprising:
-
CPU memory space for storing one or more graphics-based applications and a graphics library for generating graphics commands and data (GCAD) during the run-time of the graphics-based applications; one or more CPUs for executing said graphics-based applications; and a parallel graphics processing subsystem employing a plurality of graphics processing pipelines (GPPLs) configured to operate according to an object-division mode of parallel rendering; wherein said GPPLs include a primary GPPL; wherein each said GPPL has a Z depth buffer in which each depth value has an x, y position, and a color image buffer in which each pixel value has an x, y position; wherein said 3D scene to be rendered along said viewing direction is decomposed into objects, and said objects are assigned to particular GPPLs for graphics processing; wherein during a pass involved in rendering an image frame, graphics commands and data for all objects in the image frame are transmitted to all said GPPLs to be rendered; wherein graphics commands and data transmitted to each GPPL during said pass are used to locally generate a Global Depth Map (GDM) within the Z depth buffer of said GPPL; wherein during subsequent passes involved in rendering the image frame, graphics commands and data of objects in the image frame are transmitted to only assigned GPPLs; wherein, within the color buffer of each GPPL, a partial color-based complementary-type image is generated using said GDM, a Z test filter operating on said Z depth buffer, and said transmitted graphics commands and data, and (ii) then storing said partial color-based complementary-type image in the color buffer of said GPPL; wherein the pixels of objects sent to assigned GPPLs are rendered as color pixel values within the color image buffers of the assigned GPPLs, while pixels of objects sent to non-assigned GPPLs are rendered in as black pixel values within the color image buffers of the non-assigned GPPLs; wherein said partial color-based complementary-type image within the color frame buffer of each said GPPL comprises black pixel values corresponding to objects in the image frame sent to a non-assigned GPPL, and color pixel values corresponding to objects in the image frame sent to an assigned-GPPL and located closest to the viewer along said viewing direction; wherein, at a given x,y position in the color image buffers of said GPPLs, (i) at most only one said GPPL holds a color pixel value (i.e. non zero pixel value) which has survived the Z test filter operating on said Z buffers using said GDM stored in said Z buffers of all said GPPLs, while all other said GPPLs hold a black pixel value; and wherein, after a final pass, a complete color image frame of said 3D scene is recomposited within said primary GPPL, using said partial color-based complementary-type images stored in said color image buffers of said GPPLs, by simply combining, at each x, y position, the color and black pixel values of said partial color-based complementary-type images stored in the color image buffers within said GPPLs, so as to form said complete color image frame of the 3D scene, without using said GDM or any depth value information stored in the Z buffers of said GPPLs. - View Dependent Claims (11, 12, 13, 14, 15, 16, 17, 18)
-
Specification