Method and apparatus for performing tangent space lighting and bump mapping in a deferred shading graphics processor
First Claim
1. A bump mapping method for use in a deferred graphics pipeline processor the method comprising steps of:
- receiving for a pixel fragment associated with a surface for which bump effects are to be computed;
a surface tangent, binormal and normal defining a tangent space relative to the surface associated with the fragment; and
a tangent space texture vector representing perturbations to the surface normal in the directions of the surface tangent and binormal caused by the bump effects at the surface position associated with the pixel fragment;
computing a set of tangent space basis vectors from the surface tangent, binormal and normal that define a transformation matrix from the tangent space to eye space in view of the orientation of the texture vector;
computing a perturbed, eye space, surface normal reflecting the bump effects by performing a matrix multiplication in which the tangent space texture vector is multiplied by the transformation matrix whose columns comprise the basis vectors, giving a result that is the perturbed, eye space, surface normal; and
performing lighting computations in eye space for the pixel fragment using the perturbed, eye space, surface normal, giving an apparent color for the pixel fragment that accounts for the bump effects without needing to interpolate and transform light and half-angle vectors (L and H) used in the lighting computations.
3 Assignments
0 Petitions
Accused Products
Abstract
A system and method for performing tangent space lighting in a deferred shading graphics processor (DSGP) encompasses blocks of the DSGP that preprocess data and a Phong shader that executes only after all fragments have been preprocessed. A preprocessor block receives texture maps specified in a variety of formats and converts those texture maps to a common format for use by the Phong shader. The preprocessor blocks provide the Phong shader with interpolated surface basis vectors (vs, vt, n), a vector Tb that represents in tangen/object space the texture/bump data from the texture maps, light data, material data, eye coordinates and other information used by the Phong shader to perform the lighting and bump mapping computations. The data from the preprocessor is provided for each fragment for which lighting effects need to be computed. The Phong shader computes the color of a fragment using the information provided by the preprocessor. The Phong shader performs all lighting computations in eye space, which requires it first to transform bump data from tangent space to eye space. In one embodiment the Phong hardware does this by multiplying a matrix M whose columns comprise eye space basis vectors (bs, bt, n) derived from the surface basis vectors (vs, vt, n) and the vector Tb of bump map data. The eye space basis vectors are derived by the DSGP preprocessor so that the multiplication (M×Tb) gives the perturbed surface normal N′ in eye space, reflecting the bump effects. The perturbed surface normal N′ is subsequently used in the lighting computations.
146 Citations
35 Claims
-
1. A bump mapping method for use in a deferred graphics pipeline processor the method comprising steps of:
-
receiving for a pixel fragment associated with a surface for which bump effects are to be computed;
a surface tangent, binormal and normal defining a tangent space relative to the surface associated with the fragment; and
a tangent space texture vector representing perturbations to the surface normal in the directions of the surface tangent and binormal caused by the bump effects at the surface position associated with the pixel fragment;
computing a set of tangent space basis vectors from the surface tangent, binormal and normal that define a transformation matrix from the tangent space to eye space in view of the orientation of the texture vector;
computing a perturbed, eye space, surface normal reflecting the bump effects by performing a matrix multiplication in which the tangent space texture vector is multiplied by the transformation matrix whose columns comprise the basis vectors, giving a result that is the perturbed, eye space, surface normal; and
performing lighting computations in eye space for the pixel fragment using the perturbed, eye space, surface normal, giving an apparent color for the pixel fragment that accounts for the bump effects without needing to interpolate and transform light and half-angle vectors (L and H) used in the lighting computations.
-
-
2. A variable scale bump mapping method for shading a computer graphics image, the method comprising steps of:
-
receiving for a vertex of polygon associated with a surface to which bump effects are to be mapped geometry vectors (Vs, Vt, N) and a texture vector (Tb) in tangent space;
separating the geometry vectors into unit basis vectors ({circumflex over (b)}s, {circumflex over (b)}t, {circumflex over (n)}) and magnitudes (mbs, mbt, mbn), such that the unit basis vectors can be used to form a transform matrix for transforming vectors from tangent space to eye space;
multiplying the magnitudes and the texture vector to form a texture-magnitude vector (mTb′
);
scaling components of the texture-magnitude vector by a vector s to form a scaled texture-magnitude vector (mTb″
); and
multiplying the scaled texture-magnitude vector and the transform matrix formed from the unit basis vectors to provide a perturbed unit normal (N′
) in eye space for a pixel location.- View Dependent Claims (3, 4, 5)
-
-
6. A variable scale bump mapping method for shading a computer graphics image, the method comprising steps of:
-
receiving a gray scale image for which bump effects are to be computed;
taking derivatives relative to a gray scale intensity for a pixel fragment associated with the gray scale image;
programmatically selecting a bump map representation type, the choices for the bump map representation type comprising;
tangent space;
object space; and
derivative of gray scale intensity;
computing, from the derivatives;
a texture map in accordance with the selected bump map representation type;
associating the selected bump map representation type with the bump map so as to indicate how to generate an eye space normal from the texture map; and
computing the eye space normal from one or more values in the texture map. - View Dependent Claims (7)
-
-
8. A method for bump mapping for shading a computer graphics image, the method comprising steps of:
-
receiving for a pixel fragment associated with a surface for which bump effects are to be computed;
(i) a magnitude vector (m), and a bump vector (Tb) in tangent space; and
(ii) a unit transformation matrix (M) for transforming bump vectors from tangent space to eve space;
multiplying the magnitude vector and the bump vector to form a texture-magnitude vector (mTb′
);
scaling components of the texture-magnitude vector by a vector s to form a scaled texture-magnitude vector (mTb″
);
multiplying the scaled texture-magnitude vector and the unit transformation matrix to provide a perturbed normal (N′
);
re-scaling components of the perturbed normal to form rescaled vector (N″
); and
normalizing the rescaled vector to provide a unit perturbed normal that is used to perform lighting computations to give the pixel fragment bump effects. - View Dependent Claims (9, 11)
-
-
10. A method for bump mapping for shading a computer graphics image, the method comprising steps of:
-
receiving for a pixel fragment associated with a surface for which bump effects are to be computed;
(i) a magnitude vector (m), and a bump vector (Tb); and
(ii) a unit transformation matrix (M);
multiplying the magnitude vector and the bump vector to form a texture-magnitude vector (mTb′
);
scaling components of the texture-magnitude vector by a vector s comprising scalars (ss, st, sn) to form a scaled texture-magnitude vector (mTb″
), the scaling including multiplying texture-magnitude vector comprising s according to the expression;
-
-
12. A method for bump mapping for shading a computer graphics image, the method comprising steps of:
-
receiving for a pixel fragment associated with a surface for which bump effects are to be computed;
(i) a magnitude vector (m), and a bump vector (Tb); and
(ii) a unit transformation matrix (M);
multiplying the magnitude vector and the bump vector to form a texture-magnitude vector (mTb′
);
scaling components of the texture-magnitude vector by a vector s to form a scaled texture-magnitude vector (mTb″
);
multiplying the scaled texture-magnitude vector and the unit transformation matrix to provide a perturbed normal (N′
);
re-scaling components of the perturbed normal to form rescaled vector (N″
); and
normalizing the rescaled vector to provide a unit perturbed normal that is used to perform lighting computations to give the pixel fragment bump effects;
wherein the step of re-scaling components of the perturbed normal comprises the step of multiplying by a reciprocal of vector s (1/(ss, st, sn)) to re-establish a correct relationship between their values.
-
-
13. A bump mapping method for a computer graphics renderer, the renderer generating a rendered image from a plurality of polygons, the method comprising the steps:
-
receiving lighting information in eye coordinates;
receiving a texture map representing bumps on a surface, the texture map being associated with an object, the object comprising a plurality of the polygons, each of the polygons comprising a plurality of vertices, each vertex associated with texture coordinates in the texture map, each vertex associated with a surface normal;
receiving one of the polygons;
transforming the surface normals of the received polygon to eye coordinates;
selecting an interpolation location within the received polygon;
interpolating the texture coordinates of the received polygon at the interpolation location;
selecting bump values from the texture map associated with the interpolated texture coordinates;
if the texture map values are in tangent space, transforming the selected bump values from tangent space to eye space;
if the texture map values are in object space, transforming the selected bump values from object space to eye space; and
perform lighting calculations in eye coordinates. - View Dependent Claims (14, 15, 16, 17, 18)
associating a parameter with the texture map, the parameter indicating one of a plurality of types of bump map representations, the parameter used to determine the type of computations needed to generate eye coordinate bump-mapped surface normals, the types of bump map representations comprising;
tangent space normals;
object space normals; and
normal perturbation values.
-
-
15. The method of claim 13, further comprising:
-
the polygon vertices further comprising a tangent and binormal; and
the transforming step further comprising the transforming of the tangent and the binormal to eye coordinates.
-
-
16. The method of claim 13, further comprising the step:
generating approximations of the tangent and binormal by computing the derivatives of the texture coordinates for the polygon.
-
17. The method of claim 13, further comprising the step:
performing hidden surface removal on the plurality of polygons to eliminate the portions of the polygons that do not contribute to the rendered image, the hidden surface removal being done for an area of the image before any of the lighting calculations are done for the polygons in the area of the image, thereby eliminating some of the lighting calculations.
-
18. The method of claim 13, the method further comprising the step:
generating a tangent-space-to-eye-space transform matrix by transforming object space normal, tangent, and binormal vectors to eye coordinates and using the transformed vectors as rows or columns in the tangent-space-to-eye-space transform matrix.
-
19. A bump mapping device for a computer graphics pipeline, the pipeline generating a rendered image from a plurality of polygons, the device comprising:
-
logic receiving lighting information in eye coordinates;
a memory storing a texture map representing bumps on a surface, the texture map being associated with an object, the object comprising a plurality of the polygons, receiving one of the polygons, the polygon comprising a plurality of vertices, each vertex associated with texture coordinates in the texture map, each vertex associated with a surface normal;
logic transforming the surface normals of the received polygon to eye coordinates;
logic selecting an interpolation location within the received polygon;
logic interpolating the texture coordinates of the received polygon at the interpolation location;
logic reading bump values from the texture map associated with the interpolated texture coordinates;
logic transforming the selected bump values from tangent space to eye space if the texture map values are in tangent space;
logic transforming the selected bump values from object space to eye space if the texture map values are in object space; and
logic perform lighting calculations in eye coordinates. - View Dependent Claims (20, 21, 22, 23, 24)
logic associating a parameter with the texture map, the parameter indicating one of a plurality of types of bump map representations, the parameter used to determine the type of computations needed to generate eye coordinate bump-mapped surface normals, the types of bump map representations comprising;
tangent space normals;
object space normals; and
normal perturbation values.
-
-
21. The method of claim 19, further comprising:
-
logic receiving a tangent and binormal for each of the vertices of the received polygon; and
logic transforming of the tangent and the binormal to eye coordinates.
-
-
22. The method of claim 19, further comprising:
logic generating approximations of the tangent and binormal by computing the derivatives of the texture coordinates for the polygon.
-
23. The method of claim 19, further comprising:
logic performing hidden surface removal on the plurality of polygons to eliminate the portions of the polygons that do not contribute to the rendered image, the hidden surface removal being done for an area of the image before any of the lighting calculations are done for the polygons in the area of the image, thereby eliminating some of the lighting calculations.
-
24. The device of the claim 19, further comprising:
logic generating a tangent-space-to-eye-space transform matrix by transforming object space normal, tangent, and binormal vectors to eye coordinates and using the transformed vectors as rows or columns in the tangent-space-to-eye-space transform matrix.
-
25. A computer program for use in conjunction with a computer system, the computer program comprising a computer program mechanism embedded therein, the computer program mechanism, comprising:
-
a program module that directs the rendering of a graphics image from a plurality of polygons, to function in a specified manner, the program module including instructions for;
receiving lighting information in eye coordinates;
receiving a texture map representing bumps on a surface, the texture map being associated with an object, the object comprising a plurality of the polygons, each of the polygons comprising a plurality of vertices, each vertex associated with texture coordinates in the texture map, each vertex associated with a surface normal;
receiving one of the polygons;
transforming the surface normals of the received polygon to eye coordinates;
selecting an interpolation location within the received polygon;
interpolating the texture coordinates of the received polygon at the interpolation location;
selecting bump values from the texture map associated with the interpolated texture coordinates;
if the texture map values are in tangent space, transforming the selected bump values from tangent space to eye space;
if the texture map values are in object space, transforming the selected bump values from object space to eye space; and
perform lighting calculations in eye coordinates. - View Dependent Claims (26, 27, 28, 29, 30, 31)
associating a parameter with the texture map, the parameter indicating one of a plurality of types of bump map representations, the parameter used to determine the type of computations needed to generate eye coordinate bump-mapped surface normals, the types of bump map representations comprising;
tangent space normals, object space normals, and normal perturbation values.
-
-
28. The computer program of claim 25, the program module further including instructions for:
-
receiving a tangent and binormal associated with each of the vertices of the received polygon; and
the transforming instructions further comprising instructions for transforming of the tangent and the binormal to eye coordinates.
-
-
29. The computer program of claim 25, the program module further including instructions for:
generating approximations of the tangent and binormal by computing the derivatives of the texture coordinates for the polygon.
-
30. The computer program of claim 25, the program module further including instructions for:
performing hidden surface removal on the plurality of polygons to eliminate the portions of the polygons that do not contribute to the rendered image, the hidden surface removal being done for an area of the image before any of the lighting calculations are done for the polygons in the area of the image, thereby eliminating some of the lighting calculations.
-
31. The computer program of claim 25, the program module further including instructions for:
generating a tangent-space-to-eye-space transform matrix by transforming object space normal, tangent, and binormal vectors, to eye coordinates and using the transformed vectors as rows or columns in the tangent-space-to-eye-space transform matrix.
-
32. A computing system for three-dimensional (3-D) graphics rendering, the system generating a rendered image from a plurality of polygons, the system comprising:
-
a general-purpose computer; and
a 3-D graphics processor coupled to the general purpose computer;
the 3-D graphics processor comprising;
a bump mapping device comprising;
logic receiving lighting information in eye coordinates;
a memory storing a texture map representing bumps on a surface, the texture map being associated with an object, the object comprising a plurality of the polygons;
receiving one of the polygons, the polygon comprising a plurality of vertices, each vertex associated with texture coordinates in the texture map, each vertex associated with a surface normal;
logic transforming the surface normals of the received polygon to eye coordinates;
logic selecting an interpolation location within the received polygon;
logic interpolating the texture coordinates of the received polygon at the interpolation location;
logic reading bump values from the texture map associated with the interpolated texture coordinates;
logic transforming the selected bump values from tangent space to eye space if the texture map values are in tangent space;
logic transforming the selected bump values from object space to eye space if the texture map values are in object space; and
logic perform lighting calculations in eye coordinates. - View Dependent Claims (33, 34)
a culling device comprising;
logic performing hidden surface removal on the plurality of polygons to eliminate the portions of the polygons that do not contribute to the rendered image, the hidden surface removal being done for one of a plurality of tile areas of the image before any of the lighting calculations are done for the polygons in the one tile area of the image, thereby eliminating some of the lighting calculations.
-
-
34. The system of claim 33, the 3-D graphics processor further comprising:
-
a sort unit comprising;
logic spatially sorting the plurality of polygons according the tile areas; and
logic outputting the spatially sorted polygons according to their spatial sorting.
-
-
35. In a processor generating a rendered image for an object from received information including from received object lighting information specified in eye coordinates and received object texture information representing surface features associated with surface normals of the object, a method comprising:
-
transforming the surface normals to eye coordinate space;
interpolating the texture information at interpolation locations;
selecting values from the texture information associated with the interpolated texture information at the interpolation locations;
if the texture information values are specified in tangent coordinate space, transforming the selected values from tangent coordinate space to eye coordinate space;
if the texture information values are specified in object coordinate space, transforming the selected values from object coordinate space to eye coordinate space; and
perform lighting calculations for said object in eye coordinate space.
-
Specification