Method and apparatus for rendering video data
First Claim
1. A method for combining real-lime video data with video graphics data, comprising:
- separating processing of the real-time video data from a rasterization process;
storing real-time video data in a memory that is physically separate from a memory used to read/write texture data;
inserting a flag in a data set during rasterization processing, said flag indicating that the data set is to be textured with video data; and
obtaining a color value from the memory storing the real-time data and combining the color value with a data set output by the rasterizer.
4 Assignments
0 Petitions
Accused Products
Abstract
The present invention provides a method and apparatus for rendering an input video stream as a polygon texture. The method provides process steps to receive the input video data in a Mip Map generator, wherein the Mip Map generator converts the video data to Mip Map data and stores the Mip Map data in a first memory storage device; wherein the first memory storage device is located in a V buffer. The method further includes sending a data set from a Z buffer to V buffer and mapping the data set to RGB values at a texel address in the V buffer memory. The data set includes U, V and Z coordinates, Mip Map level and channel identification data. The V buffer includes a V buffer fetch module that receives the data set from the Z buffer and maps to RGB data within V buffer memory.
-
Citations
27 Claims
-
1. A method for combining real-lime video data with video graphics data, comprising:
-
separating processing of the real-time video data from a rasterization process;
storing real-time video data in a memory that is physically separate from a memory used to read/write texture data;
inserting a flag in a data set during rasterization processing, said flag indicating that the data set is to be textured with video data; and
obtaining a color value from the memory storing the real-time data and combining the color value with a data set output by the rasterizer. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9, 10)
receiving the real-time video data in a Mip Map generator, wherein the Mip Map generator converts the video data to Mip Map data;
storing the Mip Map data in a first memory storage device, wherein the first memory storage device is located in a V buffer, sending a data set from a Z buffer to the V buffer; and
determining a texel memory location in the first memory storage device, wherein the texel memory location is based upon the data set.
-
-
6. The method according to claim 5, further comprising:
-
transferring RGB data from the texel memory location to the Z buffer, wherein the RGB data is received by a Z buffer comparator; and
transferring the RGB data from the Z buffer to a second memory storage device.
-
-
7. The method according to claim 5, wherein the data set includes U and V coordinates.
-
8. The method according to claim 5, wherein the data set includes Mip Map level data.
-
9. The method according to claim 5, wherein the data set includes channel identification data.
-
10. The method according to claim 5, wherein the data set is converted to RCB data by a V buffer memory.
-
11. An apparatus for rendering input video data, comprising:
-
a first memory storing texture data;
a second memory storing real-time video data, said second memory being physically separate from the first memory used to read/write texture data; and
said processor obtaining a color value from the memory storing the real-time data and combining the color value with a data set output by the rasterizer. - View Dependent Claims (12, 13, 14, 15, 16, 17, 18)
a V buffer that receives the Mip Map data associated with the input video stream and a data set from a Z buffer; and
a V buffer memory, wherein the V buffer memory stores Mip Map data.
-
-
14. The apparatus according to claim 13, wherein the data set includes U and V coordinates.
-
15. The apparatus according to claim 13, wherein the data set includes Mip Map level data.
-
16. The apparatus according to claim 13, wherein the data set includes channel identification data.
-
17. The apparatus according to claim 13, wherein the V buffer includes a V buffer fetch module that receives the data set from the Z buffer.
-
18. The apparatus according to claim 17, wherein the V buffer fetch module receives the RGB data from the V buffer memory.
-
19. A method for transferring RGB data from a storage device to a Z buffer comprising:
-
separating processing of real-time video data from a rasterization process;
storing real-time video data in a memory this is physically separate from a memory used to read/write texture data;
transferring a data set from a Z buffer to a memory address calculation module;
determining memory addresses for RGB data stored in the memory storing the real-time video data based upon the data set; and
transferring the RGB data from the memory storing the real-time video data buffer to the Z buffer. - View Dependent Claims (20, 21, 22)
-
-
23. A method for combining real-time video data with video graphics data, comprising:
-
inserting a flag in a data set during rasterization processing, said flag indicating that the data set is to be textured with video data; and
obtaining a color value from a memory storing the real-time data and combining the color value with a data output by the rasterizer. - View Dependent Claims (24, 25)
-
-
26. A method for combining real-time video data with video graphics data, comprising:
-
separating processing of the real-time video data from a rasterization process; and
storing real-time video data in a memory that is physically separate from a memory used to read/write texture data. - View Dependent Claims (27)
-
Specification