Method and apparatus for mapping texture on an object displayed at a varying view angle from an observer
First Claim
1. A texture mapping method for mapping texture data on each of a plurality of polygons constituting a displayed polyhedron on a display screen, after conducting coordinate transformation for a plurality of vertexes on each of the polygons based on modeling data for geometrically defining the polyhedron, texture data for expressing a surface detail of each said polygon, and parameters including a view position and a view direction given by a virtual camera, comprising the steps of:
- predetermining a plurality of view directions from said virtual camera for each said polygon;
presetting texture data for each said polygon for said plurality of predetermined view directions;
selecting the preset texture data for one of the predetermined view directions corresponding to an actual view direction for each said polygon being actually displayed; and
mapping the preset texture data on each said polygon being actually displayed.
0 Assignments
0 Petitions
Accused Products
Abstract
A plurality of texture data of a surface of an object to be displayed as viewed from a plurality of assumed directions are stored in a texture memory. When a CPU sends to a graphic processor information specifying a shape of the surface, a direction to view the surface and a texture in the texture memory as a command, a geometric operation unit produces read information of the texture memory based on the input information. A triangle generator reads the texture data corresponding to the view direction of the surface of the object by the read information for texture mapping. An image of the texture mapped image of the surface of the object is stored in a frame memory and displayed on a monitor. When there is no texture data corresponding to the view direction, a plurality of texture data closest thereto are read from the texture memory and interpolated to produce an appropriate texture data.
-
Citations
14 Claims
-
1. A texture mapping method for mapping texture data on each of a plurality of polygons constituting a displayed polyhedron on a display screen, after conducting coordinate transformation for a plurality of vertexes on each of the polygons based on modeling data for geometrically defining the polyhedron, texture data for expressing a surface detail of each said polygon, and parameters including a view position and a view direction given by a virtual camera, comprising the steps of:
-
predetermining a plurality of view directions from said virtual camera for each said polygon; presetting texture data for each said polygon for said plurality of predetermined view directions; selecting the preset texture data for one of the predetermined view directions corresponding to an actual view direction for each said polygon being actually displayed; and mapping the preset texture data on each said polygon being actually displayed. - View Dependent Claims (2)
-
-
3. A texture mapping apparatus for mapping texture data on each of a plurality of polygons constituting a displayed polyhedron on a display screen, after conducting coordinate transformation on a plurality of vertexes for each said polygon based on modeling data for geometrically defining said polyhedron, texture data for expressing a surface detail of each said polygon, and parameters including a view position and a view direction given by a virtual camera, comprising:
-
a memory for storing, for each said polygon, texture data for each of a plurality of predetermined view directions from said virtual camera; a selection unit for selecting the stored texture data for one of the predetermined view directions corresponding to an actual view direction of each said polygon being actually displayed; and graphic image generation means for generating an image of said polyhedron by mapping said preset texture data on each said polygon being actually displayed. - View Dependent Claims (4)
-
-
5. A texture mapping apparatus for mapping texture data on each of a plurality of polygons constituting a displayed polyhedron on a display screen, after conducting coordinate transformation on a plurality of vertexes of each said polygon based on modeling data for geometrically defining said polyhedron, texture data for expressing a surface detail of each said polygon, and parameters including a view position and a view direction given by a virtual camera, comprising:
-
a first memory for storing texture data for each of a plurality of predetermined view directions from said virtual camera for each said polygon; a second memory for storing plural sets of mapping data, each set of mapping data including the texture data stored in the first memory, for the predetermined view directions, and an interpolation coefficient therebetween; and graphic image generating means for receiving a value corresponding to an actual view direction of each said polygon being actually displayed, the graphic image generating means reading a set of mapping data corresponding to said value from the second memory and conducting an interpolating process based on said set thus read to obtain interpolated texture data for each said polygon being actually displayed; wherein said graphic image generation means generates an image of said polyhedron by mapping the interpolated texture data thus obtained on each said polygon being actually displayed.
-
-
6. A texture mapping method for mapping texture data on each of a plurality of polygons constituting a displayed polyhedron on a display screen, after conducting coordinate transformation on a plurality of vertexes of each said polygon based on modeling data for geometrically defining said polyhedron, texture data for expressing a surface detail of each said polygon, and parameters including a view position and a view direction given by a virtual camera, comprising the steps of:
-
predetermining a plurality of view directions from said virtual camera for each of the polygons; pre-storing, in a texture memory, texture data for each of the polygons for said plurality of predetermined view directions; determining, by a central processing unit (CPU), an actual view direction for each said polygon based on information inputted to said CPU, and sending a command indicating the actual view direction thus determined from said CPU to a graphic processor; selecting, by said graphic processor, at least one pertinent texture data from the texture data pre-stored in said texture memory on a basis of the command received from said CPU; and mapping the texture data thus selected on each said polygon being actually displayed. - View Dependent Claims (7, 8, 9)
-
-
10. A texture mapping apparatus for mapping texture data on each of a plurality of polygons constituting a displayed polyhedron on a display screen, after conducting coordinate transformation on a plurality of vertexes of each of the polygons based on modeling data for geometrically defining said polyhedron, texture data for expressing a surface detail of each said polygon, and parameters including a view position and a view direction given by a virtual camera, comprising:
-
a texture memory for pre-storing texture data for each of a plurality of predetermined view directions from said virtual camera for each of the polygons; a central processing unit (CPU) for determining a view direction of each polygon based on information inputted to said CPU, and sending a command indicating the view direction thus determined to a graphic processor; and said graphic processor for selecting at least one pertinent texture data from the texture data stored in said texture memory based on the command received from said CPU, and mapping the texture data thus selected on each said polygon being actually displayed. - View Dependent Claims (11, 12, 13)
-
-
14. A driving simulator machine for displaying a simulated polyhedron constituted by a plurality of polygons, by mapping texture data on each of the polygons, after conducting coordinate transformation on vertexes of each said polygon based on modeling data for geometrically defining said polyhedron, texture data for expressing a surface detail of each of the polygons, and parameters including a view position and a view direction given by a virtual camera, comprising:
-
a display device having a display screen; an input device for receiving input information representing at least steering wheel manipulation, brake manipulation and accelerator manipulation for simulation by an operator, and generating a signal representing said input information; a central processing unit (CPU) responsive to said signals from said input device, for determining a view direction of each polygon based on information inputted to said CPU, and sending a command indicating the view direction thus determined; a texture memory for pre-storing texture data for each of a plurality of predetermined view directions from said virtual camera for each said polygon; and a graphic processor connected to said CPU and responsive to said command from said CPU, for selecting at least one pertinent texture data from the texture data stored in said texture memory based on said command, and mapping the texture data thus selected on each said polygon being actually displayed.
-
Specification