Texture blending between view-dependent texture and base texture in a geographic information system
First Claim
1. A computer-implemented method of providing a three-dimensional model of a geographic area, comprising:
- identifying, by one or more computing devices, a perspective of a virtual camera for viewing a polygon mesh, the polygon mesh modeling geometry of a geographic area;
identifying, by the one or more computing devices, a view-dependent texture associated with a reference direction, the view-dependent texture generated for viewing the three-dimensional model from a reference viewpoint associated with the reference direction;
identifying, by the one or more computing devices, a base texture generated for viewing the three-dimensional model from a plurality of different viewpoints;
determining, by the one or more computing devices, a viewpoint direction associated with a fragment of the polygon mesh, the viewpoint direction extending from the virtual camera towards the fragment; and
determining, by the one or more computing devices, a texture for display at the fragment based at least in part on an amount that a texture mapped image is stretched when mapped to the fragment when viewed from the viewpoint direction;
wherein the texture determined for display at the fragment comprises one or more elements of the base texture or the view-dependent texture;
wherein the texture is determined for display at the fragment based at least in part on a stretching factor indicative of the amount the texture mapped image is stretched when mapped to the fragment when viewed from the viewpoint direction;
wherein the view-dependent texture is selected for display at the fragment when the stretching factor is less than a first threshold;
wherein the base texture is selected for display at the fragment when the stretching factor is greater than a second threshold;
a blended texture is selected for display at the fragment based at least in part on the stretching factor when the stretching factor is between the first threshold and the second threshold, the blended texture comprising a blend between the base texture and the view-dependent texture.
2 Assignments
0 Petitions
Accused Products
Abstract
Systems and methods for rendering a view-dependent texture in conjunction with a three-dimensional model of a geographic area are provided. A view-dependent texture can be rendered in conjunction with at least portions of the three-dimensional model. A base texture can be rendered for portions of the three-dimensional model in the same field of view that are viewed from a slightly different perspective than a reference direction associated with the view-dependent texture. For instance, a stretching factor can be determined for each portion of the three-dimensional model based on the reference direction and a viewpoint direction associated with the portion of the three-dimensional model. A base texture, a view-dependent texture, or a blended texture can be selected for rendering at the portion of the three-dimensional model based on the stretching factor.
25 Citations
11 Claims
-
1. A computer-implemented method of providing a three-dimensional model of a geographic area, comprising:
-
identifying, by one or more computing devices, a perspective of a virtual camera for viewing a polygon mesh, the polygon mesh modeling geometry of a geographic area; identifying, by the one or more computing devices, a view-dependent texture associated with a reference direction, the view-dependent texture generated for viewing the three-dimensional model from a reference viewpoint associated with the reference direction; identifying, by the one or more computing devices, a base texture generated for viewing the three-dimensional model from a plurality of different viewpoints; determining, by the one or more computing devices, a viewpoint direction associated with a fragment of the polygon mesh, the viewpoint direction extending from the virtual camera towards the fragment; and determining, by the one or more computing devices, a texture for display at the fragment based at least in part on an amount that a texture mapped image is stretched when mapped to the fragment when viewed from the viewpoint direction; wherein the texture determined for display at the fragment comprises one or more elements of the base texture or the view-dependent texture; wherein the texture is determined for display at the fragment based at least in part on a stretching factor indicative of the amount the texture mapped image is stretched when mapped to the fragment when viewed from the viewpoint direction; wherein the view-dependent texture is selected for display at the fragment when the stretching factor is less than a first threshold; wherein the base texture is selected for display at the fragment when the stretching factor is greater than a second threshold; a blended texture is selected for display at the fragment based at least in part on the stretching factor when the stretching factor is between the first threshold and the second threshold, the blended texture comprising a blend between the base texture and the view-dependent texture. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9)
-
-
10. A computing system for rendering a three-dimensional model of a geographic area, the system comprising:
-
a display; one or more processors; one or more computer-readable media, the computer-readable media storing instructions that when executed by the one or more processors cause the processors to perform operations, the operations comprising; identifying a perspective of a virtual camera for viewing a polygon mesh, the polygon mesh modeling geometry of a geographic area; identifying a view-dependent texture associated with a reference direction, the view-dependent texture generated for viewing the three-dimensional model from a reference viewpoint associated with the reference direction; identifying a base texture generated for viewing the three-dimensional model from a plurality of different viewpoints; determining a viewpoint direction associated with a fragment of the polygon mesh, the viewpoint direction extending from the virtual camera towards the fragment; and determining a texture for display at the fragment based at least in part on an amount that a texture mapped image is stretched when mapped to the fragment when viewed from the viewpoint direction; wherein the texture determined for display at the fragment comprises one or more elements of the base texture or the view-dependent texture; wherein the texture is determined for display at the fragment based at least in part on a stretching factor indicative of the amount the texture mapped image is stretched when mapped to the fragment when viewed from the viewpoint direction; wherein the view-dependent texture is selected for display at the fragment when the stretching factor is less than a first threshold; wherein the base texture is selected for display at the fragment when the stretching factor is greater than a second threshold; a blended texture is selected for display at the fragment based at least in part on the stretching factor when the stretching factor is between the first threshold and the second threshold, the blended texture comprising a blend between the base texture and the view-dependent texture.
-
-
11. A tangible non-transitory computer-readable medium comprising computer-readable instructions that when executed by one or more processors, cause the one or more processors to perform operations of providing a three-dimensional model of a geographic area, the operations comprising:
-
identifying a perspective of a virtual camera for viewing a polygon mesh, the polygon mesh modeling geometry of a geographic area; identifying a view-dependent texture associated with a reference direction, the view-dependent texture generated for viewing the three-dimensional model from a reference viewpoint associated with the reference direction; identifying a base texture generated for viewing the three-dimensional model from a plurality of different viewpoints; determining a viewpoint direction associated with a fragment of the polygon mesh, the viewpoint direction extending from the virtual camera towards the fragment; and determining a texture for display at the fragment based at least in part on an amount that a texture mapped image is stretched when mapped to the fragment when viewed from the viewpoint direction; wherein the texture determined for display at the fragment comprises one or more elements of the base texture or the view-dependent texture; wherein the texture is determined for display at the fragment based at least in part on a stretching factor indicative of the amount the texture mapped image is stretched when mapped to the fragment when viewed from the viewpoint direction; wherein the view-dependent texture is selected for display at the fragment when the stretching factor is less than a first threshold; wherein the base texture is selected for display at the fragment when the stretching factor is greater than a second threshold; a blended texture is selected for display at the fragment based at least in part on the stretching factor when the stretching factor is between the first threshold and the second threshold, the blended texture comprising a blend between the base texture and the view-dependent texture.
-
Specification