Non-edge computer image generation system
First Claim
Patent Images
1. A method for generating an image, said method comprising:
- storing a multiplicity of parameters defining quadric and planar surfaces, said parameters being in terms of a three-axis reference coordinate system;
computing coefficients defining image plane curves from the parameters, the image plane curves being a perspective projection into an image plane of silhouette curves of the quadric surfaces, and intersections between the planar surfaces and between the planar surfaces and quadric surfaces;
transforming the stored parameters into terms of an eyepoint coordinate system having an origin at a focal point of the perspective projection and first and third axes paralleling the image plane, the coefficients being computed from the transformed parameters;
grouping the stored parameters into object lists, each object list including all of the parameters defining no more than one of the quadric surfaces and no more than a predetermined number of the planar surfaces, the parameters defining each planar surface including an inequality identification defining a half space bounded by the planar surface, such that the parameters grouped in each object list define an object lying in a conjunction of all the associated half spaces and having a boundary surface defined by the associated quadric and planar surfaces; and
displaying desired portions of the image plane curves on a display means in accordance with said coefficients.
1 Assignment
0 Petitions
Accused Products
Abstract
A computer image generation system is described which models objects without the necessity of linear edges. The system is adaptable for dynamic (real time) image generation from a compact model base for use in, for example, flight training systems. Scene content is enhanced by a novel texture generator. The system is designed for use with standard video display equipment.
110 Citations
43 Claims
-
1. A method for generating an image, said method comprising:
-
storing a multiplicity of parameters defining quadric and planar surfaces, said parameters being in terms of a three-axis reference coordinate system; computing coefficients defining image plane curves from the parameters, the image plane curves being a perspective projection into an image plane of silhouette curves of the quadric surfaces, and intersections between the planar surfaces and between the planar surfaces and quadric surfaces; transforming the stored parameters into terms of an eyepoint coordinate system having an origin at a focal point of the perspective projection and first and third axes paralleling the image plane, the coefficients being computed from the transformed parameters; grouping the stored parameters into object lists, each object list including all of the parameters defining no more than one of the quadric surfaces and no more than a predetermined number of the planar surfaces, the parameters defining each planar surface including an inequality identification defining a half space bounded by the planar surface, such that the parameters grouped in each object list define an object lying in a conjunction of all the associated half spaces and having a boundary surface defined by the associated quadric and planar surfaces; and displaying desired portions of the image plane curves on a display means in accordance with said coefficients. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12)
-
5. The method of claim 4, wherein the step of computing coefficients defining image plane curves comprises computing limb curve coefficients for each quadric surface in accordance with six limb curve coefficient formulas:
-
space="preserve" listing-type="equation">a.sub.1 =q.sub.7.sup.2 -4q.sub.0 q.sub.1
space="preserve" listing-type="equation">a.sub.2 =q.sub.9.sup.2 -4q.sub.0 q.sub.3
space="preserve" listing-type="equation">a.sub.3 =2q.sub.7 q.sub.9 -4q.sub.0 q.sub.6
space="preserve" listing-type="equation">a.sub.4 =(2q.sub.7 q.sub.8 -4q.sub.0 q.sub.4)f
space="preserve" listing-type="equation">a.sub.5 =(2q.sub.8 q.sub.9 -4q.sub.0 q.sub.5)f
space="preserve" listing-type="equation">a.sub.6 =(q.sub.8.sup.2 -4q.sub.0 q.sub.2)f.sup.2where a1 through a6 are the limb curve coefficients, f is a distance from the focal point of the perspective projection to the image plane, q0 through q9 are the associated quadric surface coefficients, and the perspective projection of the silhouette curve associated with the quadric surface includes a locus of all points in the image plane satisfying an equation;
space="preserve" listing-type="equation">a.sub.1 h.sup.2 +a.sub.2 v.sup.2 +a.sub.3 hv+a.sub.4 h+a.sub.5 v+a.sub.6 =0where h and v are Cartesian coordinates defining a point in terms of an image plane coordinate system having an origin at an intersection of the second eyepoint coordinate axis with the image plane, and horizontal and vertical axes paralleling the first and third eyepoint coordinate axes respectively.
-
-
6. The method of claim 4, further comprising grouping the stored parameters into object lists, each object list including all of the parameters defining no more than one of the quadric surfaces and no more than a predetermined number of the planar surfaces, such that the parameters grouped in each object list define an object lying in a conjunction of all the associated half spaces and having a boundary surface defined by the associated quadric and planar surfaces.
-
7. The method of claim 6, wherein the step of computing coefficients defining image plane curves further comprises computing intersection curve coefficients for each planar surface associated by one of the object lists with one of the quadric surfaces, the intersection curve coefficients being computed in accordance with six intersection curve coefficient formulas:
-
space="preserve" listing-type="equation">e.sub.1 =B.sub.4.sup.2 q.sub.1 -B.sub.1 B.sub.4 q.sub.7 +q.sub.0 B.sub.1.sup.2
space="preserve" listing-type="equation">e.sub.2 =B.sub.4.sup.2 q.sub.3 -B.sub.3 B.sub.4 q.sub.9 +q.sub.0 B.sub.3
space="preserve" listing-type="equation">e.sub.3 =B.sub.4.sup.2 q.sub.6 -B.sub.1 B.sub.4 q.sub.9 -B.sub.3 B.sub.4 q.sub.7 +2B.sub.1 B.sub.3 q.sub.0
space="preserve" listing-type="equation">e.sub.4 =(B.sub.4.sup.2 q.sub.4 -B.sub.2 B.sub.4 q.sub.7 -B.sub.1 B.sub.4 q.sub.8 +2B.sub.1 B.sub.2 q.sub.0)f
space="preserve" listing-type="equation">e.sub.5 =(B.sub.4.sup.2 q.sub.5 -B.sub.2 B.sub.4 q.sub.9 -B.sub.3 B.sub.4 q.sub.8 +2B.sub.2 B.sub.3 q.sub.0)f
space="preserve" listing-type="equation">e.sub.6 =(B.sub.4.sup.2 q.sub.2 -B.sub.2 B.sub.4 q.sub.8 +B.sub.2.sup.2 q.sub.0)f.sup.2where e1 through e6 are the intersection curve coefficients, q0 through q9 and B1 through B4 are the associated quadric and planar surface coefficients, and the perspective projection of the intersection between the planar surface and the associated quadric surface includes a locus of all points in the image plane satisfying an equation
space="preserve" listing-type="equation">e.sub.1 h.sup.2 +e.sub.2 v.sup.2 +e.sub.3 hv+e.sub.4 h+e.sub.5 v+e.sub.6 =0.
-
-
8. The method of claim 7, wherein the step of computing coefficients defining image plane curves further comprises computing intersection line coefficients for pairs of the planar surfaces, the intersection line coefficients being computed for each pair of planar surfaces associated with the same object list in accordance with three intersection line coefficient formulas:
-
space="preserve" listing-type="equation">l.sub.1 =B.sub.1-1 /B.sub.4-1 -B.sub.1-2 /B.sub.4-2
space="preserve" listing-type="equation">l.sub.2 =B.sub.3-1 /B.sub.4-1 -B.sub.3-2 /B.sub.4-2
space="preserve" listing-type="equation">l.sub.3 =B.sub.2-1 /B.sub.4-1 -B.sub.2-2 /B.sub.4-2where l1, l2 and l3 are the intersection line coefficients, B1-1 through B4-1 are the bounding plane coefficients for one of the planar surfaces from the pair, B1-2 through B4-2 are the bounding plane coefficients for the other planar surface from the pair, and the perspective projection of the intersection between the planar surfaces in the pair includes a locus of all points in the image plane satisfying an equation;
space="preserve" listing-type="equation">l.sub.1 h+l.sub.2 v+l.sub.3 =0.
-
-
9. The method of claim 8, wherein the desired portions of the image plane curves for each object are limited to those portions related to a part of the object'"'"'s boundary surface which faces the focal point of the perspective projection.
-
10. The method of claim 9, further comprising
calculating visibility test line coefficients for each intersection curve, the visibility test line coefficients being computed in accordance with three visibility test line coefficient formulas: -
space="preserve" listing-type="equation">t.sub.1 =B.sub.1 /B.sub.4 -q.sub.7 /2q.sub.0
space="preserve" listing-type="equation">t.sub.2 =B.sub.3 /B.sub.4 -q.sub.9 /2q.sub.0
space="preserve" listing-type="equation">t.sub.3 =(B.sub.3 /B.sub.4 -q.sub.8 /2q.sub.0)fwhere t1, t2 and t3 are the visibility test line coefficients, B1 through B4 are the associated bounding plane coefficients and q7, q8, q9 and q0 are four of the associated quadric surface coefficients; and determining the desired portions of each object'"'"'s associated image plane curves by referring to the associated limb curve, intersection curve, intersection line and visibility test line coefficients and the inequality identification for the associated planar surfaces.
-
-
11. The method of claim 10, wherein the image plane curves include limb curves, each defined by the limb curve coefficients for an associated one of the quadric surfaces;
- intersection curves, each defined by the intersection curve coefficients for an associated one of the planar surfaces; and
intersection lines, each defined by the intersection line coefficients for an associated one of the pairs of planar surfaces.
- intersection curves, each defined by the intersection curve coefficients for an associated one of the planar surfaces; and
-
12. The method of claim 11, wherein the step of determining desired portions of each object'"'"'s image plane curves comprises:
-
testing minimum and maximum points on the associated limb curve against the object'"'"'s associated visibility test line coefficients and intersection lines; testing contact points between the associated limb curve and each associated intersection curve against the associated intersection lines; and testing points of intersection between intersection curves and intersection lines against the object'"'"'s associated intersection line.
-
-
-
13. A method for simulating visual images encountered during nap-of-the-earth flight, said method comprising:
-
(a) gathering elevation data from a geographic region of which the visual images are to be simulated, the elevation data being measured with respect to a ground plane; (b) digitizing the elevation data; (c) determining major terrain features; (d) isolating the major terrain features; (e) fitting a single quadric surface to the elevation data corresponding to each isolated major terrain feature; (f) determining bounding planes for each isolated major terrain feature to maximize continuity between adjoining quadric surfaces; (g) determining the texture function parameters sufficient to define a desired texture pattern for each quadric surface and bounding plane by Fourier analysis of the elevation data; (h) generating further quadric surfaces, bounding planes and texture function parameters corresponding to solid surface objects typically found in the geographic region for which the visual images are to be simulated; (i) generating yet further quadric surfaces, bounding planes and texture function parameters corresponding to dynamic objects typically found in the geographic region for which the visual images are to be simulated; (j) generating yet further texture function parameters corresponding to a desired cloud texture pattern; (k) constructing a data base containing quadric surface parameters indicating the size and shape of each quadric surface and its location and orientation with respect to the ground plane, bounding plane parameters indicating the orientation of each bounding plane with respect to the ground plane, and the texture function parameters; (l) defining a line of sight corresponding to an observation platform location and orientation with respect to the ground plane; (m) assigning a subframe processor to each of a plurality of uniform subframe areas covering a field of view corresponding to the line of sight; (n) determining which of the major terrain features, solid surface objects, and dynamic objects are within the field of view; (o) assigning an object processor to each quadric surface corresponding to the major terrain features, solid surface objects, and dynamic objects within the field of view; (p) causing each assigned object processor to compute intraobject visibility information for its assigned quadric surface and associated bounding planes from the quadric surface and bounding plane parameters in the data base, and the line of sight; (q) causing the subframe processors to generate image information from the intraobject visibility information and the texture function parameters; (r) displaying the image information; (s) updating the line of sight to simulate a desired incremental in the location and orientation of the observation platform with respect to the ground plane; (t) varying the texture function parameters corresponding to the dynamic objects and the cloud texture pattern to simulate a desired motion and agitation of the associated texture patterns; and (u) repeating the steps (n) to (t) at a sufficient rate to simulate the desired visual images.
-
-
14. A flight training system for simulating an image plane viewable from a flight platform over a region modeled by a multiplicity of objects, said system comprising:
-
bulk memory means for storing an object list for each object, each such object list including a plurality of object parameters indicating the size, shape, orientation, location, and texture of the associated object; viewpoint tracking means for providing an updated set of viewpoint parameters indicating the location and orientation of the flight platform with respect to the objects, said viewpoint parameters being updated once during each passage of a desired frame period; a plurality of object processors for providing an updated set of coefficients for image plane curves representing a perspective projection of the objects into the image plane, said coefficients being computed once per frame period from the viewpoint parameters and the shape, size, orientation and location object parameters; pixel generating means for providing an updated set of intensity levels for a multiplicity of pixels corresponding to sample points on the image plane, said intensity levels being computed once per frame period from the image plane curve coefficients, the viewpoint parameters, and the texture and location object parameters, said pixel generating means comprising a plurality of subframe processors each associated with a subframe defined by one of a plurality of uniform contiguous areas forming the image plane, and means for transferring the updated image plane curve coefficient from each object processor to at least one subframe processor whose associated area in the image plane is intersected by the projection of an object associated with that object processor, the updated image plane curve coefficients being transferred once per frame period; fast memory means for storing those object lists corresponding to objects currently viewable in the image plane, the object lists being organized into blocks in the bulk memory means, each block including all the object lists for one of a plurality of predetermined contiguous areas making up the modeled region, the number of objects in each contiguous area being nearly equal; and display means for arranging the pixels on a video raster. - View Dependent Claims (15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35)
-
-
36. In a system for displaying an image plane on a video raster made up of pixels arranged on scan lines, each pixel having an associated intensity level, a portion of the pixels being superimposed by a projection into the image plane of at least one simulated object, a texture generator comprising
memory means for storing parameters for a texture function, the parameters being statistically related to the simulated object; -
means for determining, for each superimposed pixel, first, second and third Cartesian coordinates identifying a location in an object model space of a point on the simulated object for which a projection of the point into the image plane falls within the superimposed pixel; computing means for modulating the intensity level of each superimposed pixel in accordance with a value of the texture function computed using the Cartesian coordinates determined for that pixel; comparing means for comparing the value of the texture function for each superimposed pixel with a translucence limit for the object and substituting a background intensity level for the intensity level of each superimposed pixel whose texture function value is less than the translucence limit. - View Dependent Claims (37, 38, 39, 40)
-
-
41. In a system for generating a display of an image plane characterized by at least one boundary curve separating first and second regions of different intensities, the display being formed by a plurality of square pixels of uniform width arranged on a video raster, each pixel having an associated intensity level, an antialiasing filter comprising:
-
alias testing means for identifying as aliased pixels those pixels intersected by a predetermined area about the boundary curve, said predetermined area about said boundary curve including all points in the image plane within a cutoff distance of said boundary curve, the cutoff distance being substantially equal to the pixel width multiplied by a square root of two distance measuring means for determining a weighting distance for each aliased pixel, each weighting distance being defined as a distance measured in pixel widths from a center of the associated pixel to the boundary curve; weight calculating means for deriving first and second intensity weights for each aliased pixel, the first intensity weight for each aliased pixel whose center lies in the first region and the second intensity weight for each aliased pixel whose center lies in the second region being derived from the associated weighting distance, the other intensity weight for each aliased pixel being derived from a negative of the associated weighting distance; and intensity adjusting means for assigning an antialiased intensity level to each aliased pixel, the antialiased intensity level being substantially equal to the associated first intensity weight multiplied by the intensity of the first region plus the associated second intensity weight multiplied by the intensity of the second region. - View Dependent Claims (42, 43)
-
Specification