DIGITALLY ENHANCED NIGHT VISION DEVICE
First Claim
1. A user portable viewing device including a plurality of non-coaxial sensors for sensing a scene to be viewed and operably attached to a processing element combining electronic images originating from at least two sensors into a single electronic image for viewing by a user, the invention comprising:
- an electronic display operably connected to the processing element for displaying the combined electronic image;
one of the sensors having an input end to sense the scene to be viewed and being adaptable for mounting in an optical axis extending from an eye of the user; and
the electronic display being adapted to be positioned in the optical axis of the user and between the eye of the user and the input end of the sensor for direct view of the display of the combined image by the user.
3 Assignments
0 Petitions
Accused Products
Abstract
A user portable viewing device (D) includes a plurality of non-coaxially aligned sensors (26 and 32) for sensing a scene (74). A processing element (P) combines electronic images into a single electronic image. A display (24) displaying the combined electronic image is adaptable for mounting in an optical axis (A1) including an eye (20) of the user (U) and an input end (28) of the first sensor 26 for direct view. In a second embodiment, a system for fusing images comprises sensors (52 and 56) for generating sets of image data. An information processor (P) receives and samples the sets of image data to generate sample data for computing a fused image array. A display (24) receives the fused image array and displays a fused colorized image generated from the fused image array.
24 Citations
13 Claims
-
1. A user portable viewing device including a plurality of non-coaxial sensors for sensing a scene to be viewed and operably attached to a processing element combining electronic images originating from at least two sensors into a single electronic image for viewing by a user, the invention comprising:
-
an electronic display operably connected to the processing element for displaying the combined electronic image;
one of the sensors having an input end to sense the scene to be viewed and being adaptable for mounting in an optical axis extending from an eye of the user; and
the electronic display being adapted to be positioned in the optical axis of the user and between the eye of the user and the input end of the sensor for direct view of the display of the combined image by the user. - View Dependent Claims (2, 3)
-
-
4. A system for fusing images of a scene received by a plurality of sensors into a displayable colorized image, the system comprising:
-
two or more sensors for generating two or more corresponding sets of image data;
each set of image data comprising an array of datapoints with each datapoint being representative of at least one information factor mapped to a pre-determined area of the scene as received by the sensor;
an information processor for receiving and sampling the sets of image data to generate corresponding mapped sample data arrays for each set of image data and for computing a fused colorized image array to be displayed from the sample data;
each sample data array comprising a mapping of datapoints from an image data set mapped to a corresponding datapoint in the related sample data array using a pre-determined function;
the mapping function being computed by assigning a point along a color vector in a two or more dimensional colorspace to a selected value of the information factor; and
the fused colorized image to be displayed being computed from combining corresponding datapoints from each of the sample data arrays using a pre-selected arithmetic vector function; and
a display for receiving the fused colorized image array and displaying a fused colorized image generated from the fused colorized image array. - View Dependent Claims (5, 6, 7, 8, 9, 10)
-
-
11. A method for fusing images, the method comprising the steps of:
-
receiving two or more sets of image data generated by two or more sensors;
each set of image data comprising an array of datapoints with each datapoint being representative of at least one information factor mapped to a pre-determined area of the scene as received by the sensor;
sampling the sets of image data to produce corresponding mapped sample data arrays for each set of image data and for use in computing a fused colorized image array to be displayed from the mapped sample data;
each sample data array comprising a mapping of datapoints from an image data set mapped to a corresponding datapoint in the related sample data array using a pre-determined function;
the mapping function being computed by assigning a point along a color vector in a two or more dimensional colorspace to a selected value of the information factor;
computing a fused colorized image array from the sample data arrays by combining corresponding datapoints from each of the sample data arrays using a pre-selected arithmetic vector function; and
displaying a fused colorized image generated from the fused colorized image array. - View Dependent Claims (12, 13)
-
Specification