SYSTEMS AND METHODS OF SCENE AND ACTION CAPTURE USING IMAGING SYSTEM INCORPORATING 3D LIDAR
First Claim
1. A system, comprising:
- A) a first LIDAR system, including;
a first 2D sensor, wherein the first aperture provides image information to the first 2D sensor;
a first 3D sensor, wherein the first aperture provides image information to the first 3D sensor;
a first illumination source;
B) a second LIDAR system, including;
a first 2D sensor, wherein the first aperture provides image information to the first 2D sensor;
a first 3D sensor, wherein the first aperture provides image information to the first 3D sensor;
a first illumination source;
C) a processor, wherein image information obtained by the first LIDAR system and image information obtained by the second LIDAR system is combined.
2 Assignments
0 Petitions
Accused Products
Abstract
The present invention pertains to systems and methods for the capture of information regarding scenes using single or multiple three-dimensional LADAR systems. Where multiple systems are included, those systems can be placed in different positions about the imaged scene such that each LADAR system provides different viewing perspectives and/or angles. In accordance with further embodiments, the single or multiple LADAR systems can include two-dimensional focal plane arrays, in addition to three-dimensional focal plane arrays, and associated light sources for obtaining three-dimensional information about a scene, including information regarding the contours of the objects within the scene. Processing of captured image information can be performed in real time, and processed scene information can include data frames that comprise three-dimensional and two-dimensional image data.
-
Citations
26 Claims
-
1. A system, comprising:
-
A) a first LIDAR system, including; a first 2D sensor, wherein the first aperture provides image information to the first 2D sensor; a first 3D sensor, wherein the first aperture provides image information to the first 3D sensor; a first illumination source; B) a second LIDAR system, including; a first 2D sensor, wherein the first aperture provides image information to the first 2D sensor; a first 3D sensor, wherein the first aperture provides image information to the first 3D sensor; a first illumination source; C) a processor, wherein image information obtained by the first LIDAR system and image information obtained by the second LIDAR system is combined. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9, 10)
-
-
11. A method, comprising:
-
obtaining a first 2D image of a first portion of a scene from a first view point; obtaining a first 3D image of the first portion of the scene from the first view point, wherein the first 3D image includes a first point cloud; determining the location of the first view point; associating a location to at least some of the points in the first point cloud; orthorectifying the first 2D image; associating a location to at least some of the pixels in the 2D image; fusing pixels in the first 2D image to points in the first point cloud to create a first fused image. - View Dependent Claims (12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22)
-
-
23. A system, comprising:
-
a first aperture; a first 2D sensor, wherein the first 2D sensor is operable to output 2D image data of a first scene, and wherein the first 2D sensor is provided with a first signal by the first aperture; a first light source; a beam steering element, wherein the beam steering element is operable to steer a beam output by the first light source; a first 3D sensor, wherein the first 3D sensor is a flash sensor operable to output 3D image data of the first scene, wherein the first 3D sensor is provided with a second signal by the first aperture, and wherein the second signal includes light output by the first light source and reflected from at least a portion of the first scene; a location determination device; a real time processor, wherein a frame of 2D image data is fused with a frame of 3D image data to create a fused data frame, wherein absolute location information output by the location determination device is associated with each frame of fused data, wherein subsequent frames of fused data are generated in real time. - View Dependent Claims (24, 25, 26)
-
Specification