SENSOR FUSION FOR DEPTH ESTIMATION
First Claim
1. A method for calculating a depth value for a pixel in a reference image, the method comprising:
- receiving the reference image captured by a reference camera and at least one auxiliary image captured by an auxiliary camera;
receiving a first support value indicating whether the pixel in the reference image is at a particular depth, relative to the reference camera, based on comparing a region of the auxiliary image captured by the auxiliary camera with a region of the reference image captured by the reference camera;
providing a depth estimate of the pixel from a range-estimation camera;
receiving a second support value indicating whether the pixel in the reference image is at the particular depth based on comparing the depth estimate from the range-estimation camera to the particular depth;
receiving a third support value indicating whether the pixel is at the particular depth based on projecting a 3D point, corresponding to the pixel in the reference image, onto the auxiliary image; and
fusing, by operation of one or more computer processors, the first, second, and third support values to generate a total support value for the pixel at the particular depth.
2 Assignments
0 Petitions
Accused Products
Abstract
To generate a pixel-accurate depth map, data from a range-estimation sensor (e.g., a time-of flight sensor) is combined with data from multiple cameras to produce a high-quality depth measurement for pixels in an image. To do so, a depth measurement system may use a plurality of cameras mounted on a support structure to perform a depth hypothesis technique to generate a first depth-support value. Furthermore, the apparatus may include a range-estimation sensor which generates a second depth-support value. In addition, the system may project a 3D point onto the auxiliary cameras and compare the color of the associated pixel in the auxiliary camera with the color of the pixel in reference camera to generate a third depth-support value. The system may combine these support values for each pixel in an image to determine respective depth values. Using these values, the system may generate a depth map for the image.
27 Citations
20 Claims
-
1. A method for calculating a depth value for a pixel in a reference image, the method comprising:
-
receiving the reference image captured by a reference camera and at least one auxiliary image captured by an auxiliary camera; receiving a first support value indicating whether the pixel in the reference image is at a particular depth, relative to the reference camera, based on comparing a region of the auxiliary image captured by the auxiliary camera with a region of the reference image captured by the reference camera; providing a depth estimate of the pixel from a range-estimation camera; receiving a second support value indicating whether the pixel in the reference image is at the particular depth based on comparing the depth estimate from the range-estimation camera to the particular depth; receiving a third support value indicating whether the pixel is at the particular depth based on projecting a 3D point, corresponding to the pixel in the reference image, onto the auxiliary image; and fusing, by operation of one or more computer processors, the first, second, and third support values to generate a total support value for the pixel at the particular depth. - View Dependent Claims (2, 3, 4, 5, 6, 7)
-
-
8. A computer program product for calculating a depth value for a pixel in a reference image, the computer program product comprising:
a computer-readable storage medium having computer-readable program code embodied therewith, the computer-readable program code configured to; receive the reference image captured by a reference camera and at least one auxiliary image captured by an auxiliary camera; receive a first support value indicating whether the pixel in the reference image is at a particular depth, relative to the reference camera, based on comparing a region of the auxiliary image with a region of the reference image; provide a depth estimate of the pixel from a range-estimation camera; receive a second support value indicating whether the pixel in the reference image is at the particular depth based on comparing the depth estimate from the range-estimation camera to the particular depth; receive a third support value indicating whether the pixel is at the particular depth based on projecting a 3D point, corresponding to the pixel in the reference image, onto the auxiliary image; and fuse the first, second, and third support values to generate a total support value for the pixel at the particular depth. - View Dependent Claims (9, 10, 11, 12, 13)
-
14. A system, comprising:
-
a common support structure comprising; a reference camera, at least one auxiliary camera, and a range-estimation camera; and a computing device communicatively coupled to the reference, auxiliary, and ToF cameras, the computing device configured to; calculate a first support value indicating whether a pixel in a reference image captured by the reference camera is at a particular depth, relative to the reference camera, based on comparing a region of the reference image with a region of an auxiliary image captured by the auxiliary camera; calculate a second support value indicating whether the pixel in the reference image is at the particular depth based on comparing a depth estimate from the range-estimation camera to the particular depth; calculate a third support value indicating whether the pixel is at the particular depth based on projecting a 3D point, corresponding to the pixel in the reference image, onto the auxiliary image; and fuse the first, second, and third support values to generate a total support value for the pixel at the particular depth. - View Dependent Claims (15, 16, 17, 18)
-
-
19. A method for adjusting a depth value for a pixel in an image, the method comprising:
-
providing a depth map associated with the image, wherein the depth map comprises respective depth values for a plurality of pixels in the image; grouping a subset of the plurality of pixels into a super pixel based on comparing respective thermal values associated with neighboring pixels; and estimating a depth plane for the super pixel based on the respective depth values of the subset of the plurality of pixels; and adjusting at least one of the respective depth values by comparing the respective depth values of the subset of the plurality of pixels to the estimated depth plane. - View Dependent Claims (20)
-
Specification