×

Systems and methods for estimating depth from projected texture using camera arrays

  • US 10,119,808 B2
  • Filed: 11/18/2014
  • Issued: 11/06/2018
  • Est. Priority Date: 11/18/2013
  • Status: Active Grant
First Claim
Patent Images

1. A camera array, comprising:

  • at least one two-dimensional array of cameras comprising a plurality of cameras;

    an illumination system configured to illuminate a scene with a projected texture;

    a processor;

    memory containing an image processing pipeline application and an illumination system controller application;

    wherein the at least one two-dimensional array of cameras comprises at least two two-dimensional arrays of cameras located in complementary occlusion zones on opposite sides of the illumination system;

    wherein the illumination system controller application directs the processor to control the illumination system to illuminate a scene with a projected texture;

    wherein the image processing pipeline application directs the processor to;

    utilize the illumination system controller application to control the illumination system to illuminate a scene with a projected texture, wherein a spatial pattern period of the projected texture is different along different epipolar lines;

    capture a set of images of the scene illuminated with the projected texture;

    determining depth estimates for pixel locations in an image from a reference viewpoint using at least a subset of the set of images that includes at least one image captured by a camera in each of the two-dimensional arrays of cameras, wherein generating a depth estimate for a given pixel location in the image from the reference viewpoint comprises;

    identifying pixels in the at least a subset of the set of images that correspond to the given pixel location in the image from the reference viewpoint based upon expected disparity at a plurality of depths along a plurality of epipolar lines aligned at different angles with respect to each other, wherein each epipolar line in the plurality of epipolar lines is between a camera located at the reference viewpoint and an alternative view camera from a plurality of alternative view cameras in the two-dimensional array of cameras and each epipolar line is used to determine the direction of anticipated shifts of corresponding pixels with depth in alternative view images captured by the plurality of alternative view cameras, wherein disparity along a first epipolar line is greater than disparity along a second epipolar line and the projected pattern incorporates a smaller spatial pattern period in a direction corresponding to the second epipolar line;

    comparing the similarity of the corresponding pixels identified at each of the plurality of depths; and

    selecting the depth from the plurality of depths at which the identified corresponding pixels have the highest degree of similarity as a depth estimate for the given pixel location in the image from the reference viewpoint.

View all claims
  • 13 Assignments
Timeline View
Assignment View
    ×
    ×