Apparatus and methods for determining the three-dimensional shape of an object using active illumination and relative blurring in two-images due to defocus
First Claim
Patent Images
1. A method for mapping a three-dimensional stricture by depth from defocus, comprising the steps of:
- (a) illuminating said structure with a preselected illumination pattern;
(b) sensing at least a first image and a second image of said illuminated structure with different imaging parameters; and
(c) determining a relative blur between at least one elemental portion of said first image and at least one elemental portion of said second image which correspond to the same portion of said three-dimensional structure to identify depth of corresponding elemental portions of said three-dimensional structure.
1 Assignment
0 Petitions
Accused Products
Abstract
A method and apparatus for mapping depth of an object (22) in a preferred arrangement uses a projected light pattern to provide a selected texture to the object (22) along the optical axis (24) of observation. An imaging system senses (32, 34) first and second images of the object (22) with the projected light pattern and compares the defocused of the projected pattern in the images to determine relative depth of elemental portions of the object (22).
652 Citations
51 Claims
-
1. A method for mapping a three-dimensional stricture by depth from defocus, comprising the steps of:
-
(a) illuminating said structure with a preselected illumination pattern;
(b) sensing at least a first image and a second image of said illuminated structure with different imaging parameters; and
(c) determining a relative blur between at least one elemental portion of said first image and at least one elemental portion of said second image which correspond to the same portion of said three-dimensional structure to identify depth of corresponding elemental portions of said three-dimensional structure. - View Dependent Claims (2, 3, 4, 5)
-
-
6. A method for mapping a three-dimensional structure by depth from defocus, comprising the steps of:
-
(a) illuminating said structure with a preselected illumination pattern comprising a rectangular grid projected along an optical axis;
(b) sensing at least a first image and a second image of said illuminated structure from said optical axis using a constant magnification imaging system and at imaging planes with different locations with respect to a focal plane of said imaging system; and
(c) determining a relative blur between at least one elemental portion of said first image and at least one elemental portion of said second image which correspond to the same portion of said three-dimensional structure to identify depth of corresponding elemental portions of said three dimensional structure. - View Dependent Claims (7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25)
(i) converting said sensed images into digital signals on a pixel by pixel basis; and
(ii) convolving said digital signals on a pixel by pixel basis to determine power measurement signals that correspond to the fundamental frequency of said illumination pattern at each of said pixels for each sensed scene image.
-
-
19. The method of claim 18, wherein said measuring step further comprises:
(iii) correcting said power measurement signals for mis-registration on a pixel by pixel basis, such that any errors introduced into said power measurement signals because of misalignment between said sensing pixels of said array and said illumination pattern is corrected.
-
20. The method of claim 19, wherein said correcting step comprises taking the sum of the squares of said measurement signal at four neighboring pixels.
-
21. The method of claim 18, wherein said measuring step further comprises:
(iii) normalizing said power measurement signals on a pixel by pixel basis.
-
22. The method of claim 18, wherein said measuring step further comprises:
(iii) comparing said power measurement signals for one of said sensed images, on a pixel by pixel basis, with determined power measurements for a second of said sensed images to determine said depth information at each of said pixels.
-
23. The method of claim 6, wherein said determination step comprises arranging said pixel by pixel depth information as a depth map.
-
24. The method of claim 23, further comprising the step of displaying said depth map as a wireframe image.
-
25. The method of claim 13, wherein said determination step comprises arranging said pixel by pixel depth information as a depth map, further comprising the step of constructing a texture mapped three-dimensional display from said sensed brightness image and said depth map.
-
26. Apparatus for measuring a three-dimensional structure of a scene by depth from defocus, comprising:
-
(a) active illumination means for illuminating the scene with a preselected illumination pattern;
(b) sensor means, optically coupled to said illuminating means, for sensing at least a first image and a second image of the scene with differing optical or imaging parameters;
(c) depth measurement means, coupled to said sensor means, for measuring a relative blur between at least one elemental portion of said first image and at least one elemental portion of said second image which correspond to the same portion of said three-dimensional structure to identify depth of said three-dimensional structure; and
(d) scene recovery means, coupled to said measurement means, for reconstructing said three-dimensional structure of said sensed scene from said measured relative blur of said sensed images. - View Dependent Claims (27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 44, 45, 46, 47, 48, 49, 50, 51)
(i) an illumination base;
(i) a light source coupled to said illumination base; and
(ii) a spectral filter having said preselected illuminating pattern coupled to said illumination base, such that light from said light source passes through said spectral filter to form said preselected illumination pattern.
-
-
29. The apparatus of claim 28, wherein said preselected illumination pattern of said spectral filter is optimized so that a small variation in the degree of defocus sensed by said sensor means results in a large variation in the relative blur measured by said depth measurement means.
-
30. The apparatus of claim 29, wherein said optimized illumination pattern is a rectangular grid pattern.
-
31. The apparatus of claim 30, wherein said optimized illumination pattern comprises a pattern having a period being substantially equal to twice said pixel width and a phase shift being substantially equal to zero with respect to said sensing grid, in two orthogonal directions.
-
32. The apparatus of claim 30, wherein said optimized illumination pattern comprises a pattern having a period being substantially equal to four times said pixel width and a phase shift being substantially equal to one eighth of said pixel width with respect to said sensing grid, in two orthogonal directions.
-
33. The apparatus of claim 28, wherein said light source is a Xenon lamp.
-
34. The apparatus of claim 28, wherein said light source is a monochromatic laser.
-
35. The apparatus of claim 34, wherein said sensor means further comprises:
-
(i) a sensor base;
(i) first and second depth sensors, coupled to said sensor base, for sensing depth images of said scene formed by said laser light, such that said depth measurement means measure a relative blur between said sensed laser light images; and
(ii) at least one brightness sensor, coupled to said sensor base, for sensing an image of said scene formed by ambient light.
-
-
36. The apparatus of claim 26, wherein said sensor means comprises:
-
(i) a sensor base;
(ii) a lens, coupled to said sensor base and optically coupled to said illuminating means, for receiving scene images;
(iii) a beamsplitter, coupled to said sensor base and optically coupled to said lens, for splitting said scene images into two split scene images; and
(iv) first and second sensors, coupled to said sensor base, wherein said first sensor is optically coupled to said beamsplitter such that a first of said split scene images is incident on said first sensor and said second sensor is optically coupled to said beamsplitter such that a second of said split scene images is incident on said second sensor.
-
-
37. The apparatus of claim 36, wherein said sensor means further comprises:
(v) an optical member having an aperture, coupled to said sensor base in a position between said lens and said beamsplitter, being optically coupled to both said lens and said beamsplitter such that images received by said lens are passed through said aperture and are directed toward said beamsplitter.
-
38. The apparatus of claim 36, wherein said first sensor is at a position corresponding to a near focused plane in said sensed scene, and said second sensor is at a position corresponding to a far focused plane in said sensed scene.
-
39. The apparatus of claim 38, wherein said spectral filter includes an illumination pattern capable of generating multiple spatial frequencies for each image sensed by said first and second sensors.
-
40. The apparatus of claim 26, further comprising:
-
(e) a support member, coupled to said active illumination means and said sensor means; and
(f) a half-mirror, coupled to said support member at an optical intersection of said active illumination means and said sensor means, such that said preselected illumination pattern is reflected by said half-mirror prior to illuminating said scene, and such that said scene images pass through said half-mirror prior to being sensed by said sensor means, whereby said illumination pattern and said scene images pass through coaxial optical paths.
-
-
41. The apparatus of claim 26, further comprising:
-
(e) a support member, coupled to said active illumination means and said sensor means; and
(f) a half-mirror, coupled to said support member at an optical intersection of said active illumination means and said sensor means, such that said preselected illumination pattern passes through said half-mirror prior to illuminating said scene, and such that said scene images are reflected by said half-mirror prior to being sensed by said sensor means, whereby said illumination pattern and said scene images pass through coaxial optical paths.
-
-
42. The apparatus of claim 26, further comprising:
-
(e) a support member, coupled to said active illumination means and said sensor means; and
(f) a polarization filter, coupled to said support member at an optical intersection of said active illumination means and said sensor means, such that said preselected illumination pattern is reflected by said polarization filter prior to illuminating said scene, and such that said scene images pass through said polarization filter prior to being sensed by said sensor means, whereby said illumination pattern incident on said scene and said sensed scene images are both polarized in controlled polarization directions.
-
-
43. The apparatus of claim 27, wherein said depth measurement means further comprises:
-
(i) analog to digital converting means, coupled to said sensor means, for converting sensed images into digital signals on a pixel by pixel basis; and
(ii) convolving means, coupled to said analog to digital converting means, for convolving said digital signals on a pixel by pixel basis to derive power measurement signals that correspond to the fundamental frequency of said illumination pattern at each of said pixels for each sensed scene image.
-
-
44. The apparatus of claim 43, wherein said depth measurement means further comprises:
(iii) registration correction means, coupled to said convolving means, for correcting said power measurement signals for mis-registration on a pixel by pixel basis, such that any errors introduced into said power measurement signals because of misalignment between said sensing pixels of said grid and said illumination pattern is corrected.
-
45. The apparatus of claim 44, wherein said registration correction means further include arithmetic means for multiplying each of said power measurement signals, on a pixel by pixel basis, by the sum of the squares of said power measurement signal'"'"'s four neighboring power measurement signals.
-
46. The apparatus of claim 43, wherein said depth measurement means further comprises:
(iii) normalizing means, coupled to said convolving means, for normalizing said power measurement signals on a pixel by pixel basis.
-
47. The apparatus of claim 43, wherein said depth measurement means further comprises:
(iii) comparator means, coupled to said convolving means, for comparing said power measurement signals for one of said sensed images, on a pixel by pixel basis, with determined power measurements for a second of said sensed images, to determine said depth information at each of said pixels.
-
48. The apparatus of claim 47, wherein said comparator means includes a look-up table.
-
49. The apparatus of claim 27, wherein said scene recovery means comprises depth map storage means, coupled to said depth measurement means, for storing derived pixel by pixel depth information for said scene as a depth map.
-
50. The apparatus of claim 49, further comprising:
(e) display means, coupled to said scene recovery means, for displaying said depth map as a wireframe on a bitmapped workstation.
-
51. The apparatus of claim 35, wherein said scene recovery means comprises three-dimensional texturemap storage means, coupled to said depth measurement means and said brightness sensor, for storing derived pixel by pixel depth information and brightness information for said scene, further comprising:
(e) display means, coupled to said scene recovery means, for displaying said three-dimensional texturemap as a wireframe on a bitmapped workstation.
Specification