Motion and disparity estimation method, image synthesis method, and apparatus for implementing same methods
First Claim
1. A disparity estimation apparatus, comprising:
- correspondence determining means, for determining correspondence between a first image and a second image;
correspondence evaluation means for evaluating a confidence of correspondence determined by the correspondence determining means to produce evaluation data;
disparity data computing means for computing disparity data for the first image and the second image based upon correspondence determined by the correspondence determining means;
occlusion judging means for judging an occlusion portion of the second image which is not included in the first image, based upon the disparity data and the evaluation data;
first edge extracting means, for extracting edges of the evaluation data;
second edge extracting means, for extracting edges of the first and second images;
first selecting means for selecting at least one edge of the evaluation data which corresponds to a non-occlusion portion of the first image;
integrating means for integrating at least one edge of the first image with the selected at least one edge of the evaluation data to produce at least one non-occlusion disparity edge;
correspondence converting means, for obtaining the vicinity in the second image corresponding to the non-occlusion disparity edge produced by the integrating means, based upon the disparity data;
second selecting means, for selecting at least one edge of the second image from the occlusion portion of the second image;
third selecting means, for selecting the selected at least one edge of the second image and the vicinity produced by the correspondence converting means, to produce at least one occlusion disparity edge;
disparity supplementing means for supplementing disparity data for the occlusion portion of the second image, based upon the at least one occlusion disparity edge and the disparity data;
disparity extrapolating means, for extrapolating disparity data for the non-occlusion portion of the first image based upon the non-occlusion disparity edge and the disparity data; and
disparity integrating means, for integrating the disparity data, the supplemented disparity data, and the extrapolated disparity data.
1 Assignment
0 Petitions
Accused Products
Abstract
To achieve this, the invention has a base image frame memory for storing a base image, a reference image frame memory for storing a reference image, a block correlation computing circuit for computing block correlations and confidence measure for estimation by using a plurality of block sizes, a representative pixel correlation computing circuit, and an estimation integrating computing circuit for evaluating the reliability of the result of estimation by block matching on the basis of a luminance gradient, image noise, a minimum value of an evaluation yardstick for differences between blocks, and block size, and for integrating the results of estimation obtained with the plurality of block sizes.
78 Citations
46 Claims
-
1. A disparity estimation apparatus, comprising:
-
correspondence determining means, for determining correspondence between a first image and a second image; correspondence evaluation means for evaluating a confidence of correspondence determined by the correspondence determining means to produce evaluation data; disparity data computing means for computing disparity data for the first image and the second image based upon correspondence determined by the correspondence determining means; occlusion judging means for judging an occlusion portion of the second image which is not included in the first image, based upon the disparity data and the evaluation data; first edge extracting means, for extracting edges of the evaluation data; second edge extracting means, for extracting edges of the first and second images; first selecting means for selecting at least one edge of the evaluation data which corresponds to a non-occlusion portion of the first image; integrating means for integrating at least one edge of the first image with the selected at least one edge of the evaluation data to produce at least one non-occlusion disparity edge; correspondence converting means, for obtaining the vicinity in the second image corresponding to the non-occlusion disparity edge produced by the integrating means, based upon the disparity data; second selecting means, for selecting at least one edge of the second image from the occlusion portion of the second image; third selecting means, for selecting the selected at least one edge of the second image and the vicinity produced by the correspondence converting means, to produce at least one occlusion disparity edge; disparity supplementing means for supplementing disparity data for the occlusion portion of the second image, based upon the at least one occlusion disparity edge and the disparity data; disparity extrapolating means, for extrapolating disparity data for the non-occlusion portion of the first image based upon the non-occlusion disparity edge and the disparity data; and disparity integrating means, for integrating the disparity data, the supplemented disparity data, and the extrapolated disparity data. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19)
-
-
20. A method of estimating disparity, comprising:
-
determining correspondence between a first image and a second image; evaluating a confidence of the determined correspondence to produce evaluation data; computing disparity data for the first image and the second image based upon the determined correspondence; judging an occlusion portion of the second image which is not included in the first image, based upon the disparity data and the evaluation data; extracting edges of the evaluation data; extracting edges of the first and second images; selecting at least one edge of the evaluation data which corresponds to a non-occlusion portion of the first image; integrating at least one edge of the first image with the selected at least one edge of the evaluation data to produce at least one non-occlusion disparity edge; obtaining the vicinity in the second image corresponding to the non-occlusion disparity edge produced by the integrating means, based upon the disparity data; selecting at least one edge of the second image from the occlusion portion of the second image; selecting the selected at least one edge of the second image and the vicinity produced by the correspondence converting means, to produce at least one occlusion disparity edge; supplementing disparity data for the occlusion portion of the second image, based upon the at least one occlusion disparity edge and the disparity data; extrapolating disparity data for the non-occlusion portion of the first image based upon the non-occlusion disparity edge and the disparity data; and integrating the disparity data, the supplemented disparity data, and the extrapolated disparity data. - View Dependent Claims (21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 44, 45, 46)
-
Specification