Context constrained novel view interpolation
First Claim
1. A computer program product comprising at least one non-transient computer-readable medium storing one or more sequences of instructions, wherein execution of the one or more sequences of instructions by one or more processors causes the one or more processors to generate a new view of a scene by performing the steps comprising:
- (a) accessing a plurality of images of the scene, each image in said plurality of images having a different view of said scene, one image in said plurality of images being a reference image and the remaining images in said plurality of images constituting a set of images;
(b) identifying feature points in said reference image and in each of the images in said set of images, said feature points being individual pixels characterized to be distinguishable from other pixels;
(c) identifying feature points in said reference image that match feature points in said set of images as determined by the feature points'"'"' pixel characterizations, each pair of matched feature points constituting corresponding feature points;
(d) forming a separate set of corresponding features points between the reference image and each of the images in the set of images, the separate sets of corresponding features points together forming a set of correspondences;
(e) detecting edge pixels in the reference image;
(f) adjusting at least one selected set of corresponding features points by;
(i) for each selected edge pixel taken from the detected edge pixels in the reference image, defining a window around the selected edge pixel in the reference image and identifying a set of local correspondences, said local correspondences including corresponding feature points of the selected set of corresponding features points that are within the window, computing a transformation of the selected edge pixel and its set of local correspondences, using the transformation on the corresponding feature points of each corresponding image in said set of images as defined by selected set of corresponding features points to identify a matching pixel in the corresponding image that matches the selected edge pixel;
(ii) checking the validity of the selected edge pixel and its matching pixel, and responsive to the validity being acceptable, adding the selected edge pixel and its matching pixel to the set of correspondences; and
(g) using the set of correspondences to generate the new view of the scene.
2 Assignments
0 Petitions
Accused Products
Abstract
Aspects of the present invention include systems and methods for generating a novel view interpolation. In embodiments, feature correspondences and geometrical contexts are used to find additional correspondences based on the assumption of the local linear transformation. The accuracy and the number of correspondence matches may be improved by iterative refinement. Having obtained a set of correspondences, a novel view image can be generated.
37 Citations
18 Claims
-
1. A computer program product comprising at least one non-transient computer-readable medium storing one or more sequences of instructions, wherein execution of the one or more sequences of instructions by one or more processors causes the one or more processors to generate a new view of a scene by performing the steps comprising:
-
(a) accessing a plurality of images of the scene, each image in said plurality of images having a different view of said scene, one image in said plurality of images being a reference image and the remaining images in said plurality of images constituting a set of images; (b) identifying feature points in said reference image and in each of the images in said set of images, said feature points being individual pixels characterized to be distinguishable from other pixels; (c) identifying feature points in said reference image that match feature points in said set of images as determined by the feature points'"'"' pixel characterizations, each pair of matched feature points constituting corresponding feature points; (d) forming a separate set of corresponding features points between the reference image and each of the images in the set of images, the separate sets of corresponding features points together forming a set of correspondences; (e) detecting edge pixels in the reference image; (f) adjusting at least one selected set of corresponding features points by; (i) for each selected edge pixel taken from the detected edge pixels in the reference image, defining a window around the selected edge pixel in the reference image and identifying a set of local correspondences, said local correspondences including corresponding feature points of the selected set of corresponding features points that are within the window, computing a transformation of the selected edge pixel and its set of local correspondences, using the transformation on the corresponding feature points of each corresponding image in said set of images as defined by selected set of corresponding features points to identify a matching pixel in the corresponding image that matches the selected edge pixel; (ii) checking the validity of the selected edge pixel and its matching pixel, and responsive to the validity being acceptable, adding the selected edge pixel and its matching pixel to the set of correspondences; and (g) using the set of correspondences to generate the new view of the scene. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9, 10)
-
-
11. A system for generating a novel view of a scene, the system comprising:
-
one or more interfaces for receiving a reference image of the scene and a set of images of the scene, said reference image and said set of images taken at different views of the scene; a data storage for storing the set of images; and one or more processors and at least one computer-readable medium storing one or more sequences of instructions, wherein execution of the one or more sequences of instructions by one or more processors causes the one or more processors to generate a novel view of the scene by performing the steps comprising; (b) identifying feature points in said reference image and in each of the images in said set of images, said feature points being individual pixels characterized to be distinguishable from other pixels; (c) identifying feature points in said reference image that match feature points in said set of images as determined by the feature points'"'"' pixel characterizations, each pair of matched feature points constituting corresponding feature points; (d) forming a separate set of corresponding feature points between the reference image and each of the images in the set of images; (e) detecting edge pixels in the reference image; (f) adjusting at least one selected set of corresponding features points by; (i) for each selected edge pixel taken from the detected edge pixels in the reference image, defining a window around the selected edge pixel in the reference image and identifying a set of local correspondences, said local correspondences including corresponding feature points of the selected set of corresponding features points that are within the window, computing a transformation of the selected edge pixel and its set of local correspondences, using the transformation on the corresponding feature points of each corresponding image in said set of images as defined by the selected set of corresponding features points to identify a matching pixel, in the corresponding image that matches the selected edge pixel; (ii) checking the validity of the selected edge pixel and its matching pixel, and responsive to the validity being acceptable, adding the selected edge pixel and its matching pixel to the set of correspondences; and (g) using the set of correspondences to generate the new view of the scene. - View Dependent Claims (12, 13, 14, 15, 16, 17, 18)
-
Specification