Motion field modeling and estimation using motion transform
First Claim
1. An apparatus for generating an image motion vector field which describes a motion of individual image components of a first image frame and corresponding image components of a second image frame in a sequence of image frames, the apparatus comprising:
- (a) a first frame memory for receiving said first image frame;
(b) a second frame memory for receiving a second image frame; and
(c) an optical flow calculator configured for generating an image motion vector field by iteratively comparing a predicted image with said second image frame, said predicted image being produced based upon said first memory frame and image gradients generated according to a motion estimate that is produced according to a transform function using estimate transform coefficients, wherein said estimated transform coefficients, estimated based upon a previously determined image gradient.
7 Assignments
0 Petitions
Accused Products
Abstract
A motion transform is implemented for calculating the motion field between two images. An optical flow calculator is configured for generating an image motion vector field by iteratively comparing a predicted image with a second image frame, the predicted image being produced based upon a first memory frame and image gradients generated according to a motion estimate that is produced according to a transform function using transform coefficients. The transform coefficients are estimated based upon a previously determined image gradient.
-
Citations
38 Claims
-
1. An apparatus for generating an image motion vector field which describes a motion of individual image components of a first image frame and corresponding image components of a second image frame in a sequence of image frames, the apparatus comprising:
-
(a) a first frame memory for receiving said first image frame;
(b) a second frame memory for receiving a second image frame; and
(c) an optical flow calculator configured for generating an image motion vector field by iteratively comparing a predicted image with said second image frame, said predicted image being produced based upon said first memory frame and image gradients generated according to a motion estimate that is produced according to a transform function using estimate transform coefficients, wherein said estimated transform coefficients, estimated based upon a previously determined image gradient. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22)
(a) calculating a residual error by taking a difference between said predicted image and said second image frame; and
(b) determining if said residual error is less than a predetermined threshold.
-
-
10. The apparatus according to claim 1, wherein said optical flow calculator is configured to impose a zig-zag sequential ordering of said estimated transform coefficients.
-
11. The apparatus according to claim 1, wherein said optical flow calculator is configured to gradually add coefficients during an iterative process.
-
12. The apparatus according to claim 11, wherein said optical flow calculator is further configured to initialize said added coefficients.
-
13. The apparatus according to claim 1, wherein said optical flow calculator is configured to discard a coefficient adaptively during iterative comparing if an incremental change between a current estimated coefficient value and a previous estimated coefficient value has a magnitude smaller than a threshold value, such that the current estimated coefficient value attains a final coefficient value.
-
14. The apparatus according to claim 1, wherein said image motion vector field comprises a plurality of motion vectors whose values approximate the movement of corresponding image components between said first image frame and second first image frame.
-
15. The apparatus according to claim 1, wherein said optical flow calculator is configured to exclude image gradients whose value is less than a threshold value.
-
16. The apparatus according to claim 1, wherein said optical flow calculator is configured to sub-sample prescribed values including individual image component values, image gradient values, and residual error values.
-
17. The apparatus according to claim 16, wherein said optical flow calculator is configured to increase the resolution of said sub-sampling during the iterative comparing, wherein during early iterations fewer transform coefficients are used and fewer prescribed values are sub-sampled, and during later iterations greater coefficients are used and greater prescribed values are sub-sampled.
-
18. The apparatus according to claim 1, wherein said optical flow calculator is configured to partition a motion field into a plurality of smaller motion fields creating a plurality of reconstructed images, and create a reconstructed image by combining said reconstructed images.
-
19. The apparatus according to claim 18, wherein said reconstructed images overlap.
-
20. The apparatus according to claim 1, wherein said optical flow calculator is further configured to calculate transform coefficients for at least one of the following:
-
(a) a discrete cosine transform;
(b) a discrete fourier transform;
(c) a Haar transform;
(d) a KL transform; and
(e) a wavelet transform.
-
-
21. The apparatus according to claim 5, wherein said coefficient estimator further includes a lookup table to save basis function values at fixed sampling points.
-
22. The apparatus according to claim 1, wherein said optical flow calculator further includes a global motion estimator, wherein said global motion estimator generates a global estimate and said optical flow calculator models only motion differences between said global estimate and a reference motion field.
-
23. A method for generating an image motion vector field comprising the steps of:
-
(a) receiving a first image frame having individual image components;
(b) receiving a second image frame having corresponding image components;
(c) initializing an image gradient;
(d) generating said image motion vector field by iteratively;
(i) estimating transform coefficients from said individual image components and said image gradient according to a transform coefficient function;
(ii) calculating a motion field according to said estimated transform coefficients;
(iii) calculating image gradients according to said motion field;
(iv) generating a predicted image frame according to said motion field and first image frame;
(v) calculating a residual error by taking a difference between said predicted image and said second image frame;
(vi) determining if said residual error is less than a predetermined threshold, and accordingly if said predicted image has converged;
(vii) if said predicted image has converged, ending said iterations; and
(e) outputting said image motion vector field. - View Dependent Claims (24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38)
(a) gradually adding transform coefficients; and
(b) initializing said added transform coefficients.
-
-
30. The method according to claim 23, wherein said step of generating said image motion vector field further includes the step of discarding a coefficient adaptively if the incremental change between said current estimated coefficient value and said previous estimated coefficient value has a magnitude smaller than a predetermined threshold value, thereby making the current estimated coefficient value a final coefficient value.
-
31. The method according to claim 23, wherein said image motion vector field comprises a plurality of motion vectors whose values approximate the movement of corresponding image components between said first image frame and second first image frame.
-
32. The method according to claim 23, wherein said step of generating said image motion vector field further includes the step of excluding image gradients whose value is less than a threshold value, thereby eliminating those image gradients from any further processing.
-
33. The method according to claim 23, wherein said individual image component values, said image gradient values, and said residual error values are sub-sampled, thereby excluding all values that are not in the set of sub-sampled values.
-
34. The method according to claim 33, wherein the resolution of said sub-sampling may increase during the iterative process, whereby during early iterations less coefficients are used and less values are sampled, and during later iterations more coefficients are used and more values are sampled.
-
35. The method according to claim 23, further including the steps of:
-
(a) partitioning said motion field into a plurality of smaller motion fields creating a plurality of separate reconstructed images; and
(b) generating a composite reconstructed image by combining said separate reconstructed images;
thereby reducing the number of coefficients required to describe the motion field.
-
-
36. The method according to claim 35, wherein said separate reconstructed images may overlap.
-
37. The method according to claim 23, wherein said transform coefficient function may be one of the following:
-
(a) a discrete cosine transform;
(b) a discrete fourier transform;
(c) a Haar transform;
(d) a KL transform; and
(e) a wavelet transform.
-
-
38. The method according to claim 23, wherein said step of estimating transform coefficients further includes the steps of:
-
(a) calculating basis function values at fixed sampling points;
(b) saving said saved basis function values in a lookup table; and
(c) using said saved basis function values for transform coefficient estimates.
-
Specification