Target tracking method and device therefor
First Claim
1. A method for tracking a target, comprising the steps of:
- (a) forming a first tracking window at a first position in a first image, said first tracking window substantially including said target;
(b) forming a second tracking window at a first position in a second image temporally sequential to said first image;
(c) discriminating a first plurality of pixels included in a first plurality of regions of said first and second tracking windows, and utilizing block matching to calculate a first block matching value between said first and second tracking windows by analyzing only said first plurality of pixels of said first plurality of regions;
(d) modifying said second tracking window in a predetermined manner to thereby form a third tracking window at a second position in said second image, said second position in said second image being separately located from said first position in said second image;
(e) discriminating a second plurality of pixels included in a second plurality of regions of said first and third tracking windows, and utilizing block matching to calculate a second block matching value between said first and third tracking windows by analyzing only said second plurality of pixels of said second plurality of regions; and
(f) comparing said first block matching value with said second block matching value, selecting a final tracking window corresponding to a maximum block matching value, and determining a positional difference between said first tracking window and said final tracking window resulting in the maximum block matching value as a movement vector of said target, said final tracking window being selected from one among said second tracking window and said third tracking window.
6 Assignments
0 Petitions
Accused Products
Abstract
In a target tracking method employing a block matching algorithm, pixels commonly included in target, background or foreground areas in consecutive frames are excluded from the calculation and only pixels in transition are taken into account. First, a first rectangular gate substantially including a target to be tracked is formed in an image of a first frame. Also, a second rectangular gate is formed in an image of a second frame temporally sequential to the first frame. Then, pixels commonly included in moving regions in the first and second rectangular gates are discriminated and a block matching level between the first and second rectangular gates is calculated by using only pixels of the moving region. Afterwards, the second rectangular gate is changed in a predetermined area, and the block matching level is calculated. Subsequently, the block matching level values are compared to determine a second rectangular gate which results in a maximum block matching level value, and a positional difference between the first rectangular gate and the second rectangular gate resulting in the maximum block matching level value is determined as a movement vector of the target.
-
Citations
29 Claims
-
1. A method for tracking a target, comprising the steps of:
-
(a) forming a first tracking window at a first position in a first image, said first tracking window substantially including said target;
(b) forming a second tracking window at a first position in a second image temporally sequential to said first image;
(c) discriminating a first plurality of pixels included in a first plurality of regions of said first and second tracking windows, and utilizing block matching to calculate a first block matching value between said first and second tracking windows by analyzing only said first plurality of pixels of said first plurality of regions;
(d) modifying said second tracking window in a predetermined manner to thereby form a third tracking window at a second position in said second image, said second position in said second image being separately located from said first position in said second image;
(e) discriminating a second plurality of pixels included in a second plurality of regions of said first and third tracking windows, and utilizing block matching to calculate a second block matching value between said first and third tracking windows by analyzing only said second plurality of pixels of said second plurality of regions; and
(f) comparing said first block matching value with said second block matching value, selecting a final tracking window corresponding to a maximum block matching value, and determining a positional difference between said first tracking window and said final tracking window resulting in the maximum block matching value as a movement vector of said target, said final tracking window being selected from one among said second tracking window and said third tracking window. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 25, 26, 27)
(c1) identifying a third first plurality of pixels included in a first target area in said first tracking window, a first background area in said first tracking window, a second target area in said second tracking window, and a second background area in said second tracking window, said first target area separately located from said first background area, and said second target area being separately located from said second background area;
(c2) identifying said first plurality of pixels from among said third plurality of pixels, said first plurality of pixels including only transition pixels;
(c3) said transition pixels further comprising;
pixels included in said first target area and not included in said second target area; and
pixels included in said first background area and not included in said second background area; and
(c4) utilizing block matching to calculate said first block matching value between said first and second tracking windows by analyzing pixels including only said first plurality of pixels.
-
-
4. The method of claim 3, wherein said step (c1) further comprises the steps of:
-
calculating a frame difference index of a first pixel by subtracting an intensity function value of said first pixel in said first image from an intensity function value of a said first pixel in said second image; and
identifying said first pixel as being included in said second target area when said frame difference index of said first pixel is within a first predetermined set of limits.
-
-
5. The method of claim 3, wherein said step (c1) further comprises the steps of:
-
calculating a frame difference index of a first pixel by subtracting an intensity function value of said first pixel in said first image from an intensity function value of a said first pixel in said second image; and
identifying said first pixel as being included in said second background area when said frame difference index of said first pixel is within a first predetermined set of limits.
-
-
6. The method of claim 3, wherein said step (c1) further comprises the steps of:
-
calculating a frame difference index of a first pixel by subtracting an intensity function value of said first pixel in said first image from an intensity function value of a said first pixel in said second image; and
identifying said first pixel as being included in said first target area when said frame difference index of said first pixel is within a first predetermined set of limits.
-
-
7. The method of claim 3, wherein said step (c1) further comprises the steps of:
-
calculating a frame difference index of a first pixel by subtracting an intensity function value of said first pixel in said first image from an intensity function value of a said first pixel in said second image; and
identifying said first pixel as being included in said first background area when said frame difference index of said first pixel is within a first predetermined set of limits.
-
-
8. The method of claim 1, wherein said step (c) further comprises the steps of:
-
(c1) identifying a third plurality of pixels included in a first target area in said first tracking window, a first background area in said first tracking window, a second target area in said second tracking window, and a second background area in said second tracking window;
(c2) identifying said first plurality of pixels from among said third plurality of pixels, said first plurality of pixels including only transition pixels;
(c3) said transition pixels further comprising;
pixels included in said first target area and not included in said second target area; and
pixels included in said first background area and not included in said second background area;
(c4) identifying a fourth plurality of pixels from among said third plurality of pixels, said fifth plurality of pixels corresponding to impulse noise; and
(c5) utilizing block matching to calculate said first block matching value between said first and second tracking windows by analyzing pixels including only said first plurality of pixels and excluding said fourth pluralities of pixels.
-
-
9. The method of claim 8, wherein said step (c1) further comprises the steps of:
-
calculating a frame difference index of a first pixel by subtracting an intensity function value of said first pixel in said first image from an intensity function value of a said first pixel in said second image; and
identifying said first pixel as being included in said second target area when said frame difference index of said first pixel is within a first predetermined set of limits.
-
-
10. The method of claim 8, wherein said step (c1) further comprises the steps of:
-
calculating a frame difference index of a first pixel by subtracting an intensity function value of said first pixel in said first image from an intensity function value of a said first pixel in said second image; and
identifying said first pixel as being included in said second background area when said frame difference index of said first pixel is within a first predetermined set of limits.
-
-
11. The method of claim 8, wherein a first noise test pixel selected from among said third plurality of pixels correspond to impulse noise when a frame difference index of said first noise test pixel is within a first predetermined set of limits and frame difference indices of a predetermined quantity of pixels near said first noise test pixel are within a second predetermined set of limits.
-
12. The method of claim 11, wherein said predetermined quantity of pixels corresponds to eight.
-
13. The method of claim 1, wherein said step (c) further comprises the steps of:
-
(c1) identifying a third plurality of pixels, said third plurality of pixels including pixels located in a first target area, a first background area, and a first foreground area in said first tracking window, and also including pixels located in a second target area, a second background area, and a second foreground area in said second tracking window;
(c2) identifying said first plurality of pixels from among said third plurality of pixels, said first plurality of pixels including only transition pixels;
(c3) said transition pixels further comprising;
pixels included in said first target area and not included in said second target area;
pixels included in said first background area and not included in said second background area; and
pixels included in said first foreground area and not included in said second foreground area; and
(c4) utilizing block matching to calculate said first block matching value between said first and second tracking windows by analyzing pixels including only said first plurality of pixels.
-
-
14. The method of claim 13, wherein said step (c1) further comprises the steps of:
-
calculating a frame difference index of a first pixel by subtracting an intensity function value of said first pixel in said first image from an intensity function value of said first pixel in said second image; and
identifying said first pixel as being included in one area in said first tracking window selected from among said first target area, said first background area, and said first foreground area, when said frame difference index of said first pixel is within a first predetermined set of limits.
-
-
15. The method of claim 13, wherein said step (c1) further comprises the steps of:
-
calculating a frame difference index of a first pixel by subtracting an intensity function value of said first pixel in said first image from an intensity function value of said first pixel in said second image; and
identifying said first pixel as being included in one area in said second tracking window selected from among said second target area, said second background area, and said second foreground area, when said frame difference index of said first pixel is within a first predetermined set of limits.
-
-
16. The method of claim 1, wherein said step (c) further comprises the steps of:
-
(c1) identifying a third plurality of pixels, said third plurality of pixels including pixels located in a first target area, a first background area, and a first foreground area in said first tracking window, and also including pixels located in a second target area, a second background area, and a second foreground area in said second tracking window;
(c2) identifying said first plurality of pixels from among said third plurality of pixels, said first plurality of pixels including only transition pixels;
(c3) said transition pixels further comprising;
pixels included in said first target area and not included in said second target area;
pixels included in said first background area and not included in said second background area; and
pixels included in said first foreground area and not included in said second foreground area; and
(c4) identifying a fourth plurality of pixels from among said third plurality of pixels, said fourth plurality of pixels corresponding to impulse noise; and
(c5) utilizing block matching to calculate said first block matching value between said first and second tracking windows by analyzing pixels including only said first plurality of pixels and excluding said fourth plurality of pixels.
-
-
17. The method of claim 16, said step (c1) further comprising the steps of:
-
calculating a frame difference index of a first pixel by subtracting an intensity function value of said first pixel in said first image from an intensity function value of a said first pixel in said second image; and
identifying said first pixel as being included in one area in said first tracking window selected from among said first target area, said first background area, and said first foreground area, when said frame difference index of said first pixel is within a first predetermined set of limits.
-
-
18. The method of claim 16, said step (c1) further comprising the steps of:
-
calculating a frame difference index of a first pixel by subtracting an intensity function value of said first pixel in said first image from an intensity function value of a said first pixel in said second image; and
identifying said first pixel as being included in one area in said second tracking window selected from among said second target area, said second background area, and said second foreground area, when said frame difference index of said first pixel is within a first predetermined set of limits.
-
-
19. The method of claim 16, wherein a first noise test pixel selected from among said third plurality of pixels corresponds to impulse noise when a frame difference index of said first noise test pixel is within a first predetermined set of limits and frame difference indices of a predetermined quantity of pixels near said first noise test pixel are within a second predetermined set of limits.
-
20. The method of claim 19, wherein said predetermined quantity of pixels corresponds to eight.
-
25. The method of claim 1, wherein said first tracking window has a rectangular shape.
-
26. The method of claim 1, wherein said second tracking window has a rectangular shape.
-
27. The method of claim 1, wherein said first plurality of regions of said first and second tracking windows further comprise a target area in said first tracking window and a background area in said second tracking window.
-
21. An apparatus tracking a target, comprising:
-
a frame memory unit receiving an image signal of a first frame, buffering said input image signal for one frame period, and outputting an image signal of a second frame delayed by one frame period;
an image difference generator unit receiving said image signals of said first and second frames, generating a first rectangular gate substantially including a target to be tracked in said image of said first frame, forming a second rectangular gate in said image signal of said second frame, and calculating image difference between said first and second rectangular gates;
a moving region generator unit generating first and second moving templates by excluding pixels corresponding to one area selected from a target area, a background area, and a foreground area, from said first and second rectangular gates, respectively, based on said image difference output by said image difference generator unit; and
a block matching unit for calculating block matching levels between said first moving template and a plurality of second moving templates, identifying one of said plurality of second moving templates which results in a maximum block matching level value, and determining the spacial displacement of said second rectangular gate which corresponds to said second moving template resulting in said maximum block matching level value from said first rectangular gate as a motion vector of said target. - View Dependent Claims (22, 23, 24)
a logical AND gate receiving frame difference indices of a predetermined number of pixels around a current pixel to perform a logical OR operation with respect to said indices; and
a logical OR gate receiving a frame difference index of said current pixel through an input terminal thereof and output data of said logical OR gate through another input terminal to perform a logical AND operation.
-
-
24. The apparatus of claim 23, wherein said logical AND gate corresponds to an eight-input logical AND gate.
-
28. A method for tracking a target, comprising the steps of:
-
forming a first tracking gate at a first position in a first image, said first tracking gate substantially including said target;
forming a second tracking gate at a first position in a second image temporally sequential to said first image;
discriminating a first plurality of pixels commonly included in a first plurality of regions of said first and second tracking gates;
utilizing block matching to calculate a first block matching value between said first and second tracking gates by analyzing only said first plurality of pixels of said first plurality of regions;
modifying said second tracking gate in a predetermined manner to thereby form a third tracking gate at a second position in said second image, said second position in said second image being distinguishable from said first position in said second image;
identifying a second plurality of pixels commonly included in a second plurality of regions of said first and third tracking gates, and using block matching to calculate a second block matching value between said first and third tracking gates by analyzing only said second plurality of pixels of said second plurality of regions;
comparing said first block matching value with said second block matching value, selecting a final tracking gate corresponding to a maximum block matching value, and determining a positional difference between said first tracking gate and said final tracking gate resulting in the maximum block matching value as a movement vector of said target, said final tracking gate being selected from one among said second tracking gate and said third tracking gate;
said first and second positions in said second image corresponding to first localities near said first position in said first image when said first image corresponds to an initial image; and
said first and second positions in said second image corresponding to second localities near a new position shifted from said first position in said first image by a movement vector of said target earlier estimated when said first image does not correspond to said initial image. - View Dependent Claims (29)
identifying a third plurality of pixels included in a first target area in said first tracking gate, a first background area in said first tracking gate, a second target area in said second tracking gate, and a second background area in said second tracking gate, said first target area being separately located from said first background area, and said second target area being separately located from said second background area;
calculating a frame difference index of a first pixel by subtracting an intensity function value of said first pixel in said first image from an intensity function value of said first pixel in said second image, and identifying said first pixel as being included in said second target area when said frame difference index of said first pixel is within a first predetermined set of limits;
identifying said first plurality of pixels from among said third plurality of pixels, said first plurality of pixels including only transition pixels;
said transition pixels further comprising;
pixels included in said first target area and not included in said second target area; and
pixels included in said first background area and not included in said second background area; and
utilizing block matching to calculate said first block matching value between said first and second tracking windows by analyzing pixels including only said first plurality of pixels.
-
Specification