Fast high-accuracy multi-dimensional pattern inspection
First Claim
1. A method for inspecting an object in an image, the object having an expected shape and a true pose in the image the method comprising:
- storing a model pattern, said model pattern including a geometric description of the expected shape of said object said geometric description including a plurality of pattern boundary points;
providing a starting pose that represents an initial estimate of the true pose of said object in said image;
detecting in said image a plurality of image boundary points;
using said starting pose and said model pattern to determine an evaluation of reliability of at least some of said plurality of image boundary points, and to determine a corresponding position along a boundary of said model pattern corresponding to at least some of said plurality of image boundary points;
computing a new pose using said starting pose, the plurality of image boundary points, evaluations and corresponding positions along the boundary of said model pattern, the new pose representing a more refined estimate of the true pose of said object in said image;
using the evaluation of reliability of at least some of said plurality of image boundary points to identify features in the image not present in the model pattern.
1 Assignment
0 Petitions
Accused Products
Abstract
A method and apparatus are provided for identifying differences between a stored pattern and a matching image subset, where variations in pattern position, orientation, and size do not give rise to false differences. The invention is also a system for analyzing an object image with respect to a model pattern so as to detect flaws in the object image. The system includes extracting pattern features from the model pattern; generating a vector-valued function using the pattern features to provide a pattern field; extracting image features from the object image; evaluating each image feature, using the pattern field and an n-dimensional transformation that associates image features with pattern features, so as to determine at least one associated feature characteristic; and using at least one feature characteristic to identify at least one flaw in the object image. The invention can find at least two distinct kinds of flaws: missing features, and extra features. The invention provides pattern inspection that is faster and more accurate than any known prior art method by using a stored pattern that represents an ideal example of the object to be found and inspected, and that can be translated, rotated, and scaled to arbitrary precision much faster than digital image re-sampling, and without pixel grid quantization errors. Furthermore, since the invention does not use digital image re-sampling, there are no pixel quantization errors to cause false differences between the pattern and image that can limit inspection performance.
-
Citations
38 Claims
-
1. A method for inspecting an object in an image, the object having an expected shape and a true pose in the image the method comprising:
-
storing a model pattern, said model pattern including a geometric description of the expected shape of said object said geometric description including a plurality of pattern boundary points;
providing a starting pose that represents an initial estimate of the true pose of said object in said image;
detecting in said image a plurality of image boundary points;
using said starting pose and said model pattern to determine an evaluation of reliability of at least some of said plurality of image boundary points, and to determine a corresponding position along a boundary of said model pattern corresponding to at least some of said plurality of image boundary points;
computing a new pose using said starting pose, the plurality of image boundary points, evaluations and corresponding positions along the boundary of said model pattern, the new pose representing a more refined estimate of the true pose of said object in said image;
using the evaluation of reliability of at least some of said plurality of image boundary points to identify features in the image not present in the model pattern. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38)
comparing the evaluation of reliability of at least some of said plurality of image boundary points to a threshold value; and
when the evaluation of reliability of an image boundary point is less than the threshold value, identifying the image boundary point as being associated with a feature in the image that is not present in the model pattern.
-
-
3. The method of claim 1, wherein each evaluation of reliability of at least some of said plurality of image boundary points is a probability value that ranges from 0 to 1.
-
4. The method of claim 1, further comprising:
using corresponding positions along a boundary of said model pattern corresponding to at least some of said plurality of image boundary points to identify features in the model pattern not present in the image.
-
5. The method of claim 4, wherein positions along the boundary of said model pattern are associated with a probability value indicating the likelihood that the model pattern feature was present in the image.
-
6. The method of claim 5, wherein the probability value ranges from 0 to 1.
-
7. The method of claim 1, further comprising:
computing an aggregate clutter value that is a measure of features found in the image that do not correspond to model pattern features.
-
8. The method of claim 1, further comprising:
computing an aggregate clutter value using the evaluations of reliability of respective image boundary points associated with features in the image not present in the model pattern.
-
9. The method of claim 1, further comprising:
computing a coverage value that is a measure of the portion of the model pattern to which corresponding image boundary points have been found.
-
10. The method of claim 1, further comprising:
computing an error value that is a measure of the degree of match between the model pattern and the image.
-
11. The method of claim 10, wherein the error value is a root-mean-square error value.
-
12. The method of claim 1, wherein
said evaluation includes an indication that the image boundary point is one of: - “
reliable” and
“
unreliable”
.
- “
-
13. The method of claim 12, wherein
said new pose is computed using only those image boundary points wherein said evaluation includes an indication that the image boundary point is “ - reliable”
.
- reliable”
-
14. The method of claim 13, wherein
said evaluation includes a weighting factor. -
15. The method of claim 14, wherein
said new pose is computed so that each of said plurality of image boundary points is influential upon computing said new pose in accordance with said weighting factor. -
16. The method of claim 12, wherein
said evaluation is influenced by a distance between each image boundary point and the corresponding position along a boundary of said model pattern. -
17. The method of claim 16, wherein
said indication is “ - unreliable”
when said distance is greater than a parameter, and is “
reliable”
when said distance is less than said parameter.
- unreliable”
-
18. The method of claim 12, wherein
each image boundary point includes a direction estimate; - and
said evaluation is influenced by the direction estimate of said each image boundary point.
- and
-
19. The method of claim 18, wherein
an angle between the direction estimate and a direction to the corresponding position of said image boundary point is computed; - and
said indication is “
unreliable”
when said angle is greater than a parameter, and “
reliable”
when said angle is less than said parameter.
- and
-
20. The method of claim 12, wherein
each image boundary point includes a magnitude estimate; - and
said evaluation is influenced by the magnitude estimate of said each image boundary point.
- and
-
21. The method of claim 20, wherein
said indication is “ - not reliable”
when said magnitude is less than a parameter, and “
reliable”
when said magnitude is greater than said parameter.
- not reliable”
-
22. The method of claim 14, wherein
said evaluation is influenced by a distance between each image boundary point and the corresponding position along a boundary of said model pattern. -
23. The method of claim 22, wherein
said weighting factor is 0 when said distance is greater than a first parameter, is a positive constant when said distance is less than a second parameter, and assumes values between 0 and said positive constant when said distance is between said first parameter and said second parameter. -
24. The method of claim 14, wherein
each image boundary point includes a direction estimate; - and
said evaluation is influenced by the direction estimate of said each image boundary point.
- and
-
25. The method of claim 24, wherein
an angle between said direction estimate and a direction to the corresponding position of said image boundary point is computed; - and
said weighting factor is 0 when said angle is greater than a first parameter, is a positive constant when said angle is less than a second parameter, and assumes values between 0 and said positive constant when said angle is between said first parameter and said second parameter.
- and
-
26. The method of claim 14, wherein
each image boundary point includes a magnitude estimate; - and
said evaluation is influenced by the magnitude estimate of said each image boundary point.
- and
-
27. The method of claim 26, wherein
said weighting factor is 0 when said magnitude is less than a first parameter, is a positive constant when said magnitude is greater than a second parameter, and assumes values between 0 and said positive constant when said magnitude is between said first parameter and said second parameter. -
28. The method of claim 1, wherein
each image boundary point includes an estimate of gradient direction; - and
each pattern boundary point includes an estimate of gradient direction indicative of an expected shape and polarity of said object;
and the method further comprising;
choosing to consider or ignore polarity in determining the evaluation.
- and
-
29. The method of claim 28, wherein
choosing to consider or ignore polarity includes making an identical choice for each evaluation. -
30. The method of claim 28, wherein
said model pattern includes information indicating, for a plurality of portions of the model pattern boundary, whether image boundary points corresponding to each portion should consider or ignore polarity in determining the evaluation; - and
choosing to consider or ignore polarity includes making separate choices for each evaluation based on said information.
- and
-
31. The method of claim 1, wherein
said model pattern includes a plurality of codes corresponding to various regions of said model pattern; -
said starting pose and said model pattern are used to associate a code with each of said plurality of image boundary points; and
said evaluation is influenced by said code.
-
-
32. The method of claim 31, wherein
said evaluation includes a weighting factor; -
said code includes a weight modifier;
said weighting factor is modified according to said weight modifier; and
said new pose is computed so that each of said plurality of image boundary points is influential in accordance with said weighting factor.
-
-
33. The method of claim 31, wherein
said code includes a “ - don'"'"'t care”
value.
- don'"'"'t care”
-
34. The method of claim 31, wherein
said code includes an “ - expect blank”
value.
- expect blank”
-
35. The method of claim 31, wherein
said code includes an “ - eval only”
value.
- eval only”
-
36. The method of claim 1, wherein using said starting pose and said model pattern to determine an evaluation of reliability of at least some of said plurality of image boundary points includes:
comparing a gradient magnitude of an image boundary point to a threshold.
-
37. The method of claim 36, wherein comparing includes using a fuzzy comparator to compare the gradient magnitude of the image boundary point to a fuzzy threshold.
-
38. The method of claim 36, wherein
said model pattern includes a plurality of codes corresponding to various regions of said model pattern; -
said starting pose and said model pattern are used to associate a code with each of said plurality of image boundary points; and
said evaluation is zero if said code includes a “
don'"'"'t care”
value.
-
Specification