Nonfeedback-based machine vision methods for determining a calibration relationship between a camera and a moveable object
DCFirst Claim
1. A method of determining a calibration relationship between a reference frame of motion of an object and an image coordinate system (hereinafter “
- image reference frame”
) of each of plural image acquisition devices that generate images of the object, the method comprising the steps of;
A. coupling plural targets to the object such that (i) the location and orientation of the targets on the object are not necessarily known, and (ii) the locations and orientations of the targets are any of fixed and known relative to one another;
B. placing the object at plural known locations/orientations relative to the motion reference frame of the object such that at least one target is in a field of view of each respective image acquisition device for plural ones of those locations/orientations, where “
location/orientation”
refers to at least one of location and orientation, and where “
locations/orientations”
refers to the plural thereof;
C. generating with each image acquisition device an image of the object at each of the plural known locations/orientations relative to the motion reference frame of the object, and determining a location/orientation of the target in each of those images relative to the image reference frame of the image acquisition device;
D. determining a calibration relationship between the reference frame of motion of the object and the image reference frame of each of the image acquisition devices as a function of (i) the known locations/orientations of the object relative to the notion reference frame of the object and (ii) the locations of the targets in the corresponding images of the object relative to the image reference frame of the image acquisition device, and wherein step (D) comprises the step of determining the calibration relationship by minimizing an error between known locations/orientations of the object and estimates of those locations/orientations based on candidate calibration relationships.
1 Assignment
Litigations
0 Petitions
Accused Products
Abstract
A method is provided for determining a calibration relationship between a reference frame of motion of an object and a reference frame of a camera that generates images of the object. The method includes the steps of coupling a target to an object and placing the object at each of plural locations and orientations that are known with respect to the motion reference frame of the object. The location of the target(s) with respect to the object need not be known. An image of the object and target is generated while the object is at each of those locations/orientations. From each those images, the method determines the location/orientation of the target with respect to the reference frame of the camera. The method then calls for determining the calibration relationship between the reference frame of motion of the object and the camera reference frame as a function of the locations/orientations of the object with respect to the motion reference frame of the object and the locations/orientations of the target in the corresponding images with respect to the reference frame of the camera.
-
Citations
10 Claims
-
1. A method of determining a calibration relationship between a reference frame of motion of an object and an image coordinate system (hereinafter “
- image reference frame”
) of each of plural image acquisition devices that generate images of the object, the method comprising the steps of;A. coupling plural targets to the object such that (i) the location and orientation of the targets on the object are not necessarily known, and (ii) the locations and orientations of the targets are any of fixed and known relative to one another;
B. placing the object at plural known locations/orientations relative to the motion reference frame of the object such that at least one target is in a field of view of each respective image acquisition device for plural ones of those locations/orientations, where “
location/orientation”
refers to at least one of location and orientation, and where “
locations/orientations”
refers to the plural thereof;
C. generating with each image acquisition device an image of the object at each of the plural known locations/orientations relative to the motion reference frame of the object, and determining a location/orientation of the target in each of those images relative to the image reference frame of the image acquisition device;
D. determining a calibration relationship between the reference frame of motion of the object and the image reference frame of each of the image acquisition devices as a function of (i) the known locations/orientations of the object relative to the notion reference frame of the object and (ii) the locations of the targets in the corresponding images of the object relative to the image reference frame of the image acquisition device, and wherein step (D) comprises the step of determining the calibration relationship by minimizing an error between known locations/orientations of the object and estimates of those locations/orientations based on candidate calibration relationships. - View Dependent Claims (2, 3, 4, 5, 6, 7)
- image reference frame”
-
8. A method of determining a calibration relationship between a reference frame of a motion stage and an image coordinate system (hereinafter, “
- image reference frame”
) of each of plural image acquisition devices that generate images of that stage, the method comprising the steps of;A. placing a calibration plate on the motion stage, the calibration plate including plural targets, where each traget (i) has a known location (wxi, wyi) on the calibration plate, where at least one target in in the field of view of each image acquisition device, and where the motion stage is at a first known location/orientation (mx, my, θ
)j, where (j)=1,relative to the reference frame of motion of the object and generating with each of the image acquisition devices a first image of the motion stage;
B. determining a location (ix, iy)i,j, where (j)=1, of the target in each of those first images relative to the image reference frame of the respective image acquisition device;
C. moving the motion stage to one or more other known locations/orientations (mx, my, θ
)j, where (j)>
1, relative to the reference frame of motion of the object such that at least one target is in the field of view of each image acquisition device, and generating with each image acquisition device additional images of the motion stage;
D. determining locations (ix, iy)i,j, where (j)>
1, of the target in each of those additional images relative to the image reference frame of the respective image acquisition device; and
E. determining a calibration relationship between the reference frame of the motion stage and the image reference frames of the image acquisition devices as a function of (i) the known locations/orientations of the motion stage (mx, my, θ
)j, where (j)≧
1, relative to the reference frame of motion of the object, (ii) the locations (ix, iy)i,j, where (j)≧
1, of the targets in the corresponding images relative to the image reference frame of the respective image acquisition device, and (iii) the known locations (wxi,wyi) of the targets on the calibration plate.- View Dependent Claims (9)
- image reference frame”
-
10. A method of determining a calibration relationship between a reference frame of motion of an object and an image coordinate system (hereinafter “
- image reference frame”
) of each of plural image acquisition devices that generate images of the object, the method comprising the steps of;A. coupling plural targets to the object such that (i) the location and orientation of the targets on the object are not necessarily known, and (ii) the locations and orientations of the targets are any of fixed and known relative to one another;
B. placing the object at plural known locations/orientations relative to the motion reference frame of the object such that at least one target is in a field of view of each respective image acquisition device for plural ones of those locations/orientations, where “
location/orientation”
refers to at least one of location and orientation, and where “
locations/orientations”
refers to the plural thereof;
C. generating with each image acquisition device an image of the object at each of the plural known locations/orientations relative to the motion reference frame of the object, and determining a location/orientation of the target in each of those images relative to the image reference frame of the image acquisition device;
D. determining a calibration relationship between the reference frame of motion of the object and the image reference frame of each of the image acquisition devices as a function of (i) the known locations/orientations of the object relative to the motion reference frame of the object and (ii) the locations of the targets in the corresponding images of the object relative to the image reference frame of the image acquisition device.
- image reference frame”
Specification