Nonfeedback-based machine vision method for determining a calibration relationship between a camera and a moveable object
DCFirst Claim
1. A method of determining a calibration relationship between a reference frame of a motion stage and a reference frame of each of plural image acquisition devices that generate images of that stage, the method comprising the steps of:
- A. placing a calibration plate on the motion stage, the calibration plate including plural targets, where each target (i) as a known location (wxi,wyi,) on the calibration plate, where at least one target is in the field of view of each image acquisition device, and where the motion stage is at a first known location/orientation (mx,my, θ
)j, where (j)=1, and generating with each of the image acquisition devices a first image of the motion stage;
B. determining a location (ix,iy where (j)=1, of the target in each of those first images;
C. moving the motion stage to one or more other known locations/orientations (mx,my,θ
)j, where (j)>
1. such that at least one target is in the field of view of each image acquisition device, and generating with each image acquisition device additional images of the motion stage;
D. determining locations (ix,iy)ij, where (j)>
1, of the target in each of those additional images; and
E. determining a calibration relationship between the reference frame of the motion stage and the reference frames of the image acquisition devices as a function of (i) the known locations/orientations of the motion stage (mx,my,θ
)j, where (j)≧
1, (ii) the locations (ix,iy)ij, where (j)≧
1, of the targets in the corresponding images, and (iii) the known locations (wxi,wyi) of the targets on the calibration plate,wherein step (E) comprises the step of determining the calibration relationship by minimizing an error Eij between known locations/orientations of the motion stage and estimates thereof in accord with the mathematical relationship ##EQU10## where (mx,my,θ
)j represents the known motion stage locations/orientations,(ix,iy)ij represents locations of the targets in the images,(wxi,wyi) represents the known locations of each target on the calibration plate,Gi (u,v), H(i (u,v) represent a lens distortion correction functions mapping coordinates(u,v) in an image to an orthonormal image coordinate system (ix,iy),Pxi,Pyi represent a position of target (i) in motion stage coordinates when the motion stage is at (x=0,y=0,θ
=0),α
i, β
i represent pixel width and height for camera field of view i,U and V represent the cosine and sine, respectively, of each image acquisition device'"'"'s coordinate frame,(Oxi,Oyi) represents a physical position corresponding to specified location for camera field of view i.
1 Assignment
Litigations
0 Petitions
Accused Products
Abstract
A method is provided for determining a calibration relationship between a reference frame of motion of an object and a reference frame of a camera that generates images of the object. The method includes the steps of coupling a target to an object and placing the object at each of plural locations and orientations that are known with respect to the motion reference frame of the object. The location of the target(s) with respect to the object need not be known. An image of the object and target is generated while the object is at each of those locations/orientations. From each those images, the method determines the location/orientation of the target with respect to the reference frame of the camera. The method then calls for determining the calibration relationship between the reference frame of motion of the object and the camera reference frame as a function of the locations/orientations of the object with respect to the motion reference frame of the object and the locations/orientations of the target in the corresponding images with respect to the reference frame of the camera.
287 Citations
3 Claims
-
1. A method of determining a calibration relationship between a reference frame of a motion stage and a reference frame of each of plural image acquisition devices that generate images of that stage, the method comprising the steps of:
-
A. placing a calibration plate on the motion stage, the calibration plate including plural targets, where each target (i) as a known location (wxi,wyi,) on the calibration plate, where at least one target is in the field of view of each image acquisition device, and where the motion stage is at a first known location/orientation (mx,my, θ
)j, where (j)=1, and generating with each of the image acquisition devices a first image of the motion stage;B. determining a location (ix,iy where (j)=1, of the target in each of those first images; C. moving the motion stage to one or more other known locations/orientations (mx,my,θ
)j, where (j)>
1. such that at least one target is in the field of view of each image acquisition device, and generating with each image acquisition device additional images of the motion stage;D. determining locations (ix,iy)ij, where (j)>
1, of the target in each of those additional images; andE. determining a calibration relationship between the reference frame of the motion stage and the reference frames of the image acquisition devices as a function of (i) the known locations/orientations of the motion stage (mx,my,θ
)j, where (j)≧
1, (ii) the locations (ix,iy)ij, where (j)≧
1, of the targets in the corresponding images, and (iii) the known locations (wxi,wyi) of the targets on the calibration plate,wherein step (E) comprises the step of determining the calibration relationship by minimizing an error Eij between known locations/orientations of the motion stage and estimates thereof in accord with the mathematical relationship ##EQU10## where (mx,my,θ
)j represents the known motion stage locations/orientations,(ix,iy)ij represents locations of the targets in the images, (wxi,wyi) represents the known locations of each target on the calibration plate, Gi (u,v), H(i (u,v) represent a lens distortion correction functions mapping coordinates (u,v) in an image to an orthonormal image coordinate system (ix,iy), Pxi,Pyi represent a position of target (i) in motion stage coordinates when the motion stage is at (x=0,y=0,θ
=0),α
i, β
i represent pixel width and height for camera field of view i,U and V represent the cosine and sine, respectively, of each image acquisition device'"'"'s coordinate frame, (Oxi,Oyi) represents a physical position corresponding to specified location for camera field of view i.
-
-
2. A method of determining a calibration relationship between a reference frame of a motion stage and a reference frame of each of plural image acquisition devices that generate images of that stage, the method comprising the steps of:
-
A. placing a calibration plate on the motion stage, the calibration plate including plural targets, where each target (i) as a known location (wxi,wyi) on the calibration plate, where at least one target is in the field of view of each image acquisition device, and where the motion stage is at a first known location/orientation (mx,my,θ
)j, where (j)=1, and generating with each of the image acquisition devices a first image of the motion stage;B. determining a location (ix,iy)ij, where (j)=1, of the target in each of those first images; C. moving the motion stage to one or more other known locations/orientations (mx,my,θ
)j, where (j)>
1, such that at least one target is in the field of view of each image acquisition device, and generating with each image acquisition device additional images of the motion stage;D. determining locations (ix,iy)ij, where (j)>
1, of the target in each of those additional images; andE. determining a calibration relationship between the reference frame of the motion stage and the reference frames of the image acquisition devices as a function of (i) the known locations/orientations of the motion stage (mx,my,θ
)j, where (j)≦
1, (ii) the locations (ix, iy)ij, where (j)≧
1, of the targets in the corresponding images and (iii) the known locations (wxi,wyi) of the targets on the calibration plate,wherein step (E) comprises the step of determining the calibration relationship by minimizing an error Eij between known locations/orientations of the motion stage and estimates thereof in accord with the following mathematical relationship ##EQU11## where (mx,my,θ
)j represents the known motion stage locations/orientations,(ix,iy)ij represents locations of the targets in the images, (wxi,wyi) represents the known locations of each target on the calibration plate, (xc, yc, θ
c) represent unknown locations/orientations of the calibration plate with respect to the motion stage,α
i, β
i represent pixel width and height for camera field of view i,U and V represent the cosine and sine, respectively, of each image acquisition device'"'"'s coordinate frame, (Oxi,Oyi) represents a physical position corresponding to specified location for camera field of view i. - View Dependent Claims (3)
-
Specification