×

Nonfeedback-based machine vision method for determining a calibration relationship between a camera and a moveable object

DC
  • US 5,960,125 A
  • Filed: 11/21/1996
  • Issued: 09/28/1999
  • Est. Priority Date: 11/21/1996
  • Status: Expired due to Term
First Claim
Patent Images

1. A method of determining a calibration relationship between a reference frame of a motion stage and a reference frame of each of plural image acquisition devices that generate images of that stage, the method comprising the steps of:

  • A. placing a calibration plate on the motion stage, the calibration plate including plural targets, where each target (i) as a known location (wxi,wyi,) on the calibration plate, where at least one target is in the field of view of each image acquisition device, and where the motion stage is at a first known location/orientation (mx,my, θ

    )j, where (j)=1, and generating with each of the image acquisition devices a first image of the motion stage;

    B. determining a location (ix,iy where (j)=1, of the target in each of those first images;

    C. moving the motion stage to one or more other known locations/orientations (mx,my

    )j, where (j)>

    1. such that at least one target is in the field of view of each image acquisition device, and generating with each image acquisition device additional images of the motion stage;

    D. determining locations (ix,iy)ij, where (j)>

    1, of the target in each of those additional images; and

    E. determining a calibration relationship between the reference frame of the motion stage and the reference frames of the image acquisition devices as a function of (i) the known locations/orientations of the motion stage (mx,my

    )j, where (j)≧

    1, (ii) the locations (ix,iy)ij, where (j)≧

    1, of the targets in the corresponding images, and (iii) the known locations (wxi,wyi) of the targets on the calibration plate,wherein step (E) comprises the step of determining the calibration relationship by minimizing an error Eij between known locations/orientations of the motion stage and estimates thereof in accord with the mathematical relationship ##EQU10## where (mx,my

    )j represents the known motion stage locations/orientations,(ix,iy)ij represents locations of the targets in the images,(wxi,wyi) represents the known locations of each target on the calibration plate,Gi (u,v), H(i (u,v) represent a lens distortion correction functions mapping coordinates(u,v) in an image to an orthonormal image coordinate system (ix,iy),Pxi,Pyi represent a position of target (i) in motion stage coordinates when the motion stage is at (x=0,y=0,θ

    =0),α

    i, β

    i represent pixel width and height for camera field of view i,U and V represent the cosine and sine, respectively, of each image acquisition device'"'"'s coordinate frame,(Oxi,Oyi) represents a physical position corresponding to specified location for camera field of view i.

View all claims
  • 1 Assignment
Timeline View
Assignment View
    ×
    ×