Methods, systems and computer program products for photogrammetric sensor position estimation
First Claim
1. A method of determining an obscured contact point based on a visible portion of an acoustic sensor of a medical device contacting a patient from at least a first image containing an upper surface of the acoustic sensor from a first viewpoint and a second image containing the upper surface of the acoustic sensor from a second viewpoint different from the first viewpoint, the method comprising the steps of:
- locating the acoustic sensor in the first image and the second image;
determining the location of a point on the upper surface of the acoustic sensor based on the location of the acoustic sensor in the first image and the corresponding location of the acoustic sensor in the second image;
determining a plane of the upper surface of the acoustic sensor based on the acoustic sensor in the first image and the corresponding acoustic sensor in the second image; and
projecting a predetermined depth through the located point on the upper surface of the acoustic sensor in a direction having a predefined relationship with the plane of the visible portion of the acoustic sensor so as to determine the contact point of the obscured portion of the acoustic sensor.
3 Assignments
0 Petitions
Accused Products
Abstract
Methods, systems and computer program products are provided for determining an obscured contact point based on a visible portion of an acoustic sensor of a medical device contacting a patient by acquiring a first image containing an upper surface of the acoustic sensor from a first viewpoint and a second image containing the upper surface of the acoustic sensor from a second viewpoint different from the first viewpoint. The acoustic sensor is located in the first image and the second image and the centroid of the acoustic sensor is determined based on the location of the acoustic sensor in the first image and the corresponding location of the acoustic sensor in the second image. A plane of the visible portion of the acoustic sensor is also determined based on the position of the upper surface of the acoustic sensor in the first image and the corresponding position of the upper surface of the acoustic sensor in the second image. The contact point of the obscured portion of the acoustic sensor may then be determined by projecting a predetermined depth through the centroid of the acoustic sensor in a direction having a predefined relationship with the plane of the visible portion of the acoustic sensor.
-
Citations
45 Claims
-
1. A method of determining an obscured contact point based on a visible portion of an acoustic sensor of a medical device contacting a patient from at least a first image containing an upper surface of the acoustic sensor from a first viewpoint and a second image containing the upper surface of the acoustic sensor from a second viewpoint different from the first viewpoint, the method comprising the steps of:
-
locating the acoustic sensor in the first image and the second image;
determining the location of a point on the upper surface of the acoustic sensor based on the location of the acoustic sensor in the first image and the corresponding location of the acoustic sensor in the second image;
determining a plane of the upper surface of the acoustic sensor based on the acoustic sensor in the first image and the corresponding acoustic sensor in the second image; and
projecting a predetermined depth through the located point on the upper surface of the acoustic sensor in a direction having a predefined relationship with the plane of the visible portion of the acoustic sensor so as to determine the contact point of the obscured portion of the acoustic sensor. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15)
determining the centroid of the sensor in each of the first and the second image; and
determining a location of the centroid in three dimensional space from the centroid of the upper surface of the sensor in the first image and the centroid of the upper surface of the sensor in the second image.
-
-
4. A method according claim 3, wherein the step of projecting comprises the steps of:
-
determining a direction of a cross product of lines which pass through corners of the visible portion of the acoustic sensor; and
projecting the predetermined depth from the centroid of the upper surface of the sensor in the direction of the cross product.
-
-
5. A method according to claim 4, further comprising the steps of:
-
establishing calipers in a direction orthogonal to directions of detected edges of the visible portion of the acoustic sensor; and
moving the calipers until the calipers touch the located acoustic sensors so as to determine the locations of the corners of the visible portion of the acoustic sensor.
-
-
6. A method according to claim 5, wherein the visible portion of the acoustic sensor is a quadrangle.
-
7. A method according to claim 1, further comprising the steps of:
-
determining the contact points of a plurality of acoustic sensors; and
determining the relative spatial relationship between the plurality of acoustic sensors based on the determination of the contact points of the plurality of acoustic sensors.
-
-
8. A method according to claim 1, wherein the acoustic sensor is substantially rectangular in shape such that the visible portion of the upper surface of the acoustic sensor is a quadrangle.
-
9. A method according to claim 1, wherein the first image is acquired by a first camera and wherein the second image is acquired by a second camera, the method further comprising the step of calibrating the first camera and the second camera so as to determine the relationship between the first viewpoint and the second viewpoint.
-
10. A method according to claim 9, wherein the step of calibrating comprises the steps of:
-
acquiring at the first camera a first reference image containing a predefined configuration of landmarks;
acquiring at the second camera a second reference image containing the predefined configuration of landmarks; and
determining the spatial relationship between an image plane of the first camera and an image plane of the second camera based on the first and second reference images.
-
-
11. A method according to claim 9, wherein the step of calibrating comprises the steps of:
-
computing camera parameters for each of the first and second cameras;
computing projection matrices for each of the first and second cameras; and
estimating object-space coordinates for each camera using corresponding points from at least two images of the same scene;
wherein calibration of the cameras is indicated by coincidence of the object space coordinates for the corresponding points.
-
-
12. A method according to claim 1, wherein the locating step comprises the steps of:
-
locating sensor positions in the first image;
locating sensor positions in the second image; and
determining a correspondence between the sensor positions in the first image and the sensor positions in the second image so that a sensor position in the first image is associated with the corresponding sensor position the second image.
-
-
13. A method according to claim 12, where the step of determining a correspondence comprises the steps of:
-
determining an epipolar line for a sensor position in the first image;
determining which of the sensor positions in the second image the epipolar line intersects; and
associating a sensor position in the second image which the epipolar line intersects with the sensor position in the first image.
-
-
14. A method according to claim 13, wherein the step of associating comprises the steps of:
-
determining if a plurality of sensor positions in the second image intersect the epipolar line; and
associating sensor positions in the first image with sensor positions in the second image which intersect the epipolar line in an order in which the sensor positions in the first image intersect the epipolar line.
-
-
15. A method according to claim 13, wherein the step of determining which of the sensor positions in the second image the epipolar line intersects comprises the steps of:
-
determining if the epipolar line intersects a bounding box around a sensor position;
determining the signed distance between the epipolar line and opposite corners of the bounding box; and
determining that the epipolar line intersects the sensor position if the signed distances between the epipolar line and the opposite corners of the bounding box are of opposite signs.
-
-
16. A system for determining an obscured contact point based on a visible portion of an acoustic sensor of a medical device contacting a patient from at least a first image containing an upper surface of the acoustic sensor from a first viewpoint and a second image containing the upper surface of the acoustic sensor from a second viewpoint different from the first viewpoint, comprising:
-
means for locating the acoustic sensor in the first image and the second image;
means for determining the location of a point on the upper surface of the acoustic sensor based on the location of the acoustic sensor in the first image and the corresponding location of the acoustic sensor in the second image;
means for determining a plane of the upper surface of the acoustic sensor based on the acoustic sensor in the first image and the corresponding acoustic sensor in the second image; and
means for projecting a predetermined depth through the located point on the upper surface of the acoustic sensor in a direction having a predefined relationship with the plane of the visible portion of the acoustic sensor so as to determine the contact point of the obscured portion of the acoustic sensor. - View Dependent Claims (17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30)
means for determining the centroid of the sensor in each of the first and the second image; and
means for determining a location of the centroid in three dimensional space from the centroid of the upper surface of the sensor in the first image and the centroid of the upper surface of the sensor in the second image.
-
-
19. A system according claim 18, wherein the means for projecting comprises:
-
means for determining a direction of a cross product of lines which pass through corners of the visible portion of the acoustic sensor; and
means for projecting the predetermined depth from the centroid of the upper surface of the sensor in the direction of the cross product.
-
-
20. A system according to claim 19, further comprising:
-
means for establishing calipers in a direction orthogonal to directions of detected edges of the visible portion of the acoustic sensor; and
means for moving the calipers until the calipers touch the located acoustic sensors so as to determine the locations of the corners of the visible portion of the acoustic sensor.
-
-
21. A system according to claim 20, wherein the visible portion of the acoustic sensor is a quadrangle.
-
22. A system according to claim 16, further comprising:
-
means for determining the contact points of a plurality of acoustic sensors;
means for determining the relative spatial relationship between the plurality of acoustic sensors based on the determination of the contact points of the plurality of acoustic sensors.
-
-
23. A system according to claim 16, wherein the acoustic sensor is substantially rectangular in shape such that the visible portion of the upper surface of the acoustic sensor is a quadrangle.
-
24. A system according to claim 16, wherein the first image is acquired by a first camera and wherein the second image is acquired by a second camera, the system further comprising means for calibrating the first camera and the second camera so as to determine the relationship between the first viewpoint and the second viewpoint.
-
25. A system according to claim 24, wherein the means for calibrating comprises:
-
means for acquiring at the first camera a first reference image containing a predefined configuration of landmarks;
means for acquiring at the second camera a second reference image containing the predefined configuration of landmarks; and
means for determining the spatial relationship between an image plane of the first camera and an image plane of the second camera based on the first and second reference images.
-
-
26. A system according to claim 24, wherein the means for calibrating comprises:
-
means for computing camera parameters for each of the first and second cameras;
means for computing projection matrices for each of the first and second cameras; and
means for estimating object-space coordinates for each camera using corresponding points from at least two images of the same scene;
wherein calibration of the cameras is indicated by coincidence of the object space coordinates for the corresponding points.
-
-
27. A system according to claim 16, wherein the means for locating comprises:
-
means for locating sensor positions in the first image;
means for locating sensor positions in the second image; and
means for determining a correspondence between the sensor positions in the first image and the sensor positions in the second image so that a sensor position in the first image is associated with the corresponding sensor position the second image.
-
-
28. A system according to claim 27, where the means for determining a correspondence comprises:
-
means for determining an epipolar line for a sensor position in the first image;
means for determining which of the sensor positions in the second image the epipolar line intersects; and
means for associating a sensor position in the second image which the epipolar line intersects with the sensor position in the first image.
-
-
29. A system according to claim 28, wherein the means for associating comprises:
-
means for determining if a plurality of sensor positions in the second image intersect the epipolar line; and
means for associating sensor positions in the first image with sensor positions in the second image which intersect the epipolar line in an order in which the sensor positions in the first image intersect the epipolar line.
-
-
30. A system according to claim 28, wherein the means for determining which of the sensor positions in the second image the epipolar line intersects comprises:
-
means for determining if the epipolar line intersects a bounding box around a sensor position;
means for determining the signed distance between the epipolar line and opposite corners of the bounding box; and
means for determining that the epipolar line intersects the sensor position if the signed distances between the epipolar line and the opposite corners of the bounding box are of opposite signs.
-
-
31. A computer program product for determining an obscured contact point based on a visible portion of an acoustic sensor of a medical device contacting a patient from at least a first image containing an upper surface of the acoustic sensor from a first viewpoint and a second image containing the upper surface of the acoustic sensor from a second viewpoint different from the first viewpoint, comprising:
-
a computer readable storage medium having computer readable program code means embodied in said medium, said computer readable program code means comprising;
computer readable program code means for locating the acoustic sensor in the first image and the second image;
computer readable program code means for determining the location of a point on the upper surface of the acoustic sensor based on the location of the acoustic sensor in the first image and the corresponding location of the acoustic sensor in the second image;
computer readable program code means for determining a plane of the upper surface of the acoustic sensor based on the acoustic sensor in the first image and the corresponding acoustic sensor in the second image; and
computer readable program code means for projecting a predetermined depth through the located point on the upper surface of the acoustic sensor in a direction having a predefined relationship with the plane of the visible portion of the acoustic sensor so as to determine the contact point of the obscured portion of the acoustic sensor. - View Dependent Claims (32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 44, 45)
computer readable program code means for determining the centroid of the sensor in each of the first and the second image; and
computer readable program code means for determining a location of the centroid in three dimensional space from the centroid of the upper surface of the sensor in the first image and the centroid of the upper surface of the sensor in the second image.
-
-
34. A computer program product according claim 33, wherein the computer readable program code means for projecting comprises:
-
computer readable program code means for determining a direction of a cross product of lines which pass through corners of the visible portion of the acoustic sensor; and
computer readable program code means for projecting the predetermined depth from the centroid of the upper surface of the sensor in the direction of the cross product.
-
-
35. A computer program product according to claim 34, further comprising:
-
computer readable program code means for establishing calipers in a direction orthogonal to directions of detected edges of the visible portion of the acoustic sensor; and
computer readable program code means for moving the calipers until the calipers touch the located acoustic sensors so as to determine the locations of the corners of the visible portion of the acoustic sensor.
-
-
36. A computer program product according to claim 35, wherein the visible portion of the acoustic sensor is a quadrangle.
-
37. A computer program product according to claim 31, further comprising:
-
computer readable program code means for determining the contact points of a plurality of acoustic sensors;
computer readable program code means for determining the relative spatial relationship between the plurality of acoustic sensors based on the determination of the contact points of the plurality of acoustic sensors.
-
-
38. A computer program product according to claim 31, wherein the acoustic sensor is substantially rectangular in shape such that the visible portion of the upper surface of the acoustic sensor is a quadrangle.
-
39. A computer program product according to claim 31, wherein the first image is acquired by a first camera and wherein the second image is acquired by a second camera, the computer program product further comprising computer readable program code means for calibrating the first camera and the second camera so as to determine the relationship between the first viewpoint and the second viewpoint.
-
40. A computer program product according to claim 39, wherein the computer readable program code means for calibrating comprises:
-
computer readable program code means for acquiring at the first camera a first reference image containing a predefined configuration of landmarks;
computer readable program code means for acquiring at the second camera a second reference image containing the predefined configuration of landmarks; and
computer readable program code means for determining the spatial relationship between an image plane of the first camera and an image plane of the second camera based on the first and second reference images.
-
-
41. A computer program product according to claim 39, wherein the computer readable program code means for calibrating comprises:
-
computer readable program code means for computing camera parameters for each of the first and second cameras;
computer readable program code means for computing projection matrices for each of the first and second cameras; and
computer readable program code means for estimating object-space coordinates for each camera using corresponding points from at least two images of the same scene;
wherein calibration of the cameras is indicated by coincidence of the object space coordinates for the corresponding points.
-
-
42. A computer program product according to claim 31, wherein the computer readable program code means for locating comprises:
-
computer readable program code means for locating sensor positions in the first image;
computer readable program code means for locating sensor positions in the second image; and
computer readable program code means for determining a correspondence between the sensor positions in the first image and the sensor positions in the second image so that a sensor position in the first image is associated with the corresponding sensor position the second image.
-
-
43. A computer program product according to claim 42, where the computer readable program code means for determining a correspondence comprises:
-
computer readable program code means for determining an epipolar line for a sensor position in the first image;
computer readable program code means for determining which of the sensor positions in the second image the epipolar line intersects; and
computer readable program code means for associating a sensor position in the second image which the epipolar line intersects with the sensor position in the first image.
-
-
44. A computer program product according to claim 43, wherein the computer readable program code means for associating comprises:
-
computer readable program code means for determining if a plurality of sensor positions in the second image intersect the epipolar line; and
computer readable program code means for associating sensor positions in the first image with sensor positions in the second image which intersect the epipolar line in an order in which the sensor positions in the first image intersect the epipolar line.
-
-
45. A computer program product according to claim 43, wherein the computer readable program code means for determining which of the sensor positions in the second image the epipolar line intersects comprises:
-
computer readable program code means for determining if the epipolar line intersects a bounding box around a sensor position;
computer readable program code means for determining the signed distance between the epipolar line and opposite corners of the bounding box; and
computer readable program code means for determining that the epipolar line intersects the sensor position if the signed distances between the epipolar line and the opposite corners of the bounding box are of opposite signs.
-
Specification