Image processing apparatus
First Claim
1. A method of producing signals defining a transformation between three images of an object by processing first input signals defining an affine transformation between a first pair of the images, second input signals defining a perspective transformation between the first pair of the images, third input signals defining an affine transformation between a second pair of the images, one of the three images being common to the first pair and the second pair, fourth input signals defining a perspective transformation between the second pair of the images, and fifth input signals defining features matched in all three images, the method comprising:
- (a) calculating a transformation between all three images using the first, third and fifth input signals;
(b) calculating a transformation between all three images using the first, fourth and fifth input signals;
(c) calculating a transformation between all three images using the second, third and fifth input signals;
(d) calculating a transformation between all three images using the second, fourth and fifth input signals; and
(e) testing the calculated transformations to determine an accuracy of each calculated transformation, and selecting one of the calculated transformations in dependence upon the accuracy thereof.
1 Assignment
0 Petitions
Accused Products
Abstract
In an apparatus and method for creating a three-dimensional model of an object, images of the object taken from different, unknown positions are processed to identify the points in the images which correspond to the same point on the actual object (that is “matching” points), the matching points are used to determine the relative positions from which the images were taken, and the matching points and calculated positions are used to calculate points in a three-dimensional space representing points on the object. A number of different techniques are used to identify the matching points, and a number of solutions are calculated and tested for the relative positions, the solution which is consistent with the largest number of matching points being selected. In one matching technique, edges in an image are identified by first identifying corner points in the image and then identifying edges between the corner points on the basis of edge orientation values of pixels, the edges are processed in strength order to remove cross-overs, the images sub-divided into regions by connecting points at the ends of the edges on the basis of the edge strengths, and matching points within corresponding regions in two or more images are identified.
-
Citations
174 Claims
-
1. A method of producing signals defining a transformation between three images of an object by processing first input signals defining an affine transformation between a first pair of the images, second input signals defining a perspective transformation between the first pair of the images, third input signals defining an affine transformation between a second pair of the images, one of the three images being common to the first pair and the second pair, fourth input signals defining a perspective transformation between the second pair of the images, and fifth input signals defining features matched in all three images, the method comprising:
-
(a) calculating a transformation between all three images using the first, third and fifth input signals;
(b) calculating a transformation between all three images using the first, fourth and fifth input signals;
(c) calculating a transformation between all three images using the second, third and fifth input signals;
(d) calculating a transformation between all three images using the second, fourth and fifth input signals; and
(e) testing the calculated transformations to determine an accuracy of each calculated transformation, and selecting one of the calculated transformations in dependence upon the accuracy thereof. - View Dependent Claims (3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 21, 22)
-
-
2. A method of producing signals defining a transformation between three images of an object by processing first input signals defining a transformation between a first pair of the images, second input signals defining an affine transformation between a second pair of the images, one of the three images being common to the first pair and the second pair, third input signals defining a perspective transformation between the second pair of images, and fourth input signals defining features matched in all three images, the method comprising:
-
(a) calculating a transformation between all three images using the first, second and fourth input signals;
(b) calculating a transformation between all three images using the first, third and fourth input signals; and
(c) testing the calculated transformations to determine an accuracy of each calculated transformation, and selecting one of the calculated transformations in dependence upon the accuracy thereof.
-
-
19. A method of processing signals defining first and second types of transformations between a first image and a second image and between the second image and a third image, and signals defining corresponding features in the images, so as to determine a relationship between positions at which the three images were taken, the method comprising:
-
determining a relationship between the positions at which the images were taken using the corresponding features in dependence upon the first type of transformation between the first and second images and the second type of transformation between the second and third images;
determining a relationship between the positions at which the images were taken using the corresponding features in dependence upon the first type of transformation between the first and second images and the first type of transformation between the second and third images;
determining a relationship between the positions at which the images were taken using the corresponding features in dependence upon the second type of transformation between the first and second images and the second type of transformation between the second and third images;
determining a relationship between the positions at which the images were taken using the corresponding features in dependence upon the second type of transformation between the first and second images and the first type of transformation between the second and third images; and
testing the determined relationships to determine an accuracy of each relationship, and selecting one of the determined relationships in dependence upon the accuracy thereof.
-
-
20. A method of processing signals defining (i) first and second types of transformation between a first image and a second image, (ii) a transformation between the second image and a third image, and (iii) corresponding features in the images, so as to determine a relationship between positions at which the three images were taken, the method comprising:
-
determining a relationship between the positions at which the images were taken using the corresponding features in dependence upon the first type of transformation between the first and second images and the transformation between the second and third images;
determining a relationship between the positions at which the images were taken using the corresponding features in dependence upon the second type of transformation between the first and second images and the transformation between the second and third images;
testing the determined relationships to determine an accuracy of each determined relationship; and
selecting one of the determined relationships in dependence upon the accuracy thereof.
-
-
23. An image processing apparatus for producing signals defining a transformation between three images of an object by processing first input signals defining an affine transformation between a first pair of the images, second input signals defining a perspective transformation between the first pair of the images, third input signals defining an affine transformation between a second pair of the images, one of the images being common to the first pair and the second pair, fourth input signals defining a perspective transformation between the second pair of the images, and fifth input signals defining features matched in all three images, the image processing apparatus comprising:
-
(a) a first transformation calculator for calculating a transformation between all three images using the first, third and fifth input signals;
(b) a second transformation calculator for calculating a transformation between all three images using the first, fourth and fifth input signals;
(c) a third transformation calculator for calculating a transformation between all three images using the second, third and fifth input signals;
(d) a fourth transformation calculator for calculating a transformation between all three images using the second, fourth and fifth input signals;
(e) a transformation tester for testing the calculated transformations to determine an accuracy of each calculated transformation; and
(f) a transformation selector for selecting one of the calculated transformations in dependence upon the accuracy thereof. - View Dependent Claims (25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37)
-
-
24. An image processing apparatus for producing signals defining a transformation between three images of an object by processing first input signals defining a transformation between a first pair of the images, second input signals defining an affine transformation between a second pair of the images, one of the images being common to the first pair and the second pair, third input signals defining a perspective transformation between the second pair of the images, and fourth input signals defining features matched in all three images of the first pair and the second pair, the image processing apparatus comprising:
-
(a) a first transformation calculator for calculating a transformation between all three images using the first, second and fourth input signals;
(b) a second transformation calculator for calculating a transformation between all three images using the first, third and fourth input signals;
(c) a transformation tester for testing the calculated transformations to determine an accuracy of each calculated transformation; and
(d) a transformation selector for selecting one of the calculated transformations in dependence upon the accuracy thereof.
-
-
38. A method of processing input signals defining a plurality of pairs of features representing features matched in first and second images of an object taken from undefined camera positions using first and second matching techniques, to produce signals defining a relationship between the camera positions, the method comprising:
-
(a) using pairs of features matched by the first matching technique to calculate a relationship between the camera positions;
(b) using one of (i) pairs of features matched by the second matching technique and (ii) pairs of features matched by the first matching technique and pairs of features matched by the second matching technique to calculate a relationship between the camera positions;
(c) testing the calculated relationships to determine an accuracy of each calculated relationship; and
(d) selecting a relationship in dependence upon the accuracy thereof. - View Dependent Claims (39, 40, 41, 42, 43, 44, 45, 46, 47, 48, 49, 50, 51, 54, 55)
-
-
52. A method of processing first signals defining object features matched with a first matching technique in first and second images taken from imaging positions of undefined relationship and second signals defining object features matched in the first and second images with a second matching technique, so as to determine a positional relationship between the images, the method comprising:
-
(a) processing the first input signals to determine a positional relationship between the images;
(b) processing the second input signals to determine a positional relationship between the images; and
(c) testing the calculated relationships to determine an accuracy of each calculated relationship, and selecting one of the calculated relationships in dependence upon the accuracy thereof. - View Dependent Claims (53)
-
-
56. An image processing apparatus for processing input signals defining a plurality of pairs of features representing features matched in first and second images of an object taken from undefined camera positions using first and second matching techniques to produce signals defining a relationship between the camera positions, comprising:
-
(a) a first calculator arranged to use pairs of features matched by the first matching technique to calculate a relationship between the camera positions;
(b) a second calculator arranged to use one of (i) pairs of features matched by the second matching technique and (ii) pairs of features matched by the first matching technique and pairs of features matched by the second matching technique to calculate a relationship between the camera positions;
(c) a relationship tester for testing the calculated relationships to determine an accuracy of each calculated relationship; and
(d) a selector for selecting one of the calculated relationships in dependence upon the accuracy thereof. - View Dependent Claims (57, 58, 59, 60, 61, 62, 63, 64, 65, 66)
-
-
67. A method of processing input signals defining at least eight pairs of features representing features matched in first and second images of an object taken from camera positions of undefined rotation and translation relationship, to produce signals defining a rotation and translation relationship between the camera positions, the method comprising:
-
(a) calculating a fundamental matrix using at least a first seven pairs of matched features;
(b) converting the calculated fundamental matrix into a physically realizable matrix;
(c) testing the calculated physically realizable matrix using a plurality of the pairs of matched features to determine an accuracy of the calculated physically realizable matrix;
(d) repeating steps (a) to (c) using a different seven pairs of matched features in step (a); and
(e) selecting a physically realizable matrix in dependence upon the determined accuracy thereof. - View Dependent Claims (68, 69, 70, 71, 72, 73, 74, 75, 76, 77, 78, 79, 80, 81, 82, 83, 84, 85, 86, 87, 88, 90, 91)
the calculated physically realizable matrix is tested using each pair of matched features used in step (a) to calculate the fundamental matrix; and
if the calculated physically realizable matrix is consistent with a predetermined number of the pairs of matched features used to calculate the fundamental matrix, then the physically realizable matrix is tested against other pairs of matched features defined in the input signals.
-
-
70. A method according to claim 69, wherein the predetermined number comprises all of the pairs of matched features used to calculate the fundamental matrix.
-
71. A method according to claim 67, wherein, in step (c), the calculated physically realizable matrix is tested using each pair of matched features defined in the input signals.
-
72. A method according to claim 67, wherein, in step (d), steps (a) to (c) are repeated a number of times, the number of times being determined in accordance with the number of pairs of features defined in the input signals.
-
73. A method according to claim 72, wherein the number of times steps (a) to (c) are repeated is a percentage of the maximum number of different combinations of the number of pairs of features used in step (a) to calculate the fundamental matrix it is possible to select from the pairs of features defined in the input signals.
-
74. A method according to claim 72, wherein, in step (d), the repetition of steps (a) to (c) is stopped if it is determined in step (c) that an accuracy of the calculated physically realizable matrix has not increased in a given number of previous iterations.
-
75. A method according to claim 67, wherein the input signals define at least eight pairs of matching features identified using a first matching technique and at least one pair of matching features identified using a second matching technique, and wherein the method comprises:
-
(i) performing steps (a) to (e) selecting the pairs of features to be used in step (a) from those identified with the first matching technique;
(ii) performing steps (a) to (e) selecting the pairs of features to be used in step (a) from those identified with the first matching technique and those identified with the second matching technique; and
(iii) selecting the calculated physically realizable matrix of highest accuracy from the physically realizable matrix selected in step (i) (e) and the physically realizable matrix selected in step (ii) (e).
-
-
76. A method according to claim 75, wherein the pairs of features matched by one of the matching techniques have been matched by a user, and the pairs of features matched by the other matching technique have been matched by an image processing apparatus.
-
77. A method according to claim 67, wherein the input signals define at least eight pairs of matching features identified using a first matching technique and at least eight pairs of matching features identified using a second matching technique, and wherein the method comprises:
-
(i) performing steps (a) to (e) selecting pairs of features to be used in step (a) from those identified with the first matching technique;
(ii) performing steps (a) to (e) selecting pairs of features to be used in step (a) from those identified with the second matching technique; and
(iii) selecting the calculated physically realizable matrix of highest accuracy from the physically realizable matrix selected in step (i) (e) and the physically realizable matrix selected in step (ii) (e).
-
-
78. A method according to claim 77, wherein the pairs of features matched by one of the matching techniques have been matched by a user, and the pairs of features matched by the other matching technique have been matched by an image processing apparatus.
-
79. A method according to claim 67, further comprising the step of converting the selected physically realizable matrix into a rotation matrix and translation vector.
-
80. A method according to claim 67, further comprising the step of processing image data defining the images of the object to generate the input signals.
-
81. A method according to claim 67, wherein the physically realizable matrix comprises a physical fundamental matrix.
-
82. A method according to claim 67, wherein pairs of features comprise pairs of points.
-
83. A method according to claim 67, further comprising the step of processing the input signals and the signals defining the selected physically realizable matrix to generate object data defining a model of the object in a three-dimensional space.
-
84. A method according to claim 83, further comprising the step of processing the object data to generate image data.
-
85. A method according to claim 84, further comprising the step of displaying an image of the object.
-
86. A method according to claim 84, further comprising the step of recording the image data.
-
87. A method according to claim 83, further comprising the step of transmitting a signal conveying the object data.
-
88. A method according to claim 83, further comprising the step of recording the object data.
-
90. A storage device storing instructions for causing a programmable processing apparatus to become operable to perform a method according to any one of claims 67-89 or 76.
-
91. A signal for causing a programmable processing apparatus to become operable to perform a method according to any one of claims 67-89 or 76.
-
89. A method of processing signals defining object features matched in first and second images taken from imaging positions of undefined rotation and translation relationship, so as to determine a rotation and translation relationship between the imaging positions, the method comprising:
-
(a) calculating a non-physically realizable matrix using matched features;
(b) conversing the non-physically realizable matrix into a physically realizable matrix;
(c) determining an accuracy of the physically realizable matrix;
(d) repeating steps (a) to (c); and
(e) selecting the physically realizable matrix of highest determined accuracy.
-
-
92. An image processing apparatus for processing input signals defining at least eight pairs of features representing features matched in first and second images of an object taken from camera positions of undefined rotation and translation relationship to produce signals defining a rotation and translation relationship between the camera positions, comprising:
-
(a) a fundamental matrix calculator for calculating a fundamental matrix using at least a first seven pairs of matched features;
(b) a matrix converter for converting the calculated fundamental matrix into a physically realizable matrix; and
(c) a matrix tester for testing the calculated physically realizable matrix using a plurality of the pairs of matched features to determine an accuracy of the calculated physically realizable matrix;
the apparatus being controlled so as to cause the fundamental matrix calculator, the matrix converter and the matrix tester to repeat their operations using a different seven pairs of matched features in the fundamental matrix calculator to calculate a fundamental matrix; and
further comprising;
(d) a matrix selector for selecting a calculated physically realizable matrix in dependence upon the determined accuracy thereof. - View Dependent Claims (93, 94, 95, 96, 97, 98, 99, 100, 101, 102, 103, 104, 105, 106, 107, 108, 109, 110)
test the calculated physically realizable matrix using each pair of matched features used by the fundamental matrix calculator to calculate the fundamental matrix; and
if the calculated physically realizable matrix is consistent with a predetermined number of the pairs of matched features used to calculate the fundamental matrix, test the physically realizable matrix against other pairs of matched features defined in the input signals.
-
-
95. Apparatus according to claim 94, wherein the predetermined number comprises all of the pairs of matched features used to calculate the fundamental matrix.
-
96. Apparatus according to claim 92, wherein the matrix tester is arranged to test the calculated physically realizable matrix using each pair of matched features defined in the input signals.
-
97. Apparatus according to claim 92, controlled such that the operations performed by the fundamental matrix calculator, the matrix converter and the matrix tester are repeated a number of times, the number of times being determined in accordance with the number of pairs of features defined in the input signals.
-
98. Apparatus according to claim 97, wherein the number of times the operations performed by the fundamental matrix calculator, the matrix converter and the matrix tester are repeated is a percentage of the maximum number of different combinations of the number of pairs of features used by the fundamental matrix calculator to calculate the fundamental matrix it is possible to select from the pairs of features defined in the input signals.
-
99. Apparatus according to claim 97, controlled such that the repetition of the operations by the fundamental matrix calculator, the matrix converter and the matrix tester is stopped if the matrix tester determines that an accuracy of the calculated physically realizable matrix has not increased in a given number of previous iterations.
-
100. Apparatus according to claim 92, wherein the input signals define at least eight pairs of matching features identified using a first matching technique and at least one pair of matching features identified using a second matching technique, and wherein the apparatus is controlled such that:
-
(i) the fundamental matrix calculator, the matrix converter, the matrix tester and the matrix selector are operated selecting the pairs of features to be used by the fundamental matrix calculator from those identified with the first matching technique to give a first selected physically realizable matrix;
(ii) the fundamental matrix calculator, the matrix converter, the matrix tester and the matrix selector are operated selecting the pairs of features to be used by the fundamental matrix calculator from those identified with the first matching technique and those identified with the second matching technique to give a second selected physically realizable matrix; and
(iii) the calculated physically realizable matrix of highest accuracy from the first and second selected physically realizable matrices is selected.
-
-
101. Apparatus according to claim 100, wherein the pairs of features matched by one of the matching techniques have been matched by a user, and the pairs of features matched by the other matching technique have been matched by an image processing apparatus.
-
102. Apparatus according to claim 92, wherein the input signals define at least eight pairs of matching features identified using a first matching technique and at least eight pairs of matching features identified using a second matching technique, and wherein the apparatus is controlled such that:
-
(i) the fundamental matrix calculator, the matrix converter, the matrix tester and the matrix selector are operated selecting pairs of features to be used by the fundamental matrix calculator from those identified with the first matching technique to give a first selected physically realizable matrix;
(ii) the fundamental matrix calculator, the matrix converter, the matrix tester and the matrix selector are operated selecting pairs of features to be used by the fundamental matrix calculator from those identified with the second matching technique to give a second selected physically realizable matrix; and
(iii) the calculated physically realizable matrix of highest accuracy from the first and second selected physically realizable matrices is selected.
-
-
103. Apparatus according to claim 102, wherein the pairs of features matched by one of the matching techniques have been matched by a user, and the pairs of features matched by the other matching technique have been matched by an image processing apparatus.
-
104. Apparatus according to claim 92, further comprising a converter to convert the selected physically realizable matrix into a rotation matrix and translation vector.
-
105. Apparatus according to claim 92, further comprising an image data processor to generate the input signals by processing image data defining the images of the object.
-
106. Apparatus according to claim 92, wherein the physically realizable matrix comprises a physical fundamental matrix.
-
107. Apparatus according to claim 92, wherein pairs of features comprise pairs of points.
-
108. Apparatus according to claim 92, further comprising an object data generator to generate object data defining a model of the object in a three-dimensional space by processing the input signals and the signals defining the selected physically realizable matrix.
-
109. Apparatus according to claim 108, further comprising an image data generator to generate image data by processing the object data.
-
110. Apparatus according to claim 109, further comprising a display to display an image of the object.
-
111. A method of processing first input signals defining at least two points matched in each of at least three images of an object taken from different camera positions and second input signals defining the camera positions, to produce signals defining points in a three-dimensional space representing points on the object, the method comprising the steps of:
-
(a) for each of the points matched in a first pair of the images, calculating a point having a position in the three-dimensional space using the point in one image of the pair and the point in the other image of the pair;
(b) for each of the points matched in a second pair of the images, calculating a point having a position in the three-dimensional space using the point in one image of the pair and the point in the other image of the pair;
(c) calculating a single point having a position in the three-dimensional space and associated positional error for each point matched in each of the images, each single point being calculated in dependence upon a point generated in step (a) and the point generated in step (b) from corresponding matched image points;
(d) processing the single points generated in step (c) and their associated positional errors to determine whether there are any single points which may represent a same point on the object; and
(e) processing the single points which may represent a same point on the object to give one point having a position in the three-dimensional space. - View Dependent Claims (112, 113, 114, 115, 116, 117, 118, 119, 120, 121, 122, 123, 124, 125, 126, 127, 128, 129, 130, 132, 133)
projecting a ray from the point in a first image of the pair through the position of the notional optical centre of the camera for the first image to form a first projected ray;
projecting a ray from the point in the second image of the pair through the position of the notional optical centre of the camera for the second image to form a second projected ray; and
calculating the mid-point of the line which is perpendicular to both the first and second projected rays.
-
-
113. A method according to claim 111, wherein, in step (c), said single points in the three-dimensional space are calculated by:
-
(i) calculating a positional error for each point in the three-dimensional space calculated in at least one of steps (a) and (b);
(ii) re-positioning the points calculated in the at least one step in accordance with the positional error calculated in step (i) to give re-positioned points; and
(iii) calculating each single point in the three-dimensional space in dependence upon a re-positioned point and the point in the three-dimensional space generated in the other of steps (a) and (b) from the corresponding matched image points.
-
-
114. A method according to claim 113, wherein step (i) comprises:
-
calculating a difference in position between each point in the three-dimensional space generated in step (a) and the point generated in step (b) from the corresponding matched image points; and
calculating the positional error in dependence upon a plurality of the calculated differences in position.
-
-
115. A method according to claim 114, wherein the positional error is calculated in dependence upon all the calculated differences in position except any difference in position which exceeds a threshold.
-
116. A method according to claim 115, wherein points in the three-dimensional space have a spatial distribution, and wherein the threshold is set in dependence upon the spatial distribution within the three-dimensional space of the points calculated in steps (a) and (b).
-
117. A method according to claim 113, wherein, in step (c), the positional error associated with said single points in the three-dimensional space is calculated in dependence upon a difference in positions within the three-dimensional space of each re-positioned point and the point generated in the other of steps (a) and (b) from the corresponding matched image points.
-
118. A method according to claim 117, wherein, in step (c), the positional error associated with said single points in the three-dimensional space is calculated as a probability distribution in the three-dimensions.
-
119. A method according to claim 118, wherein the probability distribution is a Gaussian distribution.
-
120. A method according to claim 118, wherein step (iii) comprises calculating each single point in the three-dimensional space by combining the probability distribution of a re-positioned point and the probability distribution, if any, of the point in the three-dimensional space generated in the other of steps (a) and (b) from the corresponding matched image points.
-
121. A method according to claim 118, wherein, in step (d), it is determined that a first of the single points represents the same point on the object as a second of the single points if the first point lies within a given distance of the second point, the given distance being dependent upon the probability distribution of the positional error of the second point.
-
122. A method according to claim 121, wherein the given distance is the Mahalanobis distance of the probability distribution of the second point.
-
123. A method according to claim 111, wherein, in step (e), the one point in the three-dimensional space is calculated in dependence upon the positions of all the single points which may represent the same point on the object and their associated positional errors.
-
124. A method according to claim 111, further comprising the step of processing image data defining the images of the object to generate the first input signals.
-
125. A method according to claim 111, further comprising the step of processing the first input signals to generate the second input signals.
-
126. A method according to claim 111, further comprising the step of processing the points generated in step (e) to generate image data.
-
127. A method according to claim 126, further comprising the step of displaying an image of the object.
-
128. A method according to claim 126, further comprising the step of recording the image data.
-
129. A method according to claim 111, further comprising the step of transmitting a signal defining the points generated in step (e).
-
130. A method according to claim 111, further comprising the step of making a recording of the points generated in step (c).
-
132. A storage device storing instructions for causing a programmable processing apparatus to become operable to perform a method according to any one of claims 111 to 131.
-
133. A signal for causing a programmable processing apparatus to be operable to perform a method according to any one of claims 111 to 131.
-
131. A method of processing first input signals defining a first plurality of points comprising a point in each of first, second and third images of an object recorded at different positions, second input signals defining a second plurality of points comprising a further point in each of the first, second and third images, and third input signals defining a relationship between the positions at which the first, second and third images were recorded, so as to define points in a three-dimensional space representing points on the object, the method comprising:
-
processing the first and third input signals to define a first point in the three-dimensional space based on the points in the first and second images and a second point in the three-dimensional space based on the points in the second and third images;
processing the second and third input signals to define a third point in the three-dimensional space based on the further points in the first and second images and a fourth point in the three-dimensional space based on the further points in the second and third images;
defining a fifth point in the three-dimensional space in dependence upon the first and second points in the three-dimensional space;
defining a sixth point in the three-dimensional space in dependence upon the third and fourth points in the three-dimensional space;
determining whether the fifth and sixth points in the three-dimensional space may represent a same point on the object, and, if so, defining a seventh point in the three-dimensional space in dependence upon the fifth and sixth points.
-
-
134. An image processing apparatus for processing first input signals defining at least two points matched in each of at least three images of an object taken from different camera positions and second input signals defining the camera positions, to produce signals defining points in a three-dimensional space representing points on the object, comprising:
-
(a) a first calculator for calculating, for each of the points matched in a first pair of the images, a point having a position in the three-dimensional space using the point in one image of the pair and the point in the other image of the pair;
(b) a second calculator for calculating, for each of the points matched in a second pair of the images, a point having a position in the three-dimensional space using the point in one image of the pair and the point in the other image of the pair;
(c) a third calculator for calculating a single point having a position in the three-dimensional space and associated positional error for each point matched in each of the images, each single point being calculated in dependence upon a point generated by the first calculator and the point generated by the second calculator from corresponding matched image points;
(d) a point position processor for processing the single points generated by the third calculator and their associated positional errors to determine whether there are any single points which may represent a same point on the object; and
(e) a fourth calculator for processing the single points which may represent a same point on the object to calculate one point having a position in the three-dimensional space. - View Dependent Claims (135, 136, 137, 138, 139, 140, 141, 142, 143, 144, 145, 146, 147, 148, 149, 150)
projecting a ray from the point in a first image of the pair through the position of the notional optical centre of the camera for the first image to form a first projected ray;
projecting a ray from the point in the second image of the pair through the position of the notional optical centre of the camera for the second image to form a second projected ray; and
calculating the mid-point of the line which is perpendicular to both the first and second projected rays.
-
-
136. Apparatus according to claim 134, wherein the third calculator is arranged to calculate said single points in the three-dimensional space by:
-
(i) calculating a positional error for each point in the three-dimensional space calculated by at least one of the first calculator and the second calculator;
(ii) re-positioning the points calculated by the at least one calculator in accordance with the positional error calculated in step (i) to give re-positioned points; and
(iii) calculating each single point in the three-dimensional space in dependence upon a re-positioned point and the point in the three-dimensional space generated by the other of the first calculator and the second calculator from the corresponding matched image points.
-
-
137. Apparatus according to claim 136, wherein the third calculator is arranged to perform step (i) by:
-
calculating a difference in position between each point in the three-dimensional space generated by the first calculator and the point generated by the second calculator from the corresponding matched image points; and
calculating the positional error in dependence upon a plurality of the calculated differences in position.
-
-
138. Apparatus according to claim 137, wherein the third calculator is arranged to calculate the positional error in dependence upon all the calculated differences in position except any difference in position which exceeds a threshold.
-
139. Apparatus according to claim 138, wherein points in the three-dimensional space have a spatial distribution, and wherein the apparatus is arranged to set the threshold in dependence upon the spatial distribution within the three-dimensional space of the points calculated by the first calculator and the second calculator.
-
140. Apparatus according to claim 136, wherein the third calculator is arranged to calculate the positional error associated with said single points in the three-dimensional space in dependence upon a difference in positions within the three-dimensional space of each re-positioned point and the point generated in the other of the first calculator and the second calculator from the corresponding matched image points.
-
141. Apparatus according to claim 140, wherein the third calculator is arranged to calculate the positional error associated with said single points in the three-dimensional space as a probability distribution in the three-dimensions.
-
142. Apparatus according to claim 141, wherein the probability distribution is a Gaussian distribution.
-
143. Apparatus according to claim 141, wherein the third calculator is arranged to perform step (iii) by calculating each single point in the three-dimensional space by combining the probability distribution of a re-positioned point and the probability distribution, if any, of the point in the three-dimensional space generated by the other of the first calculator and the second calculator from the corresponding matched image points.
-
144. Apparatus according to claim 141, wherein the point position processor is arranged to determine that a first of the single points represents the same point on the object as a second of the single points if the first point lies within a given distance of the second point, the given distance being dependent upon the probability distribution of the positional error of the second point.
-
145. Apparatus according to claim 144, wherein the given distance is the Mahalanobis distance of the probability distribution of the second point.
-
146. Apparatus according to claim 134, wherein the fourth calculator is arranged to calculate the one point in the three-dimensional space in dependence upon the positions of all the single points which may represent the same point on the object and their associated positional errors.
-
147. Apparatus according to claim 134, further comprising an image data processor to process image data defining images of the object to generate the first input signals.
-
148. Apparatus according to claim 134, further comprising a processor to process the first input signals to generate the second input signals.
-
149. Apparatus according to claim 134, further comprising an image data generator to generate image data by processing the signals defining the points calculated by the fourth calculator.
-
150. Apparatus according to claim 149, further comprising a display to display an image of the object.
-
151. A method of processing first input signals defining transformations between at least three images of an object arranged in pairs with each pair of images containing an image which is part of another pair, the first input signals defining (i) a respective transformation of a first type between the images in each of the pairs and (ii) a respective transformation of a second type between the images in each of the pairs, and second input signals defining features matched in images, to produce signals defining a transformation between all of the images, the method comprising:
-
(a) calculating for each respective combination of the transformations between the images defined in the first input signals a transformation between all images using matching features defined in the second input signals;
(b) testing the calculated transformations to determine an accuracy of each calculated transformation; and
(c) selecting one of the calculated transformations in dependence upon the accuracy thereof. - View Dependent Claims (152)
-
-
153. A method of processing first input signals defining a transformation between at least three images of an object arranged in pairs with each pair of images containing an image which is part of another pair, second input signals defining a transformation of a first type between one of said at least three images and a further image, third input signals defining a transformation of a second type between said one of said at least three images and said further image, and fourth input signals defining features matched in images, to produce signals defining a transformation between all the images, the method comprising:
-
(a) calculating a transformation between all the images using the first, second and fourth input signals;
(b) calculating a transformation between all the images using the first, third and fourth input signals;
(c) testing the calculated transformations to determine an accuracy of each calculated transformation; and
(d) selecting one of the calculated transformations in dependence upon the accuracy thereof. - View Dependent Claims (154)
-
-
155. A method of processing first input signals defining a transformation between at least two images of an object, second input signals defining a plurality of transformations between at least three images of the object comprising a first of said at least two images and at least two further images of the object, said first of said at least two images and said further images being arranged in pairs with each pair of images containing an image which is part of another pair, and the second input signals defining (i) a respective transformation of a first type between the images in each of the pairs and (ii) a respective transformation of a second type between the images in each of the pairs, and third input signals defining features matched in images, to produce signals defining a transformation between all of the images, the method comprising:
-
(a) calculating for each respective combination of transformations between the images defined in the first and second input signals a transformation between all images using matching features;
(b) testing the calculated transformations to determine an accuracy of each calculated transformation; and
(c) selecting one of the calculated transformations in dependence upon the accuracy thereof. - View Dependent Claims (156)
-
-
157. An image processing apparatus for processing first input signals defining transformations between at least three images of an object arranged in pairs with each pair of images containing an image which is part of another pair, the first input signals defining (i) a respective transformation of a first type between the images in each of the pairs and (ii) a respective transformation of a second type between the images in each of the pairs, and second input signals defining features matched in images, to produce signals defining a transformation between all of the images, the apparatus comprising:
-
(a) a transformation calculator for calculating for each respective combination of the transformations between the images defined in the first input signals a transformation between all images using matching features defined in the second input signals;
(b) a transformation tester for testing the calculated transformations to determine an accuracy of each calculated transformation; and
(c) a transformation selector for selecting one of the calculated transformations in dependence upon the accuracy thereof. - View Dependent Claims (160)
-
-
158. An image processing apparatus for processing first input signals defining a transformation between at least three images of an object arranged in pairs with each pair of images containing an image which is part of another pair, second input signals defining a transformation of a first type between one of said at least three images and a further image, third input signals defining a transformation of a second type between said one of said at least three images and said further image, and fourth input signals defining features matched in images, to produce signals defining a transformation between all the images, the apparatus comprising:
-
(a) a first transformation calculator for calculating a transformation between all the images using the first, second and fourth input signals;
(b) a second transformation calculator for calculating a transformation between all the images using the first, third and fourth input signals;
(c) a transformation tester for testing the calculated transformations to determine an accuracy of each calculated transformation; and
(d) a transformation selector for selecting one of the calculated transformations in dependence upon the accuracy thereof.
-
-
159. An image processing apparatus for processing first input signals defining a transformation between at least two images of an object, second input signals defining a plurality of transformations between at least three images of the object comprising a first of said at least two images and at least two further images of the object, the said first of said at least two images and said further images being arranged in pairs with each pair of images containing an image which is part of another pair, and the second input signals defining (i) a respective transformation of a first type between the images in each of the pairs and (ii) a respective transformation of a second type between the images in each of the pairs, and third input signals defining features matched in images, to produce signals defining a transformation between all of the images, the apparatus comprising:
-
(a) a transformation calculator for calculating for each respective combination of transformations between the images defined in the first and second input signals a transformation between all images using matching features;
(b) a transformation tester for testing the calculated transformations to determine an accuracy of each calculated transformation; and
(c) a transformation selector for selecting one of the calculated transformations in dependence upon the accuracy thereof.
-
-
161. An image processing apparatus for producing signals defining a transformation between three images of an object by processing first input signals defining an affine transformation between a first pair of the images, second input signals defining a perspective transformation between the first pair of the images, third input signals defining an affine transformation between a second pair of the images, one of the three images being common to the first pair and the second pair, fourth input signals defining a perspective transformation between the second pair of the images, and fifth input signals defining features matched in all three images, the image processing apparatus comprising:
-
(a) means for calculating a transformation between all three images using the first, third and fifth input signals;
(b) means for calculating a transformation between all three images using the first, fourth and fifth input signals;
(c) means for calculating a transformation between all three images using the second, third and fifth input signals;
(d) means for calculating a transformation between all three images using the second, fourth and fifth input signals;
(e) means for testing the calculated transformations to determine an accuracy of each calculated transformation; and
(f) means for selecting one of the calculated transformations in dependence upon the accuracy thereof.
-
-
162. An image processing apparatus for producing signals defining a transformation between three images of an object by processing first input signals defining a transformation between a first pair of the images, second input signals defining an affine transformation between a second pair of the images, one of the images being common to the first pair and the second pair, third input signals defining a perspective transformation between the second pair of the images, and fourth input signals defining features matched in all three images of the first pair and the second pair, the image processing apparatus comprising:
-
(a) means for calculating a transformation between all three images using the first, second and fourth input signals;
(b) means for calculating a transformation between all three images using the first, third and fourth input signals;
(c) means for testing the calculated transformations to determine an accuracy of each calculated transformation; and
(d) means for selecting one of the calculated transformations in dependence upon the accuracy thereof.
-
-
163. An image processing apparatus for processing first input signals defining transformations between at least three images of an object arranged in pairs with each pair of images containing an image which is part of another pair, the first input signals defining (i) a respective transformation of a first type between the images in each of the pairs and (ii) a respective transformation of a second type between the images in each of the pairs, and second input signals defining features matched in images, to produce signals defining a transformation between all of the images, the apparatus comprising:
-
(a) means for calculating for each respective combination of the transformations between the images defined in the first input signals a transformation between all images using matching features defined in the second input signals;
(b) means for testing the calculated transformations to determine an accuracy of each calculated transformation; and
(c) means for selecting one of the calculated transformations in dependence upon the accuracy thereof.
-
-
164. An image processing apparatus for processing first input signals defining a transformation between at least three images of an object arranged in pairs with each pair of images containing an image which is part of another pair, second input signals defining a transformation of a first type between one of said at least three images and a further image, third input signals defining a transformation of a second type between said one of said at least three images and said further image, and fourth input signals defining features matched in images, to produce signals defining a transformation between all the images, the apparatus comprising:
-
(a) means for calculating a transformation between all the images using the first, second and fourth input signals;
(b) means for calculating a transformation between all the images using the first, third and fourth input signals;
(c) means for testing the calculated transformations to determine an accuracy of each calculated transformation; and
(d) means for selecting one of the calculated transformations in dependence upon the accuracy thereof.
-
-
165. An image processing apparatus for processing first input signals defining a transformation between at least two images of an object, second input signals defining a plurality of transformations between at least three images of the object comprising a first of said at least two images and at least two further images of the object, the said first of said at least two images and said further images being arranged in pairs with each pair of images containing an image which is part of another pair, and the second input signals defining (i) a respective transformation of a first type between the images in each of the pairs and (ii) a respective transformation of a second type between the images in each of the pairs, and third input signals defining features matched in images, to produce signals defining a transformation between all of the images, the apparatus comprising:
-
(a) means for calculating for each respective combination of transformations between the images defined in the first and second input signals a transformation between all images using matching features;
(b) means for testing the calculated transformations to determine an accuracy of each calculated transformation; and
(c) means for selecting one of the calculated transformations in dependence upon the accuracy thereof.
-
-
166. An image processing apparatus for processing input signals defining a plurality of pairs of features representing features matched in first and second images of an object taken from undefined camera positions using first and second matching techniques to produce signals defining a relationship between the camera positions, comprising:
-
(a) means arranged to use pairs of features matched by the first matching technique to calculate a relationship between the camera positions;
(b) means arranged to use one of (i) pairs of features matched by the second matching technique and (ii) pairs of features matched by the first matching technique and pairs of features matched by the second matching technique to calculate a relationship between the camera positions;
(c) means for testing the calculated relationships to determine an accuracy of each calculated relationship; and
(d) means for selecting one of the calculated relationships in dependence upon the accuracy thereof.
-
-
167. An image processing apparatus for processing input signals defining at least eight pairs of features representing features matched in first and second images of an object taken from camera positions of undefined rotation and translation relationship to produce signals defining a rotation and translation relationship between the camera positions, comprising:
-
(a) means for calculating a fundamental matrix using at least a first seven pairs of matched features;
(b) means for converting the calculated fundamental matrix into a physically realizable matrix; and
(c) means for testing the calculated physically realizable matrix using a plurality of the pairs of matched features to determine an accuracy of each calculated physically realizable matrix;
the apparatus being controlled so as to cause means (a) to (c) to repeat their operations using a different seven pairs of matched features in means (a); and
further comprising;
(d) means for selecting a calculated physically realizable matrix in dependence upon the accuracy thereof.
-
-
168. An image processing apparatus for processing first input signals defining at least two points matched in each of at least three images of an object taken from different camera positions and second input signals defining the camera positions, to produce signals defining points in a three-dimensional space representing points on the object, comprising:
-
(a) means for calculating, for each of the points matched in a first pair of the images, a point having a position in the three-dimensional space using the point in one image of the pair and the point in the other image of the pair;
(b) means for calculating, for each of the points matched in a second pair of the images, a point having a position in the three-dimensional space using the point in one image of the pair and the point in the other image of the pair;
(c) means for calculating a single point having a position in the three-dimensional space and associated positional error for each point matched in each of the images, each single point being calculated in dependence upon a point generated by means (a) and the point generated by means (b) from corresponding matched image points;
(d) means for processing the single points generated by means (c) and their associated positional errors to determine whether there are any single points which may represent a same point on the object; and
(e) means for processing the single points which may represent a same point on the object to calculate one point having a position in the three-dimensional space.
-
-
169. Apparatus for processing signals defining first and second types of transformations between a first image and a second image and between a second image and a third image, and signals defining corresponding features in the images, so as to determine a relationship between positions at which the three images were taken, the apparatus comprising:
-
a first relationship calculator for determining a first relationship between the positions at which the images were taken using the corresponding features in dependence upon the first type of transformation between the first and second images and the second type of transformation between the second and third images;
a second relationship calculator for determining a second relationship between the positions at which the images were taken using the corresponding features in dependence upon the first type of transformation between the first and second images and the first type of transformation between the second and third images;
a third relationship calculator for determining a third relationship between the positions at which the images were taken using the corresponding features in dependence upon the second type of transformation between the first and second images and the second type of transformation between the second and third images;
a fourth relationship calculator for determining a fourth relationship between the positions at which the images were taken using the corresponding features in dependence upon the second type of transformation between the first and second images and the first type of transformation between the second and third images;
a transformation tester for testing the determined transformations to determine an accuracy of each determined transformation; and
a transformation selector for selecting a transformation in dependence upon the accuracy thereof.
-
-
170. Apparatus for processing signals defining first and second types of transformations between a first image and a second image and between a second image and a third image, and signals defining corresponding features in the images, so as to determine a relationship between positions at which the three images were taken, the apparatus comprising:
-
means for determining a first relationship between the positions at which the images were taken using the corresponding features in dependence upon the first type of transformation between the first and second images and the second type of transformation between the second and third images;
means for determining a second relationship between the positions at which the images were taken using the corresponding features in dependence upon the first type of transformation between the first and second images and the first type of transformation between the second and third images;
means for determining a third relationship between the positions at which the images were taken using the corresponding features in dependence upon the second type of transformation between the first and second images and the second type of transformation between the second and third images;
means for determining a fourth relationship between the positions at which the images were taken using the corresponding features in dependence upon the second type of transformation between the first and second images and the first type of transformation between the second and third images;
means for testing the determined transformations to determine an accuracy of each determined transformation; and
means for selecting a transformation in dependence upon the accuracy thereof.
-
-
171. Apparatus for processing signals defining (i) first and second types of transformation between a first image and a second image, (ii) a transformation between the second image and a third image, and (iii) corresponding features in the images, so as to determine a relationship between positions at which the three images were taken, the apparatus comprising:
-
a first relationship calculator for determining a first relationship between the positions at which the images were taken using the corresponding features in dependence upon the first type of transformation between the first and second images and the transformation between the second and third images;
a second relationship calculator for determining a second relationship between the positions at which the images were taken using the corresponding features in dependence upon the second type of transformation between the first and second images and the transformation between the second and third images;
a relationship tester for testing the determined relationships to determine an accuracy of each determined relationship; and
a relationship selector for selecting a relationship in dependence upon the accuracy thereof.
-
-
172. Apparatus for processing signals defining (i) first and second types of transformation between a first image and a second image, (ii) a transformation between the second image and a third image, and (iii) corresponding features in the images, so as to determine a relationship between positions at which the three images were taken, the apparatus comprising:
-
means for determining a first relationship between the positions at which the images were taken using the corresponding features in dependence upon the first type of transformation between the first and second images and the transformation between the second and third images;
means for determining a second relationship between the positions at which the images were taken using the corresponding features in dependence upon the second type of transformation between the first and second images and the transformation between the second and third images;
means for testing the determined relationships to determine an accuracy of each determined relationship; and
means for selecting a relationship in dependence upon the accuracy thereof.
-
-
173. Apparatus for processing first signals defining object features matched with a first matching technique in first and second images taken from imaging positions of undefined relationship and second signals defining object features matched in the first and second images with a second matching technique, so as to determine a positional relationship between the images, the apparatus comprising:
-
a first relationship calculator for processing the first input signals to determine a first positional relationship between the images;
a second relationship calculator for processing the second input signals to determine a second positional relationship between the images;
a relationship tester for testing the determined relationships to determine an accuracy of each determined relationship; and
a relationship selector for selecting a relationship in dependence upon the accuracy thereof.
-
-
174. Apparatus for processing first signals defining object features matched with a first matching technique in first and second images taken from imaging positions of undefined relationship and second signals defining object features matched in the first and second images with a second thatching technique, so as to determine a positional relationship between the images, the apparatus comprising:
-
means for processing the first input signals to determine a first positional relationship between the images;
means for processing the second input signals to determine a second positional relationship between the images;
means for testing the determined relationship to determine an accuracy of each determined relationship; and
means for selecting a relationship in dependence upon the accuracy thereof.
-
Specification