Automatic feature detection and side scan sonar overlap navigation via sonar image matching
First Claim
1. A method for locating objects in sonar images and other images represented by a matrix of pixels each having a shade of gray within a known grayness range taken by a vehicle of a known vehicle height scanning across a scan width at known size comprised of the stepsa) normalizing brightness and contrast in the sonar image comprising the steps ofi) Dividing the image of into vertical strips each strip having a width w wherein the width w of each strip i expressed for each strip in terms of ##EQU5## wherein i is an integer from 0 to n;
- and ##EQU6## ii) Creating a histogram for each strip, the histogram being a representation of grayness values for all columns of pixels within the stripiii) Determining a sample mean μ
k and a standard deviation σ
k for each histogram created for each stripiv) Computing a new mean Mk for each strip by solving the equation
space="preserve" listing-type="equation">M.sub.k =σ
.sub.k /0.5227v) Computing a difference Δ
μ
k between the new mean Mk and the sample mean μ
k for each strip so that
space="preserve" listing-type="equation">Δ
μ
.sub.k =μ
.sub.k for k=1 to nvi) Choosing a Yk for each strip such that Yk =(Yk-1 +Yk)/2 for k=1 to n and Yo =Yo ;
Yn+1 =Yn ;
mo =m1 ;
mn+1 =mn ;
Δ
μ
o =Δ
μ
1 ; and
Δ
μ
n+1 =Δ
μ
nvii) Determining a grayness value for each pixel in each strip, adding Δ
μ
k to each grayness value and then dividing that sum by mkb) Convolving the image with at least one low pass filter;
c) Displaying the filtered image.
3 Assignments
0 Petitions
Accused Products
Abstract
A method of locating objects in sonar images so that matching algorithms can be used to determine if the same objects appear in two images. Information from the matching algorithms can then be used to upgrade the position of the vehicle and objects recorded in the vehicle'"'"'s navigation system. Images are improved by normalizing brightness and contrast and then convolving the image with at least one filter. The image is normalized by dividing the image into vertical strips, creating a histogram to represent the grayness values for columns of pixels in each strip and applying an algorithm to each strip to change its grayness value.
60 Citations
19 Claims
-
1. A method for locating objects in sonar images and other images represented by a matrix of pixels each having a shade of gray within a known grayness range taken by a vehicle of a known vehicle height scanning across a scan width at known size comprised of the steps
a) normalizing brightness and contrast in the sonar image comprising the steps of i) Dividing the image of into vertical strips each strip having a width w wherein the width w of each strip i expressed for each strip in terms of ##EQU5## wherein i is an integer from 0 to n; - and ##EQU6## ii) Creating a histogram for each strip, the histogram being a representation of grayness values for all columns of pixels within the strip
iii) Determining a sample mean μ
k and a standard deviation σ
k for each histogram created for each stripiv) Computing a new mean Mk for each strip by solving the equation
space="preserve" listing-type="equation">M.sub.k =σ
.sub.k /0.5227v) Computing a difference Δ
μ
k between the new mean Mk and the sample mean μ
k for each strip so that
space="preserve" listing-type="equation">Δ
μ
.sub.k =μ
.sub.k for k=1 to nvi) Choosing a Yk for each strip such that Yk =(Yk-1 +Yk)/2 for k=1 to n and Yo =Yo ;
Yn+1 =Yn ;
mo =m1 ;
mn+1 =mn ;
Δ
μ
o =Δ
μ
1 ; and
Δ
μ
n+1 =Δ
μ
nvii) Determining a grayness value for each pixel in each strip, adding Δ
μ
k to each grayness value and then dividing that sum by mkb) Convolving the image with at least one low pass filter; c) Displaying the filtered image. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9, 10, 11)
-
9. The method of claim 1 also comprising the step of comparing the image after filtering with a second image to determine if the image matches the second image.
-
10. The method of claim 9 wherein the comparison is done visually.
-
11. The method of claim 9 wherein the comparison is done by computer using a matching algorithm.
- and ##EQU6## ii) Creating a histogram for each strip, the histogram being a representation of grayness values for all columns of pixels within the strip
-
12. A method for matching two images each comprised of points which points each have a valve corresponding to a grayness level comprising the steps of:
-
a) creating a first matrix {P} containing values pi corresponding to a grayness level of all points in a first image; b) creating a second matrix {Q} containing values qj corresponding to a grayness level of all points in a second image; c) creating a matrix {Pi, Qj } of pairs of points from the first image and the second image; d) assigning each pair of points in matrix {Pi, Qj } a score determined by combining the grayness level of the points thereby creating a set of scores {Sk }; e) selecting and ranking those pairs of points having a score greater than a predetermined value to create a set of selected pairs {Ph, Qk }; f) choosing a selected pair of points {p1, q1 } as a centroid; g) finding another pair of points {p2, q2 } such that a distance between p1 and p2 is a minimum of all distances between points pi in the set of selected pairs {Ph, Qk }; h) determining a distance d between points q1 and q2 ; i) calculating a value x from the equation x=d2 /r2 wherein r is a preselected factor; j) determining a Δ
s from the equation ##EQU7## k) adding Δ
s to score corresponding to the selected pair of points (pi, qi);l) choosing additional selected pairs of points; m) repeating steps h thru k for each additional selected pairs of points thereby creating a set of new scores {Sm } for the chosen pairs of points; n) selecting those scores from set {Sm } which are greater than a predetermined threshold and declaring a match among the pairs of points which corresponding to the selected scores; o) creating a new set of {Pi, Qm } pairs of points containing those pairs of points selected in step n; p) selecting from set {Pi, Qm } those pairs of points having a common translation to form a new set; q) using the common translation to match other points in the first image with points in the second image; and r) displaying the first image and the second image in a manner to allow a reader to see how the images match. - View Dependent Claims (13, 14, 15, 16, 17)
-
-
18. A method of side-scan navigation to determine position and course of a vehicle having a navigation system comprising the steps of
a) making at least two passes during which sonar is operated to create a selected sonar image and a second sonar image; -
b) creating a first matrix {P} containing values Pi corresponding to a grayness level of all points in a first image; c) creating a second matrix {Q} containing values qj corresponding to a grayness level of all points in a second image; d) creating a matrix {Pi, Qj } of pairs of points from the first image and the second image; e) selecting and ranking those pairs of points having a score greater than a predetermined value to create a set of selected pairs {Ph, Qk }; f) choosing a selected pair of points {p1, q1 } as a centroid; g) finding another pair of points {p2, q2 } such that a distance between p1 and p2 in the set of selected pairs {Ph, Qk }; h) determining a distance d between points q1 and q2 ; i) calculating a value x from the equation x=d2 /r2 wherein r is a preselected factor; j) determining a Δ
s from the equation ##EQU9## k) adding Δ
s to score corresponding to the selected pair of points (pi, qi);l) choosing additional selected pairs of points; m) repeating steps h thru k for each additional selected pairs of points thereby creating a set of new scores {Sm } for the chosen pairs of points; n) selecting those scores from set {Sm } which are greater than a predetermined threshold and declaring a match among the pairs of points which corresponding to the selected scores; o) creating a new set of {Pn, Qm } pairs of points containing those pairs of points selected in step n; p) selecting from set {Pn, Qm } those pairs of points having a common translation to form a new set; q) using the common translation to match other points in the first image with points in the second image; and r) updating positions of the vehicle and objects in the navigation system of the vehicle from the translation. - View Dependent Claims (19)
-
Specification