Invariant texture matching method for image retrieval
First Claim
1. An apparatus for matching a query texture with a plurality of images, each containing a plurality of textured regions, said apparatus comprising:
- feature extraction means for identifying a plurality of regions in the plurality of images, each region containing a texture pattern, extracting an N-component feature vector for each region, and normalizing the extracted N-component feature vectors, where N is an even integer greater than or equal to 2;
normalization means for normalizing the extracted N-component texture feature vectors;
arranging means for arranging each N-component feature vector into a 2-dimensional representation with spatial frequency increasing along one dimension and orientation changing in another dimension;
extraction means for extracting D texture characteristics that are invariant to texture scale and orientation from each 2-dimensional representation and forming a D-dimensional feature vector, where D for said D texture characteristics and said D-dimensional is an integer greater than 1;
computing means for computing the similarity between the query texture and the plurality of images according to the similarity between a D-dimensional feature vector of the query texture and the D-dimensional feature vectors of the plurality of textured regions in the plurality of images; and
image ranking means for ranking the plurality of images according to their similarity to the query texture.
1 Assignment
0 Petitions
Accused Products
Abstract
A method and apparatus matches texture patterns in a way that is invariant to the intensities, scales, and orientations of the texture patterns. The texture matching apparatus includes a texture feature extractor, a feature transformer, and an image ranker. The texture feature extractor identifies regions in a plurality of stored images that contain homogeneous texture patterns that may vary in intensities, scales, and orientations, and extracts N-component texture feature vectors in these regions. The feature transformer maps the N-component feature vectors into a D-dimensional texture space that is invariant to texture intensity, scale, and orientation. The image ranker ranks the images according to the similarity between the texture patterns that they contain and the query texture measured in the D-dimensional invariant texture space, and the similarity between the N-component feature vectors of the texture patterns. The results of texture matching can be used by a content-based image retrieval system to display a list of images according to their similarity to the query texture.
-
Citations
24 Claims
-
1. An apparatus for matching a query texture with a plurality of images, each containing a plurality of textured regions, said apparatus comprising:
-
feature extraction means for identifying a plurality of regions in the plurality of images, each region containing a texture pattern, extracting an N-component feature vector for each region, and normalizing the extracted N-component feature vectors, where N is an even integer greater than or equal to 2;
normalization means for normalizing the extracted N-component texture feature vectors;
arranging means for arranging each N-component feature vector into a 2-dimensional representation with spatial frequency increasing along one dimension and orientation changing in another dimension;
extraction means for extracting D texture characteristics that are invariant to texture scale and orientation from each 2-dimensional representation and forming a D-dimensional feature vector, where D for said D texture characteristics and said D-dimensional is an integer greater than 1;
computing means for computing the similarity between the query texture and the plurality of images according to the similarity between a D-dimensional feature vector of the query texture and the D-dimensional feature vectors of the plurality of textured regions in the plurality of images; and
image ranking means for ranking the plurality of images according to their similarity to the query texture. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12)
means for storing a plurality of 2-dimensional masks;
means for computing a similarity value between each 2-dimensional mask and a local patch of the 2-dimensional representation of a feature vector; and
means for finding the largest similarity value over the entire 2-dimensional representation.
-
-
7. The apparatus of claim 6, wherein one mask identifies patch-shaped patterns in the 2-dimensional representation.
-
8. The apparatus of claim 6, wherein one mask identifies column-shaped patterns in the 2-dimensional representation.
-
9. The apparatus of claim 6, wherein one mask identifies row-shaped patterns in the 2-dimensional representation.
-
10. The apparatus of claim 1, wherein said image ranking means comprises:
-
means for arranging the texture patterns in a ranked list in order of similarity computed in the D-dimensional texture space;
means for computing the similarity between the N-component vector of a texture pattern and the N-component vector of P neighboring texture patterns down the ranked list where P is an integer greater than or equal to 1;
means for re-ranking neighboring texture patterns with larger similarity between the N-component vectors higher on the ranked list and patterns with smaller similarity lower on the ranked list;
means for mapping the ranked list of texture patterns into a ranked list of images by determining the images that contain the respective texture patterns of the ranked list of texture patterns; and
means for removing redundant images with smaller similarities from the ranked list of images.
-
-
11. The apparatus of claim 10, wherein said means for computing similarity in the D-dimensional texture space calculates Euclidean distances between D-dimensional vectors.
-
12. The apparatus of claim 10, wherein said means for computing similarity between N-component feature vectors calculates the normalized cosine between N-component feature vectors.
-
13. A method for matching a query texture with a plurality of images, each containing a plurality of textured regions, said method comprising:
-
identifying from the plurality of images a plurality of regions, each containing a texture pattern;
extracting and N-component texture feature vector for each region, where N is an even integer greater than or equal to 2;
normalizing the extracted N-component texture feature vectors;
arranging each N-component texture feature vector into a 2-dimensional representation with spatial frequency increasing along one dimension and orientation changing in another dimension;
extracting D texture characteristics that are invariant to texture scale and orientation from each 2-dimensional representation and forming a D-dimensional feature vector, where D for said D texture characteristics and said D-dimensional is an integer greater than 1;
computing the similarity between the query texture and the plurality of images according to the similarity between a D-dimensional feature vector of the query texture and the D-dimensional feature vectors of the plurality of textured regions in the plurality of images; and
ranking the plurality of images according to their similarity to the query texture. - View Dependent Claims (14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24)
retrieving a plurality of 2-dimensional masks;
computing the similarity value between a 2-dimensional mask and a local patch of a 2-dimensional representation; and
finding the largest similarity value over the entire 2-dimensional representation.
-
-
19. The method of claim 18, wherein one 2-dimensional mask identifies patch-shaped patterns in the 2-dimensional representation.
-
20. The method of claim 18, wherein one 2-dimensional mask identifies column-shaped patterns in the 2-dimensional representation.
-
21. The apparatus of claim 18, wherein one 2-dimensional mask identifies row-shaped patterns in the 2-dimensional representation.
-
22. The method of claim 13, wherein said ranking step is performed by:
-
arranging the texture patterns in a ranked list in order of similarity computed in the D-dimensional texture space;
computing the similarity between the N-component vector of a texture pattern and P neighboring texture patterns down the ranked list where P is an integer greater than or equal to 1;
moving neighboring texture patterns with larger similarity between the N-component vectors higher on the list and texture patterns with smaller similarity lower on the list;
mapping the ranked list of texture patterns into a ranked list of images by determining the images that contain the ranked texture patterns; and
removing redundant images with smaller similarities.
-
-
23. The method of claim 22, wherein similarities in the D-dimensional texture space are computed by calculating Euclidean distances between D-dimensional vectors.
-
24. The method of claim 22, wherein similarities between N-component feature vectors are computed by calculating the normalized cosine N-component between vectors.
Specification