Pattern recognition with hierarchical networks
First Claim
1. A method for recognizing a pattern having a set of features, the method comprising the following steps:
- convolving a plurality of fixed feature detectors with a local window scanned over a representation of the pattern to generate a plurality of feature maps;
applying a nonlinearity function to each feature map separately;
sensing local combinations of at least one simple feature of the feature maps; and
recognizing the pattern by classifying it on the basis of the sensed local combinations, wherein essentially statistically independent features are pre-set for the local combination of features.
2 Assignments
0 Petitions
Accused Products
Abstract
Within the frameworks of hierarchical neural feed-forward architectures for performing real-world 3D invariant object recognition a technique is proposed that shares components like weight-sharing (2), and pooling stages (3, 5) with earlier approaches, but focuses on new methods for determining optimal feature-detecting units in intermediate stages (4) of the hierarchical network. A new approach for training the hierarchical network is proposed which uses statistical means for (incrementally) learning new feature detection stages and significantly reduces the training effort for complex pattern recognition tasks, compared to the prior art. The incremental learning is based on detecting increasingly statistically independent features in higher stages of the processing hierarchy. Since this learning is unsupervised, no teacher signal is necessary and the recognition architecture can be pre-structured for a certain recognition scenario. Only a final classification step must be trained with supervised learning, which reduces significantly the effort for the adaptation to a recognition task.
Due to the improved learning efficiency, not only two dimensionally objects, but also three dimensional objects with variations of three dimensional rotation, size and lightning conditions can be recognized. As another advantage this learning method is viable for arbitrary nonlinearities between stages in the hierarchical convolutional networks, like e.g. non-differentiable Winner-Take-All nonlinearities. In contrast thereto the technology according to the abovementioned prior art can only perform backpropagation learning for differentiable nonlinearities which poses certain restrictions on the network design.
64 Citations
32 Claims
-
1. A method for recognizing a pattern having a set of features, the method comprising the following steps:
-
convolving a plurality of fixed feature detectors with a local window scanned over a representation of the pattern to generate a plurality of feature maps;
applying a nonlinearity function to each feature map separately;
sensing local combinations of at least one simple feature of the feature maps; and
recognizing the pattern by classifying it on the basis of the sensed local combinations, wherein essentially statistically independent features are pre-set for the local combination of features. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9, 12)
-
-
10. A method for recognizing a pattern having a set of features, the method comprising the following steps:
-
convolving a plurality of fixed feature detectors with a local window scanned over a representation of the pattern to generate a plurality of feature maps;
applying a nonlinearity function to each feature map separately;
sensing local combinations of the simple features of the feature maps; and
recognizing the pattern by classifying it on the basis of the sensed local combinations;
wherein a winner-takes-all strategy is applied on the result of the convolution to generate the feature maps. - View Dependent Claims (13)
-
-
11. A system for training a hierarchical network, the hierarchical network comprising:
-
means for convolving a plurality of fixed feature detectors with a local window scanned a representation of the pattern to generate a plurality of feature maps;
means for applying a nonlinearity function to each feature map separately;
intermediate means for sensing local combinations of simple features of the feature maps; and
means for recognizing the pattern by classifying it on the basis of the sensed local combinations;
wherein said intermediate means are incrementally trained to enhance the statistical independence of the local combinations of features. - View Dependent Claims (15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 26, 27, 28, 29, 30, 31, 32)
-
-
14. A pattern recognition apparatus with a hierarchical network, the hierarchical network comprising:
-
means for inputting a representation of a pattern;
means for convolving a plurality of fixed feature detectors with a local window scanned over the representation of the pattern to generate a plurality of feature maps;
means for applying a nonlinearity function to each feature map separately;
intermediate means for sensing local combinations of simple features of the feature maps; and
means for recognizing the pattern by classifying it on the basis of the sensed local combinations;
wherein said intermediate means utilize pre-set essentially statistically independent features.
-
-
25. A pattern recognition apparatus with a hierarchical network, the hierarchical network comprising:
-
means for inputting a representation of a pattern;
means for convolving a plurality of fixed feature detectors with a local window scanned over the representation of the pattern to generate a plurality of feature maps;
means for applying a nonlinearity function to each feature map separately;
intermediate means for sensing local combinations of simple features of the feature maps; and
means for recognizing the pattern by classifying it on the basis of the sensed local combinations;
wherein the convolving means use a use of a winner-takes-all strategy to generate the feature maps.
-
Specification