Trainable neural network having short-term memory for altering input layer topology during training
First Claim
Patent Images
1. A trainable artificial neural network comprising:
- input layer means for inputting sets of data items and generating node outputs, each node output corresponding to an occurrence anywhere in a set of a respective predetermined data item;
output layer means for responding to the generation of said node outputs by producing at least one net output, each net output depending upon a plurality of such node outputs weighted by corresponding weight values;
means for storing given data items in said set, each given data item stored not corresponding to any said respective predetermined data item; and
means for causing said input layer means to thereafter generate node outputs corresponding to occurrences in a said set of respective said given data items.
1 Assignment
0 Petitions
Accused Products
Abstract
A neutral net in which new nodes and connections are created in both input and intermediate layers during training, which is by punishment, reward and teaching. This can use a small increase in memory requirement to preclude the necessity for long training times applicable problems in speech and natural language processing, video recognition and simple logic functions.
29 Citations
17 Claims
-
1. A trainable artificial neural network comprising:
-
input layer means for inputting sets of data items and generating node outputs, each node output corresponding to an occurrence anywhere in a set of a respective predetermined data item; output layer means for responding to the generation of said node outputs by producing at least one net output, each net output depending upon a plurality of such node outputs weighted by corresponding weight values; means for storing given data items in said set, each given data item stored not corresponding to any said respective predetermined data item; and means for causing said input layer means to thereafter generate node outputs corresponding to occurrences in a said set of respective said given data items. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14)
-
-
15. A method of training an artificial neural network having input layer means for inputting sets of data items and generating node outputs, each node output corresponding to an occurrence anywhere in a set of a respective predetermined date item, and output layer means for responding to the generation of said node outputs by producing at least one net output, each net output depending upon a plurality of such node outputs weighted by corresponding weight values the method comprising the steps of:
-
inputting training sets of data to the input layer means; storing given data items in a training set, each stored given data item not corresponding to any said respective predetermined data item; detecting said net output, and, in the event that a predetermined criterion of success is not met, causing said input layer means to thereafter generate node outputs corresponding to occurrences in said sets of data items of at least one said given data item.
-
-
16. A method of training a neural network having network inputs and network outputs defined by a plurality of weight values, said method comprising the steps of:
inputting training data and adjusting said weight values as a function of network outputs, an amount by which a given weight value is adjusted depending upon said given weight value such that the amount is less for both high weight values and low weight values than for weight values intermediate said high weight values and low weight values, whereby after frequent adjustments in a first direction, adjustment in opposite second direction has little effect and, after frequent adjustments in said opposite second direction adjustment in said first direction has little effect. - View Dependent Claims (17)
Specification