Neural network auto-associative memory with two rules for varying the weights
First Claim
1. A neural network associative memory, comprising,a single layer of processing elements having source and destination ones of said elements which are respectively referenced as (i) and (j) elements,each of said elements having available a summation means for summing weighted inputs to said elements and a transfer function means for computing transformation of said summed weighted inputs,an input means for and associated with individual ones of said elements for receiving patterns to be learned and patterns to be identified,an output means for and associated with individual ones of said elements for outputting patterns generated by said individual ones of said processing elements,a first set of unidirectional connections comprising a first set of variable value weights extending respectively from said a plurality of said output means of said elements to a plurality of said summation means of other individual ones of said elements,a second set of unidirectional connections forming a second set of variable value weights extending respectively from said output means of a plurality of said elements to said summation means of the same ones of said elements, andmeans for varying the values of said first set of weights pursuant to the rule Δ
- wij =η
pi δ
j and the values of said second set of weights pursuant to the rule Δ
wjj =η
pj δ
j, wherein (wij) are variable connection weights between functionally adjacent ones (i) and (j) of said elements, (η
) is a constant that determines the learning rate, (wjj) are variable feedback connection weights for each of said elements, (pi) are predetermined values of said patterns to be learned and identified associated with said (i) elements and (δ
j) are error signals respectively of said (j) elements.
1 Assignment
0 Petitions
Accused Products
Abstract
A neural network associative memory which has a single layer of primatives and which utilizes a variant of the generalized delta for calculating the connection weights between the primatives. The delta rule is characterized by its utilization of predetermined values for the primitive and an error index which compares, during iterations, the predetermined primative values with actual primative values until the delta factor becomes a predetermined minimum value.
-
Citations
11 Claims
-
1. A neural network associative memory, comprising,
a single layer of processing elements having source and destination ones of said elements which are respectively referenced as (i) and (j) elements, each of said elements having available a summation means for summing weighted inputs to said elements and a transfer function means for computing transformation of said summed weighted inputs, an input means for and associated with individual ones of said elements for receiving patterns to be learned and patterns to be identified, an output means for and associated with individual ones of said elements for outputting patterns generated by said individual ones of said processing elements, a first set of unidirectional connections comprising a first set of variable value weights extending respectively from said a plurality of said output means of said elements to a plurality of said summation means of other individual ones of said elements, a second set of unidirectional connections forming a second set of variable value weights extending respectively from said output means of a plurality of said elements to said summation means of the same ones of said elements, and means for varying the values of said first set of weights pursuant to the rule Δ - wij =η
pi δ
j and the values of said second set of weights pursuant to the rule Δ
wjj =η
pj δ
j, wherein (wij) are variable connection weights between functionally adjacent ones (i) and (j) of said elements, (η
) is a constant that determines the learning rate, (wjj) are variable feedback connection weights for each of said elements, (pi) are predetermined values of said patterns to be learned and identified associated with said (i) elements and (δ
j) are error signals respectively of said (j) elements. - View Dependent Claims (2, 3)
- wij =η
-
4. A neural network associative memory, comprising:
-
a plurality of processing elements each having summation means and for summing weighted inputs to said elements and sigmoid transfer function means for computing the transformation of said summed weighted inputs, input means for individual ones of said elements for receiving patterns to be learned and patterns to be identified, output means for individuals ones of said elements for outputting patterns generated by said processing elements, connection means for forming variable value weights connecting said output means of some of said elements and said summation means of other of said elements, each of said processing elements having envelope means for providing an envelope for said sigmoid transfer function means thereof and random value output means for providing random output values within the boundary of said envelope for corresponding values output of said summations means, and learning algorithm means activated iteratively and means for varying the values of said weights pursuant thereto at each iteration. - View Dependent Claims (5, 6)
-
-
7. A method for storing patterns in a neural network associated memory which memory comprises:
-
a single layer of processing elements having source and destination ones of said elements which are respectively referenced to as (i) and (j) elements, each of said elements having available a summation means for summing weighted inputs to said elements and a transfer function means for computing transformation of said summed weighted inputs, an input means for and associated with individual ones of said elements for receiving patterns to be learned and patterns to be identified, an output means for and associated with individual ones of said elements for outputting patterns generated by said individual ones of said processing elements, a first set of unidirectional connections comprising a first set of variable value weights (wij) extending respectively from said output means of each of said elements (i) to said summation means of other ones of said elements (j), and a second set of unidirectional connections (wjj) forming variable value self weights extending respectively from said output means of said elements (j) to said summation means of the same ones of said elements (j), said method comprising the steps of; (a) applying a pattern to be learned to said input means, (b) iteratively calculating changes of said weights for said first and second sets of connections in accordance with the rule Δ
wij =η
pi δ
j and Δ
wjj =η
pj δ
j wherein (η
) is a constant that determines the learning rate, (pi) and (pj) are predetermined values of said patterns being learned and identified and (δ
j) are error terms, and(c) continuing step (b) until said weights are stabilized, and then storing said patterns. - View Dependent Claims (8)
-
-
9. A processing element assembly for use in a neural network having a plurality of such assemblies and wherein each two processing elements of functionally adjacent source and destination ones of said assemblies may be considered a pair and are referenced, respectively, as processing elements (i) and (j) of such pair,
said assembly comprising, a processing element (j) having available a summation section for summing the values of weighted inputs and a transfer section for computing a transfer function for said summed weighted inputs, fan-in connection means for said summation section comprising externally connectable lines connected to said summation section, output means for said transfer section having fan-out connection means with multiple output lines for connection from said transfer section, a plurality of adjustable weight means associated respectively with said fan-in connection lines, a weight adjusting learning algorithm means for adjusting said weight means having associated memory means for storage of patterns, pattern input means for inputting a pattern to said memory means, circuit means so constructed and assembled for providing an initializing mode such that (1) a pattern element (pj) placed on said pattern input means is directed to said memory means and to said multiple lines of said fan-out connection means by such circuit means, and (2) pattern elements (pi) from source ones of said processing elements (i) on said fan-in externally connectable lines are directed to said memory means and to said summation section via said means by said circuit means, and wherein, said circuit means is also so constructed and assembled for providing a learning mode wherein (1) an output (oj) of said transfer section output means is directed by said circuit means to said memory means and to said multiple lines of said fan-out connection means and (2) outputs (oi) from source ones of said processing elements (i) on said fan-in externally connectable lines are directed by said circuit means to said memory means and to said summation section via said weight means thereof, said learning rule comprising the form Δ - wij=η
pi δ
j wherein (wij) values are a representational weighting value of said adjustable weight means between functionally adjacent ones of said source and destination processing elements, (δ
j) are calculated error signals equal to (pj -oj)o'"'"'j, and η
is a constant that determines the learning rate. - View Dependent Claims (10, 11)
- wij=η
Specification