Tree-like perceptron and a method for parallel distributed training of such perceptrons
First Claim
1. An electrical neural network, comprising:
- an input layer comprising at least one neuron circuit having at least one input and at least one output;
a hidden layer comprising a plurality of neuron circuits, each neuron circuit being one of inhibitory and excitatory and having at least one input and an output;
an output layer comprising at least two neuron circuits having at least one input and an output;
a plurality of first synapses, each first synapse connecting the output of a neuron circuit in the input layer to the input of at least one neuron circuit in the hidden layer and having a connection weight;
a plurality of second synapses, each second synapse connecting the output of a neuron circuit in the hidden layer to an input of at most one neuron circuit in the output layer and having a connection weight with a magnitude and a polarity;
each neuron in the hidden layer being connected to only one corresponding neuron circuit in the output layer; and
wherein each neuron circuit in the hidden layer receives a reinforcement signal from the corresponding neuron circuit in the output layer to update the connection weight of a synapse connected between the output of a neuron circuit in the input layer and the input of the neuron circuit in the hidden layer, wherein the reinforcement signal is independent of the magnitude of the connection weight of any synapse connected between the output of the neuron circuit in the hidden layer and the input of any neuron circuit connected posterior to the neuron circuit in the hidden layer.
1 Assignment
0 Petitions
Accused Products
Abstract
Constraints placed on the structure of a conventional multi-layer network consequently enable learning rules to be simplified and the probability of reaching only local minima to be reduced. These constraints include neurons which are either inhibitory or excitatory. Also, for each neuron in the hidden layer, there is at most one synapse connecting it to a corresponding neuron in the output layer. The result is a tree-like structure which facilitates implementation of large scale electronic networks, and allows for parallel training of parts of the network. Additionally, each neuron in the hidden layer receives a reinforcement signal from its corresponding neuron in the output layer which is independent of the magnitude of synapses posterior to the hidden layer neuron. There may be multiple hidden layers, wherein each layer has a plurality of neurons, and wherein each neuron in an anterior layer connects to only one neuron in any posterior layer. In training, weights of synapses connected anterior to any neuron are adjusted with the polarity opposite the polarity of the error signal when the polarity determined for the path for the neuron is inhibitory. The adjustment is made with the polarity of the error signal when the polarity determined for the path for the neuron is excitatory.
45 Citations
9 Claims
-
1. An electrical neural network, comprising:
-
an input layer comprising at least one neuron circuit having at least one input and at least one output; a hidden layer comprising a plurality of neuron circuits, each neuron circuit being one of inhibitory and excitatory and having at least one input and an output; an output layer comprising at least two neuron circuits having at least one input and an output; a plurality of first synapses, each first synapse connecting the output of a neuron circuit in the input layer to the input of at least one neuron circuit in the hidden layer and having a connection weight; a plurality of second synapses, each second synapse connecting the output of a neuron circuit in the hidden layer to an input of at most one neuron circuit in the output layer and having a connection weight with a magnitude and a polarity; each neuron in the hidden layer being connected to only one corresponding neuron circuit in the output layer; and wherein each neuron circuit in the hidden layer receives a reinforcement signal from the corresponding neuron circuit in the output layer to update the connection weight of a synapse connected between the output of a neuron circuit in the input layer and the input of the neuron circuit in the hidden layer, wherein the reinforcement signal is independent of the magnitude of the connection weight of any synapse connected between the output of the neuron circuit in the hidden layer and the input of any neuron circuit connected posterior to the neuron circuit in the hidden layer. - View Dependent Claims (2)
-
-
3. An electrical neural network, comprising:
-
an input layer comprising at least one neuron circuit having at least one input and at least one output; a first hidden layer comprising a plurality of neuron circuits, each neuron circuit being one of inhibitory and excitatory and having a plurality of inputs and an output; a second hidden layer comprising a plurality of neuron circuits, each neuron circuit being one of inhibitory and excitatory and having a plurality of inputs and an output; an output layer comprising at least two neuron circuits having a plurality of inputs and an output; a plurality of first synapses, each first synapse connecting outputs of neuron circuits in the input layer to inputs of neuron circuits in the first hidden layer and having a connection weight; a plurality of second synapses, each second synapse connecting the output of a neuron circuit in the first hidden layer to an input of at most one neuron circuit in the second hidden layer and having a connection weight; and a plurality of third synapses, each third synapse connecting the output of a neuron circuit in the second hidden layer to an input of at most one neuron circuit in the output layer and having a connection weight. - View Dependent Claims (4, 5, 6)
-
-
7. A method for training an electrical neural network which has an input layer having at least one neuron, a hidden layer having a plurality of neurons, each neuron being one of inhibitory and excitatory, an output layer having at least one neuron, a plurality of first synapses, each first synapse connecting a neuron in the input layer to a neuron in the hidden layer and having a connection weight, a plurality of second synapses, each second synapse connecting a neuron in the hidden layer to a neuron in the output layer and having a connection weight with a magnitude and a polarity, each path from any neuron in the hidden layer to any neuron in the output layer having a polarity, the method comprising the steps of:
-
(a) preparing a set of training data pairs, each training data pair comprising an input and a corresponding desired output; (b) applying an input of a selected training data pair to the input layer of the neural network; (c) obtaining an actual output from each neuron of the output layer of the neural network; (d) comparing the actual output to the desired output of the selected training data pair to obtain an error signal having a polarity for each neuron of the output layer, wherein the error signal is otherwise independent of the magnitude of the connection weight of any of the second synapses; and (e) for each neuron in the hidden layer, adjusting the connection weight of the synapse connected anterior to the neuron according to the error signal, and i) with a polarity opposite the polarity of the error signal when the polarity determined for the path for the neuron is inhibitory and ii) with the polarity of the error signal when the polarity determined for the path for the neuron is excitatory. - View Dependent Claims (8, 9)
-
Specification