Sleep refreshed memory for neural network
First Claim
1. A neural network for use with a learning algorithm, a sleep refresh algorithm, and a plurality of environmental input signals, each of said environmental input signals having an activity level, comprising:
- a plurality of N neuronal outputs, each carrying a signal having an activity level, said plurality including M first level neuronal outputs and N-M second level neuronal outputs;
feed-forward response means for generating on a j'"'"'th one of said second level neuronal outputs, M<
j≦
N, a j'"'"'th signal having an activity level responsive to the temporal and spatial sum of the activity levels of said signals on all i'"'"'th ones of said first level neuronal outputs, i=1,. . . ,M, each as weighted by a respective connectivity weight wij ;
means for modifying at least one of said connectivity weights wij in accordance with said learning algorithm; and
sleep activated means for, during a sleep period, isolating said neuronal outputs from said environmental inputs and adjusting at least some of said connectivity weights wij according to said sleep refreshed algorithm.
1 Assignment
0 Petitions
Accused Products
Abstract
A method and apparatus are disclosed for implementing a neural network having a sleep mode during which capacitively stored synaptic connectivity weights are refreshed. Each neuron outputs an analog activity level, represented in a preferred embodiment by the frequency of digital pulses. Feed-forward synaptic connection circuits couple the activity level outputs of first level neurons to inputs of second level neurons, and feed-back synaptic connection circuits couple outputs of second level neurons to inputs of first level neurons, the coupling being weighted according to connectivity weights stored on respective storage capacitors in each synaptic connection circuit. The network learns according to a learning algorithm under which the connections in both directions between a particular first level neuron and a particular second level neuron are strengthened to the extent of concurrence of high activity levels in both the first and second level neurons, and weakened to the extent of concurrence of a high activity level in the second level neuron and a low activity level in the first level neuron. The network is put to sleep by disconnecting all environmental inputs and providing a non-specific low activity level signal to each of the first level neurons. This causes the network to randomly traverse its state space with low intensity resonant firings, each state being visited with a probability responsive to the initial connectivity weights of the connections which abut the second level neuron representing such state. Refresh is accomplished since the learning algorithm remains active during sleep. Thus, the sleep refresh mechanism enhances the contrast in the connectivity terrain and strengthens connections that would otherwise wash out due to lack of visitation while the system is awake. A deep sleep mechanism is also provided for preventing runaway strengthening of favored states, and also to encourage Weber Law compliance.
-
Citations
62 Claims
-
1. A neural network for use with a learning algorithm, a sleep refresh algorithm, and a plurality of environmental input signals, each of said environmental input signals having an activity level, comprising:
-
a plurality of N neuronal outputs, each carrying a signal having an activity level, said plurality including M first level neuronal outputs and N-M second level neuronal outputs; feed-forward response means for generating on a j'"'"'th one of said second level neuronal outputs, M<
j≦
N, a j'"'"'th signal having an activity level responsive to the temporal and spatial sum of the activity levels of said signals on all i'"'"'th ones of said first level neuronal outputs, i=1,. . . ,M, each as weighted by a respective connectivity weight wij ;means for modifying at least one of said connectivity weights wij in accordance with said learning algorithm; and sleep activated means for, during a sleep period, isolating said neuronal outputs from said environmental inputs and adjusting at least some of said connectivity weights wij according to said sleep refreshed algorithm.
-
-
2. A neutron circuit suitable for use in a neural network comprising:
-
a summing junction, a constant voltage node and a summing capacitor coupled between said summing junction and said constant voltage- node; an inverting integration having an input and a voltage output; an oscillator capacitor; and an oscillator circuit having an output coupled to said input of said inverting integrator, and further having means for transferring charge from said summing capacitor to said oscillator capacitor when the voltage across said oscillator capacitor is less than said voltage output of said inverting integrator, and for discharging said oscillator capacitor when the voltage across said oscillator capacitor exceeds said voltage output of said inverting integrator.
-
-
3. A neural network for use with input stimuli, comprising:
-
a plurality of neuronal circuits, each of said neuronal circuits having an output responsive according to a respective connectivity value to the output of at least on other one of said neuronal circuits; learning means for altering said connectivity values in response to said input stimuli according to a learning algorithm; and refresh means for altering said connectivity values independently of said input stimuli according to a refresh algorithm. - View Dependent Claims (4, 5, 6, 7, 8, 9, 10, 11)
-
-
12. A neural network for use with a learning algorithm, a sleep refresh algorithm, and a plurality of environmental input signals, each of said environmental input signals having an activity level, comprising:
-
a plurality of N neuronal outputs, each carrying a signal having an activity level, said plurality including N first level neuronal outputs and N -M second level neuronal outputs; feed-forward response means for generating on a j'"'"'th one of said second level neuronal outputs, M<
j≦
N, a j'"'"'th signal having an activity level responsive to a temporal and spatial sum of the activity levels of said signals on all i'"'"'th ones of said first level neuronal outputs, i=1, . . . ,M, each as weighed by a respective connectivity weight wij ;means for modifying at least one of said connectivity weights wij in accordance with said learning algorithm; and sleep activated means for, during a sleep period, isolating said neuronal outputs from said environmental inputs and adjusting at least some of said connectivity weights wij according to said sleep refresh algorithm. - View Dependent Claims (13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37)
-
-
38. A neural network for use with a plurality of environmental input signals each having an activity level, comprising:
-
a plurality of first level neuronal circuits each having at least one input. each of said inputs of each of said first level neuronal circuits carrying an input signal having an activity level, each of said first level neuronal circuits further having an output and means for providing on said output of said first level neuronal circuit an output signal having an activity level at least partially responsive to a time integration of the sum of the activity levels of said input signals of said first level neuronal circuit; a group of second level neuronal circuits each having at least one input, each at said inputs of each of said second level neuronal circuits carrying an input signal having an activity level, each of said second level neuronal circuits further having an output and means for providing on said output of said second level neuronal circuit an output signal having an activity level at least partially responsive to a time integration of the sum of the activity levels of said input signals of said second level neuronal circuit and at least partially counter-responsive to a time integration of the sum of the activity levels of the output signals of all other ones of said second level neuronal circuits in said group. a plurality of feed forward connection circuits each having an input carrying an input signal having an activity level, storage means for storing a respective connectivity strength, an output and means for providing on said output of said feed-forward connection circuit an output signal having an activity level responsive to the activity level of said input signal of said feed-forward connection circuit as weighted by said respective connectivity strength, said input of each of said feed-forward connection circuits being coupled to receive the output signal of a respective one of said first level neuronal circuits and said output of each of said feed-forward connection circuits being coupled to provide said output signal of said feed-forward connection circuit to one of said inputs of a respective one of said second level neuronal circuits in said group, the output of each of said first level neuronal circuits being coupled to an input of all of said second level neuronal circuits in said group via respective ones of said feed-forward connection circuits; a plurality of feed-back connection circuits each having an input carrying an input signal having an activity level, storage means for storing a respective connectivity strength, an output and means for providing on said output of said feed-back connection circuit an output signal having an activity level responsive to the activity level of said input signal of said feed-back connection circuit as weighted by said respective connectivity strength of said feed-back connection circuit, said input of each of said feed-back connection circuits being coupled to receive the output signal of a respective one of said second level neuronal circuits and said output of each of said feed-back connection circuit being coupled to provide said output signal of said feed-back connection circuit to one of said inputs of a respective one of said first level neuronal circuits, the output of each of said second level neuronal circuits in said group being coupled to an input of all of said first level neuronal circuits via respective ones of said feed-back connection circuits. said means for providing. in all of said first and second level neuronal circuits and said feed-forward and feed-back connection circuits, being such that the net feedback of activity levels around any loop is non-negative; and learning means for adjusting said connectivity strengths stored in said storage means in said feed-forward and feed-back convection circuits in accordance with a learning algorithm. - View Dependent Claims (39, 40, 41, 42, 43, 44, 45, 46, 47, 48, 49)
-
-
50. A neuron circuit suitable for use in a neural network, Comprising:
-
a plurality of current input lines; an output line; and means for providing on said output line an output signal having a frequency positively responsive to a time integration of the sum of currents of said current input lines and negatively responsive to a time integration of said output signal. - View Dependent Claims (51)
-
-
52. A synaptic connection circuit suitable for use in a neural network, comprising:
-
a pre-synaptic signal input and a current output; a storage capacitor; and means for providing on said output of said synaptic connection circuit a current signal the time average of which is positively responsive to the voltage across said storage capacitor times the frequency of the signal on said pre-synaptic signal input. - View Dependent Claims (53, 54, 55, 56, 57, 58)
-
-
59. A synaptic connection circuit suitable for use in a neural network, comprising:
-
a pre-synaptic signal input and a current output; a storage capacitor; a transconductance converter having an input and an output, said output of said transconductance converter being said current output of said synaptic connection circuit; and switch means for coupling the voltage on said storage capacitor to said input of said transconductance converter on each pulse of said pre-synaptic signal input. - View Dependent Claims (60, 61, 62)
-
Specification