Unsupervised, supervised and reinforced learning via spiking computation
First Claim
Patent Images
1. A method for feature extraction, comprising:
- at a neural network comprising a plurality of neural modules interconnected via a plurality of weighted synaptic connections;
receiving a first input;
extracting one or more input features from the first input as the first input propagates through the neural network via at least one of the weighted synaptic connections;
training the neural network to associate the one or more input features with the first input by applying a learning rule to at least one of the weighted synaptic connections to strengthen conductance of the at least one weighted synaptic connection; and
after the training, in response to receiving a second input comprising first input with noise, reproducing the first input without noise via the trained neural network;
wherein the training comprises unlearning network-created information including false positives by copying firing events generated by a neural module positioned along a learning, top-down pathway of the neural network, and providing the copied firing events as input to another neural module positioned along an unlearning, bottom-up pathway of the neural network; and
wherein each neural module comprises an electronic circuit including multiple neurons.
1 Assignment
0 Petitions
Accused Products
Abstract
The present invention relates to unsupervised, supervised and reinforced learning via spiking computation. The neural network comprises a plurality of neural modules. Each neural module comprises multiple digital neurons such that each neuron in a neural module has a corresponding neuron in another neural module. An interconnection network comprising a plurality of edges interconnects the plurality of neural modules. Each edge interconnects a first neural module to a second neural module, and each edge comprises a weighted synaptic connection between every neuron in the first neural module and a corresponding neuron in the second neural module.
34 Citations
12 Claims
-
1. A method for feature extraction, comprising:
-
at a neural network comprising a plurality of neural modules interconnected via a plurality of weighted synaptic connections; receiving a first input; extracting one or more input features from the first input as the first input propagates through the neural network via at least one of the weighted synaptic connections; training the neural network to associate the one or more input features with the first input by applying a learning rule to at least one of the weighted synaptic connections to strengthen conductance of the at least one weighted synaptic connection; and after the training, in response to receiving a second input comprising first input with noise, reproducing the first input without noise via the trained neural network; wherein the training comprises unlearning network-created information including false positives by copying firing events generated by a neural module positioned along a learning, top-down pathway of the neural network, and providing the copied firing events as input to another neural module positioned along an unlearning, bottom-up pathway of the neural network; and wherein each neural module comprises an electronic circuit including multiple neurons. - View Dependent Claims (2, 3, 4)
-
-
5. A system comprising a computer processor, a computer-readable hardware storage medium, and program code embodied with the computer-readable hardware storage medium for execution by the computer processor to implement a method comprising:
at a neural network comprising a plurality of neural modules interconnected via a plurality of weighted synaptic connections; receiving a first input; extracting one or more input features from the first input as the first input propagates through the neural network via at least one of the weighted synaptic connections; training the neural network to associate the one or more input features with the first input by applying a learning rule to at least one of the weighted synaptic connections to strengthen conductance of the at least one weighted synaptic connection; and after the training, in response to receiving a second input comprising the first input with noise, reproducing the first input without noise via the trained neural network; wherein the training comprises unlearning network-created information including false positives by copying firing events generated by a neural module positioned along a learning, top-down pathway of the neural network, and providing the copied firing events as input to another neural module positioned along an unlearning, bottom-up pathway of the neural network; and wherein each neural module comprises an electronic circuit including multiple neurons. - View Dependent Claims (6, 7, 8)
-
9. A computer program product comprising a computer-readable hardware storage device having program code embodied therewith, the program code being executable by a computer to implement a method comprising:
at a neural network comprising a plurality of neural modules interconnected via a plurality of weighted synaptic connections; receiving a first input; extracting one or more input features from the first input as the first input propagates through the neural network via at least one of the weighted synaptic connections; training the neural network to associate the one or more input features with the first input by applying a learning rule to at least one of the weighted synaptic connections to strengthen conductance of the at least one weighted synaptic connection; and after the training, in response to receiving a second input comprising the first input with noise, reproducing the first input without noise via the trained neural network; wherein the training comprises unlearning network-created information including false positives by copying firing events generated by a neural module positioned along a learning, top-down pathway of the neural network, and providing the copied firing events as input to another neural module positioned along an unlearning, bottom-up pathway of the neural network; and wherein each neural module comprises an electronic circuit including multiple neurons. - View Dependent Claims (10, 11, 12)
Specification