Adaptive neuron model--an architecture for the rapid learning of nonlinear topological transformations
First Claim
1. A method of training an analog neural network comprising plural neurons and synapses wherein said neurons are connected together by respective synapses, said neurons comprising respective activity states and adjustable neuron temperatures, said synapses comprising adjustable synapse weights, said method comprising the steps of:
- defining, relative to an error between activity states of a set of output neurons and a predetermined training vector, predetermined time-dependent behaviors of;
(a) said activity states, (b) said neuron temperatures in accordance with a gradient descent of said error in temperature space and (c) said synapse weights in accordance with a gradient descent of said error in weight space, said behaviors governed by (a) an activity state relaxation time, (b) a neuron temperature relaxation time and (c) a synapse weight relaxation time, respectively;
continuously updating said neuron activity states, said neuron temperatures and said synapse weights of said analog neural network at respective rates corresponding to said relaxation times until said error is reduced below a predetermined threshold.
2 Assignments
0 Petitions
Accused Products
Abstract
A method and an apparatus for the rapid learning of nonlinear mappings and topological transformations using a dynamically reconfigurable artificial neural network is presented. This fully-recurrent Adaptive Neuron Model (ANM) network has been applied to the highly degenerative inverse kinematics problem in robotics, and its performance evaluation is bench-marked. Once trained, the resulting neuromorphic architecture was implemented in custom analog neural network hardware and the parameters capturing the functional transformation downloaded onto the system. This neuroprocessor, capable of 109 ops/sec, was interfaced directly to a three degree of freedom Heathkit robotic manipulator. Calculation of the hardware feed-forward pass for this mapping was benchmarked at ≈10 μsec.
-
Citations
18 Claims
-
1. A method of training an analog neural network comprising plural neurons and synapses wherein said neurons are connected together by respective synapses, said neurons comprising respective activity states and adjustable neuron temperatures, said synapses comprising adjustable synapse weights, said method comprising the steps of:
-
defining, relative to an error between activity states of a set of output neurons and a predetermined training vector, predetermined time-dependent behaviors of;
(a) said activity states, (b) said neuron temperatures in accordance with a gradient descent of said error in temperature space and (c) said synapse weights in accordance with a gradient descent of said error in weight space, said behaviors governed by (a) an activity state relaxation time, (b) a neuron temperature relaxation time and (c) a synapse weight relaxation time, respectively;continuously updating said neuron activity states, said neuron temperatures and said synapse weights of said analog neural network at respective rates corresponding to said relaxation times until said error is reduced below a predetermined threshold. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9)
-
-
10. Apparatus for training an analog neural network comprising plural neurons and synapses wherein said neurons are connected together by respective synapses, said neurons comprising respective activity states and adjustable neuron temperatures, said synapses comprising adjustable synapse weights, said apparatus comprising:
-
means for defining and storing, relative to an error between activity states of a set of output neurons and a predetermined training vector, predetermined time-dependent behaviors of;
(a) said activity states, (b) said neuron temperatures in accordance with a gradient descent of said error in temperature space and (c) said synapse weights in accordance with a gradient descent of said error in weight space, said behaviors governed by (a) an activity state relaxation time, (b) a neuron temperature relaxation time and (c) a synapse weight relaxation time, respectively;means for continuously updating said neuron activity states, said neuron temperatures and said synapse weights of said analog neural network at respective rates corresponding to said relaxation times until said error is reduced below a predetermined threshold. - View Dependent Claims (11, 12, 13, 14, 15, 16, 17, 18)
-
Specification