SYSTEMS AND METHODS FOR TRAINING NEURAL NETWORKS BASED ON CONCURRENT USE OF CURRENT AND RECORDED DATA
First Claim
1. A computer program product embodied in a computer-readable medium, the computer program product comprising an algorithm adapted to effectuate a method comprising:
- providing a neural network comprising a plurality of estimated weights for estimating a linearly parameterized uncertainty;
receiving past data in the neural network;
recording one or more of the past data for future use;
receiving current data in the neural network; and
updating the estimated weights of the neural network, with a computer processor, based on concurrent processing of the current data and the selected past data, wherein convergence of the estimated weights to ideal weights is guaranteed when the recorded past data contains as many linearly independent elements as a dimension of a basis for the uncertainty.
2 Assignments
0 Petitions
Accused Products
Abstract
Various embodiments of the invention are neural network adaptive control systems and methods configured to concurrently consider both recorded and current data, so that persistent excitation is not required. A neural network adaptive control system of the present invention can specifically select and record data that has as many linearly independent elements as the dimension of the basis of the uncertainty. Using this recorded data along with current data, the neural network adaptive control system can guarantee global exponential parameter convergence in adaptive parameter estimation problems. Other embodiments of the neural network adaptive control system are also disclosed.
-
Citations
20 Claims
-
1. A computer program product embodied in a computer-readable medium, the computer program product comprising an algorithm adapted to effectuate a method comprising:
-
providing a neural network comprising a plurality of estimated weights for estimating a linearly parameterized uncertainty; receiving past data in the neural network; recording one or more of the past data for future use; receiving current data in the neural network; and updating the estimated weights of the neural network, with a computer processor, based on concurrent processing of the current data and the selected past data, wherein convergence of the estimated weights to ideal weights is guaranteed when the recorded past data contains as many linearly independent elements as a dimension of a basis for the uncertainty. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9, 10, 11)
-
-
12. A computer program product embodied in a computer-readable medium, the computer program product comprising an algorithm adapted to effectuate a system comprising:
-
a neural network comprising a plurality of weights applicable to received data for estimating a paramaterized uncertainty; a storage device for storing recorded data of the neural network, the recorded data being selected to contain as many linearly independent elements as a dimension of a basis for the uncertainty; and a processor in communication with the storage device, for receiving current data and for updating the weights of the neural network by concurrently processing the recorded data along with the current data by orthogonally projecting a weight training algorithm of the recorded data onto the null space of a weight training vector of the current data. - View Dependent Claims (13, 14, 15)
-
-
16. A computer-implemented method comprising:
-
received data in a neural network, the neural network comprising a plurality of weights for estimating a linearly parameterized uncertainty; recording one or more of the received data, wherein the recorded data contains as many linearly independent elements as a dimension of a basis for the uncertainty; receiving current data in the neural network; and updating the weights of the neural network, with a computer processor, based on an orthogonal projection of a weight training algorithm devised from the recorded data onto the null space of a weight training vector of the current data, wherein convergence of the weights of the neural network is guaranteed. - View Dependent Claims (17, 18, 19, 20)
-
Specification