×

Discriminative pretraining of deep neural networks

  • US 9,235,799 B2
  • Filed: 11/26/2011
  • Issued: 01/12/2016
  • Est. Priority Date: 11/26/2011
  • Status: Active Grant
First Claim
Patent Images

1. A system for training a context-dependent deep neural network (CD-DNN), comprising:

  • a computing device;

    a computer program comprising program modules executable by the computing device, comprising,a hidden layer generator program module wherein the computing device is directed by the hidden layer generator program module to,initially generate a single hidden layer neural network comprising an input layer into which training data is input, an output layer from which an output is generated, and a first hidden layer which is interconnected with the input and output layers with randomly initialized weights,whenever a pretrained version of the single hidden layer neural network is produced, discard the current output layer and add a new hidden layer which is interconnected with the first hidden layer and a new output layer with randomly initialized weights to produce a multiple hidden layer deep neural network, andwhenever a pretrained version of a last produced multiple hidden layer deep neural network is produced and is designated as lacking a prescribed number of hidden layers, discard the current output layer and add a new hidden layer which is interconnected with the last previously added hidden layer and a new output layer with randomly initialized weights to produce a new multiple hidden layer deep neural network,a pretraining program module wherein the computing device is directed by the pretraining program module to,access a set of training data entries, each data entry of which has a corresponding label assigned thereto,access the single hidden layer neural network once it is generated,input each data entry of said set one by one into the input layer of the single hidden layer neural network until all the data entries have been input at least once to produce the pretrained version of the single hidden layer neural network, such that after the inputting of each data entry, said weights associated with the first hidden layer are set via an error backpropagation procedure to produce an output from the output layer that matches the label associated with the training data entry;

    access each multiple hidden layer deep neural network at the time it is produced,for each multiple hidden layer deep neural network accessed, input each data entry of said set of training data entries one by one into the input layer until all the data entries have been input at least once to produce a pretrained version of the accessed multiple hidden layer deep neural network, such that after the inputting of each data entry, said weights associated with the last added hidden layer and each previously trained hidden layer are set via the error back propagation (BP) procedure to produce an output from the output layer that matches the label associated with the training data entry, anda DNN module wherein the computing device is directed by the DNN module to,each time a pretrained version of a multiple hidden layer DNN is produced, determining whether it includes said prescribed number of hidden layers, andwhenever it is determined the last produced pretrained multiple hidden layer deep neural network does not include the prescribed number of hidden layers, designating it as lacking the prescribed number of hidden layers, andwhenever it is determined the last produced pretrained multiple hidden layer deep neural network does include the prescribed number of hidden layers, designating it to be a pretrained DNN.

View all claims
  • 2 Assignments
Timeline View
Assignment View
    ×
    ×