×

Deep neural network processing on hardware accelerators with stacked memory

  • US 10,540,588 B2
  • Filed: 06/29/2015
  • Issued: 01/21/2020
  • Est. Priority Date: 06/29/2015
  • Status: Active Grant
First Claim
Patent Images

1. A method for processing a deep neural network, the method comprising:

  • configuring an acceleration component to perform forward propagation and backpropagation stages of the deep neural network, the acceleration component comprising an acceleration component die and a memory stack both disposed in a single integrated circuit package, the acceleration component comprising multiple discrete neural engine processing units, each neural engine processing unit comprising both input buffer memory and logic circuitry that implements at least one of;

    dot-products, derivatives or non-linear functions, the configuring comprising;

    storing at least one of;

    weights, input activations or errors in the memory stack;

    assigning, to individual ones of the neural engine processing units, a portion of the weights; and

    streaming portions of the weights, input activations or errors from the memory stack to the input buffer memory of respective ones of the neural engine processing units, the input buffer memory of each of the respective ones of the neural engine processing units individually comprising at least one of;

    a weights input memory into which the portions of the weights are stored;

    an activations input memory into which the portions of the input activations are stored;

    oran error input memory into which the portions of the errors are stored;

    wherein at least one of the weights input memory, the activations input memory or the error input memory of one neural engine processing unit is communicationally coupled to a corresponding one of the weights input memory, the activations input memory or the error input memory of a preceding neural engine processing unit and to a corresponding one of the weights input memory, the activations input memory or the error input memory of a subsequent neural engine processing unit.

View all claims
  • 1 Assignment
Timeline View
Assignment View
    ×
    ×