×

Layered architecture for hybrid controller

  • US 9,529,724 B2
  • Filed: 07/06/2012
  • Issued: 12/27/2016
  • Est. Priority Date: 07/06/2012
  • Status: Active Grant
First Claim
Patent Images

1. A controller comprising at least one hardware processor for a hybrid memory comprising a main memory and a cache for the main memory, the controller comprising a hierarchy of abstraction layers each abstraction layer configured to provide at least one component of a cache management structure, each pair of abstraction layers comprising processors communicating through an application programming interface (API), the controller configured to receive incoming memory access requests from a host processor and to manage outgoing memory access requests routed to the cache using the plurality of abstraction layers,wherein at least one of the abstraction layers is configured to:

  • receive the incoming memory access requests from the host processor, the incoming memory access requests including a range of host logical block addresses (LBAs);

    route the incoming memory access requests to a set of incoming queues by implementing a priority scheme, the set of incoming queues comprising an incoming execute queue, the priority scheme comprising;

    routing invalidate requests in the invalidate ready queue to the execute queue as a highest priority;

    routing read requests in the read ready queue to the execute queue as a second highest priority; and

    routing promotion requests in the promotion ready queue as a third highest priority;

    map the range of host LBAs into clusters of cache LBAs;

    transform each incoming memory access request into one or more outgoing memory access requests, each outgoing memory access request including a range or cluster of cache LBAs;

    route the outgoing memory access requests from the set of incoming queues into a set of outgoing queues, the outgoing queues comprising;

    a set of outgoing execute queues, wherein each entry in the incoming execute queue is associated with a plurality of entries in an outgoing execute queue, andan outgoing free queue containing a number outgoing nodes, wherein an outgoing node is removed from the outgoing free queue when an outgoing memory access request is queued in one of the outgoing execute queues and the outgoing node is returned to the outgoing free queue; and

    access the cache using the outgoing memory access requests.

View all claims
  • 2 Assignments
Timeline View
Assignment View
    ×
    ×