Information handling system with immediate scheduling of load operations in a dual-bank cache with dual dispatch into write/read data flow
First Claim
Patent Images
1. A method comprising:
- sending, by a processor element, a plurality of requests for memory operations to a cache memory, the cache memory including first and second cache banks, the memory operations including load operations and store operations, each load and store operation exhibiting a respective size requirement;
arbitrating, by an arbitration mechanism, among the plurality of requests for memory operations to select a particular load operation and a particular store operation for access to the cache memory;
arbitrating, by the arbitration mechanism, in a first arbitration stage among the load operation requests in the plurality of requests for memory operations, to provide the particular load operation for access to the cache memory;
arbitrating, by the arbitration mechanism, in the first arbitration stage among the store requests, to provide the particular store operation for access to the cache memory;
arbitrating, by the arbitration mechanism, in the first arbitration stage among read claim state machine requests, cast out state machine requests and snoop requests to determine a cache arbiter arbitration result;
arbitrating, by the arbitration mechanism, in a second arbitration stage that includes first and second arbiters that operate in parallel to provide particular first and second store instructions to a third arbitration stage;
commencing, by the arbitration mechanism, the particular load operation on the first cache bank during a first cache cycle;
commencing, by the arbitration mechanism, the particular store operation on the second cache bank during the first cache cycle such that both the particular load operation and the particular store operation commence during the same first cache cycle; and
performing, by the first and second cache banks, the particular load operation and the particular store operation simultaneously.
1 Assignment
0 Petitions
Accused Products
Abstract
An information handling system (IHS) includes a processor with a cache memory system. The processor includes a processor core with an L1 cache memory that couples to an L2 cache memory. The processor includes an arbitration mechanism that controls load and store requests to the L2 cache memory. The arbitration mechanism includes control logic that enables a load request to interrupt a store request that the L2 cache memory is currently servicing. The L2 cache memory includes dual data banks so that one bank may perform a load operation while the other bank performs a store operation. The cache system provides dual dispatch points into the data flow to the dual cache banks of the L2 cache memory.
35 Citations
18 Claims
-
1. A method comprising:
-
sending, by a processor element, a plurality of requests for memory operations to a cache memory, the cache memory including first and second cache banks, the memory operations including load operations and store operations, each load and store operation exhibiting a respective size requirement; arbitrating, by an arbitration mechanism, among the plurality of requests for memory operations to select a particular load operation and a particular store operation for access to the cache memory; arbitrating, by the arbitration mechanism, in a first arbitration stage among the load operation requests in the plurality of requests for memory operations, to provide the particular load operation for access to the cache memory; arbitrating, by the arbitration mechanism, in the first arbitration stage among the store requests, to provide the particular store operation for access to the cache memory; arbitrating, by the arbitration mechanism, in the first arbitration stage among read claim state machine requests, cast out state machine requests and snoop requests to determine a cache arbiter arbitration result; arbitrating, by the arbitration mechanism, in a second arbitration stage that includes first and second arbiters that operate in parallel to provide particular first and second store instructions to a third arbitration stage; commencing, by the arbitration mechanism, the particular load operation on the first cache bank during a first cache cycle; commencing, by the arbitration mechanism, the particular store operation on the second cache bank during the first cache cycle such that both the particular load operation and the particular store operation commence during the same first cache cycle; and performing, by the first and second cache banks, the particular load operation and the particular store operation simultaneously. - View Dependent Claims (2, 3, 4, 5, 6)
-
-
7. A cache memory system comprising:
-
a processor element; and a cache memory, coupled to the processor element, that receives a plurality of requests for memory operations, the cache memory including first and second cache banks, the memory operations including load operations and store operations, each load and store operation exhibiting a respective size requirement, the cache memory including an arbitration mechanism that arbitrates among the plurality of requests for memory operations to select a particular load operation and a particular store operation for access to the cache memory, wherein the arbitration mechanism comprises a first arbitration stage that arbitrates among the load operation requests in the plurality of requests for memory operations, to provide the particular load operation for access to the cache memory, the first arbitration stage arbitrating among store requests to provide a selected store operation for access to the cache memory, the first arbitration stage arbitrating among read claim state machine requests, cast out state machine requests and snoop requests to determine a cache arbiter arbitration result, wherein the arbitration mechanism further comprises a second arbitration stage, coupled to the first arbitration stage, that includes first and second arbiters that operate in parallel to provide particular first and second store instructions to a third arbitration stage, and wherein the arbitration mechanism commences the particular load operation on the first cache bank during a first cache cycle and commences the particular store operation on the second cache bank during the first cache cycle such that both the particular load operation and the particular store operation commence during the same first cache cycle, the first and second cache banks performing the particular load operation and the particular store operation simultaneously. - View Dependent Claims (8, 9, 10, 11, 12)
-
-
13. A information handling system (IHS) comprising:
-
a processor element; a cache memory, coupled to the processor element, that receives a plurality of requests for memory operations, the cache memory including first and second cache banks, the operations including load operations and store operations, each load and store operation exhibiting a respective size requirement, the cache memory including an arbitration mechanism that arbitrates among the plurality of requests for memory operations to select a particular load operation and a particular store operation for access to the cache memory, wherein the arbitration mechanism comprises a first arbitration stage that arbitrates among the load operation requests in the plurality of requests for memory operations, to provide the particular load operation for access to the cache memory, the first arbitration stage arbitrating among store requests to provide a selected store operation for access to the cache memory, the first arbitration stage arbitrating among read claim state machine requests, cast out state machine requests and snoop requests to determine a cache arbiter arbitration result, wherein the arbitration mechanism further comprises a second arbitration stage, coupled to the first arbitration stage, that includes first and second arbiters that operate in parallel to provide particular first and second store instructions to a third arbitration stage; and
wherein the arbitration mechanism commences the particular load operation on the first cache bank during a first cache cycle and commences the particular store operation oh the second cache bank during the first cache cycle such that both the particular load operation and the particular store operation commence during the same first cache cycle, the first and second cache banks performing the particular load operation and the particular store operation substantially simultaneously; anda system memory coupled to the cache memory. - View Dependent Claims (14, 15, 16, 17, 18)
-
Specification