Distributed caching mechanism for pending memory operations within a memory controller
First Claim
1. A memory controller containing a distributed cache that stores cache lines for pending memory operations, comprising:
- an input mechanism that is configured to receive a memory operation that is directed to a current address in memory;
a central scheduling unit;
a plurality of agents under control of the central scheduling unit, wherein a given agent in the plurality of agents is configured to receive the current address;
a comparison mechanism within the given agent that is configured to compare the current address with an address of a cache line stored within the given agent;
a reporting mechanism within the given agent that is configured to report to the plurality of agents a result provided by the comparison mechanism; and
an access mechanism that is configured to access data within the cache line stored within the given agent in order to accomplish the memory operation when the comparison mechanism indicates a match;
wherein the plurality of agents compare the current address with their respective cache line addresses in parallel; and
wherein the given agent holds the cache line while memory operations are pending for the cache line.
2 Assignments
0 Petitions
Accused Products
Abstract
One embodiment of the present invention provides a memory controller that contains a distributed cache that stores cache lines for pending memory operations. This memory controller includes an input that receives memory operations that are directed to an address in memory. It also includes a central scheduling unit and multiple agents that operate under control of the central scheduling unit. Upon receiving a current address, a given agent compares the current address with a cache line stored within the given agent. All of the agents compare the current address with their respective cache line in parallel. If the addresses match, the agent reports the result to the rest of the agents in the memory controller, and accesses data within the matching cache line stored within the agent to accomplish the memory operation.
18 Citations
21 Claims
-
1. A memory controller containing a distributed cache that stores cache lines for pending memory operations, comprising:
-
an input mechanism that is configured to receive a memory operation that is directed to a current address in memory;
a central scheduling unit;
a plurality of agents under control of the central scheduling unit, wherein a given agent in the plurality of agents is configured to receive the current address;
a comparison mechanism within the given agent that is configured to compare the current address with an address of a cache line stored within the given agent;
a reporting mechanism within the given agent that is configured to report to the plurality of agents a result provided by the comparison mechanism; and
an access mechanism that is configured to access data within the cache line stored within the given agent in order to accomplish the memory operation when the comparison mechanism indicates a match;
wherein the plurality of agents compare the current address with their respective cache line addresses in parallel; and
wherein the given agent holds the cache line while memory operations are pending for the cache line. - View Dependent Claims (2, 3, 4, 5, 6, 7)
-
-
8. A method that facilitates distributed caching within a memory controller, the method comprising:
-
receiving at a given agent in a plurality of agents a memory operation that is directed to a current address in memory;
comparing the current address with an address of a cache line stored within the given agent;
reporting to the plurality of agents a result of comparing the address; and
if the comparison indicates a match, accessing data within the cache line stored within the given agent in order to accomplish the memory operation;
wherein the plurality of agents compare the current address with the address of their respective cache lines in parallel; and
wherein the given agent holds the cache line while memory operations are pending for the cache line. - View Dependent Claims (9, 10, 11, 12, 13, 14)
-
-
15. An integrated circuit that includes a distributed cache, comprising:
-
an input mechanism that is configured to receive a memory operation that is directed to a current address in memory;
a central scheduling unit;
a plurality of agents under control of the central scheduling unit, wherein a given agent in the plurality of agents is configured to receive the current address;
a comparison mechanism within the given agent that is configured to compare the current address with an address to a cache line stored within the given agent;
a reporting mechanism within the given agent that is configured to report to the plurality of agents a result provided by the comparison mechanism; and
an access mechanism that is configured to access the cache line stored within the given agent in order to accomplish the memory operation when the comparison mechanism indicates a match;
wherein the plurality of agents compare the current address with addresses of their respective cache lines in parallel; and
wherein the given agent holds data within the cache line while memory operations are pending for the cache line. - View Dependent Claims (16, 17, 18, 19, 20, 21)
-
Specification