Method and apparatus for pushing data into a processor cache
First Claim
1. An apparatus for pushing data from a memory into a cache of a processing unit in a computing system, comprising:
- request prediction logic to analyze memory access patterns by the processing unit and to predict data requests of the processing unit based on the memory access patterns; and
push logic to issue a push request per cache line of data predicted to be requested by the processing unit, and to send the cache line associated with the push request to the processing unit if the processing unit accepts the push request, the processing unit placing the cache line in the cache.
1 Assignment
0 Petitions
Accused Products
Abstract
An arrangement is provided for using a centralized pushing mechanism to actively push data into a processor cache in a computing system with at least one processor. Each processor may comprise one or more processing units, each of which may be associated with a cache. The centralized pushing mechanism may predict data requests of each processing unit in the computing system based on each processing unit'"'"'s memory access pattern. Data predicted to be requested by a processing unit may be moved from a memory to the centralized pushing mechanism which then sends the data to the requesting processing unit. A cache coherency protocol in the computing system may help maintain the coherency among all caches in the system when the data is placed into a cache of the requesting processing unit.
-
Citations
33 Claims
-
1. An apparatus for pushing data from a memory into a cache of a processing unit in a computing system, comprising:
-
request prediction logic to analyze memory access patterns by the processing unit and to predict data requests of the processing unit based on the memory access patterns; and
push logic to issue a push request per cache line of data predicted to be requested by the processing unit, and to send the cache line associated with the push request to the processing unit if the processing unit accepts the push request, the processing unit placing the cache line in the cache. - View Dependent Claims (2, 3, 4, 5)
-
-
6. A computing system, comprising:
-
at least one processor, each processor including at least one processing unit associated with a cache;
at least one memory to store data accessible by each processing unit in the system; and
a centralized pushing mechanism to facilitate data traffic to and from the at least one memory, to predict data requests of each processing unit in the system, and to actively push data into a cache of a targeted processing unit in the at least one processor based on the predicted data requests of the targeted processing unit. - View Dependent Claims (7, 8, 9, 10, 11, 12, 13, 14)
-
-
15. A method for using a centralized pushing mechanism to push data into a processor cache, comprising:
-
analyzing a memory access pattern by a processor;
predicting data requests of the processor based on the processor'"'"'s memory access pattern;
issuing a push request for data predicted to be requested by the processor; and
pushing the data into a cache of the processor. - View Dependent Claims (16, 17, 18, 19, 20, 21)
-
-
22. A method for using a centralized pushing mechanism to push data into a cache of a processing unit, comprising:
-
analyzing memory access patterns by each processing unit in a plurality of processors, each processor including at least one processing unit;
predicting data requests of each processing unit based on each processing unit'"'"'s memory access pattern;
issuing at least one push request for data predicted to be requested by each processing unit; and
pushing data predicted to be requested by a processing unit into a cache of the processing unit. - View Dependent Claims (23, 24, 25, 26, 27, 28)
-
-
29. An article comprising a machine readable medium that stores data representing a centralized pushing mechanism comprising:
-
request prediction logic to analyze memory access patterns by at least one processing unit in a computing system and to predict data requests of the at least one processing unit based on the memory access patterns;
a prefetch data buffer to temporarily store data predicted to be requested by the at least one processing unit, the data retrieved from a memory; and
push logic to issue a push request per cache line of data predicted to be requested by the at least one processing unit, and to send the cache line associated with the push request to a targeted processing unit if the targeted processing unit accepts the push request, the targeted processing unit placing the cache line in the cache. - View Dependent Claims (30, 31)
-
-
32. An article comprising a machine readable medium having stored thereon data which, when accessed by a processor in conjunction with simulation routines, provides functionality of a centralized pushing mechanism including:
-
request prediction logic to analyze memory access patterns by at least one processing unit in a computing system and to predict data requests of the at least one processing unit based on the memory access patterns;
a prefetch data buffer to temporarily store data predicted to be requested by the at least one processing unit, the data retrieved from a memory; and
push logic to issue a push request per cache line of data predicted to be requested by the at least one processing unit, and to send the cache line associated with the push request to a targeted processing unit if the targeted processing unit accepts the push request, the targeted processing unit placing the cache line in the cache. - View Dependent Claims (33)
-
Specification