Using cache hit information to manage prefetches
First Claim
Patent Images
1. A system comprising:
- a plurality of caches comprising first cache and a second cache, the second cache having greater latency than the first cache; and
a prefetcher configured to monitor addresses included in access requests to the second cache, detect a pattern to the access requests, and determine a prefetch distance based on the pattern, the prefetcher further configured to prefetch cache lines to the second cache that are selected according to the prefetch distance and receive feedback from the second cache, the feedback indicating whether an access request issued in response to a cache miss in the first cache results in a cache hit in the second cache, the prefetch distance also determined according to the feedback.
1 Assignment
0 Petitions
Accused Products
Abstract
Cache hit information is used to manage (e.g., cap) the prefetch distance for a cache. In an embodiment in which there is a first cache and a second cache, where the second cache (e.g., a level two cache) has greater latency than the first cache (e.g., a level one cache), a prefetcher prefetches cache lines to the second cache and is configured to receive feedback from that cache. The feedback indicates whether an access request issued in response to a cache miss in the first cache results in a cache hit in the second cache. The prefetch distance for the second cache is determined according to the feedback.
11 Citations
18 Claims
-
1. A system comprising:
-
a plurality of caches comprising first cache and a second cache, the second cache having greater latency than the first cache; and a prefetcher configured to monitor addresses included in access requests to the second cache, detect a pattern to the access requests, and determine a prefetch distance based on the pattern, the prefetcher further configured to prefetch cache lines to the second cache that are selected according to the prefetch distance and receive feedback from the second cache, the feedback indicating whether an access request issued in response to a cache miss in the first cache results in a cache hit in the second cache, the prefetch distance also determined according to the feedback. - View Dependent Claims (2, 3, 4, 5, 6)
-
-
7. A computer system comprising:
-
a processing unit; a plurality of caches comprising a first cache and a second cache coupled to the processing unit, the second cache having greater latency than the first cache; and memory coupled to the processing unit and having stored therein instructions that, if executed by the computer system, cause the computer system to execute a prefetcher that performs operations comprising; monitoring addresses included in access requests to the second Cache; detecting a pattern to the access requests; determining a prefetch distance based on the pattern and also according to feedback from the second cache, the feedback indicating whether an access request issued in response to a cache miss in the first cache results in a cache hit in the second cache; prefetching cache lines to the second cache, wherein the cache lines are selected according to the prefetch distance. - View Dependent Claims (8, 9, 10, 11, 12)
-
-
13. A method implemented by a computer system comprising a processor, a memory, and a plurality of caches coupled to the processor and the memory and comprising a first cache and a second cache, the second cache having greater latency than the first cache, the method comprising:
-
monitoring addresses included in access requests to the second cache; detecting a pattern to the access requests; determining a first value for the prefetch distance based on the pattern and also according to feedback from the second cache, the feedback indicating whether an access request issued in response to a cache miss in the first cache results in a cache hit in the second cache; in response to a cache miss in the first cache, sending an access request to the second cache; and if the access request results in a cache hit in the second cache, then capping the prefetch distance even if the pattern continues to increase in length. - View Dependent Claims (14, 15, 16, 17, 18)
-
Specification