USING CACHE HIT INFORMATION TO MANAGE PREFETCHES
First Claim
Patent Images
1. A system comprising:
- a plurality of caches comprising first cache and a second cache, the second cache having greater latency than the first cache; and
a prefetcher configured to prefetch cache lines to the second cache and further configured to receive feedback from the second cache, the feedback indicating whether an access request issued in response to a cache miss in the first cache results in a cache hit in the second cache, wherein the cache lines prefetched to the second cache are selected according to a prefetch distance, the prefetch distance determined according to the feedback.
1 Assignment
0 Petitions
Accused Products
Abstract
Cache hit information is used to manage (e.g., cap) the prefetch distance for a cache. In an embodiment in which there is a first cache and a second cache, where the second cache (e.g., a level two cache) has greater latency than the first cache (e.g., a level one cache), a prefetcher prefetches cache lines to the second cache and is configured to receive feedback from that cache. The feedback indicates whether an access request issued in response to a cache miss in the first cache results in a cache hit in the second cache. The prefetch distance for the second cache is determined according to the feedback.
22 Citations
20 Claims
-
1. A system comprising:
-
a plurality of caches comprising first cache and a second cache, the second cache having greater latency than the first cache; and a prefetcher configured to prefetch cache lines to the second cache and further configured to receive feedback from the second cache, the feedback indicating whether an access request issued in response to a cache miss in the first cache results in a cache hit in the second cache, wherein the cache lines prefetched to the second cache are selected according to a prefetch distance, the prefetch distance determined according to the feedback. - View Dependent Claims (2, 3, 4, 5, 6, 7)
-
-
8. A computer system comprising:
-
a processing unit; a plurality of caches comprising a first cache and a second cache coupled to the processing unit, the second cache having greater latency than the first cache; and memory coupled to the processing unit and having stored therein instructions that, if executed by the computer system, cause the computer system to execute a prefetcher that performs operations comprising; prefetching cache lines to the second cache, wherein the cache lines are selected according to a prefetch distance; receiving feedback from the second cache indicating whether an access request issued in response to a cache miss in the first cache results in a cache hit in the second cache; and determining the prefetch distance according to the feedback. - View Dependent Claims (9, 10, 11, 12, 13, 14)
-
-
15. A method implemented by a computer system comprising a processor, a memory, and a plurality of caches coupled to the processor and the memory and comprising a first cache and a second cache, the second cache having greater latency than the first cache, the method comprising:
-
prefetching cache lines into the second cache, wherein the cache lines are selected for prefetching using a prefetch distance that increases to a first value according to a pattern of cache misses in the second cache; in response to a cache miss in the first cache, sending an access request to the second cache; and if the access request results in a cache hit in the second cache, then capping the prefetch distance even if the pattern continues to increase in length. - View Dependent Claims (16, 17, 18, 19, 20)
-
Specification