Method and system for predicting addresses and prefetching data into a cache memory
First Claim
1. A prefetching apparatus useful in association with a computer system having at least one processor, a memory device, and a cache memory coupled to said at least one processor, said apparatus comprising:
- a stream-detector configured to compare a requested memory address associated with data requested by said at least one processor to a predicted memory address derived from a previously requested memory address according to each of at least one memory address pattern; and
a prefetcher configured to prefetch new data from the memory device and to store said data in the cache memory, said new data prefetched from a next memory address computed from said requested memory address and a corresponding one of the at least one memory address pattern upon a condition in which the requested memory address is accurately represented by the predicted memory address of the corresponding memory address pattern, said new data prefetched in anticipation of a request for said new data from said at least one processor;
wherein upon the condition that there is a cache hit for the requested data in the cache memory, N new data is prefetched, where N is at least 1 and upon the condition that there is a cache miss for the requested data in the cache memory, M new data is prefetched, where M is greater than N.
4 Assignments
0 Petitions
Accused Products
Abstract
A Method for increasing data-processing speed in computer systems containing at least one microprocessor (1), a memory device (3), and a cache (2,4) connected to the processor, in which the cache (2,4) is arranged to fetch data from the addresses in the memory device (3) requested by the processor (1) and then also fetches data from one or several addresses in the memory device (3) not requested by the processor (1). The computer system includes a circuit called the stream-detection circuit (5), connected to interact with a cache (2,4) such that the stream-detection circuit (5) detects the addresses which the processor (1) requests in the cache (2,4) and registers whether the addresses requested already existed in cache (2,4) . The stream-detection circuit (5) is arranged such that it is made to detect one or several sequential series of addresses requested by the processor (1) in the cache (2,4). Additionally, the stream-detection circuit, upon detection of such a series, is structured to command the cache (2,4) to fetch data from the memory device (3) corresponding to the next address in the series and insert the address in the cache (2,4).
-
Citations
2 Claims
-
1. A prefetching apparatus useful in association with a computer system having at least one processor, a memory device, and a cache memory coupled to said at least one processor, said apparatus comprising:
-
a stream-detector configured to compare a requested memory address associated with data requested by said at least one processor to a predicted memory address derived from a previously requested memory address according to each of at least one memory address pattern; and a prefetcher configured to prefetch new data from the memory device and to store said data in the cache memory, said new data prefetched from a next memory address computed from said requested memory address and a corresponding one of the at least one memory address pattern upon a condition in which the requested memory address is accurately represented by the predicted memory address of the corresponding memory address pattern, said new data prefetched in anticipation of a request for said new data from said at least one processor; wherein upon the condition that there is a cache hit for the requested data in the cache memory, N new data is prefetched, where N is at least 1 and upon the condition that there is a cache miss for the requested data in the cache memory, M new data is prefetched, where M is greater than N.
-
-
2. A method for increasing data-processing speed in a computer system having at least one processor, a memory device, and a cache memory coupled to said at least one processor, said method comprising the steps of:
-
predicting a predicted memory address from a previously requested memory address according to each of at least one memory address pattern; comparing a requested memory address associated with data requested by the at least one processor to the predicted memory address of the at least one memory address pattern; and upon a condition in which the requested memory address is accurately represented by the predicted memory address of a corresponding one of the at least one memory address pattern; computing a next memory address from said requested memory address and the corresponding memory address pattern; and prefetching new data from the memory device and storing said new data in the cache memory, said new data prefetched from the computed next memory address in anticipation of a request for said new data from said at least one processor; wherein upon the condition that there is a cache hit for the requested data in the cache memory, N new data is prefetched, where N is at least 1 and upon the condition that there is a cache miss for the requested data in the cache memory, M new data is prefetched, where M is greater than N.
-
Specification