Method of varying the amount of data prefetched to a cache memory in dependence on the history of data requests
First Claim
1. A method of prefetching data for a cache memory from a mass-storage device in response to a fetch request generated by a requesting circuit, comprising the steps of:
- checking whether a previous fetch and a current fetch request are for data with sequential addresses, and prefetching data with sequential addresses beyond the current fetch request if the previous fetch and the current fetch request are for data with sequential addresses; and
keeping track of the number of successive fetches with sequential addresses, and progressively increasing the amount of said data with sequential addresses prefetched as the number of successive fetches with sequential addresses progressively increases.
11 Assignments
0 Petitions
Accused Products
Abstract
A method of dynamically prefetching data for a cache memory is controlled by the past history of data requests. If the previous fetch and current fetch request are not sequential, no data is prefetched. If the previous fetch and current fetch request are sequential and less than all of the current fetch request is already in the cache, two blocks of data sequentially beyond the current fetch request are prefetched. If the previous two blocks fetched and current fetch request are sequential and less than all of the current fetch request is already in the cache, four blocks of data sequentially beyond the current fetch request are prefetched. If the previous three blocks fetched and the current fetch request are sequential and less than all of the current fetch request is already in the cache, eight blocks of data sequentially beyond the current fetch request are preferred. The prefetch algorithm is limited at eight blocks. Each additional sequential request less than all of which is already in the cache will cause eight blocks to be prefetched.
-
Citations
14 Claims
-
1. A method of prefetching data for a cache memory from a mass-storage device in response to a fetch request generated by a requesting circuit, comprising the steps of:
-
checking whether a previous fetch and a current fetch request are for data with sequential addresses, and prefetching data with sequential addresses beyond the current fetch request if the previous fetch and the current fetch request are for data with sequential addresses; and keeping track of the number of successive fetches with sequential addresses, and progressively increasing the amount of said data with sequential addresses prefetched as the number of successive fetches with sequential addresses progressively increases. - View Dependent Claims (2, 3, 4)
-
-
5. A method of prefetching blocks of data from a first memory to a second memory in response to a fetch request generated by a requesting circuit, comprising the steps of:
-
checking whether a previous fetch and a current fetch request are for data with sequential addresses, and keeping track of the number of successive fetches with sequential addresses; prefetching the next two sequential blocks beyond an address of the current fetch request if the previous fetch and the current fetch request are for data with sequential addresses; prefetching the next four sequential blocks beyond the current fetch request if the previous two fetches and the current fetch request are for data with sequential addresses; and prefetching the next eight sequential blocks beyond the current fetch request if the previous three fetches and the current fetch request are for data with sequential addresses.
-
-
6. A method of prefetching data for storage in a cache memory from a mass-storage device in response to a fetch request generated by a requesting circuit, comprising the steps of:
-
checking whether a previous fetch and a current fetch request are for data with sequential addresses, and maintaining a record representative of the number of successive fetches with sequential addresses; prefetching the next two sequential blocks beyond an address of the current fetch request if the previous fetch and the current fetch request are for data with sequential addresses and the cache memory lacks a portion of the currently requested data; prefetching the next four sequential blocks beyond the current fetch request if the previous two fetches and the current fetch request are for data with sequential addresses and the cache memory lacks a portion of the currently requested data; and prefetching the next eight sequential blocks beyond the current fetch request if the previous three fetches and the current fetch request are for data with sequential addresses and the cache memory lacks a portion of the currently requested data. - View Dependent Claims (7, 8)
-
-
9. A method of transferring data from a first memory to a second memory in response to successive requests which are issued by a requesting circuit and which each specify a segment of data having sequential addresses, comprising the steps of:
-
detecting each said fetch request; fetching from said first memory to said second memory in response to detection of each of a first and a second of said requests a respective quantity of data which includes a fetch portion at sequential addresses and a prefetch portion at sequential addresses, said prefetch portion following and being contiguous to said fetch portion; and dynamically varying the size of said prefetch portion, wherein the respective prefetch portions, which are fetched in response to each of said first and second requests, are of different size, said dynamically varying step including the steps of checking whether the quantity of data to be fetched in response to a current execution of said fetching step is contiguous to the quantity of data fetched in response to the immediately preceding execution of said fetching step, keeping track of the number of successive executions of said fetching step, including the current execution thereof, in which said quantities of data fetched are all contiguous, and selecting the size of said prefetch portion in dependence on the number of successive executions of said fetching step, including the current execution thereof, in which said quantities of data fetched are all contiguous. - View Dependent Claims (10, 11, 12, 13, 14)
-
Specification