Prescheduling sequential data prefetches in a preexisting LRU cache
First Claim
1. A method for scheduling prefetches into a Least Recently Used (LRU) cache of a data storage system, the method comprising:
- remotely modeling dynamic operation of the LRU cache in a model, the model including a model of data elements currently stored within the LRU cache;
assigning a priority value to modeled data elements according to their history;
assigning a priority value to a requested data element based at least partially on whether a preceding data element is present in the LRU cache;
making a cache management decision based upon the model comprising;
intercepting a request for a data element from a stream of Input/Output (I/O) data requests sent from a host and addressed to the LRU cache; and
determining whether to schedule a prefetch of a data element logically successive to the requested data element in accordance with contents of the LRU cache as indicated by the remote model; and
executing prefetches into the LRU cache in response to select cache management decisions.
2 Assignments
0 Petitions
Accused Products
Abstract
A shared system memory, such as a cache, buffers Input/Output (I/O) requests between one or more host computers and one or more data storage servers or devices. The cache may be configured to operate natively as a least-recently-used (LRU)-only cache and may be optimized for random data accesses. Data buffered by the cache may be part of a sequential data stream for which prefetching data is desirable. A remote prefetch module is provided between the cache and the host to conduct prefetching without internally modifying the cache. The remote prefetch module maintains a model of the cache. Using the model, the prefetch module anticipates whether data is likely to be part of a sequential steam of data passed between a host and a data storage device. If so, the prefetch module schedules a prefetch of the data. The prefetch may be achieved by sending an I/O request to the data server or device. The remote cache model minimizes impacts to random access data hits by minimizing the likelihood of prefetching data which is not used and further enhances the efficiency of the successful identification of likely prefetch candidates.
68 Citations
24 Claims
-
1. A method for scheduling prefetches into a Least Recently Used (LRU) cache of a data storage system, the method comprising:
-
remotely modeling dynamic operation of the LRU cache in a model, the model including a model of data elements currently stored within the LRU cache; assigning a priority value to modeled data elements according to their history; assigning a priority value to a requested data element based at least partially on whether a preceding data element is present in the LRU cache; making a cache management decision based upon the model comprising; intercepting a request for a data element from a stream of Input/Output (I/O) data requests sent from a host and addressed to the LRU cache; and determining whether to schedule a prefetch of a data element logically successive to the requested data element in accordance with contents of the LRU cache as indicated by the remote model; and executing prefetches into the LRU cache in response to select cache management decisions. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12)
-
-
13. A method for scheduling prefetches in a data storage system having a host and a cache, the method comprising the steps of:
-
providing a cache for caching Input/Output (I/O) data; providing a prefetch module remote to the cache; remotely modeling the cache within the prefetch module and determining whether to schedule a prefetch of data into the cache according to the results of the step of remotely modeling the cache, the step of remotely modeling the cache module further comprising; examining the history of a data element in the cache; assigning a priority value to the data element according to its history; comparing that priority value to a predetermined threshold value; determining a size of memory used in the cache; periodically fetching an I/O rate of the cache from the cache; periodically fetching a hit rate of the cache from the cache; and determining a single reference residency time for a data element within the cache; intercepting a stream of I/O information from the host to the cache to locate a requested data element; determining if the requested data element in the stream of I/O information is already present within the cache; making the requested data element a youngest member of the cache; determining if the data element preceding the requested data element is present in the cache; assigning a priority value to the requested data element; periodically reevaluating the performance of the cache versus an internal model of the cache if the number of I/O requests received by the cache is greater than a predetermined number; updating the dynamic threshold used in the internal model of the cache if the dynamic threshold value does not adequately model the performance of the cache; comparing the priority value of the requested data element with the dynamic threshold value; and prefetching the requested data element if the priority value of the requested data element is greater than the dynamic threshold value by passing an I/O request of the data element to the cache.
-
-
14. A data prefetch scheduling system comprising:
-
An LRU cache configured to communicate with a host; and a remote prefetch module configured to communicate with the host and the LRU cache, intercept a stream of Input/Output (I/O) information from the host to the LRU cache to locate a requested data element and further configured to determine whether to schedule a prefetch of data into the LRU cache, wherein the determination to prefetch data is at least partially determined based on whether a data element preceding a requested data element is present in the LRU cache; a modeling module operating within the remote prefetch module configured to model the LRU cache, including providing a model of data elements currently stored within the LRU cache, wherein each data element is assigned a priority value according to its history; and a prefetch request module configured to request a data I/O from the LRU cache when the remote prefetch module determines that a prefetch is to be conducted. - View Dependent Claims (15, 16, 17, 18, 19, 20)
-
-
21. A remote prefetch module for determining whether to schedule a prefetch of data into a Least Recently Used (LRU) cache of a computer system, the prefetch module comprising:
-
a modeling module configured to model dynamic operation of the LRU cache; wherein the modeling module is further configured to provide a model of data elements currently stored within the LRU cache, wherein each data element is assigned a priority value according to its history and requested data elements are assigned a priority value based at least partially on whether a preceding data element is present in the LRU cache; and a calculation module configured to make a cache management decision based upon the model, wherein the cache management decision is at least partially determined based on intercepting a stream of I/O information from a host to the LRU cache to locate a requested data element and is further based on whether a data element preceding the requested data element is present in the LRU cache.
-
-
22. A computer station on a computer network, wherein the computer station is configured to communicated with a Least Recently Used (LRU) cache coupled to a storage device of the computer network, the computer station comprising:
-
a processor; and a memory configured to store data structures comprising; a modeling module configured to model dynamic operation of the LRU cache; wherein the modeling module is further configured to provide a model of data element currently stored within the LRU cache, wherein each data element is assigned a priority value according to its history and requested data elements are assigned a priority value based at least partially on whether a preceding data element is present in the LRU cache; and a calculation module configured to make a cache management decision based upon the module, wherein the cache management decision is at least partially determined based on intercepting a stream of I/O information from a host to the LRU cache to locate a request data element and is further based on whether a data element preceding a requested data element is present in the LRU cache.
-
-
23. A computer readable medium comprising executable data structures
configured to carry out a method for scheduling prefetches into a Least recently Used (LRU) cache of a data storage system, the method comprising: -
remotely modeling dynamic operation of the LRU cache in a model, the model including a model of data elements currently stored within the LRU cache; assigning a priority value to modeled data elements according to their history; assigning a priority value to a requested data element based at least partially on whether a preceding data element is present in the LRU cache; making a cache management decision based upon the model; executing prefetches into the LRU cache in response to select cache management decisions; and assigning the requested data element as status corresponding to a youngest member of the LRU cache.
-
-
24. A data prefetch scheduling system comprising:
-
a means for remotely modeling dynamic operation of the an LRU cache in a model, the remotely modeling including providing a model of data elements currently stored within the LRU cache; a means for assigning a priority value to modeled data elements according to their history; a means for assigning a priority value to a requested data element based at least partially on whether a preceding data element is present in the LRU cache; a means for making a cache management decision based upon the model comprising; a means for intercepting a request for a data element from a stream of Input/Output (I/O) data requests sent from a host and addressed to the LRU cache; a means for determining whether to schedule a prefetch of a data element logically successive to the requested data element in accordance with contents of the LRU cache as indicated by the remote model; and a means for executing prefetches into the LRU cache in response to select cache management decisions.
-
Specification