Method to throttle rate of data caching for improved I/O performance
First Claim
Patent Images
1. A method of storing data onto a cache device when the amount of hot data exceeds the cache device'"'"'s storage capacity comprising:
- monitoring references to data stored in one or more physical cache windows in a cache device by utilizing a least recently used queue block;
prioritizing said data stored in said one or more physical cache windows to a least recently used queue in said least recently used queue block;
promoting said one or more physical cache windows to a higher priority least recently used queue in said least recently used queue block based on the number of said references to said data in said one or more physical cache windows during a certain time period;
demoting said one or more physical cache windows to a lower priority least recently used queue in said least recently used queue block when said data is not accessed during a certain time period;
monitoring the number of demoted physical cache windows in said cache device using one or more counters in said least recently used queue block;
searching a hash table in communication with said least recently used queue;
identifying said cache device is thrashing when the total number of said demoted physical cache windows is equal to zero and a total number of free physical cache windows is equal to zero; and
increasing the number of said references required to store said data in said one or more demoted physical cache windows,wherein said increasing in said number of said references is greater than three references.
8 Assignments
0 Petitions
Accused Products
Abstract
A cache device for the caching of data and specifically for the identification of stale data or a thrashing event within the cache device is described. Further a cache device for the prioritization of cached data in the cache device during a thrashing event as well as stale cached data in the cache device are described. Methods associated with the use of the caching device for the caching of data and for the identification of data in a thrashing event or the identification of stale cached data are also described.
15 Citations
7 Claims
-
1. A method of storing data onto a cache device when the amount of hot data exceeds the cache device'"'"'s storage capacity comprising:
-
monitoring references to data stored in one or more physical cache windows in a cache device by utilizing a least recently used queue block; prioritizing said data stored in said one or more physical cache windows to a least recently used queue in said least recently used queue block; promoting said one or more physical cache windows to a higher priority least recently used queue in said least recently used queue block based on the number of said references to said data in said one or more physical cache windows during a certain time period; demoting said one or more physical cache windows to a lower priority least recently used queue in said least recently used queue block when said data is not accessed during a certain time period; monitoring the number of demoted physical cache windows in said cache device using one or more counters in said least recently used queue block; searching a hash table in communication with said least recently used queue; identifying said cache device is thrashing when the total number of said demoted physical cache windows is equal to zero and a total number of free physical cache windows is equal to zero; and increasing the number of said references required to store said data in said one or more demoted physical cache windows, wherein said increasing in said number of said references is greater than three references. - View Dependent Claims (2, 3, 4, 5)
-
-
6. A cache device for the caching of data comprising:
-
at least one virtual cache window; at least one physical cache window; a least recently used queue block in communication with said at least one physical cache window, wherein said least recently used queue block comprises at least two least recently used queues, at least two counters and one global counter; and a hash table in communication with said least recently used queue block, wherein said at least one physical cache window is prioritized to said at least two least recently used queues in said least recently used queue block based on the number of references to data stored in said at least one physical cache window, wherein said hash table maintains a list of the location of said data stored in said at least one physical cache window as well as the location of said at least one physical cache window, wherein said least recently used queue block monitors the number of references to said data stored in said at least one physical cache window, wherein said at least two counters and one global counter monitor demotion of said at least one physical cache window in communication with said least recently used queue block. - View Dependent Claims (7)
-
Specification