Method for operating a cache memory using a LRU table and access flags
First Claim
1. A method for operating a cache memory system having a mass storage device and a cache memory, said method comprising the steps of:
- storing in said cache memory selected data from said mass storage device;
maintaining a Least Recently Used (LRU) table indicating the relative recency of use of each data stored in said cache memory;
moving a selected entry to the top of said LRU table for each new block of data stored in said cache memory;
setting a flag for each entry in said LRU table when data stored in said cache memory associated with said LRU entry has been accessed;
when cache space is determined to be needed corresponding to an area of mass storage which is not currently assigned in said cache memory;
examining the entry at the LRU position in said LRU table;
determining if, for said LRU position, said flag has been set;
if said flag has been set, unsetting said flag and placing said entry at the top of said LRU table;
if said flag has not been set, decaching said data in said cache memory associated with said LRU entry, reusing said LRU entry for storing said current data not stored in said cache memory, and placing said LRU entry at the MRU position of said LRU table; and
examining the LRU entry resulting from moving the flagged entry to the MRU position.
0 Assignments
0 Petitions
Accused Products
Abstract
A computer data storage device made up of both solid state storage and rotating magnetic disk which maintains a fast response time approaching that of a solid state device for many workloads and improving on the response time of a normal magnetic disk for practically all workloads. The high performance is accomplished by a special hardware configuration coupled with unique procedures and algorithms pertaining to the methodology of placing and maintaining data in the most appropriate media based on actual and projected activity. The system management features a completely searchless method for determining the location of data within and between the two devices. Sufficient solid state memory capacity is incorporated to permit retention of useful, active data, as well as to permit prefetching of data into the solid state component when the probabilities favor such action. Movement of updated data from the solid state storage to the magnetic disk and of prefetched data from the magnetic disk to the solid state component is done on a timely, but unobtrusive, basis as background tasks of the described device. The direct, private channel between the solid state storage and the magnetic disk prevents the conversations between these two media from conflicting with the transmission of data between the host computer and the described device. A set of microprocessors manage and oversee the data transmission and storage. Data integrity is maintained through a power interruption via a battery assisted, automatic and intelligent shutdown procedure.
-
Citations
7 Claims
-
1. A method for operating a cache memory system having a mass storage device and a cache memory, said method comprising the steps of:
-
storing in said cache memory selected data from said mass storage device; maintaining a Least Recently Used (LRU) table indicating the relative recency of use of each data stored in said cache memory; moving a selected entry to the top of said LRU table for each new block of data stored in said cache memory; setting a flag for each entry in said LRU table when data stored in said cache memory associated with said LRU entry has been accessed; when cache space is determined to be needed corresponding to an area of mass storage which is not currently assigned in said cache memory; examining the entry at the LRU position in said LRU table; determining if, for said LRU position, said flag has been set; if said flag has been set, unsetting said flag and placing said entry at the top of said LRU table; if said flag has not been set, decaching said data in said cache memory associated with said LRU entry, reusing said LRU entry for storing said current data not stored in said cache memory, and placing said LRU entry at the MRU position of said LRU table; and examining the LRU entry resulting from moving the flagged entry to the MRU position. - View Dependent Claims (2, 3, 4, 5, 6, 7)
-
Specification