Adapting resource use to improve performance in a caching memory system
First Claim
1. A memory system for use by a host, the memory system comprising:
- at least one memory device having a plurality of data tracks and an allocated bandwidth of which an unused portion defines an available bandwidth;
a cache memory in communication with the host and having an allocated capacity of which an unused portion defines an available capacity;
at least one memory controller in communication with the at least one memory device and the cache memory, the at least one memory controller having a total throughput of which an unused portion defines an available throughput;
a resource controller in communication with the at least one memory controller and the host, the resource controller is operative to generate a plurality of prestage requests, each prestage request of the plurality of prestage requests identifies a respective data track of the plurality of data tracks in the at least one memory device, wherein the resource controller is operative to broadcast a message to the at least one memory controller when there is at least one unaccepted prestage request of the plurality of prestage requests, and when at least one resource selected from the group of resources consisting of the available capacity of the cache memory and the available bandwidth of the at least one memory device is sufficient to copy one data track of the plurality of data tracks to the cache memory; and
wherein the at least one memory controller receives the message from the resource controller, and each memory controller of the at least one memory controller having the available throughput sufficient to copy the one data track of the plurality of data tracks reads an accepted prestage request of the at least one unaccepted prestage request, and copies the respective data track of the plurality of data tracks to the cache memory.
1 Assignment
0 Petitions
Accused Products
Abstract
A memory system, and a method for controlling prestaging activities based upon the availability of resources within the memory system. Prestage requests are stored in a shared memory accessible to a resource controller and one or more memory controllers. When the resource controller determines that there are sufficient unused cache memory and sufficient unused memory device back-end bandwidth available to prestage at least one data track, a message is broadcast to all of the memory controllers. Those memory controllers with sufficient unused throughput accept the prestage requests and copy the associated data tracks from the memory devices to the cache memory. Counters are maintained in the shared memory to track the number of prestage requests in the process of being serviced, and the number of prestaged data tracks already buffered in cache memory and waiting to be accessed by an external host.
83 Citations
22 Claims
-
1. A memory system for use by a host, the memory system comprising:
-
at least one memory device having a plurality of data tracks and an allocated bandwidth of which an unused portion defines an available bandwidth;
a cache memory in communication with the host and having an allocated capacity of which an unused portion defines an available capacity;
at least one memory controller in communication with the at least one memory device and the cache memory, the at least one memory controller having a total throughput of which an unused portion defines an available throughput;
a resource controller in communication with the at least one memory controller and the host, the resource controller is operative to generate a plurality of prestage requests, each prestage request of the plurality of prestage requests identifies a respective data track of the plurality of data tracks in the at least one memory device, wherein the resource controller is operative to broadcast a message to the at least one memory controller when there is at least one unaccepted prestage request of the plurality of prestage requests, and when at least one resource selected from the group of resources consisting of the available capacity of the cache memory and the available bandwidth of the at least one memory device is sufficient to copy one data track of the plurality of data tracks to the cache memory; and
wherein the at least one memory controller receives the message from the resource controller, and each memory controller of the at least one memory controller having the available throughput sufficient to copy the one data track of the plurality of data tracks reads an accepted prestage request of the at least one unaccepted prestage request, and copies the respective data track of the plurality of data tracks to the cache memory. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13)
an in-progress counter indicating how many respective data tracks of the plurality of data tracks are in the process of being copied into the cache memory; and
a waiting-access counter indicating how many respective data tracks of the plurality of data tracks are buffered by the cache memory and are waiting to be accessed by the host;
wherein each memory controller of the at least one memory controller increments the in-progress counter for each accepted prestage request of the at least one unaccepted prestage request, and decrements the in-progress counter and increments the waiting-access counter for each respective data track of the plurality of data tracks it has copied to the cache memory; and
wherein the resource controller decrements the waiting-access counter for each respective data track of the plurality of dat tracks buffered by the cache memory that is accessed by the host.
-
-
3. The memory system of claim 2 further comprising a shared memory in communication with the resource controller and the at least one memory controller, for holding the plurality of prestage requests, the in-progress counter, and the waiting access-counter.
-
4. The memory system of claim 2 wherein the resource controller calculates the available capacity of the cache memory based upon a first percentage of the allocated capacity of the cache memory, the in-progress counter, and the waiting-access counter.
-
5. The memory system of claim 4 wherein the first percentage is approximately 50%.
-
6. The memory system of claim 4 wherein the cache memory further buffers at least one data track of the plurality of data tracks associated with at least one non-prestage request, and the first percentage is calculated based upon the allocated capacity of the cache memory currently consumed by the at least one data track associated with the at least one non-prestage request and a burst rate.
-
7. The memory system of claim 2 wherein the resource controller calculates the available bandwidth of the at least one memory device based upon a second percentage of the allocated bandwidth of the at least one memory device, and the in-progress counter.
-
8. The memory system of claim 7 wherein the second percentage is approximately 50%.
-
9. The memory system of claim 7 wherein the allocated bandwidth further carries at least one data track of the plurality of data tracks associated with at least one non-prestage request, and the second percentage is calculated based upon the in-progress counter, the allocated bandwidth of the at least one memory device currently consumed by the at least one data track associated with the at least one non-prestage request, and a burst rate.
-
10. The memory system of claim 1 wherein the at least one memory device is at least one disk drive.
-
11. The memory system of claim 1 wherein the at least one memory device is at least one tape drive.
-
12. The memory system of claim 1 wherein the at least one memory device is at least one solid state memory drive.
-
13. The memory system of claim 1 wherein the at least one memory device is at least one virtual memory drive.
-
14. A method of prestaging a plurality of data tracks in a memory system having at least one memory device, a cache memory connected to a host, at least one memory controller connected between the at least one memory device and the cache memory, and a resource controller connected between the host and the at least one memory controller, the method comprising:
-
generating a plurality of prestage requests by the resource controller, each prestage request of the plurality of prestage requests identifying a respective data track of a plurality of data tracks in the at least one memory device;
calculating an available bandwidth of the at least one memory device by the resource controller in response to generating the plurality of prestage requests;
calculating an available capacity of the cache memory by the resource controller in response to generating the plurality of prestage requests;
broadcasting a message from the resource controller to the at least one memory controller in response to calculating the available capacity of the cache memory and the available bandwidth of the at least one memory device, and having at least one unaccepted prestage request of the plurality of prestage requests, and having at least one resource selected from the group of resources consisting of the available bandwidth of the at least one memory device and the available capacity of the cache memory sufficient to copy one data track of the plurality of data tracks to the cache memory;
determining an available throughput for each memory controller of the at least one memory controller by each memory controller in response to receiving the message broadcast from the resource controller;
reading an accepted prestage request of the at least one unaccepted prestage request by each memory controller having the available throughput sufficient to copy the one data track of the plurality of data tracks into the cache memory; and
copying the respective data track of the plurality of data tracks into cache memory by each memory controller having the available throughput sufficient to copy one data track of the plurality of data tracks. - View Dependent Claims (15, 16, 17, 18, 19, 20, 21, 22)
incrementing an in-progress counter by the at least one memory controller in response to reading each accepted prestage request;
incrementing a waiting-access counter and decrementing the in-progress counter by the at least one memory controller in response to each respective data track that is copied to the cache memory; and
decrementing the waiting-access counter by the resource controller in response to each respective data track buffered by the cache memory that is accessed by the host.
-
-
16. The method of claim 15 wherein the plurality of prestage requests, the in-progress counter, and the waiting-access counter are held in a shared memory.
-
17. The method of claim 15 wherein calculating the available capacity of the cache memory is based upon a first percentage of an allocated capacity of the cache memory, the in-progress counter, and the waiting-access counter.
-
18. The method of claim 17 wherein the first percentage is approximately 50%.
-
19. The method of claim 17 further comprising:
-
buffering in the cache memory at least one data track of the plurality of data tracks associated with at least one non-prestage request; and
calculating the first percentage based upon the allocated capacity of the cache memory currently consumed by the at least one data track associated with the at least one non-prestage request and a burst rate.
-
-
20. The method of claim 15 wherein calculating the available bandwidth of the at least one memory device is based upon a second percentage of an allocated bandwidth of the at least one memory device, and the in-progress counter.
-
21. The method of claim 20 wherein the second percentage is approximately 50%.
-
22. The method of claim 20 further comprising:
-
carrying within the allocated bandwidth of the at least one memory device at least one data track associated with at least one non-prestage request; and
calculating the second percentage based upon the in-progress counter, the allocated bandwidth of the at least one memory device currently consumed by the at least one data track associated with the at least one non-prestage request, and a burst rate.
-
Specification