Data processing apparatus and method for cache line replacement responsive to the operational state of memory
First Claim
1. Data processing apparatus comprising:
- (i) a cache memory having a plurality of cache storage lines;
(ii) a plurality of main memory units operable to store data words to be cached within said cache memory; and
(iii) a cache victim select circuit for selecting a victim cache storage line into which one or more data words are to be transferred from one of said main memory units following a cache miss;
wherein (iv) said cache victim select circuit is responsive to an operational state of at least one of said main memory units when selecting said victim cache storage line.
2 Assignments
0 Petitions
Accused Products
Abstract
A data processing system 2 is described including a cache memory 8 and a plurality of DRAM banks 16, 18, 20, 22. A victim select circuit 32 within a cache controller 10 selects victim cache storage lines 28 upon a cache miss such that unlocked cache storage lines are selected in preference to locked cache storage lines, non-dirty cache storage lines are selected in preference to dirty cache storage lines, and cache storage lines requiring a write back to a non-busy DRAM bank are selected in preference to cached storage lines requiring a write back to a busy DRAM storage bank. A DRAM controller 24 is provided that continuously performs a background processing operation whereby dirty cache storage lines 28 within a cache memory 8 are written back to their respective DRAM banks 16, 18, 20, 22 when these are not busy performing other operations and when the cache storage line has a least recently used value below a certain threshold. A bus arbitration circuit 12 is provided that re-arbitrates bus master priorities in dependence upon determined latencies for respective memory access requests. As an example, if a high priority memory access request results a cache miss, with a lower priority memory access request resulting in a cache hit, then the lower priority memory access request will be re-arbitrated to be performed ahead of the normally higher priority memory access request and may be finished before that higher priority memory access request starts to return data words to a data bus 14.
-
Citations
19 Claims
-
1. Data processing apparatus comprising:
-
(i) a cache memory having a plurality of cache storage lines;
(ii) a plurality of main memory units operable to store data words to be cached within said cache memory; and
(iii) a cache victim select circuit for selecting a victim cache storage line into which one or more data words are to be transferred from one of said main memory units following a cache miss;
wherein(iv) said cache victim select circuit is responsive to an operational state of at least one of said main memory units when selecting said victim cache storage line. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18)
(i) least recently used line that is not locked and is not dirty;
(ii) least recently used line that is not locked, is dirty and can be written back to a main memory unit that is not busy;
(iii) least recently used line that is not locked, is dirty and has to be written back to a main memory unit that is busy;
(iv) least recently used line that is locked and is not dirty;
(v) least recently used line that is locked, is dirty and can be written back to a main memory unit that is not busy;
(vi) least recently used line that is locked, is dirty and has to be written back to a main memory unit that is busy.
-
-
12. Data processing apparatus as claimed in claim 5, wherein one or more cache storage lines may be locked for preferential use by one of said data word requesting units and said cache victim select circuit selects as said victim cache storage line that cache storage line having properties placing it highest in a list of N properties, where 1≦
- N≦
6, said list of N properties being formed of the N highest properties in the list;(i) randomly selected from those cache storage lines that are not locked and are not dirty;
(ii) randomly selected from those cache storage lines that are not locked, are dirty and can be written back to a main memory unit that is not busy;
(iii) randomly selected from those cache storage lines that are not locked, are dirty and have to be written back to a main memory unit that is busy;
(iv) randomly selected from those cache storage lines that are locked and are not dirty;
(v) randomly selected from those cache storage lines that are dirty and can be written back to a main memory unit that is not busy;
(vi) randomly selected from those cache storage lines that are locked, are dirty and have to be written back to a main memory unit that is busy.
- N≦
-
13. Data processing apparatus as claimed in claim 5, wherein one or more cache storage lines may be locked for preferential use by one of said data word requesting units and said cache victim select circuit selects as said victim cache storage line that cache storage line having properties placing it highest in a list of N properties, where 1≦
- N≦
6, said list of N properties being formed of the N highest properties in the list;(i) selected in sequence from those cache storage lines that are not locked and are not dirty;
(ii) selected in sequence from those cache storage lines that are not locked, are dirty and can be written back to a main memory unit that is not busy;
(iii) selected in sequence from those cache storage lines that are not locked, are dirty and have to be written back to a main memory unit that is busy;
(iv) selected in sequence from those cache storage lines that are locked and are not dirty;
(v) selected in sequence from those cache storage lines that are locked, are dirty and can be written back to a main memory unit that is not busy;
(vi) selected in sequence from those cache storage lines that are dirty and have to be written back to a main memory unit that is busy.
- N≦
-
14. Data processing apparatus as claimed in claim 1, wherein said plurality of main memory units are a plurality of banks of dynamic random access memory.
-
15. Data processing apparatus as claimed in claim 14, wherein a refresh of data values held within a bank of dynamic random access memory can take place concurrently with an access to a cached data value within said cache memory corresponding to a data value within said bank of dynamic random access memory.
-
16. Data processing apparatus as claimed in claim 1, wherein said plurality of main memory units are a plurality of flash memory.
-
17. Data processing apparatus as claimed in claim 1, wherein said plurality of main memory units are provided together with said cache memory on an integrated circuit.
-
18. Data processing apparatus as claimed in claim 6, wherein said plurality of main memory units are provided together with said cache memory on an integrated circuit and said plurality of data word requesting units are also provided on said integrated circuit.
-
19. A data processing method comprising the steps of:
-
(i) storing data words within a plurality of cache storage lines of a cache memory;
(ii) storing in a plurality of main memory units said data words to be cached within said cache memory; and
(iii) selecting a victim cache storage line into which one or more data words are to be transferred from one of said main memory units following a cache miss;
wherein(iv) said selection is responsive to an operational state of at least one of said main memory units when selecting said victim cache storage line.
-
Specification