Cache usage for concurrent multiple streams
First Claim
Patent Images
1. An apparatus, comprising:
- a stream monitor to determine stream activity;
a scheduler coupled with the stream monitor to determine a maximum number of cache lines per stream to pre-fetch based upon the stream activity;
a pre-fetch engine coupled with the scheduler to generate pre-fetch requests; and
cache coupled with the scheduler to store pre-fetched cache lines in response to the pre-fetch requests.
1 Assignment
0 Petitions
Accused Products
Abstract
In a system supporting concurrent multiple streams that pass through a cache between memory and the requesting devices, various techniques improve the efficient use of the cache. Some embodiments use adaptive pre-fetching of memory data using a dynamic table to determine the maximum number of pre-fetched cache lines permissible per stream. Other embodiments dynamically allocate the cache to the active streams. Still other embodiments use a programmable timer to deallocate inactive streams.
346 Citations
35 Claims
-
1. An apparatus, comprising:
-
a stream monitor to determine stream activity;
a scheduler coupled with the stream monitor to determine a maximum number of cache lines per stream to pre-fetch based upon the stream activity;
a pre-fetch engine coupled with the scheduler to generate pre-fetch requests; and
cache coupled with the scheduler to store pre-fetched cache lines in response to the pre-fetch requests. - View Dependent Claims (2, 3, 4, 5, 6)
-
-
7. An apparatus, comprising:
-
a stream monitor to determine stream activity;
cache logic circuitry coupled with the stream monitor to generate pre-fetch requests and to limit a number of pre-fetched cache lines per stream based upon the stream activity; and
cache coupled with the cache logic circuitry to store pre-fetched cache lines in response to the pre-fetch requests. - View Dependent Claims (8, 9, 10, 11)
-
-
12. An apparatus, comprising:
-
a stream monitor to determine stream activity;
cache to store pre-fetched cache lines for active streams in a cache structure of the cache; and
cache logic circuitry coupled with the stream monitor and with the cache to determine the cache structure and to allocate the cache structure to the active streams based upon the stream activity. - View Dependent Claims (13, 14, 15, 16, 17)
-
-
18. A system, comprising:
-
a memory;
a plurality of devices to request data from the memory;
input-output circuitry coupled between the memory and the plurality of devices and comprising a stream monitor to determine stream activity;
cache to store pre-fetched cache lines for active streams; and
cache logic circuitry coupled with the stream monitor and with the cache to allocate portions of the cache to the active streams based upon the stream activity. - View Dependent Claims (19, 20, 21, 22)
-
-
23. An apparatus, comprising:
-
a timer to determine a stream is inactive;
cache logic circuitry coupled with the timer to de-allocate a cache structure for the stream based upon the determination that the stream is inactive; and
cache coupled with said cache logic circuitry to store data in the cache structure. - View Dependent Claims (24, 25, 26, 27)
-
-
28. An apparatus, comprising:
-
a pre-fetch engine to pre-fetch data from a memory for multiple concurrent streams;
a timer coupled to the pre-fetch engine to determine a particular stream of the multiple concurrent streams is inactive;
cache logic circuitry coupled with the timer to de-allocate a cache structure for the particular stream based upon the determination that the particular stream is inactive; and
cache coupled with said cache logic circuitry to store the data in the cache structure. - View Dependent Claims (29, 30, 31, 32)
-
-
33. An apparatus, comprising:
-
a stream monitor to determine stream activity, a scheduler coupled with the stream monitor to determine a maximum number of cache lines per stream to pre-fetch based upon the stream activity;
cache coupled with the scheduler to store pre-fetched cache lines;
cache logic circuitry coupled with the stream monitor to generate pre-fetch requests, to limit a number of pre-fetched cache lines per stream based upon the maximum number, and to allocate the cache based upon the stream activity; and
a timer coupled with the cache logic circuitry to de-allocate a portion of the cache for a particular stream responsive to a determination that the particular stream is inactive. - View Dependent Claims (34, 35)
-
Specification