Cache with Multiple Access Pipelines
First Claim
1. A method of operating a cache having shared memory and tags, the method comprising:
- accessing the cache via a first pipeline for use by a processor to access cached data from the shared memory;
accessing the cache via a second pipeline for use by a memory access unit to access the shared memory;
maintaining a first set of tags for use by the first pipeline to control access to the shared memory;
maintaining a second set of tags for use by the second pipeline to access the shared memory; and
arbitrating for access to the shared memory for a transaction request in the first pipeline and for a transaction request in the second pipeline after each pipeline has checked its respective set of tags, wherein a winner of the arbitration is granted access to the shared memory.
1 Assignment
0 Petitions
Accused Products
Abstract
Parallel pipelines are used to access a shared memory. The shared memory is accessed via a first pipeline by a processor to access cached data from the shared memory. The shared memory is accessed via a second pipeline by a memory access unit to access the shared memory. A first set of tags is maintained for use by the first pipeline to control access to the cache memory, while a second set of tags is maintained for use by the second pipeline to access the shared memory. Arbitrating for access to the cache memory for a transaction request in the first pipeline and for a transaction request in the second pipeline is performed after each pipeline has checked its respective set of tags.
-
Citations
15 Claims
-
1. A method of operating a cache having shared memory and tags, the method comprising:
-
accessing the cache via a first pipeline for use by a processor to access cached data from the shared memory; accessing the cache via a second pipeline for use by a memory access unit to access the shared memory; maintaining a first set of tags for use by the first pipeline to control access to the shared memory; maintaining a second set of tags for use by the second pipeline to access the shared memory; and arbitrating for access to the shared memory for a transaction request in the first pipeline and for a transaction request in the second pipeline after each pipeline has checked its respective set of tags, wherein a winner of the arbitration is granted access to the shared memory. - View Dependent Claims (2, 3, 4, 5, 6)
-
-
7. A method of operating a shared resource, the method comprising:
-
accessing the shared resource via a first pipeline for use by a processor to access the shared resource; accessing the shared resource via a second pipeline for use by a memory access unit to access the shared resource; maintaining a set of tags for use by the first pipeline to control access to the shared resource; and arbitrating for access to the shared resource for a transaction request in the first pipeline and for a transaction request in the second pipeline after the first pipeline has checked its set of tags, wherein a winner of the arbitration is granted access to the shared resource. - View Dependent Claims (8)
-
-
9. A digital system comprising:
-
a shared resource; a first access pipeline coupled to the shared resource, the first pipeline being configured to receive a first transaction request from a first requester for access to the shared resource, wherein the first pipeline includes a stall stage that is configured to check for a stall condition; a second access pipeline coupled to the shared resource, the second pipeline being configured to receive a second transaction request from a second requester for access to the shared resource; and arbitration logic configure to control access to the shared resource coupled to the first pipeline and to the second pipeline, wherein the arbitration logic is coupled to the first pipeline after the stall stage. - View Dependent Claims (10, 11, 12, 13, 14, 15)
-
Specification