Address pipelined stack caching method
First Claim
1. An address pipelining method for use in caching a portion of a stack in a stack cache having a plurality of memory locations, an optop pointer pointed at a top memory location of said stack cache, and a bottom pointer pointed at a bottom memory location of said stack cache, said method comprising:
- writing a new data word for said stack at said top memory location of said stack cache, wherein said stack cache is in a first memory unit;
incrementing said optop pointer;
copying said bottom pointer as a first address to a first address register coupled to said stack and to said stack cache, wherein said stack is in a second memory unit different from said first memory unit; and
spilling a plurality of data words from said stack cache in said first memory unit to said stack in said second memory unit if a spill condition exists.
1 Assignment
0 Petitions
Accused Products
Abstract
The present invention uses a stack management unit including a stack cache to accelerate data retrieval from a stack and data storage into the stack. In one embodiment, the stack management unit includes a stack cache, a dribble manager unit, and a stack control unit. The dribble manager unit maintains a cached stack portion, typically a top portion of the stack in the stack cache. The stack cache includes a stack cache memory circuit, one or more read ports, and one or more write ports. The stack management unit also includes an address pipeline to transfer multiple data words by the spill control unit and the fill control unit to improve the throughput of spill and fill operations. When new data words are written to the top memory location of the stack, the optop pointer is incremented. If data words are read off the stack the optop pointer is decremented. During normal operations the dribble manager unit detects spill conditions and fill conditions. If a spill condition occurs, the dribble manager unit spills a plurality of data words from the stack cache to the stack. If a fill condition occurs, the dribble manager unit fills a plurality of data words from the stack to the stack cache.
63 Citations
27 Claims
-
1. An address pipelining method for use in caching a portion of a stack in a stack cache having a plurality of memory locations, an optop pointer pointed at a top memory location of said stack cache, and a bottom pointer pointed at a bottom memory location of said stack cache, said method comprising:
-
writing a new data word for said stack at said top memory location of said stack cache, wherein said stack cache is in a first memory unit;
incrementing said optop pointer;
copying said bottom pointer as a first address to a first address register coupled to said stack and to said stack cache, wherein said stack is in a second memory unit different from said first memory unit; and
spilling a plurality of data words from said stack cache in said first memory unit to said stack in said second memory unit if a spill condition exists. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14)
calculating a number of used data words;
comparing said number of used data words with a stack cache high threshold; and
generating a spill signal indicative of whether a spill condition exits.
-
-
4. The method of claim 3, wherein said detecting a spill condition further comprises registering said spill signal.
-
5. The method of claim 2 wherein said detecting said spill condition comprises:
comparing said optop pointer to a high water mark.
-
6. The method of claim 5 further comprising:
incrementing said high water mark if said spill condition exists.
-
7. The method of claim 1, further comprising copying said first address to a second register.
-
8. The method of claim 1, wherein said spilling a plurality of data words from said stack cache to said stack if a spill condition exists further comprises:
-
driving said first address to said stack and said stack cache;
decrementing said first address to generate a second address; and
storing said second address in said first address register.
-
-
9. The method of claim 8, wherein said spilling a plurality of data words from said stack cache to said stack if a spill condition exists further comprises:
-
transferring a data word of said plurality of data words at said first address of said stack cache to said stack; and
equating said bottom pointer to said second address.
-
-
10. The method of claim 9, wherein said spilling a plurality of data words from said stack cache to said stack if a spill condition exists further comprises driving said second address to said stack and said stack cache.
-
11. The method of claim 1, wherein said spilling a plurality of data words from said stack cache to said stack if a spill condition exists further comprises:
-
driving an address in said first address register to said stack and said stack cache;
decrementing said address in said first address register to produce a decremented address; and
storing said decremented address in said first address register.
-
-
12. The method of claim 11, wherein said spilling a plurality of data words from said stack cache to said stack if a spill condition exists further comprises:
-
transferring a data word of said plurality of data words at said address in said stack cache to said stack; and
equating said bottom pointer to said address.
-
-
13. The method of claim 12, wherein
said driving an address in said first address register to said stack and said stack cache; -
said decrementing said address in said first address register to produce a decremented address;
said storing said decremented address in said first address register;
said transferring a data word at said address in said stack cache to said stack; and
equating said bottom pointer to said address are repeated until said spill condition does not exist.
-
-
14. The method of claim 1 wherein said second memory unit includes a data cache.
-
15. An address pipelining method for use in caching a portion of a stack in a stack cache having a plurality of memory locations, an optop pointer pointed at a top memory location of said stack cache, and a bottom pointer pointed at a bottom memory location of said stack cache, said method comprising:
-
reading a top data word from said stack in a first memory unit;
decrementing said optop pointer;
copying said bottom pointer as a first address to a first address register coupled to said stack and to said stack cache, wherein said stack cache is in a second memory unit different from said first memory unit; and
filling a plurality of data words from said stack in said first memory unit to said stack cache in said second memory unit if a fill condition exists. - View Dependent Claims (16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27)
calculating a number of used data words;
comparing said number of used data words with a stack cache low threshold; and
generating a fill signal indicative of whether a fill condition exits.
-
-
18. The method of claim 17, wherein said detecting a fill condition further comprises registering said fill signal.
-
19. The method of claim 16 wherein said determining if said fill condition exists comprises:
comparing said optop pointer to a low water mark.
-
20. The method of claim 19 further comprising:
decrementing said low water mark if said fill condition exists.
-
21. The method of claim 15, further comprising copying said first address to a second register.
-
22. The method of claim 21, wherein said filling a plurality of data words from said stack to said stack cache if a fill condition exists further comprises:
-
decrementing said first address to generate a second address;
storing said second address in said first address register;
driving said first address to said stack; and
driving said second address to said stack cache.
-
-
23. The method of claim 22, wherein said filling a plurality of data words from said stack to said stack cache if a fill condition exists further comprises:
-
transferring a data word at said first address in said stack to said stack cache at said second address; and
equating said bottom pointer to said second address.
-
-
24. The method of claim 15, wherein said filling a plurality of data words from said stack to said stack cache if a fill condition exists further comprises:
-
copying an address in said first address register to a second address register;
decrementing said address in said first address register to produce a decremented address;
storing said decremented address in said first address register;
driving said address in said first address register to said stack;
driving an address in said second address register to said stack cache;
decrementing said address in said first address register to produce a decremented address; and
storing said decremented address in said first address register.
-
-
25. The method of claim 24, wherein said filling a plurality of data words from said stack to said stack cache if a fill condition exists further comprises:
-
transferring a data word in said stack at said address in said first address register to said stack cache; and
equating said bottom pointer to said address in said second address register.
-
-
26. The method of claim 25, wherein
said copying an address in said first address register to a second address register; -
said decrementing said address in said first address register to produce a decremented address;
said storing said decremented address in said first address register;
said driving said address in said first address register to said stack;
said driving an address in said second address register to said stack cache;
said decrementing said address in said first address register to produce a decremented address;
said storing said decremented address in said first address register;
said transferring a data word in said stack at said address in said first address register to said stack cache; and
said equating said bottom pointer to said address in said second address register are repeated until said fill condition does not exist.
-
-
27. The method of claim 15 wherein said first memory unit includes a data cache.
Specification