Online background predictors and prefetchers for locality management
First Claim
1. A method of maintaining a predictor or a prefetcher in cache comprising,constructing the predictor, using a sequence of recent user page requests and a data compression algorithm, in the form of a tree having a root, transitions, nodes and leaves,placing said root of the tree in a page and maintaining the page in cache,placing every node of the tree in at most one cache page,maintaining some of said node pages in cache in accordance with a replacement heuristic, and maintaining the remaining node pages in a secondary memory,fetching a node page not in cache from said secondary memory when it is required, andreplacing a node page in cache with said fetched node page in accordance with said replacement heuristic,wherein said replacing a node page in cache is in accordance with a least recently used heuristic.
6 Assignments
0 Petitions
Accused Products
Abstract
Online prediction techniques based on data compression principles are employed to make predictions in restricted memory environments. Predictors have data structures in the form of trees that are paged and maintained in a cache on a least recently used replacement basis. A fast sequence of events strategy increments the counts for events at the current node of the predictor.
-
Citations
10 Claims
-
1. A method of maintaining a predictor or a prefetcher in cache comprising,
constructing the predictor, using a sequence of recent user page requests and a data compression algorithm, in the form of a tree having a root, transitions, nodes and leaves, placing said root of the tree in a page and maintaining the page in cache, placing every node of the tree in at most one cache page, maintaining some of said node pages in cache in accordance with a replacement heuristic, and maintaining the remaining node pages in a secondary memory, fetching a node page not in cache from said secondary memory when it is required, and replacing a node page in cache with said fetched node page in accordance with said replacement heuristic, wherein said replacing a node page in cache is in accordance with a least recently used heuristic.
-
9. A method of prefetching pages into cache by predicting the next page request from a sequence of past page requests using a data compression algorithm, comprising,
constructing an online probabilistic model in the form of a parse tree using a data compression algorithm on the sequence of past page requests, said tree having a root, nodes and leafs connected via transitions, constructing a prefetcher comprising a data structure in the form of a tree with a current node, internal nodes and transitions for past page requests between the nodes, wherein transition counts, corresponding to the probabilities established by the parse tree, out of each node to each succeeding node are kept at each node, setting the current node of the prefetcher to be the root of the parse tree, before each user page request, prefetching at most k pages, choosing the pages in decreasing order of the estimated probabilities as determined by the transition counts out of the current node, upon receiving a user page request, updating the transition count for the page requested, resetting the current node to be the node led to by the transition corresponding to the page requested, and preparing to prefetch again, and upon reaching a leaf of the parse tree, an end node of the data structure for which there are no transitions to a further node, immediately resetting the current node of the data structure tree to be the root of the parse tree, using the root for predicting for prefetching and updating the parse tree at both the leaf and root of the parse tree upon receiving the next page request, said step of constructing a prefetcher comprising a data structure includes labeling the transitions between nodes with a user page identifier and a count corresponding to the number of times the transition is traversed, and upon the occurrence of fast page requests by a user, maintaining the current node at its present location and incrementing the count at the current node.
-
10. A method of prefetching pages into cache by predicting the next page request from a sequence of past page requests using a data compression algorithm, comprising,
constructing an online probabilistic model in the form of a parse tree using a data compression algorithm on the sequence of past page requests, said tree having a root, nodes and leafs connected via transitions, constructing a prefetcher comprising a data structure in the form of a tree with a current node, internal nodes and transitions for past page requests between the nodes, wherein transition counts, corresponding to the probabilities established by the parse tree, out of each node to each succeeding node are kept at each node, setting the current node of the prefetcher to be the root of the parse tree, before each user page request, prefetching at most k pages, choosing the pages in decreasing order of the estimated probabilities as determined by the transition counts out of the current node, upon receiving a user page request, updating the transition count for the page requested, resetting the current node to be the node led to by the transition corresponding to the page requested, and preparing to prefetch again, and upon reaching a leaf of the parse tree, an end node of the data structure for which there are no transitions to a further node, immediately resetting the current node of the data structure tree to be the root of the parse tree, using the root for predicting for prefetching and updating the parse tree at both the leaf and root of the parse tree upon receiving the next page request, said step of constructing a prefetcher comprising a data structure includes labeling the transitions between nodes with a user page identifier and a count corresponding to the number of times the transition is traversed, and comprising the step of paging each data structure node.
Specification