×

System and method for server-based predictive caching of back-end system data

  • US 7,716,332 B1
  • Filed: 06/20/2001
  • Issued: 05/11/2010
  • Est. Priority Date: 06/20/2001
  • Status: Expired due to Fees
First Claim
Patent Images

1. A system for facilitating communication between a user and a network of information items, comprising:

  • a remote data storage device for storing the information items, wherein the information items are stored in the form of pages, and wherein the pages contain a plurality of links to other information items;

    a multi-layer architecture comprising;

    a client device having a user interface program thereon, for allowing a user to interface with the network and request the information items; and

    a server device, in communication with the client device and in communication with the remote storage device, for handling information requests from multiple clients and for storing information retrieved from the data storage devices locally in a server cache memory, the server device including at least the following;

    a data collection module for collecting and storing, at the server device, successive actions of a single particular authenticated user on a user specific basis that distinguishes between specific users, the data collection module further configured to track sequences of navigational events of both the single particular authenticated user and at least one unauthenticated user; and

    a probability module in communication with the data collection module for calculating a first probability for the desirability of each of the links based on the action of the single particular user and for comparing each of the probabilities to a predetermined threshold value associated with business rules which factor a level of risk of retrieving data that may not be used and an associated hardware cost of cache memory to identify predicted links and for retrieving the predicted information items associated with the links from the remote data storage devices and enabling the storage of the predicted information items on both the client device layer and the server device layer of the multi-layer architecture in advance of the single particular user'"'"'s request for the selected information items, the probability module including a dedicated rules engine for storing the business rules, the probability module further configured to;

    update the probabilities assigned to the links with each successive user activity;

    abort retrieving the predicted information items if the user requests an information item other than the predicted information items;

    continue retrieving the predicted information items from the remote data storage devices and storing the predicted information items in the server cache memory if the user requests the predicted information item; and

    download from the server device to the client device, the user requested information item to the client from the server cache memory;

    wherein the first probability is calculated based solely on the actions of the single particular user during a past navigation and not as a member of a larger set of users,wherein the probability calculation module is further configured to calculate a second probability, the second probability being based on selection data of at least one link from a plurality of users, the probability being used to determine a likelihood that another user will select the at least one link, such that in response to the probability meeting a predetermined threshold, data related to the link will be retrieved prior to the another user selecting the link.

View all claims
  • 1 Assignment
Timeline View
Assignment View
    ×
    ×