×

Engine near cache for reducing latency in a telecommunications environment

  • US 8,112,525 B2
  • Filed: 05/15/2007
  • Issued: 02/07/2012
  • Est. Priority Date: 05/16/2006
  • Status: Active Grant
First Claim
Patent Images

1. A computer implemented method for providing a near engine cache in a network environment, comprising:

  • maintaining an engine tier distributed over a cluster network, said engine tier including one or more engine nodes that process one or more messages;

    maintaining a state tier distributed over the cluster network, said state tier including one or more replicas that store state data associated with the messages;

    continuously receiving the messages by the engine nodes and processing the messages by reading the state data from the replicas to the engine nodes and modifying the state data in the state tier;

    storing a local copy of a portion of the state data onto a near cache residing on the one or more engine nodes after processing the messages in the engine tier;

    receiving a message by an engine node after storing the local copy in the near cache;

    determining, by said engine node upon receiving the message, whether the local copy of the portion of the state data associated with the message in the near cache on said engine node is current with respect to the state tier; and

    wherein if the local copy in the near cache is determined to be current, then the engine node locks the state data in the state tier and uses the local copy of the state data from the near cache to process the message and wherein if the local copy in the near cache is determined to be stale, the engine node locks the state data in the state tier and retrieves the state data from the state tier, and wherein if the state data is retrieved from the state tier, then said state data is deserialized prior to using the state data to process the message at the engine node;

    wherein retrieving the state data from the state tier further comprises;

    serializing the state data and transporting it to the engine tier; and

    deserializing the state data in the engine tier; and

    wherein using the local copy from the near cache is performed without serializing and deserializing the local copy of the portion of the state data; and

    adjusting the size of the near cache in order to achieve a balance between latency introduced by garbage collection and latency reduced by elimination of serializing and deserializing the state data.

View all claims
  • 2 Assignments
Timeline View
Assignment View
    ×
    ×