×

Synchronizing clocks in an asynchronous distributed system

  • US 8,073,976 B2
  • Filed: 03/25/2009
  • Issued: 12/06/2011
  • Est. Priority Date: 03/27/2008
  • Status: Active Grant
First Claim
Patent Images

1. At an observing computer system in an asynchronous distributed system including a plurality of computer system, the observing computer system including a processor and system memory, the asynchronous distributed system having a clock quantum constraint, Q, and a drift rate constraint, D, the clock quantum constraint indicating the maximum difference between clock quantizations among the computer systems of the asynchronous distributed system, the drift rate constraint indicating the maximum clock drift within a specified period of time for each computer system of the asynchronous distributed system, a method for determining the variance between what the observing computer system purports the time at the observed computer system to be and the actual time at the observed computer system, the method comprising:

  • an act of participating in one or more message exchanges with the observed computer system, the message exchanges including;

    an act of recording the send time, X(t1), of the clock, X(t), at the observing computer system when a message is sent;

    an act of sending a message to the observed computer system, the message including the recorded send time X(t1);

    an act of subsequently receiving a correlatable message responsive to the message from the observed computer system, the correlatable message containing a time Y(t2) from the clock, Y(t), of the observed computer system;

    an act of recording the received time, X(t3), of the clock, X(t) at the observing computer system when the correlatable message is received; and

    an act of recording the time, Y(t2), from the observed computer system;

    an act of calculating a lower bound for the time at the observed computer system relative to the time of the observing computer system based on the equation;


    Y(t)>

    =X(t)−

    (X(t3)−

    Y(t2)+2Q)−

    2D(X(t)−

    X(t1))/2+2Q);

    an act of calculating an upper bound for the time at the observed computer system relative to the time of the observing computer system based on the equation;


    Y(t)<

    =X(t)+(Y(t2)−

    X(t1)+2Q)+2D(X(t)−

    (X(t3)+X(t1))/2+2Q);

    an act of calculating the difference between the upper bound and the lower bound; and

    an act of the processor calculating the maximum variance between what the observing computer system purports the time at the observed computer system to be and the actual time at the observed computer system by dividing the calculated difference by an averaging factor.

View all claims
  • 2 Assignments
Timeline View
Assignment View
    ×
    ×