×

Parallel management of load servers, cache servers, and feed servers

  • US 8,296,375 B1
  • Filed: 02/03/2009
  • Issued: 10/23/2012
  • Est. Priority Date: 02/03/2009
  • Status: Active Grant
First Claim
Patent Images

1. A system of content delivery, comprising:

  • a plurality of content cache servers, each cache server storing a plurality of current content and sending requested content, where the current content is substantially the same at each of the cache servers;

    a plurality of data loaders, each data loader in communication with each of the content cache servers, each data loader to retrieve an updated content from a content source and to write the updated content to each of the content cache servers, the data loaders to schedule retrieving the updated content from the content source based on a file entry comprising a refresh period for the updated content, the data loaders to arbitrate periodically to prevent duplication of file entries; and

    a plurality of content feed servers, each feed server in communication with each of the content cache servers and each of the data loaders, each feed server to receive a request for content from an electronic device, to request the content from one of the content cache servers, and when the content cache server replies that the requested content is not available, to send the request for the content to one of the data loaders selected based on a round-robin selection process and to send a message to the electronic device to resend the request for content at a later time, wherein the message comprises the later time at which the electronic device is to resend the request for content.

View all claims
  • 6 Assignments
Timeline View
Assignment View
    ×
    ×