×

Low-latency high-throughput scalable data caching

  • US 10,432,706 B2
  • Filed: 08/24/2018
  • Issued: 10/01/2019
  • Est. Priority Date: 12/22/2017
  • Status: Active Grant
First Claim
Patent Images

1. A system comprising:

  • a first data source comprising a processor and a first memory, the first data source connected to a first data cache over a network, wherein the first data source is located in a same geographical region as the first data cache, and wherein the first data source stores a plurality of data entries selected based on a first geolocation of the first data source;

    a master data source connected to the first data source over the network;

    a second memory storing the first data cache; and

    a load balancer service and a data cache service executing on one or more processors communicatively coupled with the memory to;

    receive, by the load balancer service, a first request from a client device based on the client device being located in a second geolocation in close proximity to the first geolocation of the first data source;

    request, by the load balancer service, a first data entry associated with the first request from the data cache service, wherein the first data entry is available from the master data source;

    determine, by the data cache service, that the first data entry is unavailable in both the first data cache and the first data source; and

    responsive to determining that the first data entry is unavailable, reject, by the load balancer service, the first request, wherein the first data source retrieves the first data entry from the master data source after the first request is rejected.

View all claims
  • 4 Assignments
Timeline View
Assignment View
    ×
    ×