×

Low-latency high-throughput scalable data caching

  • US 10,063,632 B1
  • Filed: 12/22/2017
  • Issued: 08/28/2018
  • Est. Priority Date: 12/22/2017
  • Status: Active Grant
First Claim
Patent Images

1. A system comprising:

  • a data source comprising a processor communicatively coupled with a memory connected over a network to a plurality of load balancer servers including a first load balancer server and a second load balancer server, wherein each load balancer server of the plurality of load balancer servers has a respective data cache, including a first data cache of the first load balancer server and a second data cache of the second load balancer server; and

    a load balancer service and a data cache service executing on one or more processors on the first load balancer server to;

    receive, by the load balancer service, a first request from a first client device over the network;

    request, by the load balancer service, a first data entry associated with the first request from the data cache service;

    retrieve, by the data cache service, the first data entry from the first data cache, wherein the first data cache stores a first plurality of data entries that is a first subset of a second plurality of data entries stored in the data source;

    modify, by the load balancer service, the first request with the first data entry;

    send, by the load balancer service, the modified first request to a plurality of receivers; and

    reject, by the load balancer service, a second request from a second client device based on the data cache service failing to locate a second data entry associated with the second request in the first data cache and also failing to locate the second data entry in the data source.

View all claims
  • 4 Assignments
Timeline View
Assignment View
    ×
    ×