Low-latency high-throughput scalable data caching
First Claim
1. A system comprising:
- a data source comprising a processor communicatively coupled with a memory connected over a network to a plurality of load balancer servers including a first load balancer server and a second load balancer server, wherein each load balancer server of the plurality of load balancer servers has a respective data cache, including a first data cache of the first load balancer server and a second data cache of the second load balancer server; and
a load balancer service and a data cache service executing on one or more processors on the first load balancer server to;
receive, by the load balancer service, a first request from a first client device over the network;
request, by the load balancer service, a first data entry associated with the first request from the data cache service;
retrieve, by the data cache service, the first data entry from the first data cache, wherein the first data cache stores a first plurality of data entries that is a first subset of a second plurality of data entries stored in the data source;
modify, by the load balancer service, the first request with the first data entry;
send, by the load balancer service, the modified first request to a plurality of receivers; and
reject, by the load balancer service, a second request from a second client device based on the data cache service failing to locate a second data entry associated with the second request in the first data cache and also failing to locate the second data entry in the data source.
4 Assignments
0 Petitions
Accused Products
Abstract
Low-latency high-throughput scalable data caching is disclosed. For example, a data source is connected over a network to a load balancer server with data cache. A load balancer service and a data cache service execute on processors on the first load balancer server to receive, by the load balancer service, a request from a client device over the network. The load balancer service requests a data entry associated with the request from the first data cache service. The data cache service retrieves the first data entry from the first data cache, which stores a first plurality of data entries that is a subset of a second plurality of data entries stored in the data source. The load balancer service modifies the request with the data entry. The load balancer service sends a modified request to a plurality of receivers.
32 Citations
29 Claims
-
1. A system comprising:
-
a data source comprising a processor communicatively coupled with a memory connected over a network to a plurality of load balancer servers including a first load balancer server and a second load balancer server, wherein each load balancer server of the plurality of load balancer servers has a respective data cache, including a first data cache of the first load balancer server and a second data cache of the second load balancer server; and a load balancer service and a data cache service executing on one or more processors on the first load balancer server to; receive, by the load balancer service, a first request from a first client device over the network; request, by the load balancer service, a first data entry associated with the first request from the data cache service; retrieve, by the data cache service, the first data entry from the first data cache, wherein the first data cache stores a first plurality of data entries that is a first subset of a second plurality of data entries stored in the data source; modify, by the load balancer service, the first request with the first data entry; send, by the load balancer service, the modified first request to a plurality of receivers; and reject, by the load balancer service, a second request from a second client device based on the data cache service failing to locate a second data entry associated with the second request in the first data cache and also failing to locate the second data entry in the data source. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22)
-
-
23. A method comprising:
-
receiving, by a load balancer service, a first request from a first client device over a network; requesting a first data entry associated with the first request from a first data cache, wherein the first data cache is hosted on a same server as the load balancer service; retrieving the first data entry from the first data cache, wherein the first data cache stores a first plurality of data entries that is a first subset of a second plurality of data entries stored in a data source; modifying the first request with the first data entry; sending the modified first request to a plurality of receivers; and rejecting, by the load balancer service, a second request from a second client device based on failing to locate a second data entry associated with the second request in the first data cache and also failing to locate the second data entry in the data source. - View Dependent Claims (24, 25, 26, 27, 28)
-
-
29. A computer-readable non-transitory storage medium storing executable instructions, which when executed by a computer system, cause the computer system to:
-
receive, by a load balancer service, a first request from a client device over a network; request a first data entry associated with the first request from a first data cache, wherein the first data cache is hosted on a same server as the load balancer service; retrieve the first data entry from the first data cache, wherein the first data cache stores a first plurality of data entries that is a first subset of a second plurality of data entries stored in a data source; modify the first request with the first data entry; send the modified first request to a plurality of receivers; and reject, by the load balancer service, a second request from a second client device based on failing to locate a second data entry associated with the second request in the first data cache and also failing to locate the second data entry in the data source.
-
Specification