Choreographed caching
First Claim
Patent Images
1. A computer implemented method comprising:
- monitoring, by a computing device, requests received on a plurality of routing devices for application data corresponding to a network resource identifier;
aggregating the requests received over a caching time window to derive an aggregated set of past application data including the requests received on the plurality of routing devices;
determining, using the set of past application data, one or more data request patterns associated with the plurality of routing devices and the application data;
determining, using the data request patterns, a forecast of demand for future application data; and
in response to determining the forecast of demand for the future application data, transmitting a communication including a preemptive notification to one or more of the plurality of routing devices to cache future application data based on the forecast of demand, wherein the communication is transmitted to the one or more of the plurality of routing devices before demand for the application data is expected.
1 Assignment
0 Petitions
Accused Products
Abstract
A routing device capable of performing application layer data caching is described. Application data caching at a routing device can alleviate the bottleneck that an application data host may experience during high demands for application data. Requests for the application data can also be fulfilled faster by eliminating the network delays for communicating with the application data host. The techniques described can also be used to perform analysis of the underlying application data in the network traffic transiting though a routing device.
-
Citations
15 Claims
-
1. A computer implemented method comprising:
-
monitoring, by a computing device, requests received on a plurality of routing devices for application data corresponding to a network resource identifier; aggregating the requests received over a caching time window to derive an aggregated set of past application data including the requests received on the plurality of routing devices; determining, using the set of past application data, one or more data request patterns associated with the plurality of routing devices and the application data; determining, using the data request patterns, a forecast of demand for future application data; and in response to determining the forecast of demand for the future application data, transmitting a communication including a preemptive notification to one or more of the plurality of routing devices to cache future application data based on the forecast of demand, wherein the communication is transmitted to the one or more of the plurality of routing devices before demand for the application data is expected. - View Dependent Claims (2, 3, 4, 5)
-
-
6. A system, comprising:
- one or more data processors; and
a non-transitory computer readable storage medium containing instructions which when executed on the one or more data processors, cause the one or more data processors to perform operations including; monitoring, by a computing device, requests received on a plurality of routing devices for application data corresponding to a network resource identifier; aggregating the requests received over a caching time window to derive an aggregated set of past application data including the requests received on the plurality of routing devices; determining, using the set of past application data, one or more data request patterns associated with the plurality of routing devices and the application data; determining, using the set of data request patterns, a forecast of demand for future application data; and in response to determining the forecast of demand for the future application data, transmitting a communication including a preemptive notification to one or more of the plurality of routing devices to cache future application data based on the forecast of demand, wherein the communication is transmitted to the one or more of the plurality of routing devices before demand for the application data is expected. - View Dependent Claims (7, 8, 9, 10)
- one or more data processors; and
-
11. A computer-program product tangibly embodied in a non-transitory machine-readable storage medium, including instructions configured to cause a data processing apparatus to perform operations comprising:
-
monitoring, by a computing device, requests received on a plurality of routing devices for application data corresponding to a network resource identifier; aggregating the requests received over a caching time window to derive an aggregated set of past application data including the requests received on the plurality of routing devices; determining, using the set of past application data, one or more data request patterns associated with the plurality of routing devices and the application data; determining, using the data request patterns, a forecast of demand for future application data; and in response to determining the forecast of demand for the future application data, transmitting a communication including a preemptive notification to one or more of the plurality of routing devices to cache future application data based on the forecast of demand, wherein the communication is transmitted to the one or more of the plurality of routing devices before demand for the application data is expected. - View Dependent Claims (12, 13, 14, 15)
-
Specification