System and method for network caching
First Claim
Patent Images
13. A cache system comprising:
- a communication network;
a plurality of network-connected intermediary servers each having an interface for receiving client requests for network resources, each intermediary server having a cache associated therewith;
communication channels linking each intermediary server with a set of neighboring intermediary servers for exchanging cache contents amongst the intermediary servers.
2 Assignments
0 Petitions
Accused Products
Abstract
A system and method for caching network resources in an intermediary server topologically located between a client and a server in a network. The intermediate server preferably caches at both a back-end location and a front-end location. Intermediary server includes a cache and methods for loading content into the cache as according to rules specified by a site owner. Optionally, content can be proactively loaded into the cache to include content not yet requested. In another option, requests can be held at the cache when a prior request for similar content is pending.
276 Citations
29 Claims
-
13. A cache system comprising:
-
a communication network;
a plurality of network-connected intermediary servers each having an interface for receiving client requests for network resources, each intermediary server having a cache associated therewith;
communication channels linking each intermediary server with a set of neighboring intermediary servers for exchanging cache contents amongst the intermediary servers.
-
-
14. A method for caching network data comprising:
-
communicating request-response traffic between two or more network-connected computing appliances;
implementing a cache coupled to the request-response traffic; and
selectively placing data from the request-response traffic into the cache at least partially based upon attributes of the client and/or server associated with the request-response traffic. - View Dependent Claims (15, 16, 17)
-
-
18. A cache system comprising:
-
a front-end server implementing a first cache and configured to receive client requests and generate responses to the client requests;
a back-end server implementing a second cache and configured to receive requests from the front-end server and generate responses to the front-end server;
an origin server having content stored thereon;
a communication channel linking the front-end server and the back-end server;
a cache management mechanism in communication with the front-end computer and the back-end computer to selectively fill the first and second caches. - View Dependent Claims (19, 20, 21)
-
-
22. A system for caching network resources comprising:
-
a plurality of intermediary servers configured to receive client requests and retrieve request-specified network resources;
a cache implemented within each of the intermediary servers and configured to store selected network resources;
a resolver mechanism for supplying a network address of the intermediary server to the client applications, wherein the resolver mechanism dynamically selects a particular intermediary server from amongst the plurality of intermediary servers based at least in part on the content of each intermediary server'"'"'s cache. - View Dependent Claims (23)
-
-
24. A cache system comprising:
-
a first front-end server implementing a first cache and configured to receive client requests and generate responses to the client requests;
a second front-end server implementing a second cache and configured to receive client requests and generate responses to the client requests;
an origin server having content stored thereon;
a communication channel linking the first front-end server and the second front-end server;
a cache management mechanism in communication with the first and second front-end computers to selectively fill the second cache in response to a client request received by the first front-end server. - View Dependent Claims (1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 25)
-
-
25-1. The cache system of claim 24 wherein the cache management mechanism selectively updates the second cache based upon anticipation that subsequent client requests will be directed to the second front-end server.
-
27. A method of speculatively caching Internet content comprising:
-
receiving a current request for specified content;
obtaining the specified content in response to the current request; and
speculatively caching data in addition to the specified content. - View Dependent Claims (28, 29)
-
Specification