Network endpoint system with accelerated data path
First Claim
Patent Images
1. A network content delivery system, comprising:
- at least one storage processor;
at least one application processor performing endpoint functionality processing;
a system interface connection configured to be coupled to a network;
at least one network processor, the network processor coupled to the system interface connection to receive data from the network; and
an interconnection between the system processor and the network processor so that the network processor may analyze data provided from the network and process the data at least in part and then forward the data to the interconnection so that other processing may be performed on the data within the system, wherein the interconnection enables a data content delivery path between the storage processor and the network processor for providing data from a storage system to the network, the data content delivery path by-passing the application processor.
1 Assignment
0 Petitions
Accused Products
Abstract
Systems and methods are provided for network connected computing systems that employ functional multi-processing to optimize bandwidth utilization and accelerate system performance. In one embodiment, the network connected computing system may include a switch based computing system. The system may further include an asymmetric multi-processor system configured in a staged pipeline manner. The network connected computing system may be utilized in one embodiment as a network endpoint system that provides content delivery.
142 Citations
62 Claims
-
1. A network content delivery system, comprising:
-
at least one storage processor;
at least one application processor performing endpoint functionality processing;
a system interface connection configured to be coupled to a network;
at least one network processor, the network processor coupled to the system interface connection to receive data from the network; and
an interconnection between the system processor and the network processor so that the network processor may analyze data provided from the network and process the data at least in part and then forward the data to the interconnection so that other processing may be performed on the data within the system, wherein the interconnection enables a data content delivery path between the storage processor and the network processor for providing data from a storage system to the network, the data content delivery path by-passing the application processor. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 44, 45, 46, 47, 48, 49, 50)
-
-
14. A method of operating a content delivery system, the method comprising:
-
providing a network processor within the content delivery system, the network processor being configured to be coupled to an interface which couples the content delivery system to a network;
processing data passing through the interface with the network processor; and
forwarding data from the network processor to a distributed interconnection;
coupling a storage processor to the distributed interconnection, the storage processor configured to also be coupled to a storage system;
coupling an application processor to the distributed interconnection, the application processor performing at least some processing upon incoming network data; and
providing outgoing content through an outgoing content data path from the storage processor to the network processor, the outgoing content data path by-passing the application processor.
-
-
28. A network connectable content delivery system, comprising:
-
a first processor engine;
a second processor engine, the second processor engine being assigned types of tasks different from the types of tasks assigned to the first processor engine;
a storage processor engine, the storage processor engine being assigned types of tasks that are different from the types of tasks assigned to the first and second processor engines, the storage processor engine being configured to be coupled to a content storage system;
a distributed interconnection coupled to the first, second and storage processor engines, the tasks of the first, second and storage processor engines being assigned such that the system operates in staged pipeline manner through the distributed interconnection;
a content request path, the content request path including the first, second and storage processor engines; and
a content delivery path, the content delivery path by-passing the second processor engine.
-
-
51. A method of providing content from a content delivery system to a network, comprising:
-
providing a first processor engine;
providing a second processor engine;
providing a storage processor engine, the storage processor engine being configured to be coupled to a content storage system;
providing a distributed interconnection coupled to the first, second and third processor engines, the tasks of the first, second and third processor engines being assigned such that the system operates in staged pipeline manner through the distributed interconnection;
routing content requests through a request path that includes the first, second and storage processor engines; and
routing content to be delivered to the network through delivery path that by-passes the second processor engine. - View Dependent Claims (52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62)
-
Specification