Content delivery system providing accelerate content delivery
First Claim
Patent Images
1. A method of providing a network endpoint content delivery system through the use of a network connectable computing system, comprising:
- providing a plurality of separate processor engines, the processor engines being assigned separate tasks in an asymmetrical multi-processor configuration;
providing a network interface connection to at least one of the processor engines to couple the content delivery system to a network;
providing a storage interface connection to the storage processor engine to couple the storage processor engine to a content storage system; and
accelerating content delivery through the network endpoint system by allowing the plurality of separate processor engines to operate in a staged pipeline manner thereby enabling parallel processing of separate system tasks.
1 Assignment
0 Petitions
Accused Products
Abstract
Systems and methods are provided for network connected content delivery systems that employ functional multi-processing to optimize bandwidth utilization and accelerate system performance. In one embodiment, the content delivery system may include a switch based computing system. The system may further include an asymmetric multi-processor system configured in a staged pipeline manner.
-
Citations
50 Claims
-
1. A method of providing a network endpoint content delivery system through the use of a network connectable computing system, comprising:
-
providing a plurality of separate processor engines, the processor engines being assigned separate tasks in an asymmetrical multi-processor configuration;
providing a network interface connection to at least one of the processor engines to couple the content delivery system to a network;
providing a storage interface connection to the storage processor engine to couple the storage processor engine to a content storage system; and
accelerating content delivery through the network endpoint system by allowing the plurality of separate processor engines to operate in a staged pipeline manner thereby enabling parallel processing of separate system tasks. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9, 10, 11)
-
-
12. A network endpoint content delivery system, comprising:
-
a plurality of separate processor engines, the processor engines being assigned separate tasks in an asymmetrical multi-processor configuration;
a network interface connection to at least one of the processor engines to couple the content delivery system to a network; and
a storage interface connection to the storage processor engine to couple the storage processor engine to a content storage system;
wherein the plurality of separate processor engines operate in a staged pipeline manner thereby enabling parallel processing of separate system tasks. - View Dependent Claims (13, 14, 15, 16, 17, 18, 19)
-
-
20. A network connectable content delivery system, comprising:
-
a first processor engine;
a second processor engine, the second processor engine being assigned types of tasks different from the types of tasks assigned to the first processor engine;
a third processor engine, the third processor engine being assigned types of tasks that are different from the types of tasks assigned to the first and second processor engines; and
a distributed interconnection coupled to the first, second and third processor engines, wherein the tasks of the first, second and third processor engines are assigned such that the system operates in asymmetrical multi-processor staged pipeline manner through the distributed interconnection to provide accelerated content delivery. - View Dependent Claims (21, 22, 23, 24, 25)
-
-
26. A method of operating a network connectable content delivery system, comprising:
-
providing a network interface processor engine comprising at least one network processor, the network interface processor engine configured to be couple to a network;
providing a second processor engine;
providing a storage processor engine, the storage processor engine configured to be coupled to a storage system;
coupling the network interface processor engine, the second processor engine and the storage processor engine to each other through a distributed interconnection;
assigning the first, second, and storage processor engines separate types of tasks; and
accelerating content delivery from the storage processor engine to the network. - View Dependent Claims (27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 44, 45, 46, 47, 48, 49, 50)
-
Specification