Hierarchical caching and analytics
First Claim
Patent Images
1. A system comprising:
- at least one end-node implementing a first stage of a multi-stage hierarchical analytics and caching technique, wherein said end-node is configured to communicate directly with one or more other end-nodes via one or more wireless links;
at least one edge node implementing a second stage of the multi-stage hierarchical analytics and caching technique, wherein said edge node is configured to communicate directly with one or more end nodes and one or more other edge nodes via wireless links; and
an edge cloud video headend implementing a third stage of the multi-stage hierarchical analytics and caching technique and configured to communicate with one or more of said end-nodes and one or more of said edge nodes, wherein said edge cloud video headend is configured to communicate with said one or more edge nodes using cable interconnections, optical interconnections, and wireless interconnections, the at least one end-node, the at least one edge node, and the edge cloud video headend each comprise an analytics and caching module/server that includes a flexible computing engine configured to implement low latency remote memory and storage access acceleration and predictive traffic shaping, and a low latency switching fabric performs predictive routing of packets using a destination ID assigned to each node and maps the destination ID to individual wavelengths of the optical interconnections.
3 Assignments
0 Petitions
Accused Products
Abstract
A system includes at least one end-node, at least one edge node, and an edge cloud video headend. The at least one end node generally implements a first stage of a multi-stage hierarchical analytics and caching technique. The at least one edge node generally implements a second stage of the multi-stage hierarchical analytics and caching technique. The edge cloud video headend generally implements a third stage of the multi-stage hierarchical analytics and caching technique.
6 Citations
18 Claims
-
1. A system comprising:
-
at least one end-node implementing a first stage of a multi-stage hierarchical analytics and caching technique, wherein said end-node is configured to communicate directly with one or more other end-nodes via one or more wireless links; at least one edge node implementing a second stage of the multi-stage hierarchical analytics and caching technique, wherein said edge node is configured to communicate directly with one or more end nodes and one or more other edge nodes via wireless links; and an edge cloud video headend implementing a third stage of the multi-stage hierarchical analytics and caching technique and configured to communicate with one or more of said end-nodes and one or more of said edge nodes, wherein said edge cloud video headend is configured to communicate with said one or more edge nodes using cable interconnections, optical interconnections, and wireless interconnections, the at least one end-node, the at least one edge node, and the edge cloud video headend each comprise an analytics and caching module/server that includes a flexible computing engine configured to implement low latency remote memory and storage access acceleration and predictive traffic shaping, and a low latency switching fabric performs predictive routing of packets using a destination ID assigned to each node and maps the destination ID to individual wavelengths of the optical interconnections. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8)
-
-
9. An apparatus comprising:
-
a low latency switch fabric enabled to communicate via one or more wireless links and one or more optical links; a first flexible computing engine directly attached to the low latency switch fabric and configured to implement predictive analytics and caching; a second flexible computing engine directly attached to the low latency switch fabric and configured to implement low latency remote memory and storage access acceleration and predictive fabric traffic shaping, wherein the low latency switch fabric performs predictive routing of packets using a destination ID assigned to each node and maps the destination ID to individual wavelengths of the optical links; a synchronization engine directly attached to the low latency switch fabric and configured to synchronize time and node metadata; and a caching and storage module directly attached to the low latency switch fabric by one of said wireless links and one of said optical links. - View Dependent Claims (10, 11, 12, 13, 14, 15, 16)
-
-
17. A method of real-time decision making comprising:
-
implementing a first stage of a multi-stage hierarchical analytics and caching technique in at least one end-node, wherein said end-node is configured to communicate directly with one or more other end-nodes via one or more wireless links; implementing a second stage of the multi-stage hierarchical analytics and caching technique in an edge node comprising at least one of a camera node and a sensor node, wherein said edge node is configured to communicate with one or more other edge nodes via one or more wireless links and with one or more end-nodes via one or more wireless links; and implementing a third stage of the multi-stage hierarchical analytics and caching technique in an edge cloud video headend, wherein said edge cloud video headend is configured to communicate with one or more of said end-nodes and one or more of said edge nodes, and said edge cloud video headend is configured to communicate with said one or more edge nodes using cable interconnections, optical interconnections, and wireless interconnections, the at least one end-node, the at least one edge node, and the edge cloud video headend each comprise an analytics and caching module/server that includes a flexible computing engine configured to implement low latency remote memory and storage access acceleration and predictive traffic shaping, and a low latency switching fabric performs predictive routing of packets using a destination ID assigned to each node and maps the destination ID to individual wavelengths of the optical interconnections. - View Dependent Claims (18)
-
Specification