Data pipeline architecture for analytics processing stack
First Claim
1. A system comprising:
- a data stream processing pipeline comprising sequential multiple input processing stages, including;
an ingestion stage comprising multiple ingestion processors configured to receive input streams from multiple different input sources;
an integration stage comprising converter interface circuitry configured to;
perform a classification on the input streams; and
responsive to the classification, assign the input streams to one or more dynamic converters of the converter interface circuitry configured to output data from multiple different stream types into a stream data in a predefined interchange format, the one or more dynamic converters comprising;
a structured data dynamic converter configured to handle input streams in a predetermined predictor data format;
a natural language dynamic converter configured to handle natural language input streams; and
a media dynamic converter configured to handle media input stream by;
selecting among speech-to-text processing for audio streams and computer vision processing for video streams; and
causing execution of selected processing for the media input streams, when present, to generate the output data; and
a storage stage comprising a memory hierarchy, the memory hierarchy configured to store the stream data in the pre-defined interchange format; and
an analytics processing stack coupled to the data stream processing pipeline, the analytics processing stack configured to;
access the stream data, via a hardware memory resource provided by storage layer circuitry of the analytics processing stack;
process the stream data at processing engine layer circuitry of the analytics processing stack to determine whether to provide the stream data to analytics model logic of analysis layer circuitry of the analytic processing stack, rule logic of the analysis layer circuitry of the analytic processing stack, or both;
when the stream data is passed to the analytics model logic;
determine a model change to a model parameter for a predictive data model for the stream data; and
pass a model change indicator of the model change to the storage layer circuitry for storage within the memory hierarchy;
when the stream data is passed to the rule logic;
determine a rule change for a rule governing response to content of the stream data; and
pass a rule change indicator of the rule change to the storage layer circuitry for storage within the memory hierarchy;
at insight processing layer circuitry above the analysis layer circuitry within the analytics processing stack;
access the model change indicator, the rule change indicator, the stream data, or any combination thereof via the hardware memory resource provided by the storage layer circuitry;
determine an insight adjustment responsive to the model change, the rule change, the stream data, or any combination thereof;
generate an insight adjustment indicator responsive to the insight adjustment; and
pass the insight adjustment indicator to the storage layer circuitry for storage within the memory hierarchy.
1 Assignment
0 Petitions
Accused Products
Abstract
A data pipeline architecture is integrated with an analytics processing stack. The data pipeline architecture may receive incoming data streams from multiple diverse endpoint systems. The data pipeline architecture may include converter interface circuitry with multiple dynamic converters configured to convert the diverse incoming data stream into one or more interchange formats for processing by the analytics processing stack. The analytics processing stack may include multiple layers with insight processing layer circuitry above analysis layer circuitry. The analysis layer circuitry may control analytics models and rule application. The insight processing layer circuitry may monitor output from the analysis layer circuitry and generate insight adjustments responsive to rule changes and analytics model parameter changes produced at the analysis layer circuitry.
-
Citations
20 Claims
-
1. A system comprising:
-
a data stream processing pipeline comprising sequential multiple input processing stages, including; an ingestion stage comprising multiple ingestion processors configured to receive input streams from multiple different input sources; an integration stage comprising converter interface circuitry configured to; perform a classification on the input streams; and
responsive to the classification, assign the input streams to one or more dynamic converters of the converter interface circuitry configured to output data from multiple different stream types into a stream data in a predefined interchange format, the one or more dynamic converters comprising;a structured data dynamic converter configured to handle input streams in a predetermined predictor data format; a natural language dynamic converter configured to handle natural language input streams; and a media dynamic converter configured to handle media input stream by; selecting among speech-to-text processing for audio streams and computer vision processing for video streams; and causing execution of selected processing for the media input streams, when present, to generate the output data; and a storage stage comprising a memory hierarchy, the memory hierarchy configured to store the stream data in the pre-defined interchange format; and
an analytics processing stack coupled to the data stream processing pipeline, the analytics processing stack configured to;access the stream data, via a hardware memory resource provided by storage layer circuitry of the analytics processing stack; process the stream data at processing engine layer circuitry of the analytics processing stack to determine whether to provide the stream data to analytics model logic of analysis layer circuitry of the analytic processing stack, rule logic of the analysis layer circuitry of the analytic processing stack, or both; when the stream data is passed to the analytics model logic; determine a model change to a model parameter for a predictive data model for the stream data; and pass a model change indicator of the model change to the storage layer circuitry for storage within the memory hierarchy; when the stream data is passed to the rule logic; determine a rule change for a rule governing response to content of the stream data; and pass a rule change indicator of the rule change to the storage layer circuitry for storage within the memory hierarchy; at insight processing layer circuitry above the analysis layer circuitry within the analytics processing stack; access the model change indicator, the rule change indicator, the stream data, or any combination thereof via the hardware memory resource provided by the storage layer circuitry; determine an insight adjustment responsive to the model change, the rule change, the stream data, or any combination thereof; generate an insight adjustment indicator responsive to the insight adjustment; and pass the insight adjustment indicator to the storage layer circuitry for storage within the memory hierarchy. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8)
-
-
9. A method comprising:
-
in a data stream processing pipeline comprising sequential multiple input processing stages; receiving, at an ingestion stage of a data stream processing pipeline, input streams from multiple different input sources, the ingestion stage comprising multiple ingestion processors; at an integration stage comprising converter interface circuitry; performing a classification on the input streams; and
responsive to the classification, assigning the input streams to one or more dynamic converters configured to output data from multiple different stream types into a stream data in a pre-defined interchange format, the one or more dynamic converters comprising;a structured data dynamic converter configured to handle input streams in a predetermined predictor data format; a natural language dynamic converter configured to handle natural language input streams; and a media dynamic converter configured to handle media input stream by; selecting among speech-to-text processing for audio streams and computer vision processing for video streams; and causing execution of selected processing for the media input streams, when present, to generate the output data; and
at a storage stage comprising a memory hierarchy, storing the stream data in the pre-defined interchange format; andin an analytics processing stack coupled to the data stream processing pipeline; accessing the stream data, via a hardware memory resource provided by storage layer circuitry of the analytics processing stack; processing the stream data at processing engine layer circuitry of the analytics processing stack to determine whether to provide the stream data to analytics model logic of analysis layer circuitry of the analytic processing stack, rule logic of the analysis layer circuitry of the analytic processing stack, or both; when the stream data is passed to the analytics model logic; determining a model change to a model parameter for a predictive data model for the stream data; and passing a model change indicator of the model change to the storage layer circuitry for storage within the memory hierarchy; when the stream data is passed to the rule logic; determining a rule change for a rule governing response to content of the stream data; and passing a rule change indicator of the rule change to the storage layer circuitry for storage within the memory hierarchy; at insight processing layer circuitry above the analysis layer circuitry within the analytics processing stack; accessing the model change indicator, the rule change indicator, the stream data, or any combination thereof via the hardware memory resource provided by the storage layer circuitry;
determining an insight adjustment responsive to the model change, the rule change, the stream data, or any combination thereof;generating an insight adjustment indicator responsive to the insight adjustment; and passing the insight adjustment indicator to the storage layer circuitry for storage within the memory hierarchy. - View Dependent Claims (10, 11, 12, 13)
-
-
14. A product comprising:
-
a machine readable medium other than a transitory signal; and instructions stored on the machine readable medium, the instructions configured to, when executed, cause a processor to; in a data stream processing pipeline comprising sequential multiple input processing stages; receive, at an ingestion stage of a data stream processing pipeline, input streams from multiple different input sources, the ingestion stage comprising multiple ingestion processors; at an integration stage comprising converter interface circuitry; perform a classification on the input streams; and
responsive to the classification, assign the input streams to one or more dynamic converters configured to output data from multiple different stream types into a stream data in a pre-defined interchange format, the one or more dynamic converters comprising;a structured data dynamic converter configured to handle input streams in a predetermined predictor data format;
a natural language dynamic converter configured to handle natural language input streams; anda media dynamic converter configured to handle media input streams by; selecting among speech-to-text processing for audio streams and computer vision processing for video streams; and causing execution of selected processing for the media input streams, when present, to generate the output data; and
at a storage stage comprising a memory hierarchy, store the stream data in the pre-defined interchange format; andin an analytics processing stack coupled to the data stream processing pipeline;
access the stream data, via a hardware memory resource provided by storage layer circuitry of the analytics processing stack;process the stream data at processing engine layer circuitry of the analytics process stack to determine whether to provide the stream data to analytics model logic of analysis layer circuitry of the analytic processing stack, rule logic of the analysis layer circuitry of the analytic processing stack, or both; when the stream data is passed to the analytics model logic; determine a model change to a model parameter for a predictive data model for the stream data; and pass a model change indicator of the model change to the storage layer circuitry for storage within the memory hierarchy; when the stream data is passed to the rule logic; determine a rule change for a rule governing response to content of the stream data; and pass a rule change indicator of the rule change to the storage layer circuitry for storage within the memory hierarchy; at insight processing layer circuitry above the analysis layer circuitry within the analytics processing stack; access the model change indicator, the rule change indicator, the stream data, or any combination thereof via the hardware memory resource provided by the storage layer circuitry; determine an insight adjustment responsive to the model change, the rule change, the stream data, or any combination thereof; generate an insight adjustment indicator responsive to the insight adjustment; and pass the insight adjustment indicator to the storage layer circuitry for storage within the memory hierarchy. - View Dependent Claims (15, 16, 17, 18, 19, 20)
-
Specification