System and method for autonomously generating heterogeneous data source interoperability bridges based on semantic modeling derived from self adapting ontology
First Claim
1. A system connected to multiple heterogeneous data sources each having a data structure, said system monitoring at least one of said data structures, analyzing changes to said at least one of said data structure and providing for simultaneous re-coding of adapters between at least two of said multiple heterogeneous data sources.
4 Assignments
0 Petitions
Accused Products
Abstract
A system, including software components, that efficiently and dynamically analyzes changes to data sources, including application programs, within an integration environment and simultaneously re-codes dynamic adapters between the data sources is disclosed. The system also monitors at least two of said data sources to detect similarities within the data structures of said data sources and generates new dynamic adapters to integrate said at least two of said data sources. The system also provides real time error validation of dynamic adapters as well as performance optimization of newly created dynamic adapters that have been generated under changing environmental conditions.
-
Citations
157 Claims
- 1. A system connected to multiple heterogeneous data sources each having a data structure, said system monitoring at least one of said data structures, analyzing changes to said at least one of said data structure and providing for simultaneous re-coding of adapters between at least two of said multiple heterogeneous data sources.
- 3. A system connected to multiple heterogeneous data sources each having a data structure, said system monitoring at least two of said data sources to detect similarities within the data structures of said data sources and generating new dynamic adapters to integrate said at least two of said data sources.
-
4. The process in a system within an integration environment for analyzing changes to multiple heterogeneous data sources each having a data structure and providing for simultaneous re-coding of dynamic adapters between said multiple heterogeneous data sources, including the steps of intelligently analyzing the conceptual relationships and alternative data mapping strategies between a plurality of said data structures by utilizing intelligent computer programs to analyze and adapt to structural, contextual and semantic differences between said multiple heterogeneous data sources.
-
46. The process of operating on two data sources within a system including other components than said two data sources, said other components including at least a Common Ontology library, including the steps of:
-
monitoring each of said data sources by an Assessment Micro Agent including a Schema Manager, said Assessment Micro Agent creating an inventory of the data structures and functionalities of said data sources and making said inventory available to predetermined ones of said other components of said system, said Assessment Micro Agent detecting a change in either of said data sources and notifying at least some of said other components of the change. - View Dependent Claims (47, 48, 49, 50, 51, 52, 53, 54, 55)
-
-
56. An Assessment Micro Agent comprising a plurality of components including:
-
a Schema—
Manager connected to at least one data source for analyzing said at least one data source and extracting a meta-data model in the form of a schema, storing said schema and providing an interface to certain of said plurality of components for retrieving the schema;
a Change Specification Manager for performing an analysis of what is different between two different versions of a data source by comparing the schemas associated with each version and presenting the change specification file to a user in a structured manner with specific information indicating changes in the schemas;
a Task scheduler for allowing a user to schedule tasks; and
a Notification Manager for providing an interface in which users can define notifications at several levels of granularity. - View Dependent Claims (57, 58, 59, 60)
-
-
61. The process of operating an Application Ontology Factory including the steps of:
-
converting the schema obtained from the Schema Manager component of the Assessment Micro Agent into a language compatible to the Common Ontology;
mapping schema element identifiers to a WordNet to extract at least one of the senses of said elements;
using said senses to extract all possible Common Ontology concept hierarchies to which the element might be a top-most specialization;
assigning each concept hierarchy a confidence factor;
merging said concept hierarchies to produce a micro-theory including each of said senses. - View Dependent Claims (62, 63)
-
-
64. In an artificial intelligence system connected to multiple heterogeneous data sources for generating new dynamic adapters to integrate changes in at least two of said data sources, the process of describing a schema using the syntax of the Common Ontology language.
- 65. In a system for automatically re-coding interfaces between heterogeneous data sources the process of monitoring changes in a monitored data source, analyzing the exact nature of the change, evaluating alternative data mapping possibilities, and adjusting the existing dynamic adapter integration code structures to address the changes.
-
69. In a system for automatically generating dynamic adapters between heterogeneous data sources the process of monitoring changes in a monitored data source using pattern matching, said process including the steps of:
-
generating a data source to ontology mapping for each data source being mapped by evaluating the mathematical probabilities of lexical and semantic relationships between schema entities and ontology concepts;
determining lexical closeness between the data source ontology and Common Ontology concepts using synonym relationships;
determining mathematical closeness of semantic relationships in the form of hypernyms; and
determining confidence factors based on the mathematical probability of said data source ontology and said Common Ontology being lexically and semantically close. - View Dependent Claims (70, 71, 72, 73, 74, 75)
-
- 76. In a system for automatically generating dynamic adapters between heterogeneous data sources, a Planner receiving the change specification file created by the Change Specification Manager and developing and logically testing an ordered dynamic adapter development plan.
- 77. In a system for automatically generating dynamic adapters between heterogeneous data sources, a Planner receiving a similarity map file created by an App2App Similarity Mapper and developing and logically testing an ordered dynamic adapter development plan.
-
81. An App2App Ontology Mapper for producing data mapping between schema elements, said mappings having confidence factors, said App2App Ontology Mapper including a software process for detecting that said mapping is accomplished by a lexical, semantic, expected data value, composition or decomposition process and, responsive to any such detecting, increasing said confidence factor.
-
82. An App2App Ontology Mapper for producing data mapping between schema elements, said mappings having confidence factors, said App2App Ontology Mapper including a software process for detecting that said mapping is refuted by a lexical, semantic, expected data value, composition or decomposition process and, responsive to any such detecting, lowering said confidence factor.
-
83. An App2App Ontology Mapper for producing data mappings between schema elements, said mappings having confidence factors, said App2App Ontology Mapper including a software process for assigning a lower confidence factor to mappings accomplished by lexical similarity than to mappings accomplished by lexical similarity plus semantic mapping.
-
84. An App2App Ontology Mapper for producing data mappings between schema elements, said mappings having confidence factors, said App2App Ontology Mapper including a software process for assigning a lower confidence factor to mappings accomplished by semantic mapping than to mappings accomplished by semantic mapping and expected data value mapping.
-
85. In a system for generating dynamic adapters between changed data sources, a process for generating dynamic adapters including the steps of:
-
after an integration plan between two data sources has been generated, an Assessment Micro Agent determining that one of said data source'"'"'s data structure has changed and, in response to said detecting, informing a Planner software component to generate a new plan if the previously generated plan has been affected by said change;
creating a Change Specification File that describes said changes that occurred;
discovering which schema elements of said dynamic adapter have changed;
mapping the affected schema elements into the existing data source ontology;
performing lexical and semantic mapping on the affected schema elements to find new associations with said data source ontology;
in response to finding said new associations, validating said new associations; and
attempting to find new mappings for the affected elements. - View Dependent Claims (86, 87, 88, 89, 90)
-
-
91. In a system for generating revised dynamic adapters between changed data sources, a process for revising said adapters including the steps of:
-
a Planner presenting an integration plan approved by a user as input to a CodeGen Agent;
said CodeGen Agent executing the development of new adapters by reparsing said integration plan into a user-selected programming language. - View Dependent Claims (92)
-
-
93. In a system for generating new dynamic adapters between data sources, a process for generating said adapters including the steps of:
-
a Planner presenting as input to a CodeGen Agent an integration plan approved by a user, said integration plan including an indication of a use-selected programming language;
said CodeGen Agent executing the development of new adapters by producing programming instructions to accomplish the integration plan in the user-elected programming language.
-
- 94. For use in a system for generating new dynamic adapters between data sources, an Error Management Micro Agent coupled to a Planner and accepting the output from said Planner to determine and categorize program errors and remediation plans.
-
99. A system for automatically re-coding interfaces between heterogeneous data sources comprising:
-
means for monitoring modifications made to a data source existing within an integration environment, wherein the environment contains multiple heterogeneous data sources, means for analyzing said modifications, means for formulating a set of potential ontological mappings between heterogeneous data sources, means for providing interoperability code structures between heterogeneous data sources. - View Dependent Claims (100)
-
-
101. A system for automatically re-coding interfaces between heterogeneous data sources comprising:
-
means for monitoring and analyzing modification made to a data source existing within an integration environment, wherein the environment contains multiple heterogeneous data sources;
means for formulating a set of potential ontological mappings between heterogeneous data sources and providing interoperability code structures between data sources.
-
-
102. In a system for automatically generating dynamic adapters between heterogeneous data sources the process of generating a new adapter, said process including the steps of:
-
generating a data source to ontology mapping for each data source being mapped by evaluating the mathematical probabilities of lexical and semantic relationships between schema entities and ontology concepts;
determining lexical closeness between the data source ontology and Common Ontology concepts using synonym relationships;
determining mathematical closeness of semantic relationships in the form of hypernyms;
determining confidence factors based on the mathematical probability of said data source ontology and said Common Ontology being lexically and semantically close. - View Dependent Claims (103, 104, 105, 106, 107, 108)
-
-
112. In a system for generating dynamic adapters between two data sources, a process for developing dynamic adapters including the steps of:
before an integration plan between said two data sources has been generated, an App2App Similarity Mapper determining the similarities between said two data sources and informing a Planner software component to generate a new plan, said App2App Similarity Mapper performing at least the steps of;
creating an App2App similarity map that describes said similarities;
mapping the schema elements affected by said similarities to an existing data source ontology;
performing lexical and semantic mapping on the affected schema elements to find new associations with said data source ontology;
in response to finding said new associations, validating said new associations; and
attempting to find new mappings for the affected elements. - View Dependent Claims (113, 114, 115, 116, 117)
- 118. One or more processor readable storage devices having processor readable code embodied on said processor readable storage devices, said processor readable code for programming one or more processors to perform in a system within an integration environment for analyzing changes to multiple heterogeneous data sources each having a data structure and providing for simultaneous re-coding of dynamic adapters between said multiple heterogeneous data sources, the process comprising the step of intelligently analyzing the conceptual relationships and alternative data mapping strategies between a plurality of said data structures by utilizing intelligent computer programs to analyze and adapt to structural, contextual and semantic differences between said multiple heterogeneous data sources.
-
123. One or more processor readable storage devices having processor readable code embodied on said processor readable storage devices, said processor readable code for programming one or more processors to perform a process of operating on two data sources within a system including other components than said two data sources, said other components including at least a Common Ontology library, the process comprising the steps of:
-
monitoring each of said data sources by an Assessment Micro Agent including a Schema Manager;
said Assessment Micro Agent creating an inventory of the data structures and functionalities of said data sources and making said inventory available to predetermined ones of said other components of said system;
said Assessment Micro Agent detecting a change in either of said data sources and notifying at least some of said other components of the change. - View Dependent Claims (124, 125, 126, 127, 128, 129, 130, 131)
-
-
132. One or more processor readable storage devices having processor readable code embodied on said processor readable storage devices, said processor readable code for programming one or more processors to perform a process of operating an Application Ontology Factory, the process comprising the steps of:
-
converting the schema obtained from the Schema Manager component of the Assessment Micro Agent into a language compatible to the Common Ontology;
mapping schema element identifiers to a WordNet to extract at least one of the senses of said elements;
using said senses to extract all possible Common Ontology concept hierarchies to which the element might be a top-most specialization;
assigning each concept hierarchy a confidence factor;
merging said concept hierarchies to produce a micro-theory including each of said senses. - View Dependent Claims (133, 134)
-
-
135. One or more processor readable storage devices having processor readable code embodied on said processor readable storage devices, said processor readable code for programming one or more processors to perform a process, in an artificial intelligence system connected to multiple heterogeneous data sources for generating new dynamic adapters to integrate changes in at least two of said data sources, the process of describing a schema using the syntax of the Common Ontology language.
- 136. One or more processor readable storage devices having processor readable code embodied on said processor readable storage devices, said processor readable code for programming one or more processors to perform a process, in a system for automatically recoding interfaces between heterogeneous data sources, the process comprising the step of monitoring changes in a monitored data source, analyzing the exact nature of the change, evaluating alternative data mapping possibilities, and adjusting the existing dynamic adapter integration code structures to address the changes.
-
138. One or more processor readable storage devices having processor readable code embodied on said processor readable storage devices, said processor readable code for programming one or more processors to perform, in a system for automatically generating dynamic adapters between heterogeneous data sources, the process of monitoring changes in a monitored data source using pattern matching, the process comprising the steps of:
-
generating a data source to ontology mapping for each data source being mapped by evaluating the mathematical probabilities of lexical and semantic relationships between schema entities and ontology concepts;
determining lexical closeness between the data source ontology and Common Ontology concepts using synonym relationships;
determining mathematical closeness of semantic relationships in the form of hypernyms; and
determining confidence factors based on the mathematical probability of said data source ontology and said Common Ontology being lexically and semantically close. - View Dependent Claims (139)
-
-
140. One or more processor readable storage devices having processor readable code embodied on said processor readable storage devices, said processor readable code for programming one or more processors to perform a process in a system for automatically generating dynamic adapters between heterogeneous data sources, the process comprising the step of a Planner receiving the change specification file created by the Change Specification Manager and developing and logically testing an ordered dynamic adapter development plan.
-
141. One or more processor readable storage devices having processor readable code embodied on said processor readable storage devices, said processor readable code for programming one or more processors to perform a process, in a system for automatically generating dynamic adapters between heterogeneous data sources, the process comprising the step of a Planner receiving a similarity map file created by an App2App Similarity Mapper and developing and logically testing an ordered dynamic adapter development plan.
-
142. One or more processor readable storage devices having processor readable code embodied on said processor readable storage devices, said processor readable code for programming one or more processors to perform a process in a system for generating dynamic adapters between changed data sources, said process for generating dynamic adapters including the steps of:
-
after an integration plan between two data sources has been generated, an Assessment Micro Agent determining that one of said data source'"'"'s data structure has changed and, in response to said detecting, informing a Planner software component to generate a new plan if the previously generated plan has been affected by said change;
creating a Change Specification File that describes said changes that occurred;
discovering which schema elements of said dynamic adapter have changed;
mapping the affected schema elements into the existing data source ontology;
performing lexical and semantic mapping on the affected schema elements to find new associations with said data source ontology;
in response to finding said new associations, validating said new associations; and
attempting to find new mappings for the affected elements. - View Dependent Claims (143, 144, 145)
-
-
146. One or more processor readable storage devices having processor readable code embodied on said processor readable storage devices, said processor readable code for programming one or more processors to perform, in a system for generating revised dynamic adapters between changed data sources, a process for revising said adapters the process comprising the steps of:
-
a Planner presenting an integration plan approved by a user as input to a CodeGen Agent;
said CodeGen Agent executing the development of new adapters by reparsing said integration plan into a user-selected programming language. - View Dependent Claims (147)
-
-
148. One or more processor readable storage devices having processor readable code embodied on said processor readable storage devices, said processor readable code for programming one or more processors to perform, in a system for generating new dynamic adapters between data sources, a process for generating said adapters, the process comprising the steps of:
-
a Planner presenting as input to a CodeGen Agent an integration plan approved by a user, said integration plan including an indication of a use-selected programming language;
said CodeGen Agent executing the development of new adapters by producing programming instructions to accomplish the integration plan in the user-elected programming language.
-
-
149. One or more processor readable storage devices having processor readable code embodied on said processor readable storage devices, said processor readable code for programming one or more processors to perform, in a system for automatically generating dynamic adapters between heterogeneous data sources the process of generating a new adapter, the process comprising the steps of:
-
generating a data source to ontology mapping for each data source being mapped by evaluating the mathematical probabilities of lexical and semantic relationships between schema entities and ontology concepts;
determining lexical closeness between the data source ontology and Common Ontology concepts using synonym relationships;
determining mathematical closeness of semantic relationships in the form of hypernyms;
determining confidence factors based on the mathematical probability of said data source ontology and said Common Ontology being lexically and semantically close. - View Dependent Claims (150)
-
-
151. One or more processor readable storage devices having processor readable code embodied on said processor readable storage devices, said processor readable code for programming one or more processors to perform, in a system for generating dynamic adapters between two data sources, a process for developing dynamic adapters, the process comprising the steps of:
-
before an integration plan between said two data sources has been generated, an App2App Similarity Mapper determining the similarities between said two data sources and informing a Planner software component to generate a new plan, said App2App Similarity Mapper performing at least the steps of;
creating an App2App similarity map that describes said similarities;
mapping the schema elements affected by said similarities to an existing data source ontology;
performing lexical and semantic mapping on the affected schema elements to find new associations with said data source ontology;
in response to finding said new associations, validating said new associations; and
attempting to find new mappings for the affected elements. - View Dependent Claims (152, 153)
-
-
154. A process of managing revision in a data source including the steps of:
-
connecting an Assessment Micro Agent to a data source;
using the Schema Manager, extracting information about the data source;
using the Schema Manager, building a schema of the data source from at least some of said extracted information; and
presenting the schema to a user. - View Dependent Claims (155, 156, 157)
-
Specification