Systems and/or methods for location transparent routing and execution of processes
First Claim
1. A method of configuring a network, the method comprising:
- providing a service-oriented integration server including a process engine;
connecting to the network at least one physical server, each said physical server including an instance of the integration server, each said instance of the integration server including an instance of the process engine;
providing a messaging layer for use with the network;
at design time;
modeling a process so as to generate a process model, the process model defining a plurality of activities associated with the process and at least one step associated with each said activity, andassigning each step from the plurality of activities to a plurality of logical servers, that are defined independent of physical servers, such that, at deployment time, any executable artifacts needed by an instance of the process engine are generatable;
generating runtime artifacts from the design time process model, said runtime artifact generation comprising;
generating, for each one of the plurality of logical servers, at least one package from the design time process model, each said package including at least one trigger and a process fragment file, each said trigger being responsible for subscribing to process transition document messages to be routed to the instance of the process engine installed on the corresponding instance of the integration server and for filtering process transition document messages published to the messaging layer, the process fragment file defining only those process steps that are associated with the corresponding logical server, andcreating two queues on the messaging layer, a first queue to process messages that trigger new process instances and a second queue to accommodate transitions between process steps of the process model; and
deploying each said package to a physical server, the package being deployed as a runtime asset for the corresponding instance of the process engine,wherein the process transition document messages are published by the process engine, each said process transition document message including routing data as a part of the message itself for routing the message, andwherein the process engine is configured to provide both location transparent;
1) execution of the process model such that the process model and each one of the steps within the process model are executed in a transparent manner with respect to the deployed location of each one of the packages that contain at least one of the plurality of steps; and
2) routing, via process transition document subscription and filtering.
1 Assignment
0 Petitions
Accused Products
Abstract
The example embodiments disclosed herein relate to networks and, more particularly, to systems and/or methods that enable processes to be routed and/or executed in a distributed, location transparent manner. A process engine for use across instances of a service-oriented integration server is provided to a network having a messaging layer. The process engine executes a process in accordance with a process model defined at design time. Each instance of the process engine comprises runtime artifacts such as deployable units of executable logic; a publishing service for publishing a process transition document (PTD) that includes routing data as a part of the message itself; a subscription service for subscribing to the PTDs to be routed to the corresponding instance of the process engine; and a filtering service for filtering other PTDs published to the messaging layer. The messaging layer includes a first queue to process PTDs that trigger new process instances, and a second queue to accommodate transitions between steps of the process model. After a step in the process model is executed, the publishing service publishes a new PTD to cause a next step in the process model to be executed.
17 Citations
40 Claims
-
1. A method of configuring a network, the method comprising:
-
providing a service-oriented integration server including a process engine; connecting to the network at least one physical server, each said physical server including an instance of the integration server, each said instance of the integration server including an instance of the process engine; providing a messaging layer for use with the network; at design time; modeling a process so as to generate a process model, the process model defining a plurality of activities associated with the process and at least one step associated with each said activity, and assigning each step from the plurality of activities to a plurality of logical servers, that are defined independent of physical servers, such that, at deployment time, any executable artifacts needed by an instance of the process engine are generatable; generating runtime artifacts from the design time process model, said runtime artifact generation comprising; generating, for each one of the plurality of logical servers, at least one package from the design time process model, each said package including at least one trigger and a process fragment file, each said trigger being responsible for subscribing to process transition document messages to be routed to the instance of the process engine installed on the corresponding instance of the integration server and for filtering process transition document messages published to the messaging layer, the process fragment file defining only those process steps that are associated with the corresponding logical server, and creating two queues on the messaging layer, a first queue to process messages that trigger new process instances and a second queue to accommodate transitions between process steps of the process model; and deploying each said package to a physical server, the package being deployed as a runtime asset for the corresponding instance of the process engine, wherein the process transition document messages are published by the process engine, each said process transition document message including routing data as a part of the message itself for routing the message, and wherein the process engine is configured to provide both location transparent;
1) execution of the process model such that the process model and each one of the steps within the process model are executed in a transparent manner with respect to the deployed location of each one of the packages that contain at least one of the plurality of steps; and
2) routing, via process transition document subscription and filtering. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9, 10, 11)
-
-
12. A method of operating a network including at least one physical server and a messaging layer, each said physical server including an instance of a service-oriented integration server provided to the network, each said instance of the integration server including an instance of a process engine, the method comprising:
-
at design time, generating a process model corresponding to a process to be executed, the process model including a plurality of activities associated with the process and at least one step associated with each said activity; assigning each one of the steps from the plurality of activities to a plurality of logical servers at design time; generating, for each one of the plurality of logical servers with associated steps, runtime artifacts from the process model, said runtime artifacts comprising at least one deployable unit of executable logic for execution on a corresponding instance of the process, engine; creating a first queue to process transition documents that trigger new process instances, and a second queue to accommodate transitions between steps of the process model; deploying each unit of executable logic to a physical server as a runtime asset for the corresponding instance of the process engine; initiating an instance of the process according to the process model; publishing a process transition document, the process transition document being populated by the process engine with routing data as a part of the message itself for routing the message; for each instance of the process engine, subscribing to the process transition documents to be routed to the corresponding instance of the process engine installed on the corresponding instance of the integration server and filtering other process transition documents published to the messaging layer; and
,after a step in the process model is executed, publishing a new process transition document to cause a next step in the process model to be executed, wherein the process is routed and executed in a location transparent manner such that the process model and each one of the steps within the process model are executed in a transparent manner with respect to the deployed physical of each one of the unit of executable logic that contain at least one of the plurality of steps. - View Dependent Claims (13, 14, 15, 16, 17, 18, 19, 20)
-
-
21. A computer-mediated network, comprising:
-
a service-oriented integration server including a process engine configured to provide location transparent distributed execution of a process conforming to a user-defined process model, the user-defined process model defining a plurality of activities associated with the process and at least one step associated with each said activity; a messaging layer for the network; and
,at least one physical server, each said physical server including an instance of the integration server, wherein each said instance of the integration server includes; an instance of the process engine, runtime artifacts generated at design time based on the design time process model, including; at least one package, each said package corresponding to logical grouping of one or more steps of the plurality of activities, and two queues provided for the messaging layer, a first queue to process messages that trigger new process instances and a second queue to accommodate transitions between process steps of the process model, at least one trigger each said trigger being responsible for subscribing to process transition document messages to be routed to the instance of the process engine installed on the corresponding instance of the integration server and for filtering process transition document messages published to the messaging layer, and a process fragment file, the process fragment file defining only those steps of the user defined process model that are associated with a corresponding logical grouping; wherein the process transition document messages are published by the process engine, each said process transition document message including routing data as a part of the message itself for routing the message, wherein the process engine is configured to provide location transparent execution of the process model such that the process model and each one of the steps within the process model are executed in a transparent manner with respect a physical network topology that includes the at least one physical server. - View Dependent Claims (22, 23, 24, 25, 26, 27, 28, 29, 30)
-
-
31. A non-transitory computer readable storage medium storing a process engine for use across instances of a service-oriented integration server provided to a network having a messaging layer, the process engine being configured to execute a process in accordance with a process model defined at design time so as to include a plurality of activities associated with the process and at least one step associated with each said activity, each one of the steps of the plurality of activities assigned among a plurality of logical servers, each instance of the process engine comprising:
-
runtime artifacts generated from the process model that correspond to a deployable unit of executable logic for execution on the instance of the process engine; a publishing service configured to publish a process transition document, the process transition document being populated by the process engine with routing data as a part of the message itself for routing the message; a subscription service configured to subscribe to the process transition documents to be routed to the corresponding instance of the process engine installed on the corresponding instance of the integration server; a filtering service configured to filter other process transition documents published to the messaging layer; wherein the messaging layer includes two queues generated from the design time process model, including a first queue to process transition documents that trigger new process instances, and a second queue to accommodate transitions between steps of the process model, wherein, after a step in the process model is executed, the publishing service is configured to publish a new process transition document to cause a next step in the process model to be executed independent of a physical location of where the next step is to be executed, and wherein the process is routed and executed in a location transparent manner such that the process model and each one of the plurality of steps within the process model are executed in a transparent manner with respect to the process engine. - View Dependent Claims (32, 33, 34, 35, 36, 37, 38, 39, 40)
-
Specification