Systems and/or methods for location transparent routing and execution of processes
First Claim
1. A method of configuring a network, the method comprising:
- providing a service-oriented integration server including a process engine;
connecting to the network at least one physical server, each said physical server including an instance of the integration server, each said instance of the integration server including an instance of the process engine;
providing a messaging layer for use with the network;
at design time;
modeling a process so as to generate a process model, the process model defining at least one activity associated with the process and at least one step associated with each said activity, andassigning each said activity to a logical server such that, at deployment time, any executable artifacts needed by an instance of the process engine are generatable;
generating runtime artifacts from the design time process model, said runtime artifact generation comprising;
generating at least one package from the design time process model, each said package corresponding to one said logical server and including at least one trigger and a process fragment file, each said trigger being responsible for subscribing to process transition document messages to be routed to the instance of the process engine installed on the corresponding instance of the integration server and for filtering process transition document messages published to the messaging layer, the process fragment file defining only those process steps that are associated with the corresponding logical server, andcreating two queues on the messaging layer, a first queue to process messages that trigger new process instances and a second queue to accommodate transitions between process steps of the process model; and
deploying each said package to a physical server, the package being deployed as a runtime asset for the corresponding instance of the process engine,wherein the process transition document messages are published by the process engine, each said process transition document message including routing data as a part of the message itself for routing the message, andwherein the process engine is configured to provide location transparent routing and process execution via process transition document subscription and filtering.
1 Assignment
0 Petitions
Accused Products
Abstract
The example embodiments disclosed herein relate to networks and, more particularly, to systems and/or methods that enable processes to be routed and/or executed in a distributed, location transparent manner. A process engine for use across instances of a service-oriented integration server is provided to a network having a messaging layer. The process engine executes a process in accordance with a process model defined at design time. Each instance of the process engine comprises runtime artifacts such as deployable units of executable logic; a publishing service for publishing a process transition document (PTD) that includes routing data as a part of the message itself; a subscription service for subscribing to the PTDs to be routed to the corresponding instance of the process engine; and a filtering service for filtering other PTDs published to the messaging layer. The messaging layer includes a first queue to process PTDs that trigger new process instances, and a second queue to accommodate transitions between steps of the process model. After a step in the process model is executed, the publishing service publishes a new PTD to cause a next step in the process model to be executed.
32 Citations
32 Claims
-
1. A method of configuring a network, the method comprising:
-
providing a service-oriented integration server including a process engine; connecting to the network at least one physical server, each said physical server including an instance of the integration server, each said instance of the integration server including an instance of the process engine; providing a messaging layer for use with the network; at design time; modeling a process so as to generate a process model, the process model defining at least one activity associated with the process and at least one step associated with each said activity, and assigning each said activity to a logical server such that, at deployment time, any executable artifacts needed by an instance of the process engine are generatable; generating runtime artifacts from the design time process model, said runtime artifact generation comprising; generating at least one package from the design time process model, each said package corresponding to one said logical server and including at least one trigger and a process fragment file, each said trigger being responsible for subscribing to process transition document messages to be routed to the instance of the process engine installed on the corresponding instance of the integration server and for filtering process transition document messages published to the messaging layer, the process fragment file defining only those process steps that are associated with the corresponding logical server, and creating two queues on the messaging layer, a first queue to process messages that trigger new process instances and a second queue to accommodate transitions between process steps of the process model; and deploying each said package to a physical server, the package being deployed as a runtime asset for the corresponding instance of the process engine, wherein the process transition document messages are published by the process engine, each said process transition document message including routing data as a part of the message itself for routing the message, and wherein the process engine is configured to provide location transparent routing and process execution via process transition document subscription and filtering. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9)
-
-
10. A method of operating a network including at least one physical server and a messaging layer, each said physical server including an instance of a service-oriented integration server provided to the network, each said instance of the integration server including an instance of a process engine, the method comprising:
-
at design time, generating a process model corresponding to a process to be executed, the process model including at least one activity associated with the process and at least one step associated with each said activity; generating runtime artifacts from the process model, said runtime artifacts comprising at least one deployable unit of executable logic for execution on a corresponding instance of the process engine, a first queue to process transition documents that trigger new process instances, and a second queue to accommodate transitions between steps of the process model; deploying each unit of executable logic to a physical server as a runtime asset for the corresponding instance of the process engine; initiating an instance of the process according to the process model; publishing a process transition document, the process transition document being populated by the process engine with routing data as a part of the message itself for routing the message; for each instance of the process engine, subscribing to the process transition documents to be routed to the corresponding instance of the process engine installed on the corresponding instance of the integration server and filtering other process transition documents published to the messaging layer; and
,after a step in the process model is executed, publishing a new process transition document to cause a next step in the process model to be executed, wherein the process is routed and executed in a location transparent manner. - View Dependent Claims (11, 12, 13, 14, 15, 16)
-
-
17. A computer-mediated network, comprising:
-
a service-oriented integration server including a process engine configured to provide location transparent distributed execution of a process conforming to a user-defined process model, the process model defining at least one activity associated with the process and at least one step associated with each said activity; a messaging layer for the network; and
,at least one physical server, each said physical server including an instance of the integration server, wherein each said instance of the integration server includes; an instance of the process engine, runtime artifacts generated at design time based on the design time process model, including; at least one package, each said package corresponding to one said logical grouping of one or more activities, and two queues provided for the messaging layer, a first queue to process messages that trigger new process instances and a second queue to accommodate transitions between process steps of the process model, at least one trigger, each said trigger being responsible for subscribing to process transition document messages to be routed to the instance of the process engine installed on the corresponding instance of the integration server and for filtering process transition document messages published to the messaging layer, and a process fragment file, the process fragment file defining only those process steps that are associated with the corresponding logical server; wherein the process transition document messages are published by the process engine, each said process transition document message including routing data as a part of the message itself for routing the message. - View Dependent Claims (18, 19, 20, 21, 22, 23, 24)
-
-
25. A process engine for use across instances of a service-oriented integration server provided to a network having a messaging layer, the process engine being configured to execute a process in accordance with a process model defined at design time so as to include at least one activity associated with the process and at least one step associated with each said activity, each instance of the process engine comprising:
-
runtime artifacts generated from the process model that correspond to a deployable unit of executable logic for execution on the instance of the process engine; a publishing service configured to publish a process transition document, the process transition document being populated by the process engine with routing data as a part of the message itself for routing the message; a subscription service configured to subscribe to the process transition documents to be routed to the corresponding instance of the process engine installed on the corresponding instance of the integration server; a filtering service configured to filter other process transition documents published to the messaging layer; wherein the messaging layer includes two queues generated from the design time process model, including a first queue to process transition documents that trigger new process instances, and a second queue to accommodate transitions between steps of the process model, wherein, after a step in the process model is executed, the publishing service is configured to publish a new process transition document to cause a next step in the process model to be executed, and wherein the process is routed and executed in a location transparent manner. - View Dependent Claims (26, 27, 28, 29, 30, 31, 32)
-
Specification