In addition to the hard benefits of improving employee productivity and reducing cycle time, BPMS offers the strategic benefit of agility, meaning enhanced responsiveness to the continual shifts in both the competitive landscape and regulatory compliance environment. BPMS fosters agility by allowing new cross-functional process solutions to be developed and deployed quickly, and by enabling the rules that drive them to change with minimal development effort. The efficiency/cycle time value proposition dates all the way back to the workflow software of the 1980s, which succeeded in transforming slow, paper-driven manual processes into much faster automated processes by routing data and documents electronically over the network. But workflow technology vendors rarely talked about agility. Today, with BPMS, sometimes it seems to be the main thing some vendors talk about. What changed?
The major change has been the way process management software integrates with external business systems. In the workflow era, the major focus was on automating human activity. Application integration was de-emphasized, because, in fact, it didn’t work very well. If you went to a workflow conference or trade show, any vendor could show you how to build a workflow in 30 minutes using no programming, just graphical drag-and-drop design. So once you bought the technology, why did it take six to twelve months to design and deploy real-world workflows?
The problem was application integration, which did take programming – lots of it – almost all of it point-to-point code linking key business systems with individual client activities in the workflow. In the early 1990s, when most enterprise applications were mainframe- or minicomputer-based, workflow did actually provide some integration without programming. A common technique leveraged screen-scraping technology that could copy and paste data between the workflow client window and a host terminal emulation window on the desktop. But over the next decade, as packaged client-server applications for ERP, CRM, accounting, HR, and supply chain proliferated, application integration heaped an ever-larger programming burden on workflow solution developers. Workflow software was great at accelerating human-centered processes, but when application integration was required, you couldn’t call it agile.
The answer to this agility problem came not from workflow but from message broker middleware, which came to be known as enterprise application integration (EAI). EAI provided software components called application adapters that could translate the APIs and data objects of diverse enterprise applications into a common message format and an enterprise message bus that could transform and route those messages between applications. With this technology, a change to customer information in the ERP system could be instantly propagated to the CRM system. And the integration could be done much more quickly than with custom coding, hence increased agility.
While EAI vendors talked about “processes,” they really meant these simple data exchanges, lasting milliseconds, not end-to-end processes lasting days or weeks. But bow hard could it be to add a true process engine and facilities for simple human workflow, such as management review and approval?
Around two or three years ago, BPMS arose as the fusion of workflow and EAI and quickly replaced the old marketing terminology on both sides. BPMS could promise both efficiency and agility, the ability to both automate human-centric process activities and integrate enterprise applications without custom point-to-point code. Inevitably it put workflow vendors and integration vendors on a collision course.
Today, at the brochure level, BPMS offerings from both traditions generally promise the same set of capabilities, and their component block diagrams even look the same. But the common feature set and value proposition actually mask important architectural differences between them. While rarely mentioned by industry analysts, these differences underlie the debate over BPM standards, and they will definitely impact the course of BPMS market evolution. For that reason, users need to understand them and take them into consideration in the BPMS buying decision.
The essential difference comes down to what a process activity is, and what the process engine conceptually does. On one side is workflow architecture, which is generally based on the reference model of the Workflow Management Coalition (WfMC) and its associated process definition language XPDL. While the workflow diagram represents the process as a flow of activities, in this architecture the process engine actually routes the set of process data elements – called the process instance or “work item” – to a sequence of queues. Each queue is accessed by a separate client program that implements the assigned process activity. The client executable retrieves the process instance from the process engine, executes the process activity, and returns a modified instance back to the process engine.
This architecture is well suited to human interaction. Queues can be easily shared by workgroups, and each activity is inherently long-running. For human interaction the client program is typically a process-aware web application that can readily display or key-enter work item data using electronic forms. If the activity does not require human interaction but only integration with an external business system, workflow architecture implements it as an “automated step.” An automated steps is a client program that retrieves the work item, transforms selected data elements into the format required by the external application, executes the integration action via an adapter or other EAI middleware, transforms the returned data back to the work item data format, and finally returns the work item to the process engine. To the process engine, however, automated steps and interactive steps basically look the same.
On the other side is service orchestration, which is based on service-oriented architecture (SOA) and its associated process definition language from OASIS, called BPEL. In BPEL, each process activity represents an invoked service, i.e., an action performed by an external system upon receipt of a request message, usually returning a response message. In other words, the process engine does not route work items to queues, but sends and receives messages to service endpoints, typically URLs.
This architecture emphasizes integration more than human interaction. Data transformation and API calls on external systems are specified in the BPEL model and performed by the process engine directly. Sequences of process activities can be coordinated as transactions and rolled back or committed as an atomic unit. On the other hand, human users are not easily addressed as service endpoints. To deal with human interaction, BPMS vendors following the service orchestration model typically provide a task manager service that the process engine can invoke to create interactive tasks, and which notifies the process engine when a task is complete. Queues, roles, and other workflow constructs can be defined and implemented by this service, safely removed from the BPEL process engine.
In other words, workflow architecture defines human interaction details in the process model and executes them on the process engine, but leaves integration outside the model and opaque to the engine. Service-oriented architecture defines integration details in the process model and executes them on the process engine, but leaves human interaction outside and opaque to the engine. Of course, modern BPMS offerings based on either architecture provide both human interaction and application integration, and try to bring them together in their design environment.
While SOA and BPEL have grabbed all the media and analyst attention, far more BPMS products today are based on the workflow architecture. One reason is that the center of gravity in BPMS demand remains streamlining human work, not agile business integration, and the XPDL-based products simply have richer functionality there, and have made their integration capabilities “agile enough.” While workflow-centric offerings like FileNet, Savvion, and Fuego keep the focus on the human element, they accommodate service orchestration by allowing BPEL subprocesses to be embedded in an end-to-end workflow.
Another reason is that service orchestration is really an infrastructure game that only the big boys can play. So it’s no surprise that BPMS on that side of the ledger come from names like IBM, Microsoft, Oracle, and SAP. As BPMS moves from a point solution to true enterprise infrastructure, the service orchestration side has an immense advantage. Undoubtedly that’s why the analysts all assume BPEL will “win.” But it’s not really clear that BPMS is what’s driving BPEL evolution in the standards committees. SOA is not just about BPMS. It’s also a next-generation agile programming paradigm, often called composite applications, and that – not BPM – appears to be closer to the heart of the BPEL technical committee.
Agile integration is important, but it’s not the only goal of BPM. From day one, BPM has sought to make process design directly accessible to business analysts, but today’s BPEL process models are still gobbledygook to all but Java geeks and SOAP jockeys. For BPEL to win the hearts and minds of the BPM community, that has to change. In other areas as well, from human interaction to content management to business rules and performance management, the workflow-centric offerings are currently ahead.
But these are still early days for BPMS. The head-to-head architectural battle has yet to begin. In 2006, expect it to come to the fore.