Mobile IoT Authors: Liz McMillan, Zakia Bouachraoui, Elizabeth White, Yeshim Deniz, Dana Gardner

Related Topics: Microservices Expo

Microservices Expo: Article

Web Services' Impact on Business Process Management

Web Services' Impact on Business Process Management

Web services have been advertised as the one-size-fits-all solution for all sorts of integration problems. But like any other innovation in the software industry, Web services require a new generation of tools and infrastructure to help companies overcome the adoption hurdle and take full advantage of the business benefits this technology promises.

In this article I take a closer look at Web services' impact on business process management (BPM) and identify the requirements for a new generation of BPM products that fully leverage Web services and address the resulting architectural changes.

BPM: Approaches and Trends Over the past 10 years, different generations of business process management systems have emerged in the marketplace to solve different problems. These systems include:

  • Workflows within packaged applications
  • Document management work flows
  • EAI-based BPM
  • B2B-based BPM

There are a number of trends in the marketplace today that alter the landscape for application integration and business process management including:

  • Convergence of application development and application integration
  • Convergence of B2B and back-end integration
  • Emergence of standards-based back - end integration
  • Maturity of J2EE application servers
  • Web services innovation

The worlds of application integration and application development were separated because traditionally, different vendors provided these two solutions. The pure integration vendors, in either the EAI or B2B markets, focused their efforts on integrating existing systems and applications, investing little effort in supporting new application development. They relied on application platform vendors such as BEA, Microsoft, and IBM to address application development. According to Gartner (December, 2001):

"Through 2006, at least 75 percent of Web services deployed by Global 2000 enterprises will have been implemented through integration of new developments and pre-existing applications (0.7 probability)."

Developers will be looking for a single platform to develop new applications and integrate existing ones. Web services adoption will help accelerate the convergence of these two worlds by providing a standards-based method to wrap existing applications and business objects for maximum reusability and integration. As a result, several new requirements for BPM products have emerged:

  • Ability to call back-end and external Web Services
  • Ability to expose a business process as a Web service
  • Seamless integration and reusability of existing business objects (e.g., EJBs)
  • Support for the development and execution of in-line application code.

Organizations that invested in the J2EE platform will seek a smooth ramp between the underlying J2EE APIs, the application logic stack, and the business process layer. Companies will also look at the deployment aspects of the platform and evaluate the BPM/ integration solution in terms of scalability, reliability, recoverability, and manageability. In the context of an integrated platform, the challenge will be to provide the proper level of abstraction and the tools to the right users (see Figure 1).

The enterprise developer who implements business objects and system-level code will be working at the J2EE API layer. The developer will need mechanisms to expose these objects to the layer above, potentially as Web services. At the business application layer, the application developer will need an easy way to assemble these Web services - typically accomplished by writing some procedural application logic or by using a BPM state machine. That developer will have to be sheltered from the complexity of the underlying J2EE layer and from the technical details of assembling Web services. At the integration level, the business analyst defines coarse-grained business processes that use the services provided by the underlying layers as well as back-end resources within the enterprise, and services provided by partners over the Web.

Convergence of Back-End Integration and B2B
The problem of integrating internal applications on the one hand, and business partners on the other hand, has grown out of two different environments. The first is characterized by a controlled environment with full access to back-end resources, based on messaging systems, short-running transactions, fine-grained access to data, and binary-level transformations. The second environment is characterized by support for XML standards, Internet protocols such as the communication channel, long-running processes and transactions, public processes exposed to partners, and trading partner management. A number of lessons can be learned from the implementation of the first generation of B2B integration solutions.

First, systems and applications developed by different groups on different platforms cannot be tightly coupled; otherwise, changes in the implementation of any of the systems involved will propagate throughout the architecture, making it unmanageable. This breakdown is one of the reasons why the EAI paradigm is insufficient for integration across partners. With Web services, you can integrate applications based on a public contract that describes the XML messages for applications to exchange, while leaving the underlying implementation details to each application. As long as applications honor their contract they can change at will without breaking the integration.

Second, the communication paradigm must be coarse-grained because of the high cost of communicating among loosely coupled systems using WAN or the Internet. Applications ought to maximize return on the cost of opening and using the communication channel by passing around larger pieces of information in the form of an XML business document. By integrating at a business level, Web services will allow greater flexibility when the underlying implementation changes.

Third, the communication ought to be asynchronous because you can't rely on other systems, especially legacy applications, to be 100% reliable. Moreover, the application shouldn't be dependent on the response time of another system whose response time may be inconsistent. These architectural changes are profound and will require a new generation of BPM to be designed around them - rather than simply being evolved from previous models.

Emergence of Standards-Based Integration
Integration is one of the most expensive entries by far in any CIO's ledger, often due to the proprietary nature of back-end systems and applications. XML was the first step toward standardizing this space as it unleashed data from proprietary binary formats into a standards-based data representation. Unfortunately, XML as a data representation alone is not enough. Access to application functionality requires a way to describe the methods available, a mechanism to discover these methods, and a mechanism to access the resulting data. Web services holds this promise.

When XML emerged and started to gain credibility in the marketplace, BPM and integration companies rushed to add XML support to their products. Most BPM solutions provided a way to consume and produce XML. Many of these products were architected long before XML was introduced; therefore, the architecture wasn't designed to handle its extensible nature. This approach falls short as the number of different XML standards increases. Moreover, the scope of XML standards is quickly moving beyond the mere description and exchange of data.

XML is now pervasive in the architecture and is used to describe services (WSDL), Web service registry (UDDI), business processes (BPML, Xlang), and sequences of public business events and processes (ebXML). For these reasons, BPM engines must provide mechanisms to cope with XML extensibility at their core, so it's possible to support multiple XML standards without requiring changes in the product. At the periphery, BPMs must support XML transformations, definition of public business processes, and interaction with multiple Web services in both synchronous and asynchronous fashions. The new generation BPM requires mechanisms to integrate with enterprise-class Web services, such as the transformation of XML messages, introspection of WSDL definitions, processing and dispatching of SOAP calls, management of correlation IDs, and the state associated with multiple conversations with multiple Web services. The combination of Web services and BPM provides developers with the business-level programming paradigm that allows organizations to build what analysts describe as "composite business applications."

To better understand the architectural and technical implications of the execution of Web services within a business process engine, let's consider an example in which an insurance company aims to expose an underwriting business process to its subsidiaries using Web services. We'll use a fictitious, simplified version of the actual business process (see Figure 2).

1. The quote request comes in through a Web service interface.

2. The request is queued; when ready, BPM starts the proper business process.

3. BPM starts two parallel tasks: it sends an asynchronous request to a back-end application to check if the customer has any history of interaction with the insurance company and it sends a request to a Web service offered by the Department of Motor Vehicles to check the history of the vehicle to be insured (see Figure 3).

4. When the responses come back asynchronously BPM decides if the request can be accepted by executing in-line Java logic.

5. If the request for quote is accepted, BPM invokes an EJB that computes the policy price based on the input parameters.

6. BPM sends the quote back to the requesting client.

There are two main requirements for the BPM design environment in this example. The first is to provide the user with a business-level abstraction of the services offered by the Web services involved while hiding the low-level implementation details. The second requirement is to keep the business process definition independent from the actual Web services with which it interacts. Over time, the same business process will have to interface with alternative Web services implementations (e.g., the credit check could be performed via a cheaper or more efficient Web service offered by a different credit agency), making it possible to swap Web services without changing the business process definition, as long as the semantic of the services methods and schemas are equivalent. At design time, the BPM tool must be able to load the WSDL definition of a Web service and allow the user to select which Web service methods to call. The tool must be able to generate the WSDL definition for those services that are exposed to external entities. It should also allow the user to introspect the schema of the document payload and provide a mechanism to transform it into the schemas that other Web services require.

There will be many cases when Web services won't be transactional, which requires the BPM to provide the user with mechanisms to model compensating transactions when something goes wrong in between multiple invocations of non-transactional Web services. In this example, an adapter is used to get at the data and functions of the back-end application. The adapter interfaces are wrapped with a Web service so that the BPM has a consistent metaphor and interfaces to access the various entities involved in the business process (see Figure 4).

Beyond the design time, there are several high-level implications for the BPM at run time. The BPM must provide a mechanism to accept SOAP calls, marshal the SOAP headers and content into the internal data representation, potentially apply a data transformation, and start the proper business process(es). In this example, the interactions with the various Web services are asynchronous. Therefore, each conversation started by BPM must generate a correlation ID and use it to send the proper response to the proper requester later. If an external Web service initiates the request, then BPM needs to store the external correlation ID to correlate the response properly later.

For most business scenarios, the conversations between BPM and other entities via Web services will probably involve multiple exchanges of messages that have state associated with them. For each conversation instance, the BPM run-time will have to transparently and recoverably manage the associated state. For each invocation of a Web service within the boundaries of a defined long-running transaction, the BPM engine has to store the information it needs to compensate for the effect of the Web service invocation. When a message has to be transformed from one format to another as it's passed across different Web services, the BPM engine needs to provide a highly efficient transformation engine.

Web services offer the promise of a single solution for integration across multiple enterprises. However, there are still many areas where there is opportunity for growth and enhancement. This article has identified and discussed the requirements that would ensure a fully successful new generation of BPM systems using Web services.

More Stories By Vittorio Viarengo

Vittorio Viarengo is senior director of product management with BEA Systems, a leading application infrastructure company, with more than 13,000
customers around the world. Viarengo is responsible for the direction of BEA WebLogic Integration, business process management, and Web Services development framework. [email protected]

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.

IoT & Smart Cities Stories
The deluge of IoT sensor data collected from connected devices and the powerful AI required to make that data actionable are giving rise to a hybrid ecosystem in which cloud, on-prem and edge processes become interweaved. Attendees will learn how emerging composable infrastructure solutions deliver the adaptive architecture needed to manage this new data reality. Machine learning algorithms can better anticipate data storms and automate resources to support surges, including fully scalable GPU-c...
Machine learning has taken residence at our cities' cores and now we can finally have "smart cities." Cities are a collection of buildings made to provide the structure and safety necessary for people to function, create and survive. Buildings are a pool of ever-changing performance data from large automated systems such as heating and cooling to the people that live and work within them. Through machine learning, buildings can optimize performance, reduce costs, and improve occupant comfort by ...
The explosion of new web/cloud/IoT-based applications and the data they generate are transforming our world right before our eyes. In this rush to adopt these new technologies, organizations are often ignoring fundamental questions concerning who owns the data and failing to ask for permission to conduct invasive surveillance of their customers. Organizations that are not transparent about how their systems gather data telemetry without offering shared data ownership risk product rejection, regu...
René Bostic is the Technical VP of the IBM Cloud Unit in North America. Enjoying her career with IBM during the modern millennial technological era, she is an expert in cloud computing, DevOps and emerging cloud technologies such as Blockchain. Her strengths and core competencies include a proven record of accomplishments in consensus building at all levels to assess, plan, and implement enterprise and cloud computing solutions. René is a member of the Society of Women Engineers (SWE) and a m...
Poor data quality and analytics drive down business value. In fact, Gartner estimated that the average financial impact of poor data quality on organizations is $9.7 million per year. But bad data is much more than a cost center. By eroding trust in information, analytics and the business decisions based on these, it is a serious impediment to digital transformation.
Digital Transformation: Preparing Cloud & IoT Security for the Age of Artificial Intelligence. As automation and artificial intelligence (AI) power solution development and delivery, many businesses need to build backend cloud capabilities. Well-poised organizations, marketing smart devices with AI and BlockChain capabilities prepare to refine compliance and regulatory capabilities in 2018. Volumes of health, financial, technical and privacy data, along with tightening compliance requirements by...
Predicting the future has never been more challenging - not because of the lack of data but because of the flood of ungoverned and risk laden information. Microsoft states that 2.5 exabytes of data are created every day. Expectations and reliance on data are being pushed to the limits, as demands around hybrid options continue to grow.
Digital Transformation and Disruption, Amazon Style - What You Can Learn. Chris Kocher is a co-founder of Grey Heron, a management and strategic marketing consulting firm. He has 25+ years in both strategic and hands-on operating experience helping executives and investors build revenues and shareholder value. He has consulted with over 130 companies on innovating with new business models, product strategies and monetization. Chris has held management positions at HP and Symantec in addition to ...
Enterprises have taken advantage of IoT to achieve important revenue and cost advantages. What is less apparent is how incumbent enterprises operating at scale have, following success with IoT, built analytic, operations management and software development capabilities - ranging from autonomous vehicles to manageable robotics installations. They have embraced these capabilities as if they were Silicon Valley startups.
As IoT continues to increase momentum, so does the associated risk. Secure Device Lifecycle Management (DLM) is ranked as one of the most important technology areas of IoT. Driving this trend is the realization that secure support for IoT devices provides companies the ability to deliver high-quality, reliable, secure offerings faster, create new revenue streams, and reduce support costs, all while building a competitive advantage in their markets. In this session, we will use customer use cases...