Mobile IoT Authors: Yeshim Deniz, Zakia Bouachraoui, Elizabeth White, Pat Romanski, Carmen Gonzalez

Related Topics: Microservices Expo

Microservices Expo: Article

Web Services' Impact on Business Process Management

Web Services' Impact on Business Process Management

Web services have been advertised as the one-size-fits-all solution for all sorts of integration problems. But like any other innovation in the software industry, Web services require a new generation of tools and infrastructure to help companies overcome the adoption hurdle and take full advantage of the business benefits this technology promises.

In this article I take a closer look at Web services' impact on business process management (BPM) and identify the requirements for a new generation of BPM products that fully leverage Web services and address the resulting architectural changes.

BPM: Approaches and Trends Over the past 10 years, different generations of business process management systems have emerged in the marketplace to solve different problems. These systems include:

  • Workflows within packaged applications
  • Document management work flows
  • EAI-based BPM
  • B2B-based BPM

There are a number of trends in the marketplace today that alter the landscape for application integration and business process management including:

  • Convergence of application development and application integration
  • Convergence of B2B and back-end integration
  • Emergence of standards-based back - end integration
  • Maturity of J2EE application servers
  • Web services innovation

The worlds of application integration and application development were separated because traditionally, different vendors provided these two solutions. The pure integration vendors, in either the EAI or B2B markets, focused their efforts on integrating existing systems and applications, investing little effort in supporting new application development. They relied on application platform vendors such as BEA, Microsoft, and IBM to address application development. According to Gartner (December, 2001):

"Through 2006, at least 75 percent of Web services deployed by Global 2000 enterprises will have been implemented through integration of new developments and pre-existing applications (0.7 probability)."

Developers will be looking for a single platform to develop new applications and integrate existing ones. Web services adoption will help accelerate the convergence of these two worlds by providing a standards-based method to wrap existing applications and business objects for maximum reusability and integration. As a result, several new requirements for BPM products have emerged:

  • Ability to call back-end and external Web Services
  • Ability to expose a business process as a Web service
  • Seamless integration and reusability of existing business objects (e.g., EJBs)
  • Support for the development and execution of in-line application code.

Organizations that invested in the J2EE platform will seek a smooth ramp between the underlying J2EE APIs, the application logic stack, and the business process layer. Companies will also look at the deployment aspects of the platform and evaluate the BPM/ integration solution in terms of scalability, reliability, recoverability, and manageability. In the context of an integrated platform, the challenge will be to provide the proper level of abstraction and the tools to the right users (see Figure 1).

The enterprise developer who implements business objects and system-level code will be working at the J2EE API layer. The developer will need mechanisms to expose these objects to the layer above, potentially as Web services. At the business application layer, the application developer will need an easy way to assemble these Web services - typically accomplished by writing some procedural application logic or by using a BPM state machine. That developer will have to be sheltered from the complexity of the underlying J2EE layer and from the technical details of assembling Web services. At the integration level, the business analyst defines coarse-grained business processes that use the services provided by the underlying layers as well as back-end resources within the enterprise, and services provided by partners over the Web.

Convergence of Back-End Integration and B2B
The problem of integrating internal applications on the one hand, and business partners on the other hand, has grown out of two different environments. The first is characterized by a controlled environment with full access to back-end resources, based on messaging systems, short-running transactions, fine-grained access to data, and binary-level transformations. The second environment is characterized by support for XML standards, Internet protocols such as the communication channel, long-running processes and transactions, public processes exposed to partners, and trading partner management. A number of lessons can be learned from the implementation of the first generation of B2B integration solutions.

First, systems and applications developed by different groups on different platforms cannot be tightly coupled; otherwise, changes in the implementation of any of the systems involved will propagate throughout the architecture, making it unmanageable. This breakdown is one of the reasons why the EAI paradigm is insufficient for integration across partners. With Web services, you can integrate applications based on a public contract that describes the XML messages for applications to exchange, while leaving the underlying implementation details to each application. As long as applications honor their contract they can change at will without breaking the integration.

Second, the communication paradigm must be coarse-grained because of the high cost of communicating among loosely coupled systems using WAN or the Internet. Applications ought to maximize return on the cost of opening and using the communication channel by passing around larger pieces of information in the form of an XML business document. By integrating at a business level, Web services will allow greater flexibility when the underlying implementation changes.

Third, the communication ought to be asynchronous because you can't rely on other systems, especially legacy applications, to be 100% reliable. Moreover, the application shouldn't be dependent on the response time of another system whose response time may be inconsistent. These architectural changes are profound and will require a new generation of BPM to be designed around them - rather than simply being evolved from previous models.

Emergence of Standards-Based Integration
Integration is one of the most expensive entries by far in any CIO's ledger, often due to the proprietary nature of back-end systems and applications. XML was the first step toward standardizing this space as it unleashed data from proprietary binary formats into a standards-based data representation. Unfortunately, XML as a data representation alone is not enough. Access to application functionality requires a way to describe the methods available, a mechanism to discover these methods, and a mechanism to access the resulting data. Web services holds this promise.

When XML emerged and started to gain credibility in the marketplace, BPM and integration companies rushed to add XML support to their products. Most BPM solutions provided a way to consume and produce XML. Many of these products were architected long before XML was introduced; therefore, the architecture wasn't designed to handle its extensible nature. This approach falls short as the number of different XML standards increases. Moreover, the scope of XML standards is quickly moving beyond the mere description and exchange of data.

XML is now pervasive in the architecture and is used to describe services (WSDL), Web service registry (UDDI), business processes (BPML, Xlang), and sequences of public business events and processes (ebXML). For these reasons, BPM engines must provide mechanisms to cope with XML extensibility at their core, so it's possible to support multiple XML standards without requiring changes in the product. At the periphery, BPMs must support XML transformations, definition of public business processes, and interaction with multiple Web services in both synchronous and asynchronous fashions. The new generation BPM requires mechanisms to integrate with enterprise-class Web services, such as the transformation of XML messages, introspection of WSDL definitions, processing and dispatching of SOAP calls, management of correlation IDs, and the state associated with multiple conversations with multiple Web services. The combination of Web services and BPM provides developers with the business-level programming paradigm that allows organizations to build what analysts describe as "composite business applications."

To better understand the architectural and technical implications of the execution of Web services within a business process engine, let's consider an example in which an insurance company aims to expose an underwriting business process to its subsidiaries using Web services. We'll use a fictitious, simplified version of the actual business process (see Figure 2).

1. The quote request comes in through a Web service interface.

2. The request is queued; when ready, BPM starts the proper business process.

3. BPM starts two parallel tasks: it sends an asynchronous request to a back-end application to check if the customer has any history of interaction with the insurance company and it sends a request to a Web service offered by the Department of Motor Vehicles to check the history of the vehicle to be insured (see Figure 3).

4. When the responses come back asynchronously BPM decides if the request can be accepted by executing in-line Java logic.

5. If the request for quote is accepted, BPM invokes an EJB that computes the policy price based on the input parameters.

6. BPM sends the quote back to the requesting client.

There are two main requirements for the BPM design environment in this example. The first is to provide the user with a business-level abstraction of the services offered by the Web services involved while hiding the low-level implementation details. The second requirement is to keep the business process definition independent from the actual Web services with which it interacts. Over time, the same business process will have to interface with alternative Web services implementations (e.g., the credit check could be performed via a cheaper or more efficient Web service offered by a different credit agency), making it possible to swap Web services without changing the business process definition, as long as the semantic of the services methods and schemas are equivalent. At design time, the BPM tool must be able to load the WSDL definition of a Web service and allow the user to select which Web service methods to call. The tool must be able to generate the WSDL definition for those services that are exposed to external entities. It should also allow the user to introspect the schema of the document payload and provide a mechanism to transform it into the schemas that other Web services require.

There will be many cases when Web services won't be transactional, which requires the BPM to provide the user with mechanisms to model compensating transactions when something goes wrong in between multiple invocations of non-transactional Web services. In this example, an adapter is used to get at the data and functions of the back-end application. The adapter interfaces are wrapped with a Web service so that the BPM has a consistent metaphor and interfaces to access the various entities involved in the business process (see Figure 4).

Beyond the design time, there are several high-level implications for the BPM at run time. The BPM must provide a mechanism to accept SOAP calls, marshal the SOAP headers and content into the internal data representation, potentially apply a data transformation, and start the proper business process(es). In this example, the interactions with the various Web services are asynchronous. Therefore, each conversation started by BPM must generate a correlation ID and use it to send the proper response to the proper requester later. If an external Web service initiates the request, then BPM needs to store the external correlation ID to correlate the response properly later.

For most business scenarios, the conversations between BPM and other entities via Web services will probably involve multiple exchanges of messages that have state associated with them. For each conversation instance, the BPM run-time will have to transparently and recoverably manage the associated state. For each invocation of a Web service within the boundaries of a defined long-running transaction, the BPM engine has to store the information it needs to compensate for the effect of the Web service invocation. When a message has to be transformed from one format to another as it's passed across different Web services, the BPM engine needs to provide a highly efficient transformation engine.

Web services offer the promise of a single solution for integration across multiple enterprises. However, there are still many areas where there is opportunity for growth and enhancement. This article has identified and discussed the requirements that would ensure a fully successful new generation of BPM systems using Web services.

More Stories By Vittorio Viarengo

Vittorio Viarengo is senior director of product management with BEA Systems, a leading application infrastructure company, with more than 13,000
customers around the world. Viarengo is responsible for the direction of BEA WebLogic Integration, business process management, and Web Services development framework. [email protected]

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.

IoT & Smart Cities Stories
Bill Schmarzo, author of "Big Data: Understanding How Data Powers Big Business" and "Big Data MBA: Driving Business Strategies with Data Science," is responsible for setting the strategy and defining the Big Data service offerings and capabilities for EMC Global Services Big Data Practice. As the CTO for the Big Data Practice, he is responsible for working with organizations to help them identify where and how to start their big data journeys. He's written several white papers, is an avid blogge...
Nicolas Fierro is CEO of MIMIR Blockchain Solutions. He is a programmer, technologist, and operations dev who has worked with Ethereum and blockchain since 2014. His knowledge in blockchain dates to when he performed dev ops services to the Ethereum Foundation as one the privileged few developers to work with the original core team in Switzerland.
René Bostic is the Technical VP of the IBM Cloud Unit in North America. Enjoying her career with IBM during the modern millennial technological era, she is an expert in cloud computing, DevOps and emerging cloud technologies such as Blockchain. Her strengths and core competencies include a proven record of accomplishments in consensus building at all levels to assess, plan, and implement enterprise and cloud computing solutions. René is a member of the Society of Women Engineers (SWE) and a m...
Andrew Keys is Co-Founder of ConsenSys Enterprise. He comes to ConsenSys Enterprise with capital markets, technology and entrepreneurial experience. Previously, he worked for UBS investment bank in equities analysis. Later, he was responsible for the creation and distribution of life settlement products to hedge funds and investment banks. After, he co-founded a revenue cycle management company where he learned about Bitcoin and eventually Ethereal. Andrew's role at ConsenSys Enterprise is a mul...
In his general session at 19th Cloud Expo, Manish Dixit, VP of Product and Engineering at Dice, discussed how Dice leverages data insights and tools to help both tech professionals and recruiters better understand how skills relate to each other and which skills are in high demand using interactive visualizations and salary indicator tools to maximize earning potential. Manish Dixit is VP of Product and Engineering at Dice. As the leader of the Product, Engineering and Data Sciences team at D...
Dynatrace is an application performance management software company with products for the information technology departments and digital business owners of medium and large businesses. Building the Future of Monitoring with Artificial Intelligence. Today we can collect lots and lots of performance data. We build beautiful dashboards and even have fancy query languages to access and transform the data. Still performance data is a secret language only a couple of people understand. The more busine...
Whenever a new technology hits the high points of hype, everyone starts talking about it like it will solve all their business problems. Blockchain is one of those technologies. According to Gartner's latest report on the hype cycle of emerging technologies, blockchain has just passed the peak of their hype cycle curve. If you read the news articles about it, one would think it has taken over the technology world. No disruptive technology is without its challenges and potential impediments t...
If a machine can invent, does this mean the end of the patent system as we know it? The patent system, both in the US and Europe, allows companies to protect their inventions and helps foster innovation. However, Artificial Intelligence (AI) could be set to disrupt the patent system as we know it. This talk will examine how AI may change the patent landscape in the years to come. Furthermore, ways in which companies can best protect their AI related inventions will be examined from both a US and...
Bill Schmarzo, Tech Chair of "Big Data | Analytics" of upcoming CloudEXPO | DXWorldEXPO New York (November 12-13, 2018, New York City) today announced the outline and schedule of the track. "The track has been designed in experience/degree order," said Schmarzo. "So, that folks who attend the entire track can leave the conference with some of the skills necessary to get their work done when they get back to their offices. It actually ties back to some work that I'm doing at the University of San...
When talking IoT we often focus on the devices, the sensors, the hardware itself. The new smart appliances, the new smart or self-driving cars (which are amalgamations of many ‘things'). When we are looking at the world of IoT, we should take a step back, look at the big picture. What value are these devices providing. IoT is not about the devices, its about the data consumed and generated. The devices are tools, mechanisms, conduits. This paper discusses the considerations when dealing with the...