Publication Date

Availability

Embargo Period

Degree Type

Degree Name

Department

Date of Defense

2015-07-08

First Committee Member

M. Brian Blake

Second Committee Member

Geoff Sutcliffe

Third Committee Member

Stefan Wuchty

Fourth Committee Member

B. Kirkpatrick

Fifth Committee Member

Aubrey Rembert

Abstract

The increase in the demand for online services has resulted in approximately 1 trillion connected objects and devices on the Internet, which generate billions of gigabytes of new data each day. As more devices become web-enabled, the number and type of services available on the Internet are likely to increase. The growing number of services adds to the complexity of connectedness and interoperability of computing systems. Moreover, interactions between organizations and individuals are also increasing. As a result, data produced by internal and external business processes are in turn growing at an exponential rate creating a data deluge. Companies need to do more than just connect people and integrate processes to become smarter service providers. Business applications must now be able to react to inherently dynamic and uncertain business situations. Companies need to improve and modify, act in an agile manner, optimize and adapt business processes to their customers, and thereby improve the responsiveness of the whole company. Only companies that can quickly and efficiently adapt to the changing business needs can stay competitive in the global market. Initial approaches that sustain competitiveness focus on increasing collaboration and interoperability within an organization and with its suppliers and consumers. These early approaches required a robust, and interoperable computing architecture. A popular and deeply collaborative approach to incorporating the realizations of connectivity and interoperability has been the Web service framework. In this context, services are realized as autonomous, platform-independent, computational elements that are described, published, discovered, orchestrated and programmed using standard protocols. Loosely coupled services have traditionally been orchestrated by a central processing service executed in the Business Process Execution Language (BPEL). BPEL is the defacto standard for service-oriented composition and orchestration. It is used to build and execute workflows of collaborating applications distributed within and across organizational boundaries. These workflows scale to Web-scale Web service workflows, which realize the business logic of organizations, its partners, and customers across geographic boundaries. However, business logic realized by BPEL is usually formulated in the early stages of the software design lifecycle, which renders workflows non-responsive to just in time changes to time-dependent and often uncertain business trends. Manual changes to service activities can be a delicate, time-consuming activity that might require a designer to have extensive knowledge about the underlying business logic and representations. Also, the deluge of data generated from workflows create daunting challenges in bringing data together in a form that can be efficiently analyzed to make data-driven decisions promptly. The result is an information gap where more data does not necessarily increase the ability to process symbiotic pieces of data to arrive at needed insights and decisions in uncertain environments. Currently, there is a strong requirement for methods that allow Web service workflows to adapt transparently to changes in a dynamic and uncertain environment to meet the demands of a business conditions. Also, a by-product of workflows is data silos. These silos have been known to contain valuable and detailed records of operations, manufacturing, supply-chain management, customer behavior, marketing campaign performance and workflow procedures. Converting data silos into a knowledge base that can provide data-driven inference is also necessary to create added value, deliver value more efficiently and prevent commoditization of the goods and services. The major contribution of this dissertation is a methodology whereas the BPEL process is represented as an agile model that mimics an expert system. This agile model enables greater responsiveness to the evolution of data and efficiently narrows the information gap. The agile model is implemented as a probabilistic network that, in effect, realizes the knowledge base and an inference engine that underlies any traditional expert system. The knowledge base formulates knowledge about the workflow problem domain in a structured way, and the inference engine supports reasoning about events and decisions in the workflow's domain. Two steps are used to accomplish structuring and reasoning. Firstly, sets of (conditional) dependence and independence statements among workflow data transitions, and casual relations are encoded as a directed acyclic graph. Secondly, the strengths of dependency relationships using probability and graphical theory are specified for the inference engine. A core aspect of this approach is that historical domain-specific functional data (extracted from an operational Web service workflow) is leveraged to capture an initial snapshot of probabilistic beliefs of the underlying phenomenon of the workflow. Then ongoing run-time data update those beliefs to capture the run-time trends of the workflow and provide just-in-time decision support. The abstract representation of BPEL process as an expert system addresses both nonresponsiveness and the information gap by providing a data-derived inferential visibility of workflow semantics. The particular knowledge base used is a Bayesian network, and the inference engine is the set of graphical and probabilistic methods that applies to the probabilistic network. The framework uses data as evidence and provides a framework to represent and update beliefs about the workflow behavior. Moreover, data silos can now be used to provide integrated, actionable information and insights. Consider the Web service workflow of an online retailer with a global supply chain. Queries to non-explicitly defined trends about the behavior of workflow users such as “Which product or supplier is best?” or “Should a product or supplier be changed?” can now be responsive to time-dependent trends. This dissertation contributes a methodology for the construction of this unique model and the framework for the transformation of BPEL into a more agile representation. Ultimately, together, these contributions represent a paradigm that is more responsive to dynamic business trends and uncertainty by the holistic inferential view of business processes.

Keywords

Web Services; BPEL; Workflows; Bayesian Networks

Recommended Citation

Clarke, Damian A., "A Methodology for a Data-Driven Model and Approach for Content-Specific Assessment of Service Workflows" (2015). Open Access Dissertations. 1472.
https://scholarlyrepository.miami.edu/oa_dissertations/1472