Search results

1 – 10 of over 61000
To view the access options for this content please click here
Article

Julian Krumeich, Benjamin Weis, Dirk Werth and Peter Loos

The business operations of today's enterprises are heavily influenced by numerous of internal and external business events. With the Event Driven Architecture and…

Abstract

Purpose

The business operations of today's enterprises are heavily influenced by numerous of internal and external business events. With the Event Driven Architecture and particularly the Complex Event Processing (CEP), the technology required for identifying complex correlations in these large amounts of event data right after its appearance has already emerged. The resulting gain in operational transparency builds the foundation for (near) real-time reactions. This motivated extensive research activities especially in the field of Business Process Management (BPM), which essentially coined the term Event-Driven BPM (EDBPM). Now, several years after the advent of this new concept, the purpose of this paper is to shed light to the question: where are we now on our way towards a sophisticated adoption of the CEP technology within BPM?

Design/methodology/approach

The research methodology of this paper is a structured literature analysis. It basically follows the procedure proposed by vom Brocke et al. (2009). This verified five-step process – entitled “Reconstructing the giant” – allowed a rigorous study. As a result, various research clusters were derived, whose state-of-the-art exposed existing research gaps within EDBPM.

Findings

First of all, the paper provides a concise conceptual basis on different application possibilities of EDBPM. Afterwards, it synthesizes current research into six clusters and highlights most significant work within them. Finally, a research agenda is proposed to tackle existing research gaps to pave the way towards fully realizing the potentials of the paradigm.

Originality/value

So far, a comparable study of the current state-of-the-art within EDBPM is non-existent. The findings of this paper, e.g. the proposed research agenda, help scholars to focus their research efforts on specific aspects that need to be considered in order to advance the adoption of the CEP technology within BPM.

To view the access options for this content please click here
Article

Christian Janiesch, Martin Matzner and Oliver Müller

The purpose of this paper is to show how to employ complex event processing (CEP) for the observation and management of business processes. It proposes a conceptual…

Abstract

Purpose

The purpose of this paper is to show how to employ complex event processing (CEP) for the observation and management of business processes. It proposes a conceptual architecture of BPM event producer, processor, and consumer and describes technical implications for the application with standard software in a perfect order scenario.

Design/methodology/approach

The authors discuss business process analytics as the technological background. The capabilities of CEP in a BPM context are outlined an architecture design is proposed. A sophisticated proof‐of‐concept demonstrates its applicability.

Findings

The results overcome the separation and data latency issues of process controlling, monitoring, and simulation. Distinct analyses of past, present, and future blur into a holistic real‐time approach. The authors highlight the necessity for configurable event producer in BPM engines, process event support in CEP engines, a common process event format, connectors to visualizers, notifiers and return channels to the BPM engine.

Research limitations/implications

Further research will thoroughly evaluate the approach in a variety of business settings. New concepts and standards for the architecture's building blocks will be needed to improve maintainability and operability.

Practical implications

Managers learn how CEP can yield insights into business processes' operations. The paper illustrates a path to overcome inflexibility, latency, and missing feedback mechanisms of current process modeling and control solutions. Software vendors might be interested in the conceptualization and the described needs for further development.

Originality/value

So far, there is no commercial CEP‐based BPM solution which facilitates a round trip from insight to action as outlines. As major software vendors have begun developing solutions (BPM/BPA solutions), this paper will stimulate a debate between research and practice on suitable design and technology.

To view the access options for this content please click here
Article

Christian Janiesch and Jörn Kuhlenkamp

Changes in workflow relevant data of business processes at run-time can hinder their completion or impact their profitability as they have been instantiated under…

Abstract

Purpose

Changes in workflow relevant data of business processes at run-time can hinder their completion or impact their profitability as they have been instantiated under different circumstances. The purpose of this paper is to propose a context engine to enhance a business process management (BPM) system’s context-awareness. The generic architecture provides the flexibility to configure processes during initialization as well as to adapt running instances at decision gates or during execution due to significant context change.

Design/methodology/approach

The paper discusses context-awareness as the conceptual background. The technological capabilities of business rules and complex event processing (CEP) are outlined in an architecture design. A reference process is proposed and discussed in an exemplary application.

Findings

The results provide an improvement over the current situation of static variable instantiation of business processes with local information. The proposed architecture extends the well-known combination of business rules and BPM systems with a context engine based on CEP.

Research limitations/implications

The resulting architecture for a BPM system using a context engine is generic in nature and, hence, requires to be contextualized for situated implementations. Implementation success is dependent on the availability of context information and process compensation options.

Practical implications

Practitioners receive advice on a reference architecture and technology choices for implementing systems, which can provide and monitor context information for business processes as well as intervene and adapt the execution.

Originality/value

Currently, there is no multi-purpose non-proprietary context engine based on CEP or any other technology available for BPM, which facilitates the adaptation of processes at run-time due to changes in context variables. This paper will stimulate a debate between research and practice on suitable design and technology.

Details

Business Process Management Journal, vol. 25 no. 6
Type: Research Article
ISSN: 1463-7154

Keywords

To view the access options for this content please click here
Article

Xiaohui Zhao, Chengfei Liu and Tao Lin

The emergence of radio frequency identification (RFID) technology promises enormous opportunities to shift business process automation up to the wire level. The purpose of…

Abstract

Purpose

The emergence of radio frequency identification (RFID) technology promises enormous opportunities to shift business process automation up to the wire level. The purpose of this paper is to explore the methodology of incorporating business logics into RFID edge systems, and thereby facilitate the business process automation in the RFID‐applied environment.

Design/methodology/approach

Following the object‐oriented modelling perspective, concepts of classes, instances are deployed to characterise the runtime context of RFID business scenarios; event patterns are used to aggregate RFID tag read events into business meaningful events; and business rules are established to automate business transactions according to the elicited events.

Findings

The paper has emphasised the synergy between business process automation and automatic data acquisition, and has identified the inter‐relations between RFID tag read events, application‐level events, business rules, and business operations. The reported research has demonstrated a feasible scheme of incorporating business process control and automation into RFID‐enabled applications.

Originality/value

The paper analyses the characteristics of RFID data and event handling in relation to business rule modelling and process automation. The features of event‐relied awareness, context containment and overlapping, etc. are all captured and described by the proposed object‐oriented business model. The given data‐driven RFID middleware architecture can serve as one reference architecture for system design and development. Hence, the paper plays an important role in connecting automatic data acquisition and existing business processes, and thereby bridges the physical world and the digital world.

Details

Business Process Management Journal, vol. 16 no. 6
Type: Research Article
ISSN: 1463-7154

Keywords

To view the access options for this content please click here
Article

Bojan Božić and Werner Winiwarter

The purpose of this paper is to present a showcase of semantic time series processing which demonstrates how this technology can improve time series processing and…

Abstract

Purpose

The purpose of this paper is to present a showcase of semantic time series processing which demonstrates how this technology can improve time series processing and community building by the use of a dedicated language.

Design/methodology/approach

The authors have developed a new semantic time series processing language and prepared showcases to demonstrate its functionality. The assumption is an environmental setting with data measurements from different sensors to be distributed to different groups of interest. The data are represented as time series for water and air quality, while the user groups are, among others, the environmental agency, companies from the industrial sector and legal authorities.

Findings

A language for time series processing and several tools to enrich the time series with meta‐data and for community building have been implemented in Python and Java. Also a GUI for demonstration purposes has been developed in PyQt4. In addition, an ontology for validation has been designed and a knowledge base for data storage and inference was set up. Some important features are: dynamic integration of ontologies, time series annotation, and semantic filtering.

Research limitations/implications

This paper focuses on the showcases of time series semantic language (TSSL), but also covers technical aspects and user interface issues. The authors are planning to develop TSSL further and evaluate it within further research projects and validation scenarios.

Practical implications

The research has a high practical impact on time series processing and provides new data sources for semantic web applications. It can also be used in social web platforms (especially for researchers) to provide a time series centric tagging and processing framework.

Originality/value

The paper presents an extended version of the paper presented at iiWAS2012.

Details

International Journal of Web Information Systems, vol. 9 no. 2
Type: Research Article
ISSN: 1744-0084

Keywords

To view the access options for this content please click here
Article

Sylva Girtelschmid, Matthias Steinbauer, Vikash Kumar, Anna Fensel and Gabriele Kotsis

The purpose of this article is to propose and evaluate a novel system architecture for Smart City applications which uses ontology reasoning and a distributed stream…

Abstract

Purpose

The purpose of this article is to propose and evaluate a novel system architecture for Smart City applications which uses ontology reasoning and a distributed stream processing framework on the cloud. In the domain of Smart City, often methodologies of semantic modeling and automated inference are applied. However, semantic models often face performance problems when applied in large scale.

Design/methodology/approach

The problem domain is addressed by using methods from Big Data processing in combination with semantic models. The architecture is designed in a way that for the Smart City model still traditional semantic models and rule engines can be used. However, sensor data occurring at such Smart Cities are pre-processed by a Big Data streaming platform to lower the workload to be processed by the rule engine.

Findings

By creating a real-world implementation of the proposed architecture and running simulations of Smart Cities of different sizes, on top of this implementation, the authors found that the combination of Big Data streaming platforms with semantic reasoning is a valid approach to the problem.

Research limitations/implications

In this article, real-world sensor data from only two buildings were extrapolated for the simulations. Obviously, real-world scenarios will have a more complex set of sensor input values, which needs to be addressed in future work.

Originality/value

The simulations show that merely using a streaming platform as a buffer for sensor input values already increases the sensor data throughput and that by applying intelligent filtering in the streaming platform, the actual number of rule executions can be limited to a minimum.

Details

International Journal of Pervasive Computing and Communications, vol. 10 no. 2
Type: Research Article
ISSN: 1742-7371

Keywords

To view the access options for this content please click here
Article

Martin Kuehnhausen and Victor S. Frost

Security and accountability within the transportation industry are vital because cargo theft could amount to as much as $60 billion per year. Since goods are often handled…

Abstract

Purpose

Security and accountability within the transportation industry are vital because cargo theft could amount to as much as $60 billion per year. Since goods are often handled by many different parties, it must be possible to tightly monitor the location of cargo and handovers. Tracking trade is difficult to manage in different formats and legacy applications Web services and open standards overcome these problems with uniform interfaces and common data formats. This allows consistent reporting, monitoring and analysis at each step. The purpose of this paper is to examine Transportation Security SensorNet (TSSN), the goal being to promote the use of open standards and specifications in combination with web services to provide cargo monitoring capabilities.

Design/methodology/approach

This paper describes a system architecture for the TSSN targeted for cargo monitoring. The paper discusses cargo security and reviews related literature and approaches. The paper then describes the proposed solution of developing a service‐oriented architecture (SOA) for cargo monitoring and its individual components.

Findings

Web services in a mobile sensor network environment have been seen as slow and producing significant overhead. The authors demonstrate that with proper architecture and design the performance requirements of the targeted scenario can be satisfied with web services; the TSSN then allows sensor networks to be utilized in a standardized and open way through web services.

Originality/value

The integration of SOA, open geospatial consortium (OGC) specifications and sensor networks is complex and difficult. As described in related works, most systems and research focus either on the combination of SOA and OGC specifications or on OGC standards and sensor networks. The TSSN shows that all three can be combined and that this combination provides cargo security and monitoring capabilities to the transportation and other industries that have not existed before.

To view the access options for this content please click here
Article

Harri Jalonen and Antti Lönnqvist

The purpose of this paper is to present a conceptual analysis of the theoretical and managerial bases and objectives of predictive business. Predictive business refers to…

Abstract

Purpose

The purpose of this paper is to present a conceptual analysis of the theoretical and managerial bases and objectives of predictive business. Predictive business refers to operational decision‐making and the development of business processes on the basis of business event analysis. It supports the early recognition of business opportunities and threats, better customer intimacy and agile reaction to changes in business environment. An underlying rationale for predictive business is the attainment of competitive advantage through better management of information and knowledge.

Design/methodology/approach

The approach to this article is conceptual and theoretical. The literature‐based discussion and analysis combines the perspectives of business performance management, business intelligence, and knowledge management to provide a new model of thinking and operation.

Findings

For a company predictive business is simultaneously a practical challenge and an epistemic one. It is a practical challenge because predictive business presupposes a change in the company's modes of operation. It is also an epistemic challenge, since it concerns the company's ability to find appropriate balance between knowledge exploitation and knowledge exploration.

Research limitations/implications

Further research should be carried out on the functionality of practical applications as well as the attitudinal and technical preparedness of companies to adopt a new mode of operation. As a subject of investigation, the world of business events offer interesting methodological possibilities, since the basis of the work is the gathering and analysis of large quantities of information on operational activities.

Originality/value

There has been little research concerning business events in knowledge management context. This article presents a theoretically founded basis for predictive business, combining the concept of analysing business events with previous research in the field of knowledge management.

Details

Management Decision, vol. 47 no. 10
Type: Research Article
ISSN: 0025-1747

Keywords

To view the access options for this content please click here
Article

Rui Pedro Figueiredo Marques, Henrique M. Dinis Santos and Carlos Santos

The paper aims to present a solution which makes it possible to control and audit organizational transactions in real time, helping to determine the degree of reliability…

Abstract

Purpose

The paper aims to present a solution which makes it possible to control and audit organizational transactions in real time, helping to determine the degree of reliability with which they are carried out, mitigating the organizational risk. This auditing is made at a very low level of organizational transactions executed and supported exclusively in a digital format, contrary to what happens in most monitoring of transactions, which occurs at a high level. Moreover, it describes the conceptual architecture of the solution, its components and functionalities as well as the development and technical issues which should be taken into consideration on the deployment and evaluation of the solution.

Design/methodology/approach

The work follows the design science methodology. It presents the problem and motivation of the investigation, the solution design and how it is being deployed. Furthermore, it presents the expected results based on the proposed architecture and on the results which are currently being achieved with the prototype implementation.

Findings

The prototype is being put into practice, thus the gathering of results and their evaluation is not yet complete. However, preliminary results are really satisfactory and very close to those expected and enumerated.

Originality/value

The research contributes to a new vision of organizational auditing focused on assurance services in transactions executed and supported in a digital format in compliance with the formalisms of a business ontological model of organizational transactions.

Details

The Learning Organization, vol. 20 no. 6
Type: Research Article
ISSN: 0969-6474

Keywords

To view the access options for this content please click here
Article

Fabio Sartori and Riccardo Melen

A wearable expert system (WES) is an expert system designed and implemented to obtain input from and give outputs to wearable devices. Among its distinguishing features…

Abstract

Purpose

A wearable expert system (WES) is an expert system designed and implemented to obtain input from and give outputs to wearable devices. Among its distinguishing features are the direct cooperation between domain experts and users, and the interaction with a knowledge maintenance system devoted to dynamically update the knowledge base taking care of the evolving scenario. The paper aims to discuss these issues.

Design/methodology/approach

The WES development method is based on the Knowledge Acquisition Framework based on Knowledge Artifact (KAFKA) framework. KAFKA employs multiple knowledge artifacts, each devoted to the acquisition and management of a specific kind of knowledge. The KAFKA framework is introduced from both the conceptual and computational points of view. An example is given which demonstrates the interaction, within this framework, of taxonomies, Bayesian networks and rule-based systems. An experimental assessment of the framework usability is also given.

Findings

The most interesting characteristic of WESs is their capability to evolve over time, due both to the measurement of new values for input variables and to the detection of new input events, that can be used to modify, extend and maintain knowledge bases and to represent domains characterized by variability over time.

Originality/value

WES is a new and challenging concept, dealing with the possibility for a user to develop his/her own decision support systems and update them according to new events when they arise from the environment. The system fully supports domain experts and users with no particular skills in knowledge engineering methodologies, to create, maintain and exploit their expert systems, everywhere and when necessary.

1 – 10 of over 61000