Search results

1 – 10 of over 14000
Article
Publication date: 24 June 2022

Zhao-ge Liu, Xiang-yang Li and Li-min Qiao

Process mining tools can help discover and improve the business processes of urban community services from historical service event records. However, for the community service

Abstract

Purpose

Process mining tools can help discover and improve the business processes of urban community services from historical service event records. However, for the community service domains with small datasets, the effects of process mining are generally limited due to process incompleteness and data noise. In this paper, a cross-domain knowledge transfer method is proposed to help service process discovery with small datasets by making use of rich knowledge in similar domains with large datasets.

Design/methodology/approach

First, ontology modeling is used to reduce the effects of cross-domain semantic ambiguity on knowledge transfer. Second, association rules (of the activities in the service processes) are extracted with Bayesian network. Third, applicable association rules are retrieved using an applicability assignment function. Further, the retrieved association rules in domains with large datasets are mapped to those with a small dataset using a linear programming method, with a heuristic miner being adopted to generate the process model.

Findings

The proposed method is verified based on the empirical data of 10 service domains from Beidaihe, China. Results show that process discovery performance of all 10 domains were improved with the overall robustness score, precision, recall and F1 score increased by 13%, 13%, 17% and 15%, respectively. For the domains with only small datasets, the cross-domain knowledge transfer method outperforms popular state-of-the art methods.

Originality/value

The limitations of sample sizes are greatly reduced. This scheme can be followed to establish business process management systems of community services with reasonable performance and limited sample sizes.

Details

Business Process Management Journal, vol. 28 no. 4
Type: Research Article
ISSN: 1463-7154

Keywords

Article
Publication date: 1 March 1995

Fiorenzo Franceschini

Discusses the problems of how to structure a computerizedprocess‐oriented test facility [TF] with particular regard to theaerospace, automotive, electronic, chemical and railway…

138

Abstract

Discusses the problems of how to structure a computerized process‐oriented test facility [TF] with particular regard to the aerospace, automotive, electronic, chemical and railway industries and the military. Covers the areas of test facility functional architecture, operation philosophy, test plan preparation, data management support, system access and security, system engineering support, the conducting of the test and system maintenance support. Concludes that the aim of a structural approach has been to focalize in an organized framework the attention of a test‐designer to the operating and supporting functions of a test facility. The general methodology proposed can be utilised as a reference in many application fields from defence to commercial systems.

Details

Sensor Review, vol. 15 no. 1
Type: Research Article
ISSN: 0260-2288

Keywords

Article
Publication date: 29 April 2020

Rachel K. Fischer, Aubrey Iglesias, Alice L. Daugherty and Zhehan Jiang

The article presents a methodology that can be used to analyze data from the transaction log of EBSCO Discovery Service searches recorded in Google Analytics. It explains the…

Abstract

Purpose

The article presents a methodology that can be used to analyze data from the transaction log of EBSCO Discovery Service searches recorded in Google Analytics. It explains the steps to follow for exporting the data, analyzing the data, and recreating searches. The article provides suggestions to improve the quality of research on the topic. It also includes advice to vendors on improving the quality of transaction log software.

Design/methodology/approach

Case study

Findings

Although Google Analytics can be used to study transaction logs accurately, vendors still need to improve the functionality so librarians can gain the most benefit from it.

Research limitations/implications

The research is applicable to the usage of Google Analytics with EBSCO Discovery Service.

Practical implications

The steps presented in the article can be followed as a step-by-step guide to repeating the study at other institutions.

Social implications

The methodology in this article can be used to assess how library instruction can be improved.

Originality/value

This article provides a detailed description of a transaction log analysis process that other articles have not previously described. This includes a description of a methodology for accurately calculating statistics from Google Analytics data and provides steps for recreating accurate searches from data recorded in Google Analytics.

Details

Library Hi Tech, vol. 39 no. 1
Type: Research Article
ISSN: 0737-8831

Keywords

Article
Publication date: 25 September 2019

Nabil Moukafih, Ghizlane Orhanou and Said Elhajji

This paper aims to propose a mobile agent-based security information and event management architecture (MA-SIEM) that uses mobile agents for near real-time event collection and…

Abstract

Purpose

This paper aims to propose a mobile agent-based security information and event management architecture (MA-SIEM) that uses mobile agents for near real-time event collection and normalization on the source device. The externalization of the normalization process, executed by several distributed mobile agents on interconnected computers and devices, proposes a SIEM server dedicated mainly for correlation and analysis.

Design/methodology/approach

The architecture has been proposed in three stages. In the first step, the authors described the different aspects of the proposed approach. Then they implemented the proposed architecture and presented a new vision for the insertion of normalized data into the SIEM database. Finally, the authors performed a numerical comparison between the approach used in the proposed architecture and that of existing SIEM systems.

Findings

The results of the experiments showed that MA-SIEM systems are more efficient than existing SIEM systems because they leave the SIEM resources primarily dedicated to advanced correlation analysis. In addition, this paper takes into account realistic scenarios and use-cases and proposes a fully automated process for transferring normalized events in near real time to the SIEM server for further analysis using mobile agents.

Originality/value

The work provides new insights into the normalization security-related events using light mobile agents.

Details

Information & Computer Security, vol. 28 no. 1
Type: Research Article
ISSN: 2056-4961

Keywords

Article
Publication date: 25 February 2020

Leandro Guarino Vasconcelos, Laercio Augusto Baldochi and Rafael Duarte Coelho Santos

This paper aims to presents Real-time Usage Mining (RUM), an approach that exploits the rich information provided by client logs to support the construction of adaptive Web…

Abstract

Purpose

This paper aims to presents Real-time Usage Mining (RUM), an approach that exploits the rich information provided by client logs to support the construction of adaptive Web applications. The main goal of RUM is to provide useful information about the behavior of users that are currently browsing a Web application. By consuming this information, the application is able to adapt its user interface in real-time to enhance the user experience. RUM provides two types of services as follows: support for the detection of struggling users; and user profiling based on the detection of behavior patterns.

Design/methodology/approach

RUM leverages the previous study on usability evaluation to provide a service that evaluates the usability of tasks performed by users while they browse applications. This evaluation is based on a metric that allows the detection of struggling users, making it possible to identify these users as soon as few logs from their interaction are processed. RUM also exploits log mining techniques to detect usage patterns, which are then associated with user profiles previously defined by the application specialist. After associating usage patterns to user profiles, RUM is able to classify users as they browse applications, allowing the application developer to tailor the user interface according to the users’ needs and preferences.

Findings

The proposed approach was exploited to improve user experience in real-world Web applications. Experiments showed that RUM was effective to provide support for struggling users to complete tasks. Moreover, it was also effective to detect usage patterns and associate them with user profiles.

Originality/value

Although the literature reports studies that explore client logs to support both the detection of struggling users and the user profiling based on usage patterns, no existing solutions provide support for detecting users from specific profiles or struggling users, in real-time, while they are browsing Web applications. RUM also provides a toolkit that allows the approach to be easily deployed in any Web application.

Details

International Journal of Web Information Systems, vol. 16 no. 2
Type: Research Article
ISSN: 1744-0084

Keywords

Book part
Publication date: 1 November 2007

Irina Farquhar and Alan Sorkin

This study proposes targeted modernization of the Department of Defense (DoD's) Joint Forces Ammunition Logistics information system by implementing the optimized innovative…

Abstract

This study proposes targeted modernization of the Department of Defense (DoD's) Joint Forces Ammunition Logistics information system by implementing the optimized innovative information technology open architecture design and integrating Radio Frequency Identification Device data technologies and real-time optimization and control mechanisms as the critical technology components of the solution. The innovative information technology, which pursues the focused logistics, will be deployed in 36 months at the estimated cost of $568 million in constant dollars. We estimate that the Systems, Applications, Products (SAP)-based enterprise integration solution that the Army currently pursues will cost another $1.5 billion through the year 2014; however, it is unlikely to deliver the intended technical capabilities.

Details

The Value of Innovation: Impact on Health, Life Quality, Safety, and Regulatory Research
Type: Book
ISBN: 978-1-84950-551-2

Article
Publication date: 21 September 2012

Ahmet Soylu, Felix Mödritscher, Fridolin Wild, Patrick De Causmaecker and Piet Desmet

Mashups have been studied extensively in the literature; nevertheless, the large body of work in this area focuses on service/data level integration and leaves UI level…

Abstract

Purpose

Mashups have been studied extensively in the literature; nevertheless, the large body of work in this area focuses on service/data level integration and leaves UI level integration, hence UI mashups, almost unexplored. The latter generates digital environments in which participating sources exist as individual entities; member applications and data sources share the same graphical space particularly in the form of widgets. However, the true integration can only be realized through enabling widgets to be responsive to the events happening in each other. The authors call such an integration “widget orchestration” and the resulting application “mashup by orchestration”. This article aims to explore and address challenges regarding the realization of widget‐based UI mashups and UI level integration, prominently in terms of widget orchestration, and to assess their suitability for building web‐based personal environments.

Design/methodology/approach

The authors provide a holistic view on mashups and a theoretical grounding for widget‐based personal environments. The authors identify the following challenges: widget interoperability, end‐user data mobility as a basis for manual widget orchestration, user behavior mining – for extracting behavioral patterns – as a basis for automated widget orchestration, and infrastructure. The authors introduce functional widget interfaces for application interoperability, exploit semantic web technologies for data interoperability, and realize end‐user data mobility on top of this interoperability framework. The authors employ semantically enhanced workflow/process mining techniques, along with Petri nets as a formal ground, for user behavior mining. The authors outline a reference platform and architecture that is compliant with the authors' strategies, and extend W3C widget specification respectively – prominently with a communication channel – to foster standardization. The authors evaluate their solution approaches regarding interoperability and infrastructure through a qualitative comparison with respect to existing literature, and provide a computational evaluation of the behavior mining approach. The authors realize a prototype for a widget‐based personal learning environment for foreign language learning to demonstrate the feasibility of their solution strategies. The prototype is also used as a basis for the end‐user assessment of widget‐based personal environments and widget orchestration.

Findings

The evaluation results suggest that the interoperability framework, platform, and architecture have certain advantages over existing approaches, and the proposed behavior mining techniques are adequate for the extraction of behavioral patterns. User assessments show that widget‐based UI mashups with orchestration (i.e. mashups by orchestration) are promising for the creation of personal environments as well as for an enhanced user experience.

Originality/value

This article provides an extensive exploration of mashups by orchestration and their role in the creation of personal environments. Key challenges are described, along with novel solution strategies to meet them.

Article
Publication date: 24 May 2011

Bokyoung Kang, Jae‐Yoon Jung, Nam Wook Cho and Suk‐Ho Kang

The purpose of this paper is to help industrial managers monitor and analyze critical performance indicators in real time during the execution of business processes by proposing a…

1815

Abstract

Purpose

The purpose of this paper is to help industrial managers monitor and analyze critical performance indicators in real time during the execution of business processes by proposing a visualization technique using an extended formal concept analysis (FCA). The proposed approach monitors the current progress of ongoing processes and periodically predicts their probable routes and performances.

Design/methodology/approach

FCA is utilized to analyze relations among patterns of events in historical process logs, and this method of data analysis visualizes the relations in a concept lattice. To apply FCA to real‐time business process monitoring, the authors extended the conventional concept lattice into a reachability lattice, which enables managers to recognize reachable patterns of events in specific instances of business processes.

Findings

By using a reachability lattice, expected values of a target key performance indicator are predicted and traced along with probable outcomes. Analysis is conducted periodically as the monitoring time elapses over the course of business processes.

Practical implications

The proposed approach focuses on the visualization of probable event occurrences on the basis of historical data. Such visualization can be utilized by industrial managers to evaluate the status of any given instance during business processes and to easily predict possible subsequent states for purposes of effective and efficient decision making. The proposed method was developed in a prototype system for proof of concept and has been illustrated using a simplified real‐world example of a business process in a telecommunications company.

Originality/value

The main contribution of this paper lies in the development of a real‐time monitoring approach of ongoing processes. The authors have provided a new data structure, namely a reachability lattice, which visualizes real‐time progress of ongoing business processes. As a result, current and probable next states can be predicted graphically using periodically conducted analysis during the processes.

Details

Industrial Management & Data Systems, vol. 111 no. 5
Type: Research Article
ISSN: 0263-5577

Keywords

Article
Publication date: 16 October 2018

Anna Kalenkova, Andrea Burattin, Massimiliano de Leoni, Wil van der Aalst and Alessandro Sperduti

The purpose of this paper is to demonstrate that process mining techniques can help to discover process models from event logs, using conventional high-level process modeling…

1031

Abstract

Purpose

The purpose of this paper is to demonstrate that process mining techniques can help to discover process models from event logs, using conventional high-level process modeling languages, such as Business Process Model and Notation (BPMN), leveraging their representational bias.

Design/methodology/approach

The integrated discovery approach presented in this work is aimed to mine: control, data and resource perspectives within one process diagram, and, if possible, construct a hierarchy of subprocesses improving the model readability. The proposed approach is defined as a sequence of steps, performed to discover a model, containing various perspectives and presenting a holistic view of a process. This approach was implemented within an open-source process mining framework called ProM and proved its applicability for the analysis of real-life event logs.

Findings

This paper shows that the proposed integrated approach can be applied to real-life event logs of information systems from different domains. The multi-perspective process diagrams obtained within the approach are of good quality and better than models discovered using a technique that does not consider hierarchy. Moreover, due to the decomposition methods applied, the proposed approach can deal with large event logs, which cannot be handled by methods that do not use decomposition.

Originality/value

The paper consolidates various process mining techniques, which were never integrated before and presents a novel approach for the discovery of multi-perspective hierarchical BPMN models. This approach bridges the gap between well-known process mining techniques and a wide range of BPMN-complaint tools.

Details

Business Process Management Journal, vol. 25 no. 5
Type: Research Article
ISSN: 1463-7154

Keywords

Article
Publication date: 21 June 2022

Onur Dogan

Similar to many business processes, waiting times are also essential for health care processes, especially in obstetrics and gynecology outpatient department (GOD), because…

Abstract

Purpose

Similar to many business processes, waiting times are also essential for health care processes, especially in obstetrics and gynecology outpatient department (GOD), because pregnant women may be affected by long waiting times. Since creating process models manually presents subjective and nonrealistic flows, this study aims to meet the need of an objective and realistic method.

Design/methodology/approach

In this study, the authors investigate time-related bottlenecks in both departments for different doctors by process mining. Process mining is a pragmatic analysis to obtain meaningful insights through event logs. It applies data mining techniques to business process management with more comprehensive perspectives. Process mining in this study enables to automatically create patient flows to compare considering each department and doctor.

Findings

The study concludes that average waiting times in the GOD are higher than obstetrics outpatient department. However, waiting times in departments can change inversely for different doctors.

Research limitations/implications

The event log was created by expert opinions because activities in the processes had just starting timestamp. The ending time of activity was computed by considering the average duration of the corresponding activity under a normal distribution.

Originality/value

This study focuses on administrative (nonclinical) health processes in obstetrics and GOD. It uses a parallel activity log inference algorithm (PALIA) to produce process trees by handling duplicate activities. Infrequent information in health processes can have critical information about the patient. PALIA considers infrequent activities in the event log to extract meaningful information, in contrast to many discovery algorithms.

1 – 10 of over 14000