Search results
1 – 10 of 18Tulsi Pawan Fowdur and Ashven Sanghan
The purpose of this paper is to develop a blockchain-based data capture and transmission system that will collect real-time power consumption data from a household electrical…
Abstract
Purpose
The purpose of this paper is to develop a blockchain-based data capture and transmission system that will collect real-time power consumption data from a household electrical appliance and transfer it securely to a local server for energy analytics such as forecasting.
Design/methodology/approach
The data capture system is composed of two current transformer (CT) sensors connected to two different electrical appliances. The CT sensors send the power readings to two Arduino microcontrollers which in turn connect to a Raspberry-Pi for aggregating the data. Blockchain is then enabled onto the Raspberry-Pi through a Java API so that the data are transmitted securely to a server. The server provides real-time visualization of the data as well as prediction using the multi-layer perceptron (MLP) and long short term memory (LSTM) algorithms.
Findings
The results for the blockchain analysis demonstrate that when the data readings are transmitted in smaller blocks, the security is much greater as compared with blocks of larger size. To assess the accuracy of the prediction algorithms data were collected for a 20 min interval to train the model and the algorithms were evaluated using the sliding window approach. The mean average percentage error (MAPE) was used to assess the accuracy of the algorithms and a MAPE of 1.62% and 1.99% was obtained for the LSTM and MLP algorithms, respectively.
Originality/value
A detailed performance analysis of the blockchain-based transmission model using time complexity, throughput and latency as well as energy forecasting has been performed.
Details
Keywords
Petra Pekkanen and Timo Pirttilä
The aim of this study is to empirically explore and analyze the concrete tasks of output measurement and the inherent challenges related to these tasks in a traditional and…
Abstract
Purpose
The aim of this study is to empirically explore and analyze the concrete tasks of output measurement and the inherent challenges related to these tasks in a traditional and autonomous professional public work setting – the judicial system.
Design/methodology/approach
The analysis of the tasks is based on a categorization of general performance measurement motives (control-motivate-learn) and main stakeholder levels (society-organization-professionals). The analysis is exploratory and conducted as an empirical content analysis on materials and reports produced in two performance improvement projects conducted in European justice organizations.
Findings
The identified main tasks in the different categories are related to managing resources, controlling performance deviations, and encouraging improvement and development of performance. Based on the results, key improvement areas connected to output measurement in professional public organizations are connected to the improvement of objectivity and fairness in budgeting and work allocation practices, improvement of output measures' versatility and informativeness to highlight motivational and learning purposes, improvement of professional self-management in setting output targets and producing outputs, as well as improvement of organizational learning from the output measurement.
Practical implications
The paper presents empirically founded practical examples of challenges and improvement opportunities related to the tasks of output measurement in professional public organization.
Originality/value
This paper fulfils an identified need to study how general performance management motives realize as concrete tasks of output measurement in justice organizations.
Details
Keywords
C. Bharanidharan, S. Malathi and Hariprasath Manoharan
The potential of vehicle ad hoc networks (VANETs) to improve driver and passenger safety and security has made them a hot topic in the field of intelligent transportation systems…
Abstract
Purpose
The potential of vehicle ad hoc networks (VANETs) to improve driver and passenger safety and security has made them a hot topic in the field of intelligent transportation systems (ITSs). VANETs have different characteristics and system architectures from mobile ad hoc networks (MANETs), with a primary focus on vehicle-to-vehicle (V2V) and vehicle-to-infrastructure (V2I) communication. But protecting VANETs from malicious assaults is crucial because they can undermine network security and safety.
Design/methodology/approach
The black hole attack is a well-known danger to VANETs. It occurs when a hostile node introduces phony routing tables into the network, potentially damaging it and interfering with communication. A safe ad hoc on-demand distance vector (AODV) routing protocol has been created in response to this issue. By adding cryptographic features for source and target node verification to the route request (RREQ) and route reply (RREP) packets, this protocol improves upon the original AODV routing system.
Findings
Through the use of cryptographic-based encryption and decryption techniques, the suggested method fortifies the VANET connection. In addition, other network metrics are taken into account to assess the effectiveness of the secure AODV routing protocol under black hole attacks, including packet loss, end-to-end latency, packet delivery ratio (PDR) and routing request overhead. Results from simulations using an NS-2.33 simulator show how well the suggested fix works to enhance system performance and lessen the effects of black hole assaults on VANETs.
Originality/value
All things considered, the safe AODV routing protocol provides a strong method for improving security and dependability in VANET systems, protecting against malevolent attacks and guaranteeing smooth communication between cars and infrastructure.
Details
Keywords
Armando Calabrese, Antonio D'Uffizi, Nathan Levialdi Ghiron, Luca Berloco, Elaheh Pourabbas and Nathan Proudlove
The primary objective of this paper is to show a systematic and methodological approach for the digitalization of critical clinical pathways (CPs) within the healthcare domain.
Abstract
Purpose
The primary objective of this paper is to show a systematic and methodological approach for the digitalization of critical clinical pathways (CPs) within the healthcare domain.
Design/methodology/approach
The methodology entails the integration of service design (SD) and action research (AR) methodologies, characterized by iterative phases that systematically alternate between action and reflective processes, fostering cycles of change and learning. Within this framework, stakeholders are engaged through semi-structured interviews, while the existing and envisioned processes are delineated and represented using BPMN 2.0. These methodological steps emphasize the development of an autonomous, patient-centric web application alongside the implementation of an adaptable and patient-oriented scheduling system. Also, business processes simulation is employed to measure key performance indicators of processes and test for potential improvements. This method is implemented in the context of the CP addressing transient loss of consciousness (TLOC), within a publicly funded hospital setting.
Findings
The methodology integrating SD and AR enables the detection of pivotal bottlenecks within diagnostic CPs and proposes optimal corrective measures to ensure uninterrupted patient care, all the while advancing the digitalization of diagnostic CP management. This study contributes to theoretical discussions by emphasizing the criticality of process optimization, the transformative potential of digitalization in healthcare and the paramount importance of user-centric design principles, and offers valuable insights into healthcare management implications.
Originality/value
The study’s relevance lies in its ability to enhance healthcare practices without necessitating disruptive and resource-intensive process overhauls. This pragmatic approach aligns with the imperative for healthcare organizations to improve their operations efficiently and cost-effectively, making the study’s findings relevant.
Details
Keywords
Elena Stefana, Paola Cocca, Federico Fantori, Filippo Marciano and Alessandro Marini
This paper aims to overcome the inability of both comparing loss costs and accounting for production resource losses of Overall Equipment Effectiveness (OEE)-related approaches.
Abstract
Purpose
This paper aims to overcome the inability of both comparing loss costs and accounting for production resource losses of Overall Equipment Effectiveness (OEE)-related approaches.
Design/methodology/approach
The authors conducted a literature review about the studies focusing on approaches combining OEE with monetary units and/or resource issues. The authors developed an approach based on Overall Equipment Cost Loss (OECL), introducing a component for the production resource consumption of a machine. A real case study about a smart multicenter three-spindle machine is used to test the applicability of the approach.
Findings
The paper proposes Resource Overall Equipment Cost Loss (ROECL), i.e. a new KPI expressed in monetary units that represents the total cost of losses (including production resource ones) caused by inefficiencies and deviations of the machine or equipment from its optimal operating status occurring over a specific time period. ROECL enables to quantify the variation of the product cost occurring when a machine or equipment changes its health status and to determine the actual product cost for a given production order. In the analysed case study, the most critical production orders showed an actual production cost about 60% higher than the minimal cost possible under the most efficient operating conditions.
Originality/value
The proposed approach may support both production and cost accounting managers during the identification of areas requiring attention and representing opportunities for improvement in terms of availability, performance, quality, and resource losses.
Details
Keywords
Patrik Jonsson, Johan Öhlin, Hafez Shurrab, Johan Bystedt, Azam Sheikh Muhammad and Vilhelm Verendel
This study aims to explore and empirically test variables influencing material delivery schedule inaccuracies?
Abstract
Purpose
This study aims to explore and empirically test variables influencing material delivery schedule inaccuracies?
Design/methodology/approach
A mixed-method case approach is applied. Explanatory variables are identified from the literature and explored in a qualitative analysis at an automotive original equipment manufacturer. Using logistic regression and random forest classification models, quantitative data (historical schedule transactions and internal data) enables the testing of the predictive difference of variables under various planning horizons and inaccuracy levels.
Findings
The effects on delivery schedule inaccuracies are contingent on a decoupling point, and a variable may have a combined amplifying (complexity generating) and stabilizing (complexity absorbing) moderating effect. Product complexity variables are significant regardless of the time horizon, and the item’s order life cycle is a significant variable with predictive differences that vary. Decoupling management is identified as a mechanism for generating complexity absorption capabilities contributing to delivery schedule accuracy.
Practical implications
The findings provide guidelines for exploring and finding patterns in specific variables to improve material delivery schedule inaccuracies and input into predictive forecasting models.
Originality/value
The findings contribute to explaining material delivery schedule variations, identifying potential root causes and moderators, empirically testing and validating effects and conceptualizing features that cause and moderate inaccuracies in relation to decoupling management and complexity theory literature?
Details
Keywords
Namal Bandaranayake, Senevi Kiridena and Asela K. Kulatunga
Achieving swift and even flow of cargo through the border, the ultimate objective of cross-border logistics (CBL) requires the close coordination and collaboration of a multitude…
Abstract
Purpose
Achieving swift and even flow of cargo through the border, the ultimate objective of cross-border logistics (CBL) requires the close coordination and collaboration of a multitude of stakeholders, as well as optimally configured systems. To achieve and sustain competitiveness in a dynamic international trade environment, CBL processes must undergo periodic analysis, improvement and optimization. This study aims to develop a modelling framework to capture CBL processes for analysis and improvement.
Design/methodology/approach
Relying on the extant literature, a meta-model is developed incorporating significant perspectives required to model CBL processes. Popular process modelling notations are evaluated against the meta-model and their ease of comprehension is also evaluated. The selected notation through evalution is augmented with addendums for a comprehensive depiction of CBL processes.
Findings
The capacity of role activity diagrams (RADs) to depict all perspectives, including interactions in a single diagram, makes them particularly suitable for modelling CBL processes. RADs have been complemented with physical flow diagrams and methods to capture temporal dimension, enabling a comprehensive view of CBL processes laying the foundation for insightful analysis.
Research limitations/implications
The meta-model developed in this paper paves the way to develop an analysis framework which requires further research.
Originality/value
The lack of well-accepted modelling notations for studying CBL processes prompts researchers to search and adapt different formalisms. This study has filled this gap by proposing a comprehensive modelling framework able to capture CBL processes at different granularities in rich detail. Not only does the developed meta-model aid in selecting the notation, it is also useful in analysing the constituent elements of CBL processes.
Details
Keywords
Ibtissem Alguirat, Fatma Lehyani and Alaeddine Zouari
Lean management tools are becoming increasingly applied in different types of organizations around the world. These tools have shown their significant contribution to improving…
Abstract
Purpose
Lean management tools are becoming increasingly applied in different types of organizations around the world. These tools have shown their significant contribution to improving business performance. In this vein, the purpose of this paper is to examine the influence of lean management on both occupational safety and operational excellence in Tunisian companies.
Design/methodology/approach
A survey was conducted among Tunisian companies, and it resulted in the collection of 62 responses that were analyzed using the software SPSS. In addition, a conceptual model linking the practices of the three basic concepts was designed to highlight the hypotheses of the research. Subsequently, factor analysis and structural equation method analysis were conducted to assess the validation of the assumptions.
Findings
The results obtained have shown that lean management has a significant impact on occupational safety. Similarly, occupational safety has a significant impact on operational excellence. However, lean management does not have a significant impact on operational excellence.
Originality/value
This work highlighted the involvement of small and medium-sized enterprise’s managers from emerging economies in the studied concepts’ practices. Likewise, it testified to the impacts of lean management on occupational safety and operational excellence in the Tunisian context.
Details
Keywords
Renan Ribeiro Do Prado, Pedro Antonio Boareto, Joceir Chaves and Eduardo Alves Portela Santos
The aim of this paper is to explore the possibility of using the Define-Measure-Analyze-Improve-Control (DMAIC) cycle, process mining (PM) and multi-criteria decision methods in…
Abstract
Purpose
The aim of this paper is to explore the possibility of using the Define-Measure-Analyze-Improve-Control (DMAIC) cycle, process mining (PM) and multi-criteria decision methods in an integrated way so that these three elements combined result in a methodology called the Agile DMAIC cycle, which brings more agility and reliability in the execution of the Six Sigma process.
Design/methodology/approach
The approach taken by the authors in this study was to analyze the studies arising from this union of concepts and to focus on using PM tools where appropriate to accelerate the DMAIC cycle by improving the first two steps, and to test using the AHP as a decision-making process, to bring more excellent reliability in the definition of indicators.
Findings
It was indicated that there was a gain with acquiring indicators and process maps generated by PM. And through the AHP, there was a greater accuracy in determining the importance of the indicators.
Practical implications
Through the results and findings of this study, more organizations can understand the potential of integrating Six Sigma and PM. It was just developed for the first two steps of the DMAIC cycle, and it is also a replicable method for any Six Sigma project where data acquisition through mining is possible.
Originality/value
The authors develop a fully applicable and understandable methodology which can be replicated in other settings and expanded in future research.
Details
Keywords
Zahra Borghei, Martina Linnenluecke and Binh Bui
This paper aims to explore current trends in how companies disclose climate-related risks and opportunities in their financial statements. As part of the authors’ analysis, they…
Abstract
Purpose
This paper aims to explore current trends in how companies disclose climate-related risks and opportunities in their financial statements. As part of the authors’ analysis, they examine: whether forward-looking assumptions and judgements are typically considered in reporting climate-related risks/opportunities; whether there are differences in the reporting practices of firms in carbon-intensive industries versus non-carbon-intensive industries; and whether negative media reports have an influence on the levels of disclosure a firm makes.
Design/methodology/approach
The authors chose content analysis as their methodology and examined the financial statements published by firms listed on the UK’s FTSE 100 between 2016 and 2020. This analysis is framed by Suchman’s three dimensions of legitimacy, being pragmatic, cognitive and moral.
Findings
Climate-related disclosures in the notes and financial accounts of these firms did increase over the period. Yet, overall, the level the disclosures was inadequate and the quality was inconsistent. From this, the authors conclude that pragmatic legitimacy is not a particularly strong driving factor in compelling organisations to disclose climate-related information. The firms in carbon-intensive industries do provide greater levels of disclosure, including both qualitative and quantitative (monetary) content, which is consistent with cognitive legitimacy. However, from a moral legitimacy perspective, this study finds that firms did not adapt responsively to negative media coverage as a way of reflecting their accountability to broader public norms and values. Overall, this analysis suggests that regulatory enforcement and a systematic reporting framework with adequate guidance is going to be critical to developing transparent climate-related reporting in future.
Originality/value
This paper contributes to existing studies on climate-related disclosures, which have mainly examined the ‘front-half’ of annual reports. Conversely, this study aims to shed light on these practices in the “back-half” of these reports, exploring the underlying reasons for reporting climate-related risks and opportunities in financial accounts. The authors’ insights into the current disclosure practices make a theoretical contribution to the literature. Practitioners can also draw on these insights to improve how they report on climate-related risks and opportunities in their financial statements.
Details