Search results

1 – 10 of over 25000
Article
Publication date: 28 November 2022

Anuraj Mohan, Karthika P.V., Parvathi Sankar, K. Maya Manohar and Amala Peter

Money laundering is the process of concealing unlawfully obtained funds by presenting them as coming from a legitimate source. Criminals use crypto money laundering to hide the…

Abstract

Purpose

Money laundering is the process of concealing unlawfully obtained funds by presenting them as coming from a legitimate source. Criminals use crypto money laundering to hide the illicit origin of funds using a variety of methods. The most simplified form of bitcoin money laundering leans hard on the fact that transactions made in cryptocurrencies are pseudonymous, but open data gives more power to investigators and enables the crowdsourcing of forensic analysis. With the motive to curb these illegal activities, there exist various rules, policies and technologies collectively known as anti-money laundering (AML) tools. When properly implemented, AML restrictions reduce the negative effects of illegal economic activity while also promoting financial market integrity and stability, but these bear high costs for institutions. The purpose of this work is to motivate the opportunity to reconcile the cause of safety with that of financial inclusion, bearing in mind the limitations of the available data. The authors use the Elliptic dataset; to the best of the authors' knowledge, this is the largest labelled transaction dataset publicly available in any cryptocurrency.

Design/methodology/approach

AML in bitcoin can be modelled as a node classification task in dynamic networks. In this work, graph convolutional decision forest will be introduced, which combines the potentialities of evolving graph convolutional network and deep neural decision forest (DNDF). This model will be used to classify the unknown transactions in the Elliptic dataset. Additionally, the application of knowledge distillation (KD) over the proposed approach gives finest results compared to all the other experimented techniques.

Findings

The importance of utilising a concatenation between dynamic graph learning and ensemble feature learning is demonstrated in this work. The results show the superiority of the proposed model to classify the illicit transactions in the Elliptic dataset. Experiments also show that the results can be further improved when the system is fine-tuned using a KD framework.

Originality/value

Existing works used either ensemble learning or dynamic graph learning to tackle the problem of AML in bitcoin. The proposed model provides a novel view to combine the power of random forest with dynamic graph learning methods. Furthermore, the work also demonstrates the advantage of KD in improving the performance of the whole system.

Details

Data Technologies and Applications, vol. 57 no. 3
Type: Research Article
ISSN: 2514-9288

Keywords

Article
Publication date: 15 September 2023

Suzan Alaswad and Sinan Salman

While steady-state analysis is useful, it does not consider the inherent transient characteristics of repairable systems' behavior, especially in systems that have relatively…

Abstract

Purpose

While steady-state analysis is useful, it does not consider the inherent transient characteristics of repairable systems' behavior, especially in systems that have relatively short life spans, or when their transient behavior is of special concern such as the motivating example used in this paper, military systems. Therefore, a maintenance policy that considers both transient and steady-state availability and aims to achieve the best trade-off between high steady-state availability and rapid stabilization is essential.

Design/methodology/approach

This paper studies the transient behavior of system availability under the Kijima Type II virtual age model. While such systems achieve steady-state availability, and it has been proved that deploying preventive maintenance (PM) can significantly improve its steady-state availability, this improvement often comes at the price of longer and increased fluctuating transient behavior, which affects overall system performance. The authors present a methodology that identifies the optimal PM policy that achieves the best trade-off between high steady-state availability and rapid stabilization based on cost-availability analysis.

Findings

When the proposed simulation-based optimization and cost analysis methodology is applied to the motivating example, it produces an optimal PM policy that achieves an availability–variability balance between transient and steady-state system behaviors. The optimal PM policy produces a notably lower availability coefficient of variation (by 11.5%), while at the same time suffering a negligible limiting availability loss of only 0.3%. The new optimal PM policy also provides cost savings of about 5% in total maintenance cost. The performed sensitivity analysis shows that the system's optimal maintenance cost is sensitive to the repair time, the shape parameter of the Weibull distribution and the downtime cost, but is robust with respect to changes in the remaining parameters.

Originality/value

Most of the current maintenance models emphasize the steady-state behavior of availability and neglect its transient behavior. For some systems, using steady-state availability as the sole metric for performance is not adequate, especially in systems that have relatively short life spans or when their transient behavior affects the overall performance. However, little work has been done on the transient analysis of such systems. In this paper, the authors aim to fill this gap by emphasizing such systems and applications where transient behavior is of critical importance to efficiently optimize system performance. The authors use military systems as a motivating example.

Details

International Journal of Quality & Reliability Management, vol. 41 no. 2
Type: Research Article
ISSN: 0265-671X

Keywords

Article
Publication date: 15 August 2023

Walaa AlKhader, Raja Jayaraman, Khaled Salah, Andrei Sleptchenko, Jiju Antony and Mohammed Omar

Quality 4.0 (Q4.0) leverages new emerging technologies to achieve operational excellence and enhance performance. Implementing Q4.0 in digital manufacturing can bring about…

Abstract

Purpose

Quality 4.0 (Q4.0) leverages new emerging technologies to achieve operational excellence and enhance performance. Implementing Q4.0 in digital manufacturing can bring about reliable, flexible and decentralized manufacturing. Emerging technologies such as Non-Fungible Tokens (NFTs), Blockchain and Interplanetary File Storage (IPFS) can all be utilized to realize Q4.0 in digital manufacturing. NFTs, for instance, can provide traceability and property ownership management and protection. Blockchain provides secure and verifiable transactions in a manner that is trusted, immutable and tamper-proof. This research paper aims to explore the concept of Q4.0 within digital manufacturing systems and provide a novel solution based on Blockchain and NFTs for implementing Q4.0 in digital manufacturing.

Design/methodology/approach

This study reviews the relevant literature and presents a detailed system architecture, along with a sequence diagram that demonstrates the interactions between the various participants. To implement a prototype of the authors' system, the authors next develop multiple Ethereum smart contracts and test the algorithms designed. Then, the efficacy of the proposed system is validated through an evaluation of its cost-effectiveness and security parameters. Finally, this research provides other potential applications and scenarios across diverse industries.

Findings

The proposed solution's smart contracts governing the transactions among the participants were implemented successfully. Furthermore, the authors' analysis indicates that the authors' solution is cost-effective and resilient against commonly known security attacks.

Research limitations/implications

This study represents a pioneering endeavor in the exploration of the potential applications of NFTs and blockchain in the attainment of a comprehensive quality framework (Q4.0) in digital manufacturing. Presently, the body of research on quality control or assurance in digital manufacturing is limited in scope, primarily focusing on the products and production processes themselves. However, this study examines the other vital elements, including management, leadership and intra- and inter-organizational relationships, which are essential for manufacturers to achieve superior performance and optimal manufacturing outcomes.

Practical implications

To facilitate the achievement of Q4.0 and empower manufacturers to attain outstanding quality and gain significant competitive advantages, the authors propose the integration of Blockchain and NFTs into the digital manufacturing framework, with all related processes aligned with an organization's strategic and leadership objectives.

Originality/value

This study represents a pioneering endeavor in the exploration of the potential applications of NFTs and blockchain in the attainment of a comprehensive quality framework (Quality 4.0) in digital manufacturing. Presently, the body of research on quality control or assurance in digital manufacturing is limited in scope, primarily focusing on the products and production processes themselves. However, this study examines the other vital elements, including management, leadership and intra- and inter-organizational relationships, which are essential for manufacturers to achieve superior performance and optimal manufacturing outcomes.

Details

Journal of Manufacturing Technology Management, vol. 34 no. 7
Type: Research Article
ISSN: 1741-038X

Keywords

Article
Publication date: 9 February 2024

Jeffrey W. Alstete and Heidi Flavian

This study aims to investigate basic/core principles and practical tools behind successful manuscript writing for education journals. Drawing on the insights of journal editors…

Abstract

Purpose

This study aims to investigate basic/core principles and practical tools behind successful manuscript writing for education journals. Drawing on the insights of journal editors and related literature, this paper seeks to clarify the craft of preparing quality manuscripts to meet the expectations of academic journals.

Design/methodology/approach

This paper uses an interpretivist framework by incorporating a qualitative analysis of the literature with the authors’ experiences to identify key principles and issues in academic publishing. These narratives provide an empirical basis for understanding the mechanics and essence of effective manuscript crafting. The study integrates theoretical knowledge with actionable strategies, focusing on identifying the objectives and processes of writing, determining common challenges and directing readers toward comprehensive resources for guidance in article writing.

Findings

This study reveals that manuscript rejections often transcend technical shortcomings. Issues that are central to nonacceptance include misalignment with a journal’s thematic focus, absence of a coherent and persuasive argument, methodological weaknesses and insufficient evidence underpinning the assertions. Successful publication depends not just on data presentation and adherence to submission norms but also on developing a narrative that enriches the prevailing scholarly discourse. Our findings advocate for manuscripts that strike an appropriate balance between lucidity and analytical rigor, avoid superfluous technical language and express a mix of assertiveness and scholarly modesty.

Originality/value

Although there is literature on academic writing, very few recent articles have been uncovered that probe the intricacies of crafting education manuscripts and point to resources.

Details

Quality Assurance in Education, vol. 32 no. 2
Type: Research Article
ISSN: 0968-4883

Keywords

Open Access
Article
Publication date: 21 July 2023

Erika Alves dos Santos, Silvio Peroni and Marcos Luiz Mucheroni

In this study, the authors want to identify current possible causes for citing and referencing errors in scholarly literature to compare if something changed from the snapshot…

Abstract

Purpose

In this study, the authors want to identify current possible causes for citing and referencing errors in scholarly literature to compare if something changed from the snapshot provided by Sweetland in his 1989 paper.

Design/methodology/approach

The authors analysed reference elements, i.e. bibliographic references, mentions, quotations and respective in-text reference pointers, from 729 articles published in 147 journals across the 27 subject areas.

Findings

The outcomes of the analysis pointed out that bibliographic errors have been perpetuated for decades and that their possible causes have increased, despite the encouraged use of technological facilities, i.e. the reference managers.

Originality/value

As far as the authors know, the study is the best recent available analysis of errors in referencing and citing practices in the literature since Sweetland (1989).

Details

Journal of Documentation, vol. 79 no. 7
Type: Research Article
ISSN: 0022-0418

Keywords

Article
Publication date: 12 December 2022

Afshin Yaghoubi and Seyed Taghi Akhavan Niaki

One of the common approaches to improve systems reliability is using standby redundancy. Although many works are available in the literature on the applications of standby…

Abstract

Purpose

One of the common approaches to improve systems reliability is using standby redundancy. Although many works are available in the literature on the applications of standby redundancy, the system components are assumed to be independent of each other. But, in reality, the system components can be dependent on one another, causing the failure of each component to affect the failure rate of the remaining active components. In this paper, a standby two-unit system is considered, assuming a dependency between the switch and its associated active component.

Design/methodology/approach

This paper assumes that the failures between the switch and its associated active component follow the Marshall–Olkin exponential bivariate exponential distribution. Then, the reliability analysis of the system is done using the continuous-time Markov chain method.

Findings

The derived equations application to determine the system steady-state availability, system reliability and sensitivity analysis on the mean time to failure is demonstrated using a numerical illustration.

Originality/value

All previous models assumed independency between the switch and the associated active unit in the standby redundancy approach. In this paper, the switch and its associated component are assumed to be dependent on each other.

Details

International Journal of Quality & Reliability Management, vol. 40 no. 6
Type: Research Article
ISSN: 0265-671X

Keywords

Article
Publication date: 21 December 2023

Majid Rahi, Ali Ebrahimnejad and Homayun Motameni

Taking into consideration the current human need for agricultural produce such as rice that requires water for growth, the optimal consumption of this valuable liquid is…

Abstract

Purpose

Taking into consideration the current human need for agricultural produce such as rice that requires water for growth, the optimal consumption of this valuable liquid is important. Unfortunately, the traditional use of water by humans for agricultural purposes contradicts the concept of optimal consumption. Therefore, designing and implementing a mechanized irrigation system is of the highest importance. This system includes hardware equipment such as liquid altimeter sensors, valves and pumps which have a failure phenomenon as an integral part, causing faults in the system. Naturally, these faults occur at probable time intervals, and the probability function with exponential distribution is used to simulate this interval. Thus, before the implementation of such high-cost systems, its evaluation is essential during the design phase.

Design/methodology/approach

The proposed approach included two main steps: offline and online. The offline phase included the simulation of the studied system (i.e. the irrigation system of paddy fields) and the acquisition of a data set for training machine learning algorithms such as decision trees to detect, locate (classification) and evaluate faults. In the online phase, C5.0 decision trees trained in the offline phase were used on a stream of data generated by the system.

Findings

The proposed approach is a comprehensive online component-oriented method, which is a combination of supervised machine learning methods to investigate system faults. Each of these methods is considered a component determined by the dimensions and complexity of the case study (to discover, classify and evaluate fault tolerance). These components are placed together in the form of a process framework so that the appropriate method for each component is obtained based on comparison with other machine learning methods. As a result, depending on the conditions under study, the most efficient method is selected in the components. Before the system implementation phase, its reliability is checked by evaluating the predicted faults (in the system design phase). Therefore, this approach avoids the construction of a high-risk system. Compared to existing methods, the proposed approach is more comprehensive and has greater flexibility.

Research limitations/implications

By expanding the dimensions of the problem, the model verification space grows exponentially using automata.

Originality/value

Unlike the existing methods that only examine one or two aspects of fault analysis such as fault detection, classification and fault-tolerance evaluation, this paper proposes a comprehensive process-oriented approach that investigates all three aspects of fault analysis concurrently.

Details

International Journal of Intelligent Computing and Cybernetics, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 1756-378X

Keywords

Open Access
Article
Publication date: 27 December 2021

Hristo Trifonov and Donal Heffernan

The purpose of this paper is to describe how emerging open standards are replacing traditional industrial networks. Current industrial Ethernet networks are not interoperable;…

3219

Abstract

Purpose

The purpose of this paper is to describe how emerging open standards are replacing traditional industrial networks. Current industrial Ethernet networks are not interoperable; thus, limiting the potential capabilities for the Industrial Internet of Things (IIoT). There is no forthcoming new generation fieldbus standard to integrate into the IIoT and Industry 4.0 revolution. The open platform communications unified architecture (OPC UA) time-sensitive networking (TSN) is a potential vendor-independent successor technology for the factory network. The OPC UA is a data exchange standard for industrial communication, and TSN is an Institute of Electrical and Electronics Engineers standard for Ethernet that supports real-time behaviour. The merging of these open standard solutions can facilitate cross-vendor interoperability for Industry 4.0 and IIoT products.

Design/methodology/approach

A brief review of the history of the fieldbus standards is presented, which highlights the shortcomings for current industrial systems in meeting converged traffic solutions. An experimental system for the OPC UA TSN is described to demonstrate an approach to developing a three-layer factory network system with an emphasis on the field layer.

Findings

From the multitude of existing industrial network schemes, there is a convergence pathway in solutions based on TSN Ethernet and OPC UA. At the field level, basic timing measurements in this paper show that the OPC UA TSN can meet the basic critical timing requirements for a fieldbus network.

Originality/value

This paper uniquely focuses on the specific fieldbus standards elements of industrial networks evolution and traces the developments from the early history to the current developing integration in IIoT context.

Details

International Journal of Pervasive Computing and Communications, vol. 19 no. 3
Type: Research Article
ISSN: 1742-7371

Keywords

Open Access
Article
Publication date: 20 February 2024

Elvis Achuo, Bruno Emmanuel Ongo Nkoa, Nembo Leslie Ndam and Njimanted G. Forgha

Despite the longstanding male dominance in the socio-politico-economic spheres, recent decades have witnessed remarkable improvements in gender inclusion. Although the issue of…

Abstract

Purpose

Despite the longstanding male dominance in the socio-politico-economic spheres, recent decades have witnessed remarkable improvements in gender inclusion. Although the issue of gender inclusion has been widely documented, answers to the question of whether institutional arrangements and information technology shape gender inclusion remain contentious. This study, therefore, empirically examines the effects of institutional quality and ICT penetration on gender inclusion on a global scale.

Design/methodology/approach

To control for the endogeneity of modeled variables and cross-sectional dependence inherent with large panel datasets, the study employs the Driscoll-Kraay Fixed Effects (DKFE) and the system Generalised Method of Moments (GMM) estimators for a panel of 142 countries from 1996 to 2020.

Findings

The empirical findings from the DKFE and system GMM estimators reveal that strong institutions significantly enhance gender inclusion. Moreover, by disaggregating institutional quality into various governance indicators, we show that besides corruption control, which has a positive but insignificant effect on women’s empowerment, other governance indicators significantly enhance gender inclusion. Furthermore, there is evidence that various ICT measures promote gender inclusion.

Practical implications

The study results suggest that policymakers in developing countries should implement stringent measures to curb corruption. Moreover, policymakers in low-income countries should create avenues to facilitate women’s access to ICTs. Hence, policymakers in low-income countries should create and equip ICT training centers and render them accessible to all categories of women. Furthermore, developed countries with high-tech knowledge could help developing countries by organizing free training workshops and sensitization campaigns concerning the use of ICTs vis-à-vis women empowerment in various fields of life.

Originality/value

The present study fills a significant research gap by comprehensively exploring the nexuses between governance, ICT penetration, and the socio-politico-economic dimensions of gender inclusion from a global perspective. Besides the paucity of studies in this regard, the few existing studies have either been focused on region and country-specific case studies in developed or developing economies. Moreover, this study is timely, given the importance placed on gender inclusion (SDG5), quality of institutions (SDG16), and ICT penetration (SDG9) in the 2015–2030 global development agenda.

Details

Journal of Economics and Development, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 1859-0020

Keywords

Article
Publication date: 20 February 2023

Zakaria Sakyoud, Abdessadek Aaroud and Khalid Akodadi

The main goal of this research work is the optimization of the purchasing business process in the Moroccan public sector in terms of transparency and budgetary optimization. The…

Abstract

Purpose

The main goal of this research work is the optimization of the purchasing business process in the Moroccan public sector in terms of transparency and budgetary optimization. The authors have worked on the public university as an implementation field.

Design/methodology/approach

The design of the research work followed the design science research (DSR) methodology for information systems. DSR is a research paradigm wherein a designer answers questions relevant to human problems through the creation of innovative artifacts, thereby contributing new knowledge to the body of scientific evidence. The authors have adopted a techno-functional approach. The technical part consists of the development of an intelligent recommendation system that supports the choice of optimal information technology (IT) equipment for decision-makers. This intelligent recommendation system relies on a set of functional and business concepts, namely the Moroccan normative laws and Control Objectives for Information and Related Technology's (COBIT) guidelines in information system governance.

Findings

The modeling of business processes in public universities is established using business process model and notation (BPMN) in accordance with official regulations. The set of BPMN models constitute a powerful repository not only for business process execution but also for further optimization. Governance generally aims to reduce budgetary wastes, and the authors' recommendation system demonstrates a technical and methodological approach enabling this feature. Implementation of artificial intelligence techniques can bring great value in terms of transparency and fluidity in purchasing business process execution.

Research limitations/implications

Business limitations: First, the proposed system was modeled to handle one type products, which are computer-related equipment. Hence, the authors intend to extend the model to other types of products in future works. Conversely, the system proposes optimal purchasing order and assumes that decision makers will rely on this optimal purchasing order to choose between offers. In fact, as a perspective, the authors plan to work on a complete automation of the workflow to also include vendor selection and offer validation. Technical limitations: Natural language processing (NLP) is a widely used sentiment analysis (SA) technique that enabled the authors to validate the proposed system. Even working on samples of datasets, the authors noticed NLP dependency on huge computing power. The authors intend to experiment with learning and knowledge-based SA and assess the' computing power consumption and accuracy of the analysis compared to NLP. Another technical limitation is related to the web scraping technique; in fact, the users' reviews are crucial for the authors' system. To guarantee timeliness and reliable reviews, the system has to look automatically in websites, which confront the authors with the limitations of the web scraping like the permanent changing of website structure and scraping restrictions.

Practical implications

The modeling of business processes in public universities is established using BPMN in accordance with official regulations. The set of BPMN models constitute a powerful repository not only for business process execution but also for further optimization. Governance generally aims to reduce budgetary wastes, and the authors' recommendation system demonstrates a technical and methodological approach enabling this feature.

Originality/value

The adopted techno-functional approach enabled the authors to bring information system governance from a highly abstract level to a practical implementation where the theoretical best practices and guidelines are transformed to a tangible application.

Details

Kybernetes, vol. 53 no. 5
Type: Research Article
ISSN: 0368-492X

Keywords

1 – 10 of over 25000