Search results

1 – 10 of 15
Article
Publication date: 5 September 2024

Chinmaya Prasad Padhy, Suryakumar Simhambhatla and Debraj Bhattacharjee

This study aims to improve the mechanical properties of an object produced by fused deposition modelling with high-grade polymer.

Abstract

Purpose

This study aims to improve the mechanical properties of an object produced by fused deposition modelling with high-grade polymer.

Design/methodology/approach

The study uses an ensembled surrogate-assisted evolutionary algorithm (SAEA) to optimize the process parameters for example, layer height, print speed, print direction and nozzle temperature for enhancing the mechanical properties of temperature-sensitive high-grade polymer poly-ether-ether-ketone (PEEK) in fused deposition modelling (FDM) 3D printing while considering print time as one of the important parameter. These models are integrated with an evolutionary algorithm to efficiently explore parameter space. The optimized parameters from the SAEA approach are compared with those obtained using the Gray Relational Analysis (GRA) Taguchi method serving as a benchmark. Later, the study also highlights the significant role of print direction in optimizing the mechanical properties of FDM 3D printed PEEK.

Findings

With the use of ensemble learning-based SAEA, one can successfully maximize the ultimate stress and percentage elongation with minimum print time. SAEA-based solution has 28.86% higher ultimate stress, 66.95% lower percentage of elongation and 7.14% lower print time in comparison to the benchmark result (GRA Taguchi method). Also, the results from the experimental investigation indicate that the print direction has a greater role in deciding the optimum value of mechanical properties for FDM 3D printed high-grade thermoplastic PEEK polymer.

Research limitations/implications

This study is valid for the parameter ranges, which are defined to conduct the experimentation.

Practical implications

This study has been conducted on the basis of taking only a few important process parameters as per the literatures and available scope of the study; however, there are many other parameters, e.g. wall thickness, road width, print orientation, fill pattern, roller speed, retraction, etc. which can be included to make a more comprehensive investigation and accuracy of the results for practical implementation.

Originality/value

This study deploys a novel meta-model-based optimization approach for enhancing the mechanical properties of high-grade thermoplastic polymers, which is rarely available in the published literature in the research domain.

Article
Publication date: 2 August 2024

Faris Elghaish, Sandra Matarneh, M. Reza Hosseini, Algan Tezel, Abdul-Majeed Mahamadu and Firouzeh Taghikhah

Predictive digital twin technology, which amalgamates digital twins (DT), the internet of Things (IoT) and artificial intelligence (AI) for data collection, simulation and…

Abstract

Purpose

Predictive digital twin technology, which amalgamates digital twins (DT), the internet of Things (IoT) and artificial intelligence (AI) for data collection, simulation and predictive purposes, has demonstrated its effectiveness across a wide array of industries. Nonetheless, there is a conspicuous lack of comprehensive research in the built environment domain. This study endeavours to fill this void by exploring and analysing the capabilities of individual technologies to better understand and develop successful integration use cases.

Design/methodology/approach

This study uses a mixed literature review approach, which involves using bibliometric techniques as well as thematic and critical assessments of 137 relevant academic papers. Three separate lists were created using the Scopus database, covering AI and IoT, as well as DT, since AI and IoT are crucial in creating predictive DT. Clear criteria were applied to create the three lists, including limiting the results to only Q1 journals and English publications from 2019 to 2023, in order to include the most recent and highest quality publications. The collected data for the three technologies was analysed using the bibliometric package in R Studio.

Findings

Findings reveal asymmetric attention to various components of the predictive digital twin’s system. There is a relatively greater body of research on IoT and DT, representing 43 and 47%, respectively. In contrast, direct research on the use of AI for net-zero solutions constitutes only 10%. Similarly, the findings underscore the necessity of integrating these three technologies to develop predictive digital twin solutions for carbon emission prediction.

Practical implications

The results indicate that there is a clear need for more case studies investigating the use of large-scale IoT networks to collect carbon data from buildings and construction sites. Furthermore, the development of advanced and precise AI models is imperative for predicting the production of renewable energy sources and the demand for housing.

Originality/value

This paper makes a significant contribution to the field by providing a strong theoretical foundation. It also serves as a catalyst for future research within this domain. For practitioners and policymakers, this paper offers a reliable point of reference.

Details

Smart and Sustainable Built Environment, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 2046-6099

Keywords

Article
Publication date: 20 March 2024

Ziming Zhou, Fengnian Zhao and David Hung

Higher energy conversion efficiency of internal combustion engine can be achieved with optimal control of unsteady in-cylinder flow fields inside a direct-injection (DI) engine…

Abstract

Purpose

Higher energy conversion efficiency of internal combustion engine can be achieved with optimal control of unsteady in-cylinder flow fields inside a direct-injection (DI) engine. However, it remains a daunting task to predict the nonlinear and transient in-cylinder flow motion because they are highly complex which change both in space and time. Recently, machine learning methods have demonstrated great promises to infer relatively simple temporal flow field development. This paper aims to feature a physics-guided machine learning approach to realize high accuracy and generalization prediction for complex swirl-induced flow field motions.

Design/methodology/approach

To achieve high-fidelity time-series prediction of unsteady engine flow fields, this work features an automated machine learning framework with the following objectives: (1) The spatiotemporal physical constraint of the flow field structure is transferred to machine learning structure. (2) The ML inputs and targets are efficiently designed that ensure high model convergence with limited sets of experiments. (3) The prediction results are optimized by ensemble learning mechanism within the automated machine learning framework.

Findings

The proposed data-driven framework is proven effective in different time periods and different extent of unsteadiness of the flow dynamics, and the predicted flow fields are highly similar to the target field under various complex flow patterns. Among the described framework designs, the utilization of spatial flow field structure is the featured improvement to the time-series flow field prediction process.

Originality/value

The proposed flow field prediction framework could be generalized to different crank angle periods, cycles and swirl ratio conditions, which could greatly promote real-time flow control and reduce experiments on in-cylinder flow field measurement and diagnostics.

Details

International Journal of Numerical Methods for Heat & Fluid Flow, vol. 34 no. 8
Type: Research Article
ISSN: 0961-5539

Keywords

Article
Publication date: 17 September 2024

Rachid Mharzi, Abderrahmane Ben Kacem and Abdelmajid Elouadi

The purpose of this study is to analyze the operations and performance dynamics of a supply chain (SC) subject to disruptions. The preparedness of Moroccan responders in handling…

Abstract

Purpose

The purpose of this study is to analyze the operations and performance dynamics of a supply chain (SC) subject to disruptions. The preparedness of Moroccan responders in handling emergencies could be enhanced significantly, by devising digital twin-based decision support systems (DSSs).

Design/methodology/approach

The authors create a discrete-event simulation model to investigate proactively risks and resilience of a Moroccan basic-items SC (BISC). In this study, the authors analyze the effects of catastrophe-related disruptions (CRDs) on the Moroccan BISC, by the use of a simulation-based decision-supporting quantitative method.

Findings

In the disruption-free simulation experiment, the outcome was a satisfactory 100% coverage. By implementing CRDs, inventory levels have dropped, service levels decreased, lead time raised and there was an increase in backlogged products and late orders numbers. The highest impact was observed for the shutdown of paths linking suppliers to warehouses, whereas the increase in demand had a comparatively minor effect. The risk analysis approach helps to identify critical products for which the time-to-recover is longer and requires more commitment to enhance their resilience.

Practical implications

The model serves to deduce quantitative resilience assessment from simulation, streamline the selection of recovery strategies and enable the best-informed reactive decision-making to minimize the impact.

Originality/value

The research brings organizing solutions to catastrophe-related emergencies in Morocco. It would contribute significantly by visualizing, examining and unveiling the effects of disruptions on a BISC and offering actionable recommendations for remedial measures.

Details

Journal of Modelling in Management, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 1746-5664

Keywords

Open Access
Article
Publication date: 28 February 2024

Eyad Buhulaiga and Arnesh Telukdarie

Multinational business deliver value via multiple sites with similar operational capacities. The age of the Fourth Industrial Revolution (4IR) delivers significant opportunities…

1930

Abstract

Purpose

Multinational business deliver value via multiple sites with similar operational capacities. The age of the Fourth Industrial Revolution (4IR) delivers significant opportunities for the deployment of digital tools for business optimization. Therefore, this study aims to study the Industry 4.0 implementation for multinationals.

Design/methodology/approach

The key objective of this research is multi-site systems integration using a reproducible, modular and standardized “Cyber Physical System (CPS) as-a-Service”.

Findings

A best practice reference architecture is adopted to guide the design and delivery of a pioneering CPS multi-site deployment. The CPS deployed is a cloud-based platform adopted to enable all manufacturing areas within a multinational energy and petrochemical company. A methodology is developed to quantify the system environmental and sustainability benefits focusing on reduced carbon dioxide (CO2) emissions and energy consumption. These results demonstrate the benefits of standardization, replication and digital enablement for multinational businesses.

Originality/value

The research illustrates the ability to design a single system, reproducible for multiple sites. This research also illustrates the beneficial impact of system reuse due to reduced environmental impact from lower CO2 emissions and energy consumption. The paper assists organizations in deploying complex systems while addressing multinational systems implementation constraints and standardization.

Details

Digital Transformation and Society, vol. 3 no. 3
Type: Research Article
ISSN: 2755-0761

Keywords

Article
Publication date: 22 August 2024

Iman Bashtani and Javad Abolfazli Esfahani

This study aims to introduce a novel machine learning feature vector (MLFV) method to bring machine learning to overcome the time-consuming computational fluid dynamics (CFD…

Abstract

Purpose

This study aims to introduce a novel machine learning feature vector (MLFV) method to bring machine learning to overcome the time-consuming computational fluid dynamics (CFD) simulations for rapidly predicting turbulent flow characteristics with acceptable accuracy.

Design/methodology/approach

In this method, CFD snapshots are encoded in a tensor as the input training data. Then, the MLFV learns the relationship between data with a rod filter, which is named feature vector, to learn features by defining functions on it. To demonstrate the accuracy of the MLFV, this method is used to predict the velocity, temperature and turbulent kinetic energy fields of turbulent flow passing over an innovative nature-inspired Dolphin turbulator based on only ten CFD data.

Findings

The results indicate that MLFV and CFD contours alongside scatter plots have a good agreement between predicted and solved data with R2 ≃ 1. Also, the error percentage contours and histograms reveal the high precisions of predictions with MAPE = 7.90E-02, 1.45E-02, 7.32E-02 and NRMSE = 1.30E-04, 1.61E-03, 4.54E-05 for prediction velocity, temperature, turbulent kinetic energy fields at Re = 20,000, respectively.

Practical implications

The method can have state-of-the-art applications in a wide range of CFD simulations with the ability to train based on small data, which is practical and logical regarding the number of required tests.

Originality/value

The paper introduces a novel, innovative and super-fast method named MLFV to address the time-consuming challenges associated with the traditional CFD approach to predict the physics of turbulent heat and fluid flow in real time with the superiority of training based on small data with acceptable accuracy.

Details

International Journal of Numerical Methods for Heat & Fluid Flow, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 0961-5539

Keywords

Article
Publication date: 15 July 2024

Xiaolong Lyu, Dan Huang, Liwei Wu and Ding Chen

Parameter estimation in complex engineering structures typically necessitates repeated calculations using simulation models, leading to significant computational costs. This paper…

Abstract

Purpose

Parameter estimation in complex engineering structures typically necessitates repeated calculations using simulation models, leading to significant computational costs. This paper aims to introduce an adaptive multi-output Gaussian process (MOGP) surrogate model for parameter estimation in time-consuming models.

Design/methodology/approach

The MOGP surrogate model is established to replace the computationally expensive finite element method (FEM) analysis during the estimation process. We propose a novel adaptive sampling method for MOGP inspired by the traditional expected improvement (EI) method, aiming to reduce the number of required sample points for building the surrogate model. Two mathematical examples and an application in the back analysis of a concrete arch dam are tested to demonstrate the effectiveness of the proposed method.

Findings

The numerical results show that the proposed method requires a relatively small number of sample points to achieve accurate estimates. The proposed adaptive sampling method combined with the MOGP surrogate model shows an obvious advantage in parameter estimation problems involving expensive-to-evaluate models, particularly those with high-dimensional output.

Originality/value

A novel adaptive sampling method for establishing the MOGP surrogate model is proposed to accelerate the procedure of solving large-scale parameter estimation problems. This modified adaptive sampling method, based on the traditional EI method, is better suited for multi-output problems, making it highly valuable for numerous practical engineering applications.

Details

Engineering Computations, vol. 41 no. 6
Type: Research Article
ISSN: 0264-4401

Keywords

Open Access
Article
Publication date: 4 July 2024

Bart Lameijer, Elizabeth S.L. de Vries, Jiju Antony, Jose Arturo Garza-Reyes and Michael Sony

Many organizations currently transition towards digitalized process design, execution, control, assurance and improvement, and the purpose of this research is to empirically…

Abstract

Purpose

Many organizations currently transition towards digitalized process design, execution, control, assurance and improvement, and the purpose of this research is to empirically demonstrate how data-based operational excellence techniques are useful in digitalized environments by means of the optimization of a robotic process automation deployment.

Design/methodology/approach

An interpretive mixed-method case study approach comprising both secondary Lean Six Sigma (LSS) project data together with participant-as-observer archival observations is applied. A case report, comprising per DMAIC phase (1) the objectives, (2) the main deliverables, (3) the results and (4) the key actions leading to achieving the presented results is presented.

Findings

Key findings comprise (1) the importance of understanding how to acquire and prepare large system generated data and (2) the need for better large system-generated database validation mechanisms. Finally (3) the importance of process contextual understanding of the LSS project lead is emphasized, together with (4) the need for LSS foundational curriculum developments in order to be effective in digitalized environments.

Originality/value

This study provides a rich prescriptive demonstration of LSS methodology implementation for RPA deployment improvement, and is one of the few empirical demonstrations of LSS based problem solving methodology in industry 4.0 contexts.

Details

Business Process Management Journal, vol. 30 no. 8
Type: Research Article
ISSN: 1463-7154

Keywords

Article
Publication date: 10 May 2023

Upama Dey, Aparna Duggirala and Souren Mitra

Aluminium alloys can be used as lightweight and high-strength materials in combination with the technology of laser beam welding, an efficient joining method, in the manufacturing…

Abstract

Purpose

Aluminium alloys can be used as lightweight and high-strength materials in combination with the technology of laser beam welding, an efficient joining method, in the manufacturing of automotive parts. The purposes of this paper are to conduct laser welding experiments with Al2024 in the lap joint configuration, model the laser welding process parameters of Al2024 alloys and use propounded models to optimize the process parameters.

Design/methodology/approach

Laser welding of Al2024 alloy has been conducted in the lap joint configuration. Then, the influences of explanatory variables (laser peak power, scanning speed and frequency) on outcome variables (weld width [WW], throat length [TL] and breaking load [BL]) have been investigated with Poisson regression analysis of the data set derived from experimentation. Thereafter, a multi-objective genetic algorithm (MOGA) has been used using MATLAB to find the optimum solutions. The effects of various input process parameters on the responses have also been analysed using response surface plots.

Findings

The promulgated statistical models, derived with Poisson regression analysis, are evinced to be well-fit ones using the analysis of deviance approach. Pareto fronts have been used to demonstrate the optimization results, and the maximized load-bearing capacity is computed to be 1,263 N, whereas the compromised WW and TL are 714 µm and 760 µm, respectively.

Originality/value

This work of conducting laser welding of lap joint of Al2024 alloy incorporating the Taguchi method and optimizing the input process parameters with the promulgated statistical models proffers a neoteric perspective that can be useful to the manufacturing industry.

Details

World Journal of Engineering, vol. 21 no. 4
Type: Research Article
ISSN: 1708-5284

Keywords

Article
Publication date: 21 August 2024

Sarah Ayad and Fatimah Alsayoud

The term knowledge refers to the part of the world investigated by a specific discipline and that includes a specific taxonomy, vocabulary, concepts, theories, research methods…

Abstract

Purpose

The term knowledge refers to the part of the world investigated by a specific discipline and that includes a specific taxonomy, vocabulary, concepts, theories, research methods and standards of justification. Our approach uses domain knowledge to improve the quality of business process models (BPMs) by exploiting the domain knowledge provided by large language models (LLMs). Among these models, ChatGPT stands out as a notable example of an LLM capable of providing in-depth domain knowledge. The lack of coverage presents a limitation in each approach, as it hinders the ability to fully capture and represent the domain’s knowledge. To solve such limitations, we aim to exploit GPT-3.5 knowledge. Our approach does not ask GPT-3.5 to create a visual representation; instead, it needs to suggest missing concepts, thus helping the modeler improve his/her model. The GPT-3.5 may need to refine its suggestions based on feedback from the modeler.

Design/methodology/approach

We initiate our semantic quality enhancement process of a BPM by first extracting crucial elements including pools, lanes, activities and artifacts, along with their corresponding relationships such as lanes being associated with pools, activities belonging to each lane and artifacts associated with each activity. These data are systematically gathered and structured into ArrayLists, a form of organized collection that allows for efficient data manipulation and retrieval. Once we have this structured data, our methodology involves creating a series of prompts based on each data element. We adopt three approaches to prompting: zero-shot, few-shot and chain of thoughts (CoT) prompts. Each type of prompting is specifically designed to interact with the OpenAI language model in a unique way, aiming to elicit a diverse array of suggestions. As we apply these prompting techniques, the OpenAI model processes each prompt and returns a list of suggestions tailored to that specific element of the BPM. Our approach operates independently of any specific notation and offers semi-automation, allowing modelers to select from a range of suggested options.

Findings

This study demonstrates the significant potential of prompt engineering techniques in enhancing the semantic quality of BPMs when integrated with LLMs like ChatGPT. Our analysis of model activity richness and model artifact richness across different prompt techniques and input configurations reveals that carefully tailored prompts can lead to more complete BPMs. This research is a step forward for further exploration into the optimization of LLMs in BPM development.

Research limitations/implications

The limitation is the domain ontology that we are relying on to evaluate the semantic completeness of the new BPM. In our future work, the modeler will have the option to ask for synonyms, hyponyms, hypernyms or keywords. This feature will facilitate the replacement of existing concepts to improve not only the completeness of the BPM but also the clarity and specificity of concepts in BPMs.

Practical implications

To demonstrate our methodology, we take the “Hospitalization” process as an illustrative example. In the scope of our research, we have presented a select set of instructions pertinent to the “chain of thought” and “few-shot prompting.” Due to constraints in presentation and the extensive nature of the instructions, we have not included every detail within the body of this paper. However, they can be found in the previous GitHub link. Two appendices are given at the end. Appendix 1 describes the different prompt instructions. Appendix 2 presents the application of the instructions in our example.

Originality/value

In our research, we rely on the domain application knowledge provided by ChatGPT-3 to enhance the semantic quality of BPMs. Typically, the semantic quality of BPMs may suffer due to the modeler's lack of domain knowledge. To address this issue, our approach employs three prompt engineering methods designed to extract accurate domain knowledge. By utilizing these methods, we can identify and propose missing concepts, such as activities and artifacts. This not only ensures a more comprehensive representation of the business process but also contributes to the overall improvement of the model's semantic quality, leading to more effective and accurate business process management.

Details

Business Process Management Journal, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 1463-7154

Keywords

1 – 10 of 15