Search results

1 – 10 of 822
Article
Publication date: 27 December 2022

Bright Awuku, Eric Asa, Edmund Baffoe-Twum and Adikie Essegbey

Challenges associated with ensuring the accuracy and reliability of cost estimation of highway construction bid items are of significant interest to state highway transportation…

Abstract

Purpose

Challenges associated with ensuring the accuracy and reliability of cost estimation of highway construction bid items are of significant interest to state highway transportation agencies. Even with the existing research undertaken on the subject, the problem of inaccurate estimation of highway bid items still exists. This paper aims to assess the accuracy of the cost estimation methods employed in the selected studies to provide insights into how well they perform empirically. Additionally, this research seeks to identify, synthesize and assess the impact of the factors affecting highway unit prices because they affect the total cost of highway construction costs.

Design/methodology/approach

This paper systematically searched, selected and reviewed 105 papers from Scopus, Google Scholar, American Society of Civil Engineers (ASCE), Transportation Research Board (TRB) and Science Direct (SD) on conceptual cost estimation of highway bid items. This study used content and nonparametric statistical analyses to determine research trends, identify, categorize the factors influencing highway unit prices and assess the combined performance of conceptual cost prediction models.

Findings

Findings from the trend analysis showed that between 1983 and 2019 North America, Asia, Europe and the Middle East contributed the most to improving highway cost estimation research. Aggregating the quantitative results and weighting the findings using each study's sample size revealed that the average error between the actual and the estimated project costs of Monte-Carlo simulation models (5.49%) performed better compared to the Bayesian model (5.95%), support vector machines (6.03%), case-based reasoning (11.69%), artificial neural networks (12.62%) and regression models (13.96%). This paper identified 41 factors and was grouped into three categories, namely: (1) factors relating to project characteristics; (2) organizational factors and (3) estimate factors based on the common classification used in the selected papers. The mean ranking analysis showed that most of the selected papers used project-specific factors more when estimating highway construction bid items than the other factors.

Originality/value

This paper contributes to the body of knowledge by analyzing and comparing the performance of highway cost estimation models, identifying and categorizing a comprehensive list of cost drivers to stimulate future studies in improving highway construction cost estimates.

Details

Engineering, Construction and Architectural Management, vol. 31 no. 3
Type: Research Article
ISSN: 0969-9988

Keywords

Article
Publication date: 27 July 2023

Mas Irfan P. Hidayat, Azzah D. Pramata and Prima P. Airlangga

This study presents finite element (FE) and generalized regression neural network (GRNN) approaches for modeling multiple crack growth problems and predicting crack-growth…

Abstract

Purpose

This study presents finite element (FE) and generalized regression neural network (GRNN) approaches for modeling multiple crack growth problems and predicting crack-growth directions under the influence of multiple crack parameters.

Design/methodology/approach

To determine the crack-growth direction in aluminum specimens, multiple crack parameters representing some degree of crack propagation complexity, including crack length, inclination angle, offset and distance, were examined. FE method models were developed for multiple crack growth simulations. To capture the complex relationships among multiple crack-growth variables, GRNN models were developed as nonlinear regression models. Six input variables and one output variable comprising 65 training and 20 test datasets were established.

Findings

The FE model could conveniently simulate the crack-growth directions. However, several multiple crack parameters could affect the simulation accuracy. The GRNN offers a reliable method for modeling the growth of multiple cracks. Using 76% of the total dataset, the NN model attained an R2 value of 0.985.

Research limitations/implications

The models are presented for static multiple crack growth problems. No material anisotropy is observed.

Practical implications

In practical crack-growth analyses, the NN approach provides significant benefits and savings.

Originality/value

The proposed GRNN model is simple to develop and accurate. Its performance was superior to that of other NN models. This model is also suitable for modeling multiple crack growths with arbitrary geometries. The proposed GRNN model demonstrates its prediction capability with a simpler learning process, thus producing efficient multiple crack growth predictions and assessments.

Details

Multidiscipline Modeling in Materials and Structures, vol. 19 no. 5
Type: Research Article
ISSN: 1573-6105

Keywords

Open Access
Article
Publication date: 23 January 2023

Hussein Y.H. Alnajjar and Osman Üçüncü

Artificial intelligence (AI) models are demonstrating day by day that they can find long-term solutions to improve wastewater treatment efficiency. Artificial neural networks…

1162

Abstract

Purpose

Artificial intelligence (AI) models are demonstrating day by day that they can find long-term solutions to improve wastewater treatment efficiency. Artificial neural networks (ANNs) are one of the most important of these models, and they are increasingly being used to forecast water resource variables. The goal of this study was to create an ANN model to estimate the removal efficiency of biological oxygen demand (BOD), total nitrogen (TN), total phosphorus (TP) and total suspended solids (TSS) at the effluent of various primary and secondary treatment methods in a wastewater treatment plant (WWTP).

Design/methodology/approach

The MATLAB App Designer model was used to generate the data set. Various combinations of wastewater quality data, such as temperature(T), TN, TP and hydraulic retention time (HRT) are used as inputs into the ANN to assess the degree of effect of each of these variables on BOD, TN, TP and TSS removal efficiency. Two of the models reflect two different types of primary treatment, while the other nine models represent different types of subsequent treatment. The ANN model’s findings are compared to the MATLAB App Designer model. For evaluating model performance, mean square error (MSE) and coefficient of determination statistics (R2) are utilized as comparative metrics.

Findings

For both training and testing, the R values for the ANN models were greater than 0.99. Based on the comparisons, it was discovered that the ANN model can be used to estimate the removal efficiency of BOD, TN, TP and TSS in WWTP and that the ANN model produces very similar and satisfying results to the APPDESIGNER model. The R-value (Correlation coefficient) of 0.9909 and the MSE of 5.962 indicate that the model is accurate. Because of the many benefits of the ANN models used in this study, it has a lot of potential as a general modeling tool for a range of other complicated process systems that are difficult to solve using conventional modeling techniques.

Originality/value

The objective of this study was to develop an ANN model that could be used to estimate the removal efficiency of pollutants such as BOD, TN, TP and TSS at the effluent of various primary and secondary treatment methods in a WWTP. In the future, the ANN could be used to design a new WWTP and forecast the removal efficiency of pollutants.

Details

Arab Gulf Journal of Scientific Research, vol. 41 no. 4
Type: Research Article
ISSN: 1985-9899

Keywords

Article
Publication date: 9 October 2023

Manish Bansal

This paper undertakes an extensive and systematic review of the literature on earnings management (EM) over the past three decades (1992–2022). Furthermore, the study identifies…

Abstract

Purpose

This paper undertakes an extensive and systematic review of the literature on earnings management (EM) over the past three decades (1992–2022). Furthermore, the study identifies emerging research themes and proposes future avenues for further investigation in the realm of EM.

Design/methodology/approach

For this study, a comprehensive collection of 2,775 articles on EM published between 1992 and 2022 was extracted from the Scopus database. The author employed various tools, including Microsoft Excel, R studio, Gephi and visualization of similarities viewer, to conduct bibliometric, content, thematic and cluster analyses. Additionally, the study examined the literature across three distinct periods: prior to the enactment of the Sarbanes-Oxley Act (1992–2001), subsequent to the implementation of the Sarbanes-Oxley Act (2002–2012), and after the adoption of International Financial Reporting Standards (2013–2022) to draw more inferences and insights on EM research.

Findings

The study identifies three major themes, namely the operationalization of EM constructs, the trade-off between EM tools (accrual EM, real EM and classification shifting) and the role of corporate governance in mitigating EM in emerging markets. Existing literature in these areas presents mixed and inconclusive findings, suggesting the need for further theoretical development. Further, the study findings observe a shift in research focus over time: initially, understanding manipulation techniques, then evaluating regulatory measures, and more recently, investigating the impact of global accounting standards. Several emerging research themes (technology advancements, cross-cultural and cross-national studies, sustainability, behavioral aspects and non-financial indicators of EM) have been identified. This study subsequent analysis reveals an evolving EM landscape, with researchers from disciplines like data science, computer science and engineering applying their analytical expertise to detect EM anomalies. Furthermore, this study offers significant insights into sophisticated EM techniques such as neural networks, machine learning techniques and hidden Markov models, among others, as well as relevant theories including dynamic capabilities theory, learning curve theory, psychological contract theory and normative institutional theory. These techniques and theories demonstrate the need for further advancement in the field of EM. Lastly, the findings shed light on prominent EM journals, authors and countries.

Originality/value

This study conducts quantitative bibliometric and thematic analyses of the existing literature on EM while identifying areas that require further development to advance EM research.

Details

Journal of Accounting Literature, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 0737-4607

Keywords

Article
Publication date: 27 June 2023

Nirodha Fernando, Kasun Dilshan T.A. and Hexin (Johnson) Zhang

The Government’s investment in infrastructure projects is considerably high, especially in bridge construction projects. Government authorities must establish an initial…

Abstract

Purpose

The Government’s investment in infrastructure projects is considerably high, especially in bridge construction projects. Government authorities must establish an initial forecasted budget to have transparency in transactions. Early cost estimating is challenging for Quantity Surveyors due to incomplete project details at the initial stage and the unavailability of standard cost estimating techniques for bridge projects. To mitigate the difficulties in the traditional preliminary cost estimating methods, there is a requirement to develop a new initial cost estimating model which is accurate, user friendly and straightforward. The research was carried out in Sri Lanka, and this paper aims to develop the artificial neural network (ANN) model for an early cost estimate of concrete bridge systems.

Design/methodology/approach

The construction cost data of 30 concrete bridge projects which are in Sri Lanka constructed within the past ten years were trained and tested to develop an ANN cost model. Backpropagation technique was used to identify the number of hidden layers, iteration and momentum for optimum neural network architectures.

Findings

An ANN cost model was developed, furnishing the best result since it succeeded with around 90% validation accuracy. It created a cost estimation model for the public sector as an accurate, heuristic, flexible and efficient technique.

Originality/value

The research contributes to the current body of knowledge by providing the most accurate early-stage cost estimate for the concrete bridge systems in Sri Lanka. In addition, the research findings would be helpful for stakeholders and policymakers to propose policy recommendations that positively influence the prediction of the most accurate cost estimate for concrete bridge construction projects in Sri Lanka and other developing countries.

Details

Journal of Financial Management of Property and Construction , vol. 29 no. 1
Type: Research Article
ISSN: 1366-4387

Keywords

Article
Publication date: 13 February 2024

Aleena Swetapadma, Tishya Manna and Maryam Samami

A novel method has been proposed to reduce the false alarm rate of arrhythmia patients regarding life-threatening conditions in the intensive care unit. In this purpose, the…

Abstract

Purpose

A novel method has been proposed to reduce the false alarm rate of arrhythmia patients regarding life-threatening conditions in the intensive care unit. In this purpose, the atrial blood pressure, photoplethysmogram (PLETH), electrocardiogram (ECG) and respiratory (RESP) signals are considered as input signals.

Design/methodology/approach

Three machine learning approaches feed-forward artificial neural network (ANN), ensemble learning method and k-nearest neighbors searching methods are used to detect the false alarm. The proposed method has been implemented using Arduino and MATLAB/SIMULINK for real-time ICU-arrhythmia patients' monitoring data.

Findings

The proposed method detects the false alarm with an accuracy of 99.4 per cent during asystole, 100 per cent during ventricular flutter, 98.5 per cent during ventricular tachycardia, 99.6 per cent during bradycardia and 100 per cent during tachycardia. The proposed framework is adaptive in many scenarios, easy to implement, computationally friendly and highly accurate and robust with overfitting issue.

Originality/value

As ECG signals consisting with PQRST wave, any deviation from the normal pattern may signify some alarming conditions. These deviations can be utilized as input to classifiers for the detection of false alarms; hence, there is no need for other feature extraction techniques. Feed-forward ANN with the Lavenberg–Marquardt algorithm has shown higher rate of convergence than other neural network algorithms which helps provide better accuracy with no overfitting.

Details

Data Technologies and Applications, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 2514-9288

Keywords

Article
Publication date: 1 January 2024

Shrutika Sharma, Vishal Gupta, Deepa Mudgal and Vishal Srivastava

Three-dimensional (3D) printing is highly dependent on printing process parameters for achieving high mechanical strength. It is a time-consuming and expensive operation to…

Abstract

Purpose

Three-dimensional (3D) printing is highly dependent on printing process parameters for achieving high mechanical strength. It is a time-consuming and expensive operation to experiment with different printing settings. The current study aims to propose a regression-based machine learning model to predict the mechanical behavior of ulna bone plates.

Design/methodology/approach

The bone plates were formed using fused deposition modeling (FDM) technique, with printing attributes being varied. The machine learning models such as linear regression, AdaBoost regression, gradient boosting regression (GBR), random forest, decision trees and k-nearest neighbors were trained for predicting tensile strength and flexural strength. Model performance was assessed using root mean square error (RMSE), coefficient of determination (R2) and mean absolute error (MAE).

Findings

Traditional experimentation with various settings is both time-consuming and expensive, emphasizing the need for alternative approaches. Among the models tested, GBR model demonstrated the best performance in predicting both tensile and flexural strength and achieved the lowest RMSE, highest R2 and lowest MAE, which are 1.4778 ± 0.4336 MPa, 0.9213 ± 0.0589 and 1.2555 ± 0.3799 MPa, respectively, and 3.0337 ± 0.3725 MPa, 0.9269 ± 0.0293 and 2.3815 ± 0.2915 MPa, respectively. The findings open up opportunities for doctors and surgeons to use GBR as a reliable tool for fabricating patient-specific bone plates, without the need for extensive trial experiments.

Research limitations/implications

The current study is limited to the usage of a few models. Other machine learning-based models can be used for prediction-based study.

Originality/value

This study uses machine learning to predict the mechanical properties of FDM-based distal ulna bone plate, replacing traditional design of experiments methods with machine learning to streamline the production of orthopedic implants. It helps medical professionals, such as physicians and surgeons, make informed decisions when fabricating customized bone plates for their patients while reducing the need for time-consuming experimentation, thereby addressing a common limitation of 3D printing medical implants.

Details

Rapid Prototyping Journal, vol. 30 no. 3
Type: Research Article
ISSN: 1355-2546

Keywords

Article
Publication date: 26 December 2023

Farshad Peiman, Mohammad Khalilzadeh, Nasser Shahsavari-Pour and Mehdi Ravanshadnia

Earned value management (EVM)–based models for estimating project actual duration (AD) and cost at completion using various methods are continuously developed to improve the…

Abstract

Purpose

Earned value management (EVM)–based models for estimating project actual duration (AD) and cost at completion using various methods are continuously developed to improve the accuracy and actualization of predicted values. This study primarily aimed to examine natural gradient boosting (NGBoost-2020) with the classification and regression trees (CART) base model (base learner). To the best of the authors' knowledge, this concept has never been applied to EVM AD forecasting problem. Consequently, the authors compared this method to the single K-nearest neighbor (KNN) method, the ensemble method of extreme gradient boosting (XGBoost-2016) with the CART base model and the optimal equation of EVM, the earned schedule (ES) equation with the performance factor equal to 1 (ES1). The paper also sought to determine the extent to which the World Bank's two legal factors affect countries and how the two legal causes of delay (related to institutional flaws) influence AD prediction models.

Design/methodology/approach

In this paper, data from 30 construction projects of various building types in Iran, Pakistan, India, Turkey, Malaysia and Nigeria (due to the high number of delayed projects and the detrimental effects of these delays in these countries) were used to develop three models. The target variable of the models was a dimensionless output, the ratio of estimated duration to completion (ETC(t)) to planned duration (PD). Furthermore, 426 tracking periods were used to build the three models, with 353 samples and 23 projects in the training set, 73 patterns (17% of the total) and six projects (21% of the total) in the testing set. Furthermore, 17 dimensionless input variables were used, including ten variables based on the main variables and performance indices of EVM and several other variables detailed in the study. The three models were subsequently created using Python and several GitHub-hosted codes.

Findings

For the testing set of the optimal model (NGBoost), the better percentage mean (better%) of the prediction error (based on projects with a lower error percentage) of the NGBoost compared to two KNN and ES1 single models, as well as the total mean absolute percentage error (MAPE) and mean lags (MeLa) (indicating model stability) were 100, 83.33, 5.62 and 3.17%, respectively. Notably, the total MAPE and MeLa for the NGBoost model testing set, which had ten EVM-based input variables, were 6.74 and 5.20%, respectively. The ensemble artificial intelligence (AI) models exhibited a much lower MAPE than ES1. Additionally, ES1 was less stable in prediction than NGBoost. The possibility of excessive and unusual MAPE and MeLa values occurred only in the two single models. However, on some data sets, ES1 outperformed AI models. NGBoost also outperformed other models, especially single models for most developing countries, and was more accurate than previously presented optimized models. In addition, sensitivity analysis was conducted on the NGBoost predicted outputs of 30 projects using the SHapley Additive exPlanations (SHAP) method. All variables demonstrated an effect on ETC(t)/PD. The results revealed that the most influential input variables in order of importance were actual time (AT) to PD, regulatory quality (RQ), earned duration (ED) to PD, schedule cost index (SCI), planned complete percentage, rule of law (RL), actual complete percentage (ACP) and ETC(t) of the ES optimal equation to PD. The probabilistic hybrid model was selected based on the outputs predicted by the NGBoost and XGBoost models and the MAPE values from three AI models. The 95% prediction interval of the NGBoost–XGBoost model revealed that 96.10 and 98.60% of the actual output values of the testing and training sets are within this interval, respectively.

Research limitations/implications

Due to the use of projects performed in different countries, it was not possible to distribute the questionnaire to the managers and stakeholders of 30 projects in six developing countries. Due to the low number of EVM-based projects in various references, it was unfeasible to utilize other types of projects. Future prospects include evaluating the accuracy and stability of NGBoost for timely and non-fluctuating projects (mostly in developed countries), considering a greater number of legal/institutional variables as input, using legal/institutional/internal/inflation inputs for complex projects with extremely high uncertainty (such as bridge and road construction) and integrating these inputs and NGBoost with new technologies (such as blockchain, radio frequency identification (RFID) systems, building information modeling (BIM) and Internet of things (IoT)).

Practical implications

The legal/intuitive recommendations made to governments are strict control of prices, adequate supervision, removal of additional rules, removal of unfair regulations, clarification of the future trend of a law change, strict monitoring of property rights, simplification of the processes for obtaining permits and elimination of unnecessary changes particularly in developing countries and at the onset of irregular projects with limited information and numerous uncertainties. Furthermore, the managers and stakeholders of this group of projects were informed of the significance of seven construction variables (institutional/legal external risks, internal factors and inflation) at an early stage, using time series (dynamic) models to predict AD, accurate calculation of progress percentage variables, the effectiveness of building type in non-residential projects, regular updating inflation during implementation, effectiveness of employer type in the early stage of public projects in addition to the late stage of private projects, and allocating reserve duration (buffer) in order to respond to institutional/legal risks.

Originality/value

Ensemble methods were optimized in 70% of references. To the authors' knowledge, NGBoost from the set of ensemble methods was not used to estimate construction project duration and delays. NGBoost is an effective method for considering uncertainties in irregular projects and is often implemented in developing countries. Furthermore, AD estimation models do fail to incorporate RQ and RL from the World Bank's worldwide governance indicators (WGI) as risk-based inputs. In addition, the various WGI, EVM and inflation variables are not combined with substantial degrees of delay institutional risks as inputs. Consequently, due to the existence of critical and complex risks in different countries, it is vital to consider legal and institutional factors. This is especially recommended if an in-depth, accurate and reality-based method like SHAP is used for analysis.

Details

Engineering, Construction and Architectural Management, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 0969-9988

Keywords

Article
Publication date: 16 May 2023

Fátima García-Martínez, Diego Carou, Francisco de Arriba-Pérez and Silvia García-Méndez

Material extrusion is one of the most commonly used approaches within the additive manufacturing processes available. Despite its popularity and related technical advancements…

Abstract

Purpose

Material extrusion is one of the most commonly used approaches within the additive manufacturing processes available. Despite its popularity and related technical advancements, process reliability and quality assurance remain only partially solved. In particular, the surface roughness caused by this process is a key concern. To solve this constraint, experimental plans have been exploited to optimize surface roughness in recent years. However, the latter empirical trial and error process is extremely time- and resource consuming. Thus, this study aims to avoid using large experimental programs to optimize surface roughness in material extrusion.

Design/methodology/approach

This research provides an in-depth analysis of the effect of several printing parameters: layer height, printing temperature, printing speed and wall thickness. The proposed data-driven predictive modeling approach takes advantage of Machine Learning (ML) models to automatically predict surface roughness based on the data gathered from the literature and the experimental data generated for testing.

Findings

Using ten-fold cross-validation of data gathered from the literature, the proposed ML solution attains a 0.93 correlation with a mean absolute percentage error of 13%. When testing with our own data, the correlation diminishes to 0.79 and the mean absolute percentage error reduces to 8%. Thus, the solution for predicting surface roughness in extrusion-based printing offers competitive results regarding the variability of the analyzed factors.

Research limitations/implications

There are limitations in obtaining large volumes of reliable data, and the variability of the material extrusion process is relatively high.

Originality/value

Although ML is not a novel methodology in additive manufacturing, the use of published data from multiple sources has barely been exploited to train predictive models. As available manufacturing data continue to increase on a daily basis, the ability to learn from these large volumes of data is critical in future manufacturing and science. Specifically, the power of ML helps model surface roughness with limited experimental tests.

Details

Rapid Prototyping Journal, vol. 29 no. 8
Type: Research Article
ISSN: 1355-2546

Keywords

Article
Publication date: 2 April 2024

Yixue Shen, Naomi Brookes, Luis Lattuf Flores and Julia Brettschneider

In recent years, there has been a growing interest in the potential of data analytics to enhance project delivery. Yet many argue that its application in projects is still lagging…

Abstract

Purpose

In recent years, there has been a growing interest in the potential of data analytics to enhance project delivery. Yet many argue that its application in projects is still lagging behind other disciplines. This paper aims to provide a review of the current use of data analytics in project delivery encompassing both academic research and practice to accelerate current understanding and use this to formulate questions and goals for future research.

Design/methodology/approach

We propose to achieve the research aim through the creation of a systematic review of the status of data analytics in project delivery. Fusing the methodology of integrative literature review with a recently established practice to include both white and grey literature amounts to an approach tailored to the state of the domain. It serves to delineate a research agenda informed by current developments in both academic research and industrial practice.

Findings

The literature review reveals a dearth of work in both academic research and practice relating to data analytics in project delivery and characterises this situation as having “more gap than knowledge.” Some work does exist in the application of machine learning to predicting project delivery though this is restricted to disparate, single context studies that do not reach extendible findings on algorithm selection or key predictive characteristics. Grey literature addresses the potential benefits of data analytics in project delivery but in a manner reliant on “thought-experiments” and devoid of empirical examples.

Originality/value

Based on the review we articulate a research agenda to create knowledge fundamental to the effective use of data analytics in project delivery. This is structured around the functional framework devised by this investigation and highlights both organisational and data analytic challenges. Specifically, we express this structure in the form of an “onion-skin” model for conceptual structuring of data analytics in projects. We conclude with a discussion about if and how today’s project studies research community can respond to the totality of these challenges. This paper provides a blueprint for a bridge connecting data analytics and project management.

Details

International Journal of Managing Projects in Business, vol. 17 no. 2
Type: Research Article
ISSN: 1753-8378

Keywords

1 – 10 of 822