Search results

1 – 10 of over 14000
Open Access
Article
Publication date: 13 August 2020

Mariam AlKandari and Imtiaz Ahmad

Solar power forecasting will have a significant impact on the future of large-scale renewable energy plants. Predicting photovoltaic power generation depends heavily on climate…

11040

Abstract

Solar power forecasting will have a significant impact on the future of large-scale renewable energy plants. Predicting photovoltaic power generation depends heavily on climate conditions, which fluctuate over time. In this research, we propose a hybrid model that combines machine-learning methods with Theta statistical method for more accurate prediction of future solar power generation from renewable energy plants. The machine learning models include long short-term memory (LSTM), gate recurrent unit (GRU), AutoEncoder LSTM (Auto-LSTM) and a newly proposed Auto-GRU. To enhance the accuracy of the proposed Machine learning and Statistical Hybrid Model (MLSHM), we employ two diversity techniques, i.e. structural diversity and data diversity. To combine the prediction of the ensemble members in the proposed MLSHM, we exploit four combining methods: simple averaging approach, weighted averaging using linear approach and using non-linear approach, and combination through variance using inverse approach. The proposed MLSHM scheme was validated on two real-time series datasets, that sre Shagaya in Kuwait and Cocoa in the USA. The experiments show that the proposed MLSHM, using all the combination methods, achieved higher accuracy compared to the prediction of the traditional individual models. Results demonstrate that a hybrid model combining machine-learning methods with statistical method outperformed a hybrid model that only combines machine-learning models without statistical method.

Details

Applied Computing and Informatics, vol. 20 no. 3/4
Type: Research Article
ISSN: 2634-1964

Keywords

Article
Publication date: 29 September 2023

Oliver Csernyava, Jozsef Pavo and Zsolt Badics

This study aims to model and investigate low-loss wave-propagation modes across random media. The objective is to achieve better channel properties for applying radio links…

Abstract

Purpose

This study aims to model and investigate low-loss wave-propagation modes across random media. The objective is to achieve better channel properties for applying radio links through random vegetation (e.g. forest) using a beamforming approach. Thus, obtaining the link between the statistical parameters of the media and the channel properties.

Design/methodology/approach

A beamforming approach is used to obtain low-loss propagation across random media constructed of long cylinders, i.e. a simplified two dimensional (2D) model of agroforests. The statistical properties of the eigenmode radio wave propagation are studied following a Monte Carlo method. An error quantity is defined to represent the robustness of an eigenmode, and it is shown that it follows a known Lognormal statistical distribution, thereby providing a base for further statistical investigations.

Findings

In this study, it is shown that radio wave propagation eigenmodes exist based on a mathematical model. The algorithm presented can find such modes of propagation that are less affected by the statistical variation of the media than the regular beams used in radio wave communication techniques. It is illustrated that a sufficiently chosen eigenmode waveform is not significantly perturbed by the natural variation of the tree trunk diameters.

Originality/value

As a new approach to obtain low-loss propagation in random media at microwave frequencies, the presented mathematical model can calculate scattering-free wave-propagation eigenmodes. A robustness quantity is defined for a specific eigenmode, considering a 2D simplified statistical forest example. This new robustness quantity is useful for performing computationally low-cost optimization problems to find eigenmodes for more complex vegetation models.

Details

COMPEL - The international journal for computation and mathematics in electrical and electronic engineering , vol. 42 no. 5
Type: Research Article
ISSN: 0332-1649

Keywords

Article
Publication date: 11 October 2023

Chinthaka Niroshan Atapattu, Niluka Domingo and Monty Sutrisna

Cost overrun in infrastructure projects is a constant concern, with a need for a proper solution. The current estimation practice needs improvement to reduce cost overruns. This…

Abstract

Purpose

Cost overrun in infrastructure projects is a constant concern, with a need for a proper solution. The current estimation practice needs improvement to reduce cost overruns. This study aimed to find possible statistical modelling techniques that could be used to develop cost models to produce more reliable cost estimates.

Design/methodology/approach

A bibliographic literature review was conducted using a two-stage selection method to compile the relevant publications from Scopus. Then, Visualisation of Similarities (VOS)-Viewer was used to develop the visualisation maps for co-occurrence keyword analysis and yearly trends in research topics.

Findings

The study found seven primary techniques used as cost models in construction projects: regression analysis (RA), artificial neural network (ANN), case-based reasoning (CBR), fuzzy logic, Monte-Carlo simulation (MCS), support vector machine (SVM) and reference class forecasting (RCF). RA, ANN and CBR were the most researched techniques. Furthermore, it was observed that the model's performance could be improved by combining two or more techniques into one model.

Research limitations/implications

The research was limited to the findings from the bibliometric literature review.

Practical implications

The findings provided an assessment of statistical techniques that the industry can adopt to improve the traditional estimation practice of infrastructure projects.

Originality/value

This study mapped the research carried out on cost-modelling techniques and analysed the trends. It also reviewed the performance of the models developed for infrastructure projects. The findings could be used to further research to develop more reliable cost models using statistical modelling techniques with better performance.

Details

Smart and Sustainable Built Environment, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 2046-6099

Keywords

Open Access
Article
Publication date: 25 April 2023

Manuela Cazzaro and Paola Maddalena Chiodini

Although the Net Promoter Score (NPS) index is simple, NPS has weaknesses that make NPS's interpretation misleading. The main criticism is that identical index values can…

1647

Abstract

Purpose

Although the Net Promoter Score (NPS) index is simple, NPS has weaknesses that make NPS's interpretation misleading. The main criticism is that identical index values can correspond to different levels of customer loyalty. This makes difficult to determine whether the company is improving/deteriorating in two different years. The authors describe the application of statistical tools to establish whether identical values may/may not be considered similar under statistical hypotheses.

Design/methodology/approach

Equal NPSs with a “similar” component composition should have a two-way table satisfying marginal homogeneity hypothesis. The authors compare the marginals using a cumulative marginal logit model that assumes a proportional odds structure: the model has the same effect for each logit. Marginal homogeneity corresponds to null effect. If the marginal homogeneity hypothesis is rejected, the cumulative odds ratio becomes a tool for measuring the proportionality between the odds.

Findings

The authors propose an algorithm that helps managers in their decision-making process. The authors' methodology provides a statistical tool to recognize customer base compositions. The authors suggest a statistical test of the marginal distribution homogeneity of the table representing the index compositions at two times. Through the calculation of cumulative odds ratios, the authors discriminate against the hypothesis of equality of the NPS.

Originality/value

The authors' contribution provides a statistical alternative that can be easily implemented by business operators to fill the known shortcomings of the index in the customer satisfaction's context. This paper confirms that although a single number summarizes and communicates a complex situation very quickly, the number is ambiguous and unreliable if not accompanied by other tools.

Details

The TQM Journal, vol. 35 no. 9
Type: Research Article
ISSN: 1754-2731

Keywords

Article
Publication date: 7 February 2022

Muralidhar Vaman Kamath, Shrilaxmi Prashanth, Mithesh Kumar and Adithya Tantri

The compressive strength of concrete depends on many interdependent parameters; its exact prediction is not that simple because of complex processes involved in strength…

Abstract

Purpose

The compressive strength of concrete depends on many interdependent parameters; its exact prediction is not that simple because of complex processes involved in strength development. This study aims to predict the compressive strength of normal concrete and high-performance concrete using four datasets.

Design/methodology/approach

In this paper, five established individual Machine Learning (ML) regression models have been compared: Decision Regression Tree, Random Forest Regression, Lasso Regression, Ridge Regression and Multiple-Linear regression. Four datasets were studied, two of which are previous research datasets, and two datasets are from the sophisticated lab using five established individual ML regression models.

Findings

The five statistical indicators like coefficient of determination (R2), mean absolute error, root mean squared error, Nash–Sutcliffe efficiency and mean absolute percentage error have been used to compare the performance of the models. The models are further compared using statistical indicators with previous studies. Lastly, to understand the variable effect of the predictor, the sensitivity and parametric analysis were carried out to find the performance of the variable.

Originality/value

The findings of this paper will allow readers to understand the factors involved in identifying the machine learning models and concrete datasets. In so doing, we hope that this research advances the toolset needed to predict compressive strength.

Details

Journal of Engineering, Design and Technology , vol. 22 no. 2
Type: Research Article
ISSN: 1726-0531

Keywords

Open Access
Article
Publication date: 15 February 2024

Di Kang, Steven W. Kirkpatrick, Zhipeng Zhang, Xiang Liu and Zheyong Bian

Accurately estimating the severity of derailment is a crucial step in quantifying train derailment consequences and, thereby, mitigating its impacts. The purpose of this paper is…

Abstract

Purpose

Accurately estimating the severity of derailment is a crucial step in quantifying train derailment consequences and, thereby, mitigating its impacts. The purpose of this paper is to propose a simplified approach aimed at addressing this research gap by developing a physics-informed 1-D model. The model is used to simulate train dynamics through a time-stepping algorithm, incorporating derailment data after the point of derailment.

Design/methodology/approach

In this study, a simplified approach is adopted that applies a 1-D kinematic analysis with data obtained from various derailments. These include the length and weight of the rail cars behind the point of derailment, the train braking effects, derailment blockage forces, the grade of the track and the train rolling and aerodynamic resistance. Since train braking/blockage effects and derailment blockage forces are not always available for historical or potential train derailment, it is also necessary to fit the historical data and find optimal parameters to estimate these two variables. Using these fitted parameters, a detailed comparison can be performed between the physics-informed 1-D model and previous statistical models to predict the derailment severity.

Findings

The results show that the proposed model outperforms the Truncated Geometric model (the latest statistical model used in prior research) in estimating derailment severity. The proposed model contributes to the understanding and prevention of train derailments and hazmat release consequences, offering improved accuracy for certain scenarios and train types

Originality/value

This paper presents a simplified physics-informed 1-D model, which could help understand the derailment mechanism and, thus, is expected to estimate train derailment severity more accurately for certain scenarios and train types compared with the latest statistical model. The performance of the braking response and the 1-D model is verified by comparing known ride-down profiles with estimated ones. This validation process ensures that both the braking response and the 1-D model accurately represent the expected behavior.

Details

Smart and Resilient Transportation, vol. 6 no. 1
Type: Research Article
ISSN: 2632-0487

Keywords

Article
Publication date: 17 April 2024

Jahanzaib Alvi and Imtiaz Arif

The crux of this paper is to unveil efficient features and practical tools that can predict credit default.

Abstract

Purpose

The crux of this paper is to unveil efficient features and practical tools that can predict credit default.

Design/methodology/approach

Annual data of non-financial listed companies were taken from 2000 to 2020, along with 71 financial ratios. The dataset was bifurcated into three panels with three default assumptions. Logistic regression (LR) and k-nearest neighbor (KNN) binary classification algorithms were used to estimate credit default in this research.

Findings

The study’s findings revealed that features used in Model 3 (Case 3) were the efficient and best features comparatively. Results also showcased that KNN exposed higher accuracy than LR, which proves the supremacy of KNN on LR.

Research limitations/implications

Using only two classifiers limits this research for a comprehensive comparison of results; this research was based on only financial data, which exhibits a sizeable room for including non-financial parameters in default estimation. Both limitations may be a direction for future research in this domain.

Originality/value

This study introduces efficient features and tools for credit default prediction using financial data, demonstrating KNN’s superior accuracy over LR and suggesting future research directions.

Details

Kybernetes, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 0368-492X

Keywords

Article
Publication date: 10 June 2024

Zhangtao Peng, Qian Fang, Qing Ai, Xiaomo Jiang, Hui Wang, Xingchun Huang and Yong Yuan

A risk-based method is proposed to identify the dominant influencing factors of secondary lining cracking in an operating mountain tunnel with weak surrounding rock.

Abstract

Purpose

A risk-based method is proposed to identify the dominant influencing factors of secondary lining cracking in an operating mountain tunnel with weak surrounding rock.

Design/methodology/approach

Based on the inspection data from a mountain tunnel in Southwest China, a lognormal proportional hazard model is established to describe the statistical distribution of secondary lining cracks. Then, the model parameters are obtained by using the Bayesian regression method, and the importance of influencing factors can be sorted based on the absolute values of the parameters.

Findings

The results show that the order of importance of the influencing factors of secondary lining cracks is as follows: location of the crack on the tunnel profile, rock mass grade of the surrounding rock, time to completion of the secondary lining, and void behind the secondary lining. Accordingly, the location of the crack on the tunnel profile and rock mass grade of the surrounding rock are the two most important influencing factors of secondary lining cracks in the investigated mountain tunnel, and appropriate maintenance measures should be focused on these two aspects.

Originality/value

This study provides a general and effective reference for identifying the dominant influencing factors of secondary lining cracks to guide the targeted maintenance in mountain tunnels.

Details

International Journal of Structural Integrity, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 1757-9864

Keywords

Book part
Publication date: 14 December 2023

Steven A. Harrast, Lori Olsen and Yan (Tricia) Sun

Prior research (Harrast, Olsen, & Sun, 2023) analyzes the eight emerging topics to be included in future CPA exams and discusses their importance to career success and appropriate…

Abstract

Prior research (Harrast, Olsen, & Sun, 2023) analyzes the eight emerging topics to be included in future CPA exams and discusses their importance to career success and appropriate teaching locus in light of survey evidence. They find that the general topic of data analytics is the most important of the eight emerging topics. To further understand the topics most important to career success, this study analyzes subtopics underlying the eight emerging topics. The results show that advanced Excel analysis tools, data visualization, and data extraction, transformation, and loading (ETL) are the most important data analytics subskills for career success according to professionals and that these topics should be both introduced and emphasized in the accounting curriculum. The results provide useful information to educators to prioritize general emerging topics and specific subtopics in the accounting curriculum by taking into account the most pressing needs of the profession.

Article
Publication date: 11 June 2024

Shakeel Dilawar, Ahsan Khan, Asif Ur Rehman, Syed Zahid Husain and Syed Husain Imran Jaffery

The purpose of this study was to use bridge curvature method (BCM) to quantify stress, while multiscale modeling with adaptive coarsening predicted distortions based on…

Abstract

Purpose

The purpose of this study was to use bridge curvature method (BCM) to quantify stress, while multiscale modeling with adaptive coarsening predicted distortions based on experimentally validated models. Taguchi method and response surface method were used to optimize process parameters (energy density, hatch spacing, scanning speed and beam diameter).

Design/methodology/approach

Laser powder bed fusion (LPBF) offers significant design freedom but suffers from residual stresses due to rapid melting and solidification. This study presents a novel approach combining multiscale modeling and statistical optimization to minimize residual stress in SS316L.

Findings

Optimal parameters were identified through simulations and validated with experiments, achieving an 8% deviation. This approach significantly reduced printing costs compared to traditional trial-and-error methods. The analysis revealed a non-monotonic relationship between residual stress and energy density, with an initial increase followed by a decrease with increasing hatch spacing and scanning speed (both contributing to lower energy density). Additionally, beam diameter had a minimal impact compared to other energy density parameters.

Originality/value

This work offers a unique framework for optimizing LPBF processes by combining multiscale modeling with statistical techniques. The identified optimal parameters and insights into the individual and combined effects of energy density parameters provide valuable guidance for mitigating residual stress in SS316L, leading to improved part quality and performance.

Details

Rapid Prototyping Journal, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 1355-2546

Keywords

1 – 10 of over 14000