Search results

1 – 10 of 13
Article
Publication date: 20 June 2024

Hugo Gobato Souto and Amir Moradi

This study aims to critically evaluate the competitiveness of Transformer-based models in financial forecasting, specifically in the context of stock realized volatility…

Abstract

Purpose

This study aims to critically evaluate the competitiveness of Transformer-based models in financial forecasting, specifically in the context of stock realized volatility forecasting. It seeks to challenge and extend upon the assertions of Zeng et al. (2023) regarding the purported limitations of these models in handling temporal information in financial time series.

Design/methodology/approach

Employing a robust methodological framework, the study systematically compares a range of Transformer models, including first-generation and advanced iterations like Informer, Autoformer, and PatchTST, against benchmark models (HAR, NBEATSx, NHITS, and TimesNet). The evaluation encompasses 80 different stocks, four error metrics, four statistical tests, and three robustness tests designed to reflect diverse market conditions and data availability scenarios.

Findings

The research uncovers that while first-generation Transformer models, like TFT, underperform in financial forecasting, second-generation models like Informer, Autoformer, and PatchTST demonstrate remarkable efficacy, especially in scenarios characterized by limited historical data and market volatility. The study also highlights the nuanced performance of these models across different forecasting horizons and error metrics, showcasing their potential as robust tools in financial forecasting, which contradicts the findings of Zeng et al. (2023)

Originality/value

This paper contributes to the financial forecasting literature by providing a comprehensive analysis of the applicability of Transformer-based models in this domain. It offers new insights into the capabilities of these models, especially their adaptability to different market conditions and forecasting requirements, challenging the existing skepticism created by Zeng et al. (2023) about their utility in financial forecasting.

Details

China Finance Review International, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 2044-1398

Keywords

Article
Publication date: 5 July 2024

Aditya Thangjam, Sanjita Jaipuria and Pradeep Kumar Dadabada

The purpose of this study is to propose a systematic model selection procedure for long-term load forecasting (LTLF) for ex-ante and ex-post cases considering uncertainty in…

Abstract

Purpose

The purpose of this study is to propose a systematic model selection procedure for long-term load forecasting (LTLF) for ex-ante and ex-post cases considering uncertainty in exogenous predictors.

Design/methodology/approach

The different variants of regression models, namely, Polynomial Regression (PR), Generalised Additive Model (GAM), Quantile Polynomial Regression (QPR) and Quantile Spline Regression (QSR), incorporating uncertainty in exogenous predictors like population, Real Gross State Product (RGSP) and Real Per Capita Income (RPCI), temperature and indicators of breakpoints and calendar effects, are considered for LTLF. Initially, the Backward Feature Elimination procedure is used to identify the optimal set of predictors for LTLF. Then, the consistency in model accuracies is evaluated using point and probabilistic forecast error metrics for ex-ante and ex-post cases.

Findings

From this study, it is found PR model outperformed in ex-ante condition, while QPR model outperformed in ex-post condition. Further, QPR model performed consistently across validation and testing periods. Overall, QPR model excelled in capturing uncertainty in exogenous predictors, thereby reducing over-forecast error and risk of overinvestment.

Research limitations/implications

These findings can help utilities to align model selection strategies with their risk tolerance.

Originality/value

To propose the systematic model selection procedure in this study, the consistent performance of PR, GAM, QPR and QSR models are evaluated using point forecast accuracy metrics Mean Absolute Percentage Error, Root Mean Squared Error and probabilistic forecast accuracy metric Pinball Score for ex-ante and ex-post cases considering uncertainty in the considered exogenous predictors such as RGSP, RPCI, population and temperature.

Details

Journal of Modelling in Management, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 1746-5664

Keywords

Article
Publication date: 22 August 2024

Iman Bashtani and Javad Abolfazli Esfahani

This study aims to introduce a novel machine learning feature vector (MLFV) method to bring machine learning to overcome the time-consuming computational fluid dynamics (CFD…

Abstract

Purpose

This study aims to introduce a novel machine learning feature vector (MLFV) method to bring machine learning to overcome the time-consuming computational fluid dynamics (CFD) simulations for rapidly predicting turbulent flow characteristics with acceptable accuracy.

Design/methodology/approach

In this method, CFD snapshots are encoded in a tensor as the input training data. Then, the MLFV learns the relationship between data with a rod filter, which is named feature vector, to learn features by defining functions on it. To demonstrate the accuracy of the MLFV, this method is used to predict the velocity, temperature and turbulent kinetic energy fields of turbulent flow passing over an innovative nature-inspired Dolphin turbulator based on only ten CFD data.

Findings

The results indicate that MLFV and CFD contours alongside scatter plots have a good agreement between predicted and solved data with R2 ≃ 1. Also, the error percentage contours and histograms reveal the high precisions of predictions with MAPE = 7.90E-02, 1.45E-02, 7.32E-02 and NRMSE = 1.30E-04, 1.61E-03, 4.54E-05 for prediction velocity, temperature, turbulent kinetic energy fields at Re = 20,000, respectively.

Practical implications

The method can have state-of-the-art applications in a wide range of CFD simulations with the ability to train based on small data, which is practical and logical regarding the number of required tests.

Originality/value

The paper introduces a novel, innovative and super-fast method named MLFV to address the time-consuming challenges associated with the traditional CFD approach to predict the physics of turbulent heat and fluid flow in real time with the superiority of training based on small data with acceptable accuracy.

Details

International Journal of Numerical Methods for Heat & Fluid Flow, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 0961-5539

Keywords

Open Access
Article
Publication date: 9 November 2023

Abdulmohsen S. Almohsen, Naif M. Alsanabani, Abdullah M. Alsugair and Khalid S. Al-Gahtani

The variance between the winning bid and the owner's estimated cost (OEC) is one of the construction management risks in the pre-tendering phase. The study aims to enhance the…

Abstract

Purpose

The variance between the winning bid and the owner's estimated cost (OEC) is one of the construction management risks in the pre-tendering phase. The study aims to enhance the quality of the owner's estimation for predicting precisely the contract cost at the pre-tendering phase and avoiding future issues that arise through the construction phase.

Design/methodology/approach

This paper integrated artificial neural networks (ANN), deep neural networks (DNN) and time series (TS) techniques to estimate the ratio of a low bid to the OEC (R) for different size contracts and three types of contracts (building, electric and mechanic) accurately based on 94 contracts from King Saud University. The ANN and DNN models were evaluated using mean absolute percentage error (MAPE), mean sum square error (MSSE) and root mean sums square error (RMSSE).

Findings

The main finding is that the ANN provides high accuracy with MAPE, MSSE and RMSSE a 2.94%, 0.0015 and 0.039, respectively. The DNN's precision was high, with an RMSSE of 0.15 on average.

Practical implications

The owner and consultant are expected to use the study's findings to create more accuracy of the owner's estimate and decrease the difference between the owner's estimate and the lowest submitted offer for better decision-making.

Originality/value

This study fills the knowledge gap by developing an ANN model to handle missing TS data and forecasting the difference between a low bid and an OEC at the pre-tendering phase.

Details

Engineering, Construction and Architectural Management, vol. 31 no. 13
Type: Research Article
ISSN: 0969-9988

Keywords

Book part
Publication date: 22 July 2024

Bhavya Advani, Anshita Sachan, Udit Kumar Sahu and Ashis Kumar Pradhan

A major concern for policymakers and researchers is to ascertain the movement of price levels and employment rates. Predicting the trends of these variables will assist the…

Abstract

A major concern for policymakers and researchers is to ascertain the movement of price levels and employment rates. Predicting the trends of these variables will assist the government in making policies to stabilize the economy. The objective of this chapter is to forecast the unemployment rate and Consumer Price Index (CPI) for the period 2022 to 2031 for the Indian economy. For this purpose, the authors analyse the prediction capability of the univariate auto-regressive integrated moving average (ARIMA) model and the vector autoregressive (VAR) model. The dataset for India's annual CPI and unemployment rate pertains to a 30-year time period from 1991 to 2021. The result shows that the inflation forecasts derived from the ARIMA model are more precise than that of the VAR model. Whereas, unemployment rate forecasts obtained from the VAR model are more reliable than that of the ARIMA model. It is also observed that predicted unemployment rates hover around 5.7% in the forthcoming years, while the forecasted inflation rate witnesses an increasing trend.

Details

Modeling Economic Growth in Contemporary India
Type: Book
ISBN: 978-1-80382-752-0

Keywords

Article
Publication date: 6 May 2024

Issah Ibrahim and David Lowther

Evaluating the multiphysics performance of an electric motor can be a computationally intensive process, especially where several complex subsystems of the motor are coupled…

Abstract

Purpose

Evaluating the multiphysics performance of an electric motor can be a computationally intensive process, especially where several complex subsystems of the motor are coupled together. For example, evaluating acoustic noise requires the coupling of the electromagnetic, structural and acoustic models of the electric motor. Where skewed poles are considered in the design, the problem becomes a purely three-dimensional (3D) multiphysics problem, which could increase the computational burden astronomically. This study, therefore, aims to introduce surrogate models in the design process to reduce the computational cost associated with solving such 3D-coupled multiphysics problems.

Design/methodology/approach

The procedure involves using the finite element (FE) method to generate a database of several skewed rotor pole surface-mounted permanent magnet synchronous motors and their corresponding electromagnetic, structural and acoustic performances. Then, a surrogate model is fitted to the data to generate mapping functions that could be used in place of the time-consuming FE simulations.

Findings

It was established that the surrogate models showed promising results in predicting the multiphysics performance of skewed pole surface-mounted permanent magnet motors. As such, such models could be used to handle the skewing aspects, which has always been a major design challenge due to the scarcity of simulation tools with stepwise skewing capability.

Originality/value

The main contribution involves the use of surrogate models to replace FE simulations during the design cycle of skewed pole surface-mounted permanent magnet motors without compromising the integrity of the electromagnetic, structural, and acoustic results of the motor.

Details

COMPEL - The international journal for computation and mathematics in electrical and electronic engineering , vol. 43 no. 3
Type: Research Article
ISSN: 0332-1649

Keywords

Article
Publication date: 25 June 2024

Junseo Bae

The main objectives of this study are to (1) develop and test a cost contingency learning model that can generalize initially estimated contingency amounts by analyzing back the…

Abstract

Purpose

The main objectives of this study are to (1) develop and test a cost contingency learning model that can generalize initially estimated contingency amounts by analyzing back the multiple project changes experienced and (2) uncover the hidden link of the learning networks using a curve-fitting technique for the post-construction evaluation of cost contingency amounts to cover cost risk for future projects.

Design/methodology/approach

Based on a total of 1,434 datapoints collected from DBB and DB transportation projects, a post-construction cost contingency learning model was developed using feedforward neural networks (FNNs). The developed model generalizes cost contingencies under two different project delivery methods (i.e. DBB and DB). The learning outputs of generalized contingency amounts were curve-fitted with the post-construction schedule and cost information, specifically aiming at uncovering the hidden link of the FNNs. Two different bridge projects completed under DBB and DB were employed as illustrative examples to demonstrate how the proposed modeling framework could be implemented.

Findings

With zero or negative values of change growth experienced, it was concluded that cost contingencies were overallocated at the contract stage. On the other hand, with positive values of change growth experienced, it was evaluated that set cost contingencies were insufficient from the post-construction standpoint. Taken together, this study proposed a tangible post-construction evaluation technique that can produce not only the plausible ranges of cost contingencies but also the exact amounts of contingency under DBB and DB contracts.

Originality/value

As the first of its kind, the proposed modeling framework provides agency engineers and decision-makers with tangible assessments of cost contingency coupled with experienced risks at the post-construction stage. Use of the proposed model will help them evaluate the allocation of appropriate contingency amounts. If an agency allocates a cost contingency benchmarked from similar projects on aspects of the base estimate and experienced risks, a set contingency can be defended more reliably. The main findings of this study contribute to post-construction cost contingency verification, enabling agency engineers and decision-makers to systematically evaluate set cost contingencies during the post-construction assessment stage and achieving further any enhanced level of confidence for future cost contingency plans.

Details

Engineering, Construction and Architectural Management, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 0969-9988

Keywords

Article
Publication date: 31 May 2024

Shikha Pandey, Sumit Gandhi and Yogesh Iyer Murthy

The purpose of this study is to compare the prediction models for half-cell potential (HCP) of RCC slabs cathodically protected using pure magnesium anodes and subjected to…

Abstract

Purpose

The purpose of this study is to compare the prediction models for half-cell potential (HCP) of RCC slabs cathodically protected using pure magnesium anodes and subjected to chloride ingress.The models for HCP using 1,134 data set values based on experimentation are developed and compared using ANFIS, artificial neural network (ANN) and integrated ANN-GA algorithms.

Design/methodology/approach

In this study, RCC slabs, 1000 mm × 1000 mm × 100 mm were cast. Five slabs were cast with 3.5% NaCl by weight of cement, and five more were cast without NaCl. The distance of the point under consideration from the anode in the x- and y-axes, temperature, relative humidity and age of the slab in days were the input parameters, while the HCP values with reference to the Standard Calomel Electrode were the output. Experimental values consisting of 80 HCP values per slab per day were collected for 270 days and were averaged for both cases to generate the prediction model.

Findings

In this study, the premise and consequent parameters are trained, validated and tested using ANFIS, ANN and by using ANN as fitness function of GA. The MAPE, RMSE and MAE of the ANFIS model were 24.57, 1702.601 and 871.762, respectively. Amongst the ANN algorithms, Levenberg−Marquardt (LM) algorithm outperforms the other methods, with an overall R-value of 0.983. GA with ANN as the objective function proves to be the best means for the development of prediction model.

Originality/value

Based on the original experimental values, the performance of ANFIS, ANN and GA with ANN as objective function provides excellent results.

Details

Anti-Corrosion Methods and Materials, vol. 71 no. 5
Type: Research Article
ISSN: 0003-5599

Keywords

Open Access
Article
Publication date: 20 August 2024

Quang Phung Duy, Oanh Nguyen Thi, Phuong Hao Le Thi, Hai Duong Pham Hoang, Khanh Linh Luong and Kim Ngan Nguyen Thi

The goal of the study is to offer important insights into the dynamics of the cryptocurrency market by analyzing pricing data for Bitcoin. Using quantitative analytic methods, the…

Abstract

Purpose

The goal of the study is to offer important insights into the dynamics of the cryptocurrency market by analyzing pricing data for Bitcoin. Using quantitative analytic methods, the study makes use of a Generalized Autoregressive Conditional Heteroskedasticity (GARCH) model and an Autoregressive Integrated Moving Average (ARIMA). The study looks at how predictable Bitcoin price swings and market volatility will be between 2021 and 2023.

Design/methodology/approach

The data used in this study are the daily closing prices of Bitcoin from Jan 17th, 2021 to Dec 17th, 2023, which corresponds to a total of 1065 observations. The estimation process is run using 3 years of data (2021–2023), while the remaining (Jan 1st 2024 to Jan 17th 2024) is used for forecasting. The ARIMA-GARCH method is a robust framework for forecasting time series data with non-seasonal components. The model was selected based on the Akaike Information Criteria corrected (AICc) minimum values and maximum log-likelihood. Model adequacy was checked using plots of residuals and the Ljung–Box test.

Findings

Using the Box–Jenkins method, various AR and MA lags were tested to determine the most optimal lags. ARIMA (12,1,12) is the most appropriate model obtained from the various models using AIC. As financial time series, such as Bitcoin returns, can be volatile, an attempt is made to model this volatility using GARCH (1,1).

Originality/value

The study used partially processed secondary data to fit for time series analysis using the ARIMA (12,1,12)-GARCH(1,1) model and hence reliable and conclusive results.

Details

Business Analyst Journal, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 0973-211X

Keywords

Article
Publication date: 3 September 2024

Jamal A.A. Numan and Izham Mohamad Yusoff

Due to the lack of consensus on influential variables for real estate appraisal, which varies from one country to another based on the national preferences and customs of each…

Abstract

Purpose

Due to the lack of consensus on influential variables for real estate appraisal, which varies from one country to another based on the national preferences and customs of each country, this study aims to identify the most influential variables affecting condominium apartment real estate appraisal within the context of Al Bireh city, Palestine.

Design/methodology/approach

The methodology adopts a cross-sectional quantitative approach, entailing the administration of an online questionnaire survey to 103 buyers and appraisers. The questionnaire aims to evaluate 32 variables concerning their impact on condominium real estate appraisal. Out of these, 25 are derived from three specific previous studies, and the remaining 7 are identified through various studies or by the authors, taking into account the local context and the geopolitical situation in Palestine. Respondents assign scores to these variables on a five-point Likert scale, ranging from 1 to 5, where 1 indicates less influence and 5 signifies the most influence. Variables with an arithmetic mean score exceeding 4 are deemed the most influential.

Findings

The findings underscore 16 and 17 influential variables as perceived by buyers and appraisers, respectively. Notably, 13 variables are common, including aspects such as parking, elevator, neighborhood, floor apartments, finishing quality, construction material, condition, building apartments, area, open sides, building floors, colonies and age.

Research limitations/implications

The primary constraint of this study is its dependence on insights solely from buyers and appraisers, disregarding input from other stakeholders like investors or developers. The questionnaire lacks vital respondent characteristics, such as gender and occupation, impeding the analysis of variable dependence on participant attributes. Although some additional influential variables are suggested through the responses of an open-ended question, the questionnaire is not repeated, leaving their influence unassessed. This study's focus on Al Bireh city limits the opportunity for result comparison with other cities, diminishing its generalizability.

Practical implications

The implications of this research are twofold: to provide stakeholders with a checklist for variables influencing apartment price value and to guide data collection related to the real estate appraisal sector, facilitating its use as input in advanced appraisal methods such as artificial intelligence with a view to improving overall performance. Obtaining an informed, mature and accurate appraisal has direct economic, business and financial impacts at the level of policymakers and individuals.

Originality/value

To the best of the authors' knowledge, this study is the inaugural endeavor in Palestine focusing on identifying pivotal factors influencing condominium apartment appraisal values. This study concludes by presenting a checklist comprising the most influential variables, offering utility to various stakeholders, including buyers, appraisers and developers. In addition, the questionnaire incorporates an open-ended question, soliciting respondents' input on additional variables they believe impact the appraisal process.

Details

Journal of Facilities Management , vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 1472-5967

Keywords

1 – 10 of 13