Search results

1 – 10 of 876
Article
Publication date: 10 August 2023

Zvi Schwartz, Jing Ma and Timothy Webb

Mean absolute percentage error (MAPE) is the primary forecast evaluation metric in hospitality and tourism research; however its main shortcoming is that it is asymmetric. The…

Abstract

Purpose

Mean absolute percentage error (MAPE) is the primary forecast evaluation metric in hospitality and tourism research; however its main shortcoming is that it is asymmetric. The asymmetry occurs due to over or under forecasts that introduce bias into forecast evaluation. This study aims to explore the nature of asymmetry and designs a new measure, one that reduces the asymmetric properties while maintaining MAPE’s scale-free and intuitive interpretation characteristics.

Design/methodology/approach

The study proposes and tests a new forecasting accuracy measure for hospitality revenue management (RM). A computer simulation is used to assess and demonstrate the problem of asymmetry when forecasting with MAPE, and the new measures’ (MSapeMER, that is, Mean of Selectively applied Absolute Percentage Error or Magnitude of Error Relative to the estimate) ability to reduce it. The MSapeMER’s effectiveness is empirically validated by using a large set of hotel forecasts.

Findings

The study demonstrates the ability of the MSapeMER to reduce the asymmetry bias generated by MAPE. Furthermore, this study demonstrates that MSapeMER is more effective than previous attempts to correct for asymmetry bias. The results show via simulation and empirical investigation that the error metric is more stable and less swayed by the presence of over and under forecasts.

Research limitations/implications

It is recommended that hospitality RM researchers and professionals adopt MSapeMER when using MAPE to evaluate forecasting performance. The MSapeMER removes the potential bias that MAPE invites due to its calculation and presence of over and under forecasts. Therefore, forecasting evaluations may be less affected by the presence of over and under forecasts and their ability to bias forecasting results.

Practical implications

Hospitality RM should adopt this measure when MAPE is used, to reduce biased decisions driven by the “asymmetry of MAPE.”

Originality/value

The MAPE error metric exhibits an asymmetry problem, and this paper proposes a more effective solution to reduce biased results with two major methodological contributions. It is first to systematically study the characteristics of MAPE’s asymmetry, while proposing and testing a measure that considerably reduces the amount of asymmetry. This is a critical contribution because MAPE is the primary forecasting metric in hospitality and tourism studies. The second methodological contribution is a procedure developed to “quantify” the asymmetry. The approach is demonstrated and allows future research to compare asymmetric characteristics among various accuracy measures.

Details

International Journal of Contemporary Hospitality Management, vol. 36 no. 6
Type: Research Article
ISSN: 0959-6119

Keywords

Article
Publication date: 4 July 2016

Gülşah Hançerlioğulları, Alper Şen and Esra Ağca Aktunç

The purpose of this paper is to investigate the impact of demand uncertainty on inventory turnover performance through empirical modeling. In particular the authors use the…

5959

Abstract

Purpose

The purpose of this paper is to investigate the impact of demand uncertainty on inventory turnover performance through empirical modeling. In particular the authors use the inaccuracy of quarterly sales forecasts as a proxy for demand uncertainty and study its impact on firm-level inventory turnover ratios.

Design/methodology/approach

The authors use regression analysis to study the effect of various measures on inventory performance. The authors use a sample financial data for 304 publicly listed US retail firms for the 25-year period from 1985 to 2009.

Findings

Controlling for the effects of retail segments and year, it is found that inventory turnover is negatively correlated with mean absolute percentage error of quarterly sales forecasts and gross margin and positively correlated with capital intensity and sales surprise. These four variables explain 73.7 percent of the variation across firms and over time and 93.4 percent of the within-firm variation in the data.

Practical implications

In addition to conducting an empirical investigation for the sources of variation in a major operational metric, the results in this study can also be used to benchmark a retailer’s inventory performance against its competitors.

Originality/value

The authors develop a new proxy to measure the demand uncertainty that a firm faces and show that this measure may help to explain the variation in inventory performance.

Details

International Journal of Physical Distribution & Logistics Management, vol. 46 no. 6/7
Type: Research Article
ISSN: 0960-0035

Keywords

Article
Publication date: 1 March 2016

Daniel W. Williams and Shayne C. Kavanagh

This study examines forecast accuracy associated with the forecast of 55 revenue data series of 18 local governments. The last 18 months (6 quarters; or 2 years) of the data are…

Abstract

This study examines forecast accuracy associated with the forecast of 55 revenue data series of 18 local governments. The last 18 months (6 quarters; or 2 years) of the data are held-out for accuracy evaluation. Results show that forecast software, damped trend methods, and simple exponential smoothing methods perform best with monthly and quarterly data; and use of monthly or quarterly data is marginally better than annualized data. For monthly data, there is no advantage to converting dollar values to real dollars before forecasting and reconverting using a forecasted index. With annual data, naïve methods can outperform exponential smoothing methods for some types of data; and real dollar conversion generally outperforms nominal dollars. The study suggests benchmark forecast errors and recommends a process for selecting a forecast method.

Details

Journal of Public Budgeting, Accounting & Financial Management, vol. 28 no. 4
Type: Research Article
ISSN: 1096-3367

Article
Publication date: 16 October 2020

Julia S. Mehlitz and Benjamin R. Auer

Motivated by the growing importance of the expected shortfall in banking and finance, this study aims to compare the performance of popular non-parametric estimators of the…

Abstract

Purpose

Motivated by the growing importance of the expected shortfall in banking and finance, this study aims to compare the performance of popular non-parametric estimators of the expected shortfall (i.e. different variants of historical, outlier-adjusted and kernel methods) to each other, selected parametric benchmarks and estimates based on the idea of forecast combination.

Design/methodology/approach

Within a multidimensional simulation setup (spanned by different distributional settings, sample sizes and confidence levels), the authors rank the estimators based on classic error measures, as well as an innovative performance profile technique, which the authors adapt from the mathematical programming literature.

Findings

The rich set of results supports academics and practitioners in the search for an answer to the question of which estimators are preferable under which circumstances. This is because no estimator or combination of estimators ranks first in all considered settings.

Originality/value

To the best of their knowledge, the authors are the first to provide a structured simulation-based comparison of non-parametric expected shortfall estimators, study the effects of estimator averaging and apply the mentioned profiling technique in risk management.

Details

The Journal of Risk Finance, vol. 21 no. 4
Type: Research Article
ISSN: 1526-5943

Keywords

Article
Publication date: 27 March 2024

Xiaomei Liu, Bin Ma, Meina Gao and Lin Chen

A time-varying grey Fourier model (TVGFM(1,1,N)) is proposed for the simulation of variable amplitude seasonal fluctuation time series, as the performance of traditional grey…

67

Abstract

Purpose

A time-varying grey Fourier model (TVGFM(1,1,N)) is proposed for the simulation of variable amplitude seasonal fluctuation time series, as the performance of traditional grey models can't catch the time-varying trend well.

Design/methodology/approach

The proposed model couples Fourier series and linear time-varying terms as the grey action, to describe the characteristics of variable amplitude and seasonality. The truncated Fourier order N is preselected from the alternative order set by Nyquist-Shannon sampling theorem and the principle of simplicity, then the optimal Fourier order is determined by hold-out method to improve the robustness of the proposed model. Initial value correction and the multiple transformation are also studied to improve the precision.

Findings

The new model has a broader applicability range as a result of the new grey action, attaining higher fitting and forecasting accuracy. The numerical experiment of a generated monthly time series indicates the proposed model can accurately fit the variable amplitude seasonal sequence, in which the mean absolute percentage error (MAPE) is only 0.01%, and the complex simulations based on Monte-Carlo method testify the validity of the proposed model. The results of monthly electricity consumption in China's primary industry, demonstrate the proposed model catches the time-varying trend and has good performances, where MAPEF and MAPET are below 5%. Moreover, the proposed TVGFM(1,1,N) model is superior to the benchmark models, grey polynomial model (GMP(1,1,N)), grey Fourier model (GFM(1,1,N)), seasonal grey model (SGM(1,1)), seasonal ARIMA model seasonal autoregressive integrated moving average model (SARIMA) and support vector regression (SVR).

Originality/value

The parameter estimates and forecasting of the new proposed TVGFM are studied, and the good fitting and forecasting accuracy of time-varying amplitude seasonal fluctuation series are testified by numerical simulations and a case study.

Details

Grey Systems: Theory and Application, vol. 14 no. 3
Type: Research Article
ISSN: 2043-9377

Keywords

Article
Publication date: 4 January 2022

Satish Kumar, Tushar Kolekar, Ketan Kotecha, Shruti Patil and Arunkumar Bongale

Excessive tool wear is responsible for damage or breakage of the tool, workpiece, or machining center. Thus, it is crucial to examine tool conditions during the machining process…

Abstract

Purpose

Excessive tool wear is responsible for damage or breakage of the tool, workpiece, or machining center. Thus, it is crucial to examine tool conditions during the machining process to improve its useful functional life and the surface quality of the final product. AI-based tool wear prediction techniques have proven to be effective in estimating the Remaining Useful Life (RUL) of the cutting tool. However, the model prediction needs improvement in terms of accuracy.

Design/methodology/approach

This paper represents a methodology of fusing a feature selection technique along with state-of-the-art deep learning models. The authors have used NASA milling data sets along with vibration signals for tool wear prediction and performance analysis in 15 different fault scenarios. Multiple steps are used for the feature selection and ranking. Different Long Short-Term Memory (LSTM) approaches are used to improve the overall prediction accuracy of the model for tool wear prediction. LSTM models' performance is evaluated using R-square, Mean Absolute Error (MAE), Root Mean Square Error (RMSE) and Mean Absolute Percentage Error (MAPE) parameters.

Findings

The R-square accuracy of the hybrid model is consistently high and has low MAE, MAPE and RMSE values. The average R-square score values for LSTM, Bidirection, Encoder–Decoder and Hybrid LSTM are 80.43, 84.74, 94.20 and 97.85%, respectively, and corresponding average MAPE values are 23.46, 22.200, 9.5739 and 6.2124%. The hybrid model shows high accuracy as compared to the remaining LSTM models.

Originality/value

The low variance, Spearman Correlation Coefficient and Random Forest Regression methods are used to select the most significant feature vectors for training the miscellaneous LSTM model versions and highlight the best approach. The selected features pass to different LSTM models like Bidirectional, Encoder–Decoder and Hybrid LSTM for tool wear prediction. The Hybrid LSTM approach shows a significant improvement in tool wear prediction.

Details

International Journal of Quality & Reliability Management, vol. 39 no. 7
Type: Research Article
ISSN: 0265-671X

Keywords

Article
Publication date: 4 December 2017

Gabriel Nani, Isaac Mensah and Theophilus Adjei-Kumi

A major concern for construction professionals at the rural road agency in Ghana is the problem of fixing contract duration for bridge construction projects in rural areas. The…

Abstract

Purpose

A major concern for construction professionals at the rural road agency in Ghana is the problem of fixing contract duration for bridge construction projects in rural areas. The purpose of the study was to develop a tool for construction professionals to forecast duration for bridge projects.

Design/methodology/approach

In all, 100 questionnaires were distributed to professionals at the Department of Feeder Roads to ascertain their views on the work items in a bill of quantities (BOQ) that impact significantly on the duration of bridge construction projects. Historical data for 30 completed bridge projects were also collected from the same Department. The data collected were executed work items in BOQ and actual durations used in completing the works. The qualitative data were analysed using the relative importance index and the quantitative data, processed and analysed using both the stepwise regression method and artificial neural network (ANN) technique.

Findings

The identified predictors, namely, in-situ concrete, weight of prefabricated steel components, gravel sub-base and haulage of aggregates, used as independent variables resulted in the development of a regression model with a mean absolute percentage error (MAPE) of 25 per cent and an ANN model with a feed forward back propagation algorithm with an MAPE of 26 per cent at the validation stage. The study has shown that both regression and ANN models are appropriate for predicting the duration of a new bridge construction project.

Research limitations/implications

The predictors used in the developed models are limited to work items in BOQs only of completed bridge construction projects as well as the small sample size.

Practical implications

The study has developed a working tool for practitioners at the agency to forecast contract duration for bridge projects prior to its commencement.

Originality value

The study has quantified the relationship between the work items in BOQs and the duration of bridge construction projects using the stepwise regression method and the ANN techniques.

Details

Journal of Engineering, Design and Technology, vol. 15 no. 6
Type: Research Article
ISSN: 1726-0531

Keywords

Article
Publication date: 27 January 2022

Dimitrios D. Kantianis

This research aims to develop conceptual phase building project cost forecasting models by exploring the relationship of existing plan shape complexity indices and general design…

Abstract

Purpose

This research aims to develop conceptual phase building project cost forecasting models by exploring the relationship of existing plan shape complexity indices and general design morphology parameters with total construction cost.

Design/methodology/approach

Plan shape indices proposed to date by the literature for measuring building design complexity are critically reviewed. Building morphology is also dictated by town planning restrictions such as plot coverage ratio or number of storeys. This study analyses historical data collected from 49 residential building projects to develop multiple linear regression (MLR) and artificial neural network (ANN) models for forecasting construction cost. Existing plan shape coefficients are calculated to evaluate the geometrical complexity of sampled projects. Ten regression-based cost estimating equations are totally derived from stepwise backward and forward methods, and their predictive accuracy is contrasted: to performance levels reported in past studies and to ANN models developed in this research with multilayer perceptron architecture.

Findings

Analysis of plan shape indices revealed that 85.7% of examined past projects possess a high degree of design complexity, hence resulting in expensive initial decisions. This highlights the need for more effective early design stage decision-making by developing new building economic tools. The most accurate regression model, with a mean absolute percentage error (MAPE) of 19.2%, predicts the log of total cost from wall to floor index and total building envelope surface. Other explanatory variables resulting in MAPE values in the order of 20%–22% are total volume, volume above ground level, gross floor area below ground level, gross floor area per storey and total number of storeys. The overall MAPE of regression-based equations is 24.3% whilst ANN models are slightly more accurate with MAPE scores of 21.8% and 21.6% for one hidden and two hidden layers, respectively. The most accurate forecasting model in the research is the ANN with two hidden layers and the sigmoid activation function which predicts total building cost from total building volume (19.1%).

Originality/value

This paper introduces MLR-based and ANN-based conceptual construction cost forecasting models which are founded solely on building morphology design parameters and compare favourably with previous studies with an average predictive accuracy less than 25%. This paper is expected to be beneficial to both practitioners and academics in the built environment towards more effective cost planning of building projects. The methodology suggested can further be implemented in other countries provided that accurate and relevant data from historical projects are used.

Details

Journal of Financial Management of Property and Construction , vol. 27 no. 3
Type: Research Article
ISSN: 1366-4387

Keywords

Article
Publication date: 26 April 2023

Shavkatjon Tulkinov

Electricity plays an essential role in nations' economic development. However, coal and renewables currently play an important part in electricity production in major world…

Abstract

Purpose

Electricity plays an essential role in nations' economic development. However, coal and renewables currently play an important part in electricity production in major world economies. The current study aims to forecast the electricity production from coal and renewables in the USA, China and Japan.

Design/methodology/approach

Two intelligent grey forecasting models – optimized discrete grey forecasting model DGM (1,1,α), and optimized even grey forecasting model EGM (1,1,α,θ) – are used to forecast electricity production. Also, the accuracy of the forecasts is measured through the mean absolute percentage error (MAPE).

Findings

Coal-powered electricity production is decreasing, while renewable energy production is increasing in the major economies (MEs). China's coal-fired electricity production continues to grow. The forecasts generated by the two grey models are more accurate than that by the classical models EGM (1,1) and DGM (1,1) and the exponential triple smoothing (ETS).

Originality/value

The study confirms the reliability and validity of grey forecasting models to predict electricity production in the MEs.

Details

Grey Systems: Theory and Application, vol. 13 no. 3
Type: Research Article
ISSN: 2043-9377

Keywords

Article
Publication date: 17 August 2021

Md Vaseem Chavhan, M. Ramesh Naidu and Hayavadana Jamakhandi

This paper aims to propose the artificial neural network (ANN) and regression models for the estimation of the thread consumption at multilayered seam assembly stitched with lock…

Abstract

Purpose

This paper aims to propose the artificial neural network (ANN) and regression models for the estimation of the thread consumption at multilayered seam assembly stitched with lock stitch 301.

Design/methodology/approach

In the present study, the generalized regression and neural network models are developed by considering the fabric types: woven, nonwoven and multilayer combination thereof, with basic sewing parameters: sewing thread linear density, stitch density, needle count and fabric assembly thickness. The network with feed-forward backpropagation is considered to build the ANN, and the training function trainlm of MATLAB software is used to adjust weight and basic values according to the optimization of Levenberg Marquardt. The performance of networks measured in terms of the mean squared error and the layer output is set according to the sigmoid transfer function.

Findings

The proposed ANN and regression model are able to predict the thread consumption with more accuracy for multilayered seam assembly. The predictability of thread consumption from available geometrical models, regression models and industrial empirical techniques are compared with proposed linear regression, quadratic regression and neural network models. The proposed quadratic regression model showed a good correlation with practical thread consumption value and more accuracy in prediction with an overall 4.3% error, as compared to other techniques for given multilayer substrates. Further, the developed ANN network showed good accuracy in the prediction of thread consumption.

Originality/value

The estimation of thread consumed while stitching is the prerequisite of the garment industry for inventory management especially with the introduction of the costly high-performance sewing thread. In practice, different types of fabrics are stitched at multilayer combinations at different locations of the stitched product. The ANN and regression models are developed for multilayered seam assembly of woven and nonwoven fabric blend composition for better prediction of thread consumption.

Details

Research Journal of Textile and Apparel, vol. 26 no. 4
Type: Research Article
ISSN: 1560-6074

Keywords

1 – 10 of 876