Search results
1 – 10 of over 2000
To provide an overview of how a number of frequently used smoothing‐based forecasting techniques can be modelled for use in dynamic analysis of production‐inventory systems.
Abstract
Purpose
To provide an overview of how a number of frequently used smoothing‐based forecasting techniques can be modelled for use in dynamic analysis of production‐inventory systems.
Design/methodology/approach
The smoothing techniques are modelled using transfer functions and state space representation. Basic control theory is used for analysing the dynamic properties.
Findings
A set of expressions are derived for the smoothing techniques and dynamic properties are identified.
Practical implications
Dynamic properties are important in many applications. It is shown that the different smoothing techniques can have very different influences on the dynamic behaviour and therefore should be considered as a factor when smoothing parameters are decided on.
Originality/value
Dynamic behaviour of production‐inventory systems can be analysed using control theory based on, e.g. transfer functions or state space models. In this paper a set of models for five common smoothing techniques are analysed and their respective dynamic properties are highlighted.
Details
Keywords
Mark T. Leung, Rolando Quintana and An-Sing Chen
Demand forecasting has long been an imperative tenet in production planning especially in a make-to-order environment where a typical manufacturer has to balance the issues of…
Abstract
Demand forecasting has long been an imperative tenet in production planning especially in a make-to-order environment where a typical manufacturer has to balance the issues of holding excessive safety stocks and experiencing possible stockout. Many studies provide pragmatic paradigms to generate demand forecasts (mainly based on smoothing forecasting models.) At the same time, artificial neural networks (ANNs) have been emerging as alternatives. In this chapter, we propose a two-stage forecasting approach, which combines the strengths of a neural network with a more conventional exponential smoothing model. In the first stage of this approach, a smoothing model estimates the series of demand forecasts. In the second stage, general regression neural network (GRNN) is applied to learn and then correct the errors of estimates. Our empirical study evaluates the use of different static and dynamic smoothing models and calibrates their synergies with GRNN. Various statistical tests are performed to compare the performances of the two-stage models (with error correction by neural network) and those of the original single-stage models (without error-correction by neural network). Comparisons with the single-stage GRNN are also included. Statistical results show that neural network correction leads to improvements to the forecasts made by all examined smoothing models and can outperform the single-stage GRNN in most cases. Relative performances at different levels of demand lumpiness are also examined.
The work of Scott, Bruce and Cooper on small firm growth and development is reviewed. It is shown that by adapting exponential smoothing forecasting procedures it is possible to…
Abstract
The work of Scott, Bruce and Cooper on small firm growth and development is reviewed. It is shown that by adapting exponential smoothing forecasting procedures it is possible to monitor the commercial health of a small firm. This is achieved by ‘tracking’ key indicators and producing an exception message when a signal exceeds certain predetermined control limits. The procedure is equally effective for either a step or ramp change in the underlying input data. This suggested approach requires little sophistication in either data or technique and has a practical application to small firm management, while adding to our understanding of the process of growth of small businesses.
D.M.K.N. Seneviratna and R.M. Kapila Tharanga Rathnayaka
The Coronavirus (COVID-19) is one of the major pandemic diseases caused by a newly discovered virus that has been directly affecting the human respiratory system. Because of the…
Abstract
Purpose
The Coronavirus (COVID-19) is one of the major pandemic diseases caused by a newly discovered virus that has been directly affecting the human respiratory system. Because of the gradually increasing magnitude of the COVID-19 pandemic across the world, it has been sparking emergencies and critical issues in the healthcare systems around the world. However, predicting the exact amount of daily reported new COVID cases is the most serious issue faced by governments around the world today. So, the purpose of this current study is to propose a novel hybrid grey exponential smoothing model (HGESM) to predicting transmission dynamics of the COVID-19 outbreak properly.
Design/methodology/approach
As a result of the complications relates to the traditional time series approaches, the proposed HGESM model is well defined to handle exponential data patterns in multidisciplinary systems. The proposed methodology consists of two parts as double exponential smoothing and grey exponential smoothing modeling approach respectively. The empirical analysis of this study was carried out on the basis of the 3rd outbreak of Covid-19 cases in Sri Lanka, from 1st March 2021 to 15th June 2021. Out of the total 90 daily observations, the first 85% of daily confirmed cases were used during the training, and the remaining 15% of the sample.
Findings
The new proposed HGESM is highly accurate (less than 10%) with the lowest root mean square error values in one head forecasting. Moreover, mean absolute deviation accuracy testing results confirmed that the new proposed model has given more significant results than other time-series predictions with the limited samples.
Originality/value
The findings suggested that the new proposed HGESM is more suitable and effective for forecasting time series with the exponential trend in a short-term manner.
Details
Keywords
Matthew Lindsey and Robert Pavur
When forecasting intermittent demand the method derived by Croston (1972) is often cited. Previous research favorably compared Croston's forecasting method for demand with simple…
Abstract
When forecasting intermittent demand the method derived by Croston (1972) is often cited. Previous research favorably compared Croston's forecasting method for demand with simple exponential smoothing assuming a nonzero demand occurs as a Bernoulli process with a constant probability. In practice, however, the assumption of a constant probability for the occurrence of nonzero demand is often violated. This research investigates Croston's method under violation of the assumption of a constant probability of nonzero demand. In a simulation study, forecasts derived using single exponential smoothing (SES) are compared to forecasts using a modification of Croston's method utilizing double exponential smoothing to forecast the time between nonzero demands assuming a normal distribution for demand size with different standard deviation levels. This methodology may be applicable to forecasting intermittent demand at the beginning or end of a product's life cycle.
Matthew Lindsey and Robert Pavur
Research in the area of forecasting and stock inventory control for intermittent demand is designed to provide robust models for the underlying demand which appears at random…
Abstract
Research in the area of forecasting and stock inventory control for intermittent demand is designed to provide robust models for the underlying demand which appears at random, with some time periods having no demand at all. Croston’s method is a popular technique for these models and it uses two single exponential smoothing (SES) models which involve smoothing constants. A key issue is the choice of the values due to the sensitivity of the forecasts to changes in demand. Suggested selections of the smoothing constants include values between 0.1 and 0.3. Since an ARIMA model has been illustrated to be equivalent to SES, an optimal smoothing constant can be selected from the ARIMA model for SES. This chapter will conduct simulations to investigate whether using an optimal smoothing constant versus the suggested smoothing constant is important. Since SES is designed to be an adapted method, data are simulated which vary between slow and fast demand.
Details
Keywords
Daniel W. Williams and Shayne C. Kavanagh
This study examines forecast accuracy associated with the forecast of 55 revenue data series of 18 local governments. The last 18 months (6 quarters; or 2 years) of the data are…
Abstract
This study examines forecast accuracy associated with the forecast of 55 revenue data series of 18 local governments. The last 18 months (6 quarters; or 2 years) of the data are held-out for accuracy evaluation. Results show that forecast software, damped trend methods, and simple exponential smoothing methods perform best with monthly and quarterly data; and use of monthly or quarterly data is marginally better than annualized data. For monthly data, there is no advantage to converting dollar values to real dollars before forecasting and reconverting using a forecasted index. With annual data, naïve methods can outperform exponential smoothing methods for some types of data; and real dollar conversion generally outperforms nominal dollars. The study suggests benchmark forecast errors and recommends a process for selecting a forecast method.
Maryam Doroodi and Alireza Mokhtar
The purpose of this paper is to predict the amount of energy consumption by using a suitable statistical method in some sectors and energy carriers, which has shown a significant…
Abstract
Purpose
The purpose of this paper is to predict the amount of energy consumption by using a suitable statistical method in some sectors and energy carriers, which has shown a significant correlation with greenhouse gas emissions.
Design/methodology/approach
After studying the correlation between energy consumption rates in different sectors of energy consumption and some energy carriers with greenhouse gas distribution (CO2, SO2, NOX and SPM), the most effective factors on pollution emission will be first identified and then predicted for the next 20 years (2015 to 2004). Furthermore, to determine the appropriate method for forecasting, two approaches titled “trend analysis” and “double exponential smoothing” will be applied on data, collected from 1967 to 2014, and their capabilities in anticipating will be compared to each other contributing MSD, MAD, MAPE indices and also the actual and projected time series comparison. After predicting the energy consumption in the sectors and energy carriers, the growth rate of consumption in the next 20 years is also calculated.
Findings
Correlation study shows that four energy sectors (industry sector, agriculture, transportation and household-general-commercial) and two energy carriers (electricity and natural gas) have shown remarkable correlation with greenhouse gas emissions. To predict the energy consumption in mentioned sectors and carriers, it is proven that double exponential smoothing method is more capable in predicting. The study shows that among the demand sectors, the industry will account for the highest consumption rate. Electricity will experience the highest rate among the energy careers. In fact, producing this amount of electricity causes emissions of greenhouse gases.
Research limitations/implications
Access to the data and categorized data was one of the main limitations.
Practical implications
By identifying the sectors and energy carriers that have the highest consumption growth rate in the next 20 years, it can be said that greenhouse gas emissions, which show remarkable correlation with these sectors and carriers, will also increase dramatically. So, their stricter control seems to be necessary. On the other hand, to control a particular greenhouse gas, it is possible to focus on the amount of energy consumed in the sectors and carriers that have a significant correlation with this pollutant. These results will lead to more targeted policies to reduce greenhouse gas emissions.
Social implications
The tendency of communities toward industrialization along with population growth will doubtlessly lead to more consumption of fossil fuels. An immediate aftermath of burning fuels is greenhouse gas emission resulting in destructive effects on the environment and ecosystems. Identifying the factors affecting the pollutants resulted from consumption of fossil fuels is significant in controlling the emissions.
Originality/value
Such analyses help policymakers make more informed and targeted decisions to reduce greenhouse gas emissions and make safer and more appropriate policies and investment.
Details
Keywords
John L. Stanton and Stephen L. Baglione
Product success is contingent on forecasting when a product is needed and how it should be offered. Forecasting accuracy is contingent on the correct forecasting technique. Using…
Abstract
Product success is contingent on forecasting when a product is needed and how it should be offered. Forecasting accuracy is contingent on the correct forecasting technique. Using supermarket data across two product categories, this chapter shows that using a bevy of forecasting methods improves forecasting accuracy. Accuracy is measured by the mean absolute percentage error. The optimal methods for one consumer goods product may be different than for another. The best model varied from sophisticated, most such as autoregressive integrated moving average (ARIMA) and Holt–Winters to a random walk model. Forecasters must be proficient in multiple statistical techniques since the best technique varies within a categories, variety, and product size.
Details
Keywords
Kenneth D. Lawrence, Gary K. Kleinman and Sheila M. Lawrence
This research examines the use of a number of time series model structures of a moderate allocation mutual fund, PRWCX. PRWCX was rated as the top fund in its category during the…
Abstract
This research examines the use of a number of time series model structures of a moderate allocation mutual fund, PRWCX. PRWCX was rated as the top fund in its category during the past five years. The fund invests at least 50% of its total assets that the fund manager believes that have above average potential for capital growth. The remaining assets are generally invested in convertible securities, corporate and government debt bank loans, and foreign securities. Forecasting the total NAV of such a moderate allocation mutual fund, composed of an extremely large number of investments, requires a method that produces accurate results. These models are exponentially smoothing (single, double, and Winter’s Method), trend models (linear, quadratic, and exponential) are Box-Jenkins models.
Details