Search results
11 – 20 of over 113000Joanne S. Utley and J. Gaylord May
This study examines the use of forecast combination to improve the accuracy of forecasts of cumulative demand. A forecast combination methodology based on least absolute value…
Abstract
This study examines the use of forecast combination to improve the accuracy of forecasts of cumulative demand. A forecast combination methodology based on least absolute value (LAV) regression analysis is developed and is applied to partially accumulated demand data from an actual manufacturing operation. The accuracy of the proposed model is compared with the accuracy of common alternative approaches that use partial demand data. Results indicate that the proposed methodology outperforms the alternative approaches.
Abstract
Purpose
This study attempts to discover effective strategies for mobile commerce applications (apps) to grow their consumer base by releasing app strategic updates. Drawing on the landscape search model from strategy research, this study conceptualizes mobile app update strategy as three interdependent decisions, i.e. what business elements are changed in an app strategic update, how substantial the changes are and when strategic updates are released relative to the competitive environment.
Design/methodology/approach
Using a field data set of 1,500 strategic updates of seven rival apps in the mobile travel market, this study integrated fuzzy set qualitative comparative analysis (fsQCA) with econometric analysis to analyze how app strategic update decisions interdependently influence app performance.
Findings
This study identified three effective and one ineffective mobile app update strategies from the mixed-method analysis, which verified the complex interdependency of app strategic update decisions. A general takeaway from these strategies is that a complex strategy problem on the mobile platform must be solved with respect to the constraints and capabilities of mobile technology.
Originality/value
This study moves beyond a linear view of the relationship between app update frequency and app performance and provides a holistic view of how and why app strategic update decisions mutually influence one another in their impact on app performance. This work makes contributions by identifying interdependency as a conceptual bridge between strategy and mobile app literature and developing an empirically testable version of the landscape search model.
Details
Keywords
In contrast to point forecasts, interval forecasts provide information on future variability. This research thus aimed to develop interval prediction models by addressing two…
Abstract
Purpose
In contrast to point forecasts, interval forecasts provide information on future variability. This research thus aimed to develop interval prediction models by addressing two significant issues: (1) a simple average with an additive property is commonly used to derive combined forecasts, but this unreasonably ignores the interaction among sequences used as sources of information, and (2) the time series often does not conform to any statistical assumptions.
Design/methodology/approach
To develop an interval prediction model, the fuzzy integral was applied to nonlinearly combine forecasts generated by a set of grey prediction models, and a sequence including the combined forecasts was then used to construct a neural network. All required parameters relevant to the construction of an interval model were optimally determined by the genetic algorithm.
Findings
The empirical results for tourism demand showed that the proposed non-additive interval model outperformed the other interval prediction models considered.
Practical implications
The private and public sectors in economies with high tourism dependency can benefit from the proposed model by using the forecasts to help them formulate tourism strategies.
Originality/value
In light of the usefulness of combined point forecasts and interval model forecasting, this research contributed to the development of non-additive interval prediction models on the basis of combined forecasts generated by grey prediction models.
Details
Keywords
Drago Podobnik and Slavko Dolinšek
The purpose of this paper is to present the usefulness of the combination of the European Foundation for Quality Management excellence model and the balanced scorecard integrated…
Abstract
Purpose
The purpose of this paper is to present the usefulness of the combination of the European Foundation for Quality Management excellence model and the balanced scorecard integrated into the management model for competitiveness and performance development.
Design/methodology/approach
The presented model is the result of a business research where comparative analysis of the two models has been carried out. Both models have been thoroughly studied from different points of view. Such an approach enabled one to define the strengths, weaknesses and similarities of the two models.
Findings
On the basis of the robustness of both models a combination was formed integrated into the management model, which is likely to be better, more effective and simpler to use in practice, and which will support an increase in competitiveness and performance development.
Research limitations/implications
Within the research the aim has been focused on close research into interactions presented in the integrated management model. Throughout research, consideration was given to the problem of its external validity which is somehow limited. Here analytical generalisation is discussed. A number of cases can be found in which a combination of both models has been used in different ways. The way of combination in the integral management model, which is presented here, was carried out in 2005 in an important Slovenian international company.
Originality/value
The originality can be found in the particular approach towards comparative analysis and also in the result, which represents a combination of the two models integrated in an original manner into the integral model of management. Companies which have not yet introduced, as well as those which have already introduced, one of the researched management tools will be able to use the results of this research for further upgrading/consolidation in the sense of the model combination of both. The synergetic effects of the interactions of the combination between both management approaches will have a positive effect on increasing company competitiveness.
Details
Keywords
Past research has shown that forecast combination typically improves demand forecast accuracy even when only two component forecasts are used; however, systematic bias in the…
Abstract
Past research has shown that forecast combination typically improves demand forecast accuracy even when only two component forecasts are used; however, systematic bias in the component forecasts can reduce the effectiveness of combination. This study proposes a methodology for combining demand forecasts that are biased. Data from an actual manufacturing shop are used to develop the methodology and compare its accuracy with the accuracy of the standard approach of correcting for bias prior to combination. Results indicate that the proposed methodology outperforms the standard approach.
Details
Keywords
Mehrnaz Ahmadi and Mehdi Khashei
The purpose of this paper is to propose a new linear-nonlinear data preprocessing-based hybrid model to achieve a more accurate result at a lower cost for wind power forecasting…
Abstract
Purpose
The purpose of this paper is to propose a new linear-nonlinear data preprocessing-based hybrid model to achieve a more accurate result at a lower cost for wind power forecasting. For this purpose, a decomposed based series-parallel hybrid model (PKF-ARIMA-FMLP) is proposed which can model linear/nonlinear and certain/uncertain patterns in underlying data simultaneously.
Design/methodology/approach
To design the proposed model at first, underlying data are divided into two categories of linear and nonlinear patterns by the proposed Kalman filter (PKF) technique. Then, the linear patterns are modeled by the linear-fuzzy nonlinear series (LLFN) hybrid models to detect linearity/nonlinearity and certainty/uncertainty in underlying data simultaneously. This step is also repeated for nonlinear decomposed patterns. Therefore, the nonlinear patterns are modeled by the linear-fuzzy nonlinear series (NLFN) hybrid models. Finally, the weight of each component (e.g. KF, LLFN and NLFN) is calculated by the least square algorithm, and then the results are combined in a parallel structure. Then the linear and nonlinear patterns are modeled with the lowest cost and the highest accuracy.
Findings
The effectiveness and predictive capability of the proposed model are examined and compared with its components, based models, single models, series component combination based hybrid models, parallel component combination based hybrid models and decomposed-based single model. Numerical results show that the proposed linear-nonlinear data preprocessing-based hybrid models have been able to improve the performance of single, hybrid and single decomposed based prediction methods by approximately 66.29%, 52.10% and 38.13% for predicting wind power time series in the test data, respectively.
Originality/value
The combination of single linear and nonlinear models has expanded due to the theory of the existence of linear and nonlinear patterns simultaneously in real-world data. The main idea of the linear and nonlinear hybridization method is to combine the benefits of these models to identify the linear and nonlinear patterns in the data in series, parallel or series-parallel based models by reducing the limitations of the single model that leads to higher accuracy, more comprehensiveness and less risky predictions. Although the literature shows that the combination of linear and nonlinear models can improve the prediction results by detecting most of the linear and nonlinear patterns in underlying data, the investigation of linear and nonlinear patterns before entering linear and nonlinear models can improve the performance, which in no paper this separation of patterns into two classes of linear and nonlinear is considered. So by this new data preprocessing based method, the modeling error can be reduced and higher accuracy can be achieved at a lower cost.
Details
Keywords
Computer simulation in construction planning has been the subject of research for the last few decades. The present paper describes simulation models geared toward improving the…
Abstract
Computer simulation in construction planning has been the subject of research for the last few decades. The present paper describes simulation models geared toward improving the productivity of concreting operations. It is primarily concerned with a study of the sensitivity of concreting operations to a set of possible resource combinations. Thirteen models are examined by the two well‐known methods of concreting: (1) crane and bucket; and (2) the pump. Concreting into slabs, beams and columns are considered. The simulation software Micro‐CYCLONE is used for the actual generation of models. Sensitivity parameters considered in resource combinations include the number of truck‐mixers, buckets and labourers in concrete placing crews. The simulation models developed are compared and the results are discussed. The results enable planners to realize how the resource quantities and capacities in one cycle affect the ones in another period for cyclic operations like concreting. It can be concluded that the maximum number of resources, the interaction of work crews caused by work space limitations and the interaction of equipment because of sharing with other activities of the project may bring limitations.
Details
Keywords
Refet S. Gürkaynak, Burçin Kısacıkoğlu and Barbara Rossi
Recently, it has been suggested that macroeconomic forecasts from estimated dynamic stochastic general equilibrium (DSGE) models tend to be more accurate out-of-sample than random…
Abstract
Recently, it has been suggested that macroeconomic forecasts from estimated dynamic stochastic general equilibrium (DSGE) models tend to be more accurate out-of-sample than random walk forecasts or Bayesian vector autoregression (VAR) forecasts. Del Negro and Schorfheide (2013) in particular suggest that the DSGE model forecast should become the benchmark for forecasting horse-races. We compare the real-time forecasting accuracy of the Smets and Wouters (2007) DSGE model with that of several reduced-form time series models. We first demonstrate that none of the forecasting models is efficient. Our second finding is that there is no single best forecasting method. For example, typically simple AR models are most accurate at short horizons and DSGE models are most accurate at long horizons when forecasting output growth, while for inflation forecasts the results are reversed. Moreover, the relative accuracy of all models tends to evolve over time. Third, we show that there is no support to the common practice of using large-scale Bayesian VAR models as the forecast benchmark when evaluating DSGE models. Indeed, low-dimensional unrestricted AR and VAR forecasts may forecast more accurately.
Details
Keywords
Increasing availability of the financial data has opened new opportunities for quantitative modeling. It has also exposed limitations of the existing frameworks, such as low…
Abstract
Increasing availability of the financial data has opened new opportunities for quantitative modeling. It has also exposed limitations of the existing frameworks, such as low accuracy of the simplified analytical models and insufficient interpretability and stability of the adaptive data-driven algorithms. I make the case that boosting (a novel, ensemble learning technique) can serve as a simple and robust framework for combining the best features of the analytical and data-driven models. Boosting-based frameworks for typical financial and econometric applications are outlined. The implementation of a standard boosting procedure is illustrated in the context of the problem of symbolic volatility forecasting for IBM stock time series. It is shown that the boosted collection of the generalized autoregressive conditional heteroskedastic (GARCH)-type models is systematically more accurate than both the best single model in the collection and the widely used GARCH(1,1) model.
Jingshuai Zhang, Yuanxin Ouyang, Weizhu Xie, Wenge Rong and Zhang Xiong
The purpose of this paper is to propose an approach to incorporate contextual information into collaborative filtering (CF) based on the restricted Boltzmann machine (RBM) and…
Abstract
Purpose
The purpose of this paper is to propose an approach to incorporate contextual information into collaborative filtering (CF) based on the restricted Boltzmann machine (RBM) and deep belief networks (DBNs). Traditionally, neither the RBM nor its derivative model has been applied to modeling contextual information. In this work, the authors analyze the RBM and explore how to utilize a user’s occupation information to enhance recommendation accuracy.
Design/methodology/approach
The proposed approach is based on the RBM. The authors employ user occupation information as a context to design a context-aware RBM and stack the context-aware RBM to construct DBNs for recommendations.
Findings
The experiments on the MovieLens data sets show that the user occupation-aware RBM outperforms other CF models, and combinations of different context-aware models by mutual information can obtain better accuracy. Moreover, the context-aware DBNs model is superior to baseline methods, indicating that deep networks have more qualifications for extracting preference features.
Originality/value
To improve recommendation accuracy through modeling contextual information, the authors propose context-aware CF approaches based on the RBM. Additionally, the authors attempt to introduce hybrid weights based on information entropy to combine context-aware models. Furthermore, the authors stack the RBM to construct a context-aware multilayer network model. The results of the experiments not only convey that the context-aware RBM has potential in terms of contextual information but also demonstrate that the combination method, the hybrid recommendation and the multilayer neural network extension have significant benefits for the recommendation quality.
Details