Search results

1 – 10 of over 7000

Abstract

This article surveys recent developments in the evaluation of point and density forecasts in the context of forecasts made by vector autoregressions. Specific emphasis is placed on highlighting those parts of the existing literature that are applicable to direct multistep forecasts and those parts that are applicable to iterated multistep forecasts. This literature includes advancements in the evaluation of forecasts in population (based on true, unknown model coefficients) and the evaluation of forecasts in the finite sample (based on estimated model coefficients). The article then examines in Monte Carlo experiments the finite-sample properties of some tests of equal forecast accuracy, focusing on the comparison of VAR forecasts to AR forecasts. These experiments show the tests to behave as should be expected given the theory. For example, using critical values obtained by bootstrap methods, tests of equal accuracy in population have empirical size about equal to nominal size.

Details

VAR Models in Macroeconomics – New Developments and Applications: Essays in Honor of Christopher A. Sims
Type: Book
ISBN: 978-1-78190-752-8

Keywords

Book part
Publication date: 6 January 2016

Laura E. Jackson, M. Ayhan Kose, Christopher Otrok and Michael T. Owyang

We compare methods to measure comovement in business cycle data using multi-level dynamic factor models. To do so, we employ a Monte Carlo procedure to evaluate model performance…

Abstract

We compare methods to measure comovement in business cycle data using multi-level dynamic factor models. To do so, we employ a Monte Carlo procedure to evaluate model performance for different specifications of factor models across three different estimation procedures. We consider three general factor model specifications used in applied work. The first is a single-factor model, the second a two-level factor model, and the third a three-level factor model. Our estimation procedures are the Bayesian approach of Otrok and Whiteman (1998), the Bayesian state-space approach of Kim and Nelson (1998) and a frequentist principal components approach. The latter serves as a benchmark to measure any potential gains from the more computationally intensive Bayesian procedures. We then apply the three methods to a novel new dataset on house prices in advanced and emerging markets from Cesa-Bianchi, Cespedes, and Rebucci (2015) and interpret the empirical results in light of the Monte Carlo results.

Details

Dynamic Factor Models
Type: Book
ISBN: 978-1-78560-353-2

Keywords

Article
Publication date: 3 August 2012

Anand Prakash, Sanjay Kumar Jha and Rajendra Prasad Mohanty

The purpose of this paper is to propose the idea of linking the use of the Monte Carlo simulation with scenario planning to assist strategy makers in formulating strategy in the…

1030

Abstract

Purpose

The purpose of this paper is to propose the idea of linking the use of the Monte Carlo simulation with scenario planning to assist strategy makers in formulating strategy in the face of uncertainty relating to service quality gaps for life insurance business, where discontinuities always remain for need‐based selling.

Design/methodology/approach

The paper reviews briefly some applications of scenario planning. Scenario planning emphasizes the development of a strategic plan that is robust across different scenarios. The paper provides considerable evidence to suggest a new strategic approach using Monte Carlo simulation for making scenario planning.

Findings

The paper highlights which particular service quality gap attribute as risk impacts most and least for the possibility of occurrences as best case, worst case, and most likely case.

Research limitations/implications

This study suffers from methodological limitations associated with convenience sampling and anonymous survey‐based research.

Practical implications

The approach using Monte Carlo simulation increases the credibility of the scenario to an acceptable level, so that it will be used by managers and other decision makers.

Social implications

The paper provides a thorough documentation on scenario planning upon studying the impact of risk and uncertainty in service quality gap for making rational decisions in management of services such that managers make better justification and communication for their arguments.

Originality/value

The paper offers empirical understanding of the application of Monte Carlo simulation to scenario planning and identifies key drivers which impact most and least on service quality gap.

Details

Journal of Strategy and Management, vol. 5 no. 3
Type: Research Article
ISSN: 1755-425X

Keywords

Article
Publication date: 18 June 2021

Chuanyuan Zhou, Zhenyu Liu, Chan Qiu and Jianrong Tan

The conventional statistical method of three-dimensional tolerance analysis requires numerous pseudo-random numbers and consumes enormous computations to increase the calculation…

Abstract

Purpose

The conventional statistical method of three-dimensional tolerance analysis requires numerous pseudo-random numbers and consumes enormous computations to increase the calculation accuracy, such as the Monte Carlo simulation. The purpose of this paper is to propose a novel method to overcome the problems.

Design/methodology/approach

With the combination of the quasi-Monte Carlo method and the unified Jacobian-torsor model, this paper proposes a three-dimensional tolerance analysis method based on edge sampling. By setting reasonable evaluation criteria, the sequence numbers representing relatively smaller deviations are excluded and the remaining numbers are selected and kept which represent deviations approximate to and still comply with the tolerance requirements.

Findings

The case study illustrates the effectiveness and superiority of the proposed method in that it can reduce the sample size, diminish the computations, predict wider tolerance ranges and improve the accuracy of three-dimensional tolerance of precision assembly simultaneously.

Research limitations/implications

The proposed method may be applied only when the dimensional and geometric tolerances are interpreted in the three-dimensional tolerance representation model.

Practical implications

The proposed tolerance analysis method can evaluate the impact of manufacturing errors on the product structure quantitatively and provide a theoretical basis for structural design, process planning and manufacture inspection.

Originality/value

The paper is original in proposing edge sampling as a sampling strategy to generating deviation numbers in tolerance analysis.

Details

Assembly Automation, vol. 41 no. 4
Type: Research Article
ISSN: 0144-5154

Keywords

Book part
Publication date: 19 December 2012

Lee C. Adkins and Mary N. Gade

Monte Carlo simulations are a very powerful way to demonstrate the basic sampling properties of various statistics in econometrics. The commercial software package Stata makes…

Abstract

Monte Carlo simulations are a very powerful way to demonstrate the basic sampling properties of various statistics in econometrics. The commercial software package Stata makes these methods accessible to a wide audience of students and practitioners. The purpose of this chapter is to present a self-contained primer for conducting Monte Carlo exercises as part of an introductory econometrics course. More experienced econometricians that are new to Stata may find this useful as well. Many examples are given that can be used as templates for various exercises. Examples include linear regression, confidence intervals, the size and power of t-tests, lagged dependent variable models, heteroskedastic and autocorrelated regression models, instrumental variables estimators, binary choice, censored regression, and nonlinear regression models. Stata do-files for all examples are available from the authors' website http://learneconometrics.com/pdf/MCstata/.

Details

30th Anniversary Edition
Type: Book
ISBN: 978-1-78190-309-4

Keywords

Article
Publication date: 1 December 1992

Alec H. Evans

Discusses the shortcomings of the so‐called conventional method ofappraising development sites at the pre‐purchase stage. Demonstrates howmultiple‐outcome simulations, Monte Carlo

1046

Abstract

Discusses the shortcomings of the so‐called conventional method of appraising development sites at the pre‐purchase stage. Demonstrates how multiple‐outcome simulations, Monte Carlo analysis in particular, can overcome many disadvantages. Illustrates by means of a practical example. Uses the simpler residual valuation, rather than cashflow appraisal. Concludes that the Monte Carlo method gives a much more graphical impression than a deterministic method, and is well worth the effort to appreciate.

Details

Journal of Property Finance, vol. 3 no. 2
Type: Research Article
ISSN: 0958-868X

Keywords

Article
Publication date: 29 March 2011

Wei Zhan

The purpose of this paper is to demonstrate the effectiveness of Six Sigma as an innovative tool in software design optimization. The problem of reducing simulation time for…

Abstract

Purpose

The purpose of this paper is to demonstrate the effectiveness of Six Sigma as an innovative tool in software design optimization. The problem of reducing simulation time for characterizing a type of DC motor is studied in this paper. The case study illustrates how Six Sigma tools such as the design of experiments (DOE) method can be used to improve a simulation process.

Design/methodology/approach

A first‐principle model for the motor is used for simulation in MATLAB®. Each parameter in the model is assumed to have a known distribution. Using the random number generator in MATLAB®, Monte Carlo analysis is conducted. To reduce simulation time, several factors in the simulation process are identified. A two‐level full factorial DOE matrix is constructed. The Monte Carlo analysis is carried out for each of the parameter set in the DOE matrix. Based on the simulation results and the DOE analysis, the Simulink model is identified as the main contributor to the computational time of the simulation. Several steps are taken to reduce the computational time related to the Simulink model. The improved model requires only one‐fourth of the original computational time.

Findings

The paper illustrates that Six Sigma tools can be applied to algorithm and software‐development process for optimization purpose. Statistical analysis can be conducted in the simulation environment to provide valuable information.

Practical implications

As an example, the improved simulation process is used to derive statistical information related to the speed vs torque curve and response time as part of the motor characteristics. The findings suggest that with an optimized simulation model, large amount of statistical analyses can be conducted in the simulation environment to provide practical information. This approach can be effectively used in early stage of product design, e.g. during the feasibility study.

Originality/value

In industry, most of the DOE are conducted using real‐test data. It is usually time consuming and cost inefficient. This paper combines mathematical modelling and statistical analysis to optimize a simulation model using DOE. The novel approach used in this paper can be applied in many other software optimization problems. It is expected that this approach will broaden the application of Six Sigma in industry.

Details

International Journal of Lean Six Sigma, vol. 2 no. 1
Type: Research Article
ISSN: 2040-4166

Keywords

Article
Publication date: 29 April 2021

Jian Guo, Junlin Chen and Yujie Xie

This paper explores the impact of both government subsidies and decision makers' loss-averse behavior on the determination of transportation build-operate-transfer (BOT…

Abstract

Purpose

This paper explores the impact of both government subsidies and decision makers' loss-averse behavior on the determination of transportation build-operate-transfer (BOT) concession periods based on cumulative prospect theory (CPT). The prospect value of a transportation project under traffic risk can be formulated according to the value function for gains and losses and the decision weight for gains and losses. As an extra income for investors, government subsidy is designed for highly risky aspects of BOT transportation projects: uncertain initial traffic volumes and fluctuating growth rates.

Design/methodology/approach

A decision-making model determining the concession period of a transportation BOT project is proposed by using the Monte-Carlo simulation method based on CPT, and the effects of risky behaviors of private investors on concession period decision making are analyzed. A subsidy method related to the internal rate-of-return (IRR) corresponding to a specific initial traffic volume and growth rate is proposed. The case of an actual BOT highway project is examined to illustrate how the method proposed can be used to determine the concession period of a transportation BOT project considering decision makers' loss-averse behavior and government subsidy. Contingency analysis is discussed to cope with possible misestimating of key factors such as initial traffic volume and cost coefficients. Sensitivity analysis is employed to investigate the impact of CPT parameters on the concession period decisions. An actual BOT case which failed to attract private capital is introduced to show the practical application. The results are then interpreted to conclude this paper.

Findings

Based on comparisons drawn between a concession period decision-making model considering the psychological behaviors of decision makers and a model not considering them, the authors conclude that the concession period based on CPT is distinctly different from that of the loss-neutral model. The concession period based on CPT is longer than the loss-neutral concession period. That is, loss-averse private investors tend to ask for long concession periods to make up for losses they will face in the future. Government subsidies serve as extra income for investors, allowing appointed profits to be secured sooner. For the benefit side of contingency variables, the normal state of initial traffic volume, average annual traffic growth rate and bias degree and the government subsidy need to be paid close attention during the project life span. For the cost side of contingency variables, the annual operating cost variable has a significant impact on the length of predicted concession period, while the large-scale cost variable has minor impact.

Originality/value

With an actual BOT highway project, the determination of transportation BOT concession periods based on the psychological behaviors of decision makers is analyzed in this paper. As the psychological behaviors of decision makers heavily impact the decision-making process, the authors analyze their impacts on concession period decision making. Government subsidy is specifically designed for various states of initial traffic volume and fluctuating growth rates to cope with corresponding high risks and mitigate private investors' loss-averse behaviors. Contingency analysis and sensitivity analysis are discussed as the estimated values of parameters may not be authentic in actual situations.

Details

Engineering, Construction and Architectural Management, vol. 29 no. 3
Type: Research Article
ISSN: 0969-9988

Keywords

Article
Publication date: 19 July 2019

Slawomir Koziel and Adrian Bekasiewicz

This paper aims to investigate the strategy for low-cost yield optimization of miniaturized microstrip couplers using variable-fidelity electromagnetic (EM) simulations.

Abstract

Purpose

This paper aims to investigate the strategy for low-cost yield optimization of miniaturized microstrip couplers using variable-fidelity electromagnetic (EM) simulations.

Design/methodology/approach

Usefulness of data-driven models constructed from structure frequency responses formulated in the form of suitably defined characteristic points for statistical analysis is investigated. Reformulation of the characteristics leads to a less nonlinear functional landscape and reduces the number of training samples required for accurate modeling. Further reduction of the cost associated with construction of the data-driven model, is achieved using variable-fidelity methods. Numerical case study is provided demonstrating feasibility of the feature-based modeling for low cost statistical analysis and yield optimization.

Findings

It is possible, through reformulation of the structure frequency responses in the form of suitably defined feature points, to reduce the number of training samples required for its data-driven modeling. The approximation model can be used as an accurate evaluation engine for a low-cost Monte Carlo analysis. Yield optimization can be realized through minimization of yield within the data-driven model bounds and subsequent model re-set around the optimized design.

Research limitations/implications

The investigated technique exceeds capabilities of conventional Monte Carlo-based approaches for statistical analysis in terms of computational cost without compromising its accuracy with respect to the conventional EM-based Monte Carlo.

Originality/value

The proposed tolerance-aware design approach proved useful for rapid yield optimization of compact microstrip couplers represented using EM-simulation models, which is extremely challenging when using conventional approaches due to tremendous number of EM evaluations required for statistical analysis.

Details

Engineering Computations, vol. 36 no. 9
Type: Research Article
ISSN: 0264-4401

Keywords

Article
Publication date: 16 October 2009

Rahman Farnoosh and Ebrahimi Morteza

The purpose of this paper is to provide a Monte Carlo variance reduction method based on Control variates to solve Fredholm integral equations of the second kind.

497

Abstract

Purpose

The purpose of this paper is to provide a Monte Carlo variance reduction method based on Control variates to solve Fredholm integral equations of the second kind.

Design/methodology/approach

A numerical algorithm consisted of the combined use of the successive substitution method and Monte Carlo simulation is established for the solution of Fredholm integral equations of the second kind.

Findings

Owing to the application of the present method, the variance of the solution is reduced. Therefore, this method achieves several orders of magnitude improvement in accuracy over the conventional Monte Carlo method.

Practical implications

Numerical tests are performed in order to show the efficiency and accuracy of the present paper. Numerical experiments show that an excellent estimation on the solution can be obtained within a couple of minutes CPU time at Pentium IV‐2.4 GHz PC.

Originality/value

This paper provides a new efficient method to solve Fredholm integral equations of the second kind and discusses basic advantages of the present method.

Details

Kybernetes, vol. 38 no. 9
Type: Research Article
ISSN: 0368-492X

Keywords

1 – 10 of over 7000