Search results

1 – 10 of 432
Book part
Publication date: 24 October 2018

Harro Maas

In this chapter, I take a talk show in which Coen Teulings, then Director of the official Dutch Bureau for Economic Forecasting and Policy Analysis (CPB) was interviewed about its…

Abstract

In this chapter, I take a talk show in which Coen Teulings, then Director of the official Dutch Bureau for Economic Forecasting and Policy Analysis (CPB) was interviewed about its economic forecasts in the immediate aftermath of the financial crisis of 2008 as point of entry into an examination into how personal experience and judgment enter, and are essential for, the production and presentation of economic forecasts. During the interview it transpired that CPB did not rely on its macroeconomic models, but on personal experience encapsulated in “hand-made” monitors, to observe the unfolding crisis; monitors that were, in Teulings’ words, used to “feel the pulse” of the Dutch economy. I will take this metaphor as a cue to present several historical episodes in which models, numbers, and a certain feel for economic phenomena aimed to make CPB economists’ research more precise. These episodes are linked with a story about vain attempts by CPB director Teulings to drive out the personal from economic forecasting. The crisis forced him to recognize that personal experience was more important in increasing the precision of economic forecasts than theoretical deepening. The crisis thus both challenged the belief in the supremacy of theory driven, computer-based forecasting, and helped foster the view that precision is inevitably linked to judgment, experience and observation, and not seated in increased attention to high theory; scientifically sound knowledge proved less useful than the technically unqualified experiential knowledge of quacks.

Details

Including a Symposium on Mary Morgan: Curiosity, Imagination, and Surprise
Type: Book
ISBN: 978-1-78756-423-7

Keywords

Book part
Publication date: 21 December 2010

Tore Selland Kleppe, Jun Yu and H.J. Skaug

In this chapter we develop and implement a method for maximum simulated likelihood estimation of the continuous time stochastic volatility model with the constant elasticity of…

Abstract

In this chapter we develop and implement a method for maximum simulated likelihood estimation of the continuous time stochastic volatility model with the constant elasticity of volatility. The approach does not require observations on option prices, nor volatility. To integrate out latent volatility from the joint density of return and volatility, a modified efficient importance sampling technique is used after the continuous time model is approximated using the Euler–Maruyama scheme. The Monte Carlo studies show that the method works well and the empirical applications illustrate usefulness of the method. Empirical results provide strong evidence against the Heston model.

Details

Maximum Simulated Likelihood Methods and Applications
Type: Book
ISBN: 978-0-85724-150-4

Book part
Publication date: 23 June 2016

Jean-Jacques Forneron and Serena Ng

This paper considers properties of an optimization-based sampler for targeting the posterior distribution when the likelihood is intractable. It uses auxiliary statistics to…

Abstract

This paper considers properties of an optimization-based sampler for targeting the posterior distribution when the likelihood is intractable. It uses auxiliary statistics to summarize information in the data and does not directly evaluate the likelihood associated with the specified parametric model. Our reverse sampler approximates the desired posterior distribution by first solving a sequence of simulated minimum distance problems. The solutions are then reweighted by an importance ratio that depends on the prior and the volume of the Jacobian matrix. By a change of variable argument, the output consists of draws from the desired posterior distribution. Optimization always results in acceptable draws. Hence, when the minimum distance problem is not too difficult to solve, combining importance sampling with optimization can be much faster than the method of Approximate Bayesian Computation that by-passes optimization.

Details

Essays in Honor of Aman Ullah
Type: Book
ISBN: 978-1-78560-786-8

Keywords

Book part
Publication date: 1 January 1991

Abstract

Details

Operations Research for Libraries and Information Agencies: Techniques for the Evaluation of Management Decision Alternatives
Type: Book
ISBN: 978-0-12424-520-4

Book part
Publication date: 9 July 2010

Akos Rona-Tas and Stefanie Hiss

Both consumer and corporate credit ratings agencies played a major role in the US subprime mortgage crisis. Equifax, Experian, and TransUnion deployed a formalized scoring system…

Abstract

Both consumer and corporate credit ratings agencies played a major role in the US subprime mortgage crisis. Equifax, Experian, and TransUnion deployed a formalized scoring system to assess individuals in mortgage origination, mortgage pools then were assessed for securitization by Moody's, S&P, and Fitch relying on expert judgment aided by formal models. What can we learn about the limits of formalization from the crisis? We discuss five problems responsible for the rating failures – reactivity, endogeneity, learning, correlated outcomes, and conflict of interest – and compare the way consumer and corporate rating agencies tackled these difficulties. We conclude with some policy lessons.

Details

Markets on Trial: The Economic Sociology of the U.S. Financial Crisis: Part A
Type: Book
ISBN: 978-0-85724-205-1

Abstract

Details

Conceptualising Risk Assessment and Management across the Public Sector
Type: Book
ISBN: 978-1-80043-693-0

Abstract

Details

Structural Road Accident Models
Type: Book
ISBN: 978-0-08-043061-4

Book part
Publication date: 22 August 2014

Alisa G. Brink, Eric Gooden and Meha Kohli Mishra

There has been much discussion regarding the necessity of moving away from precise (rules-based) standards toward less precise (principles-based) standards. This study examines…

Abstract

There has been much discussion regarding the necessity of moving away from precise (rules-based) standards toward less precise (principles-based) standards. This study examines the impact of the proposed shift by using a controlled experiment to evaluate the influence of rule precision and information ambiguity on reporting decisions in the presence of monetary incentives to report aggressively. Using motivated reasoning theory as a framework, we predict that the malleability inherent in both rule precision and information ambiguity amplify biased reasoning in a manner that is consistent with individuals’ pecuniary incentives. In contrast, consistent with research exploring ambiguity aversion we predict that high levels of ambiguity will actually attenuate aggressive reporting. Our results support these predictions. Specifically, we find an interactive effect between rule precision and information ambiguity on self-interested reporting decisions at moderate levels of ambiguity. However, consistent with ambiguity aversion, we find decreased self-interested reporting decisions at high levels of ambiguity relative to moderate ambiguity. This study should be of interest to preparers, auditors, and regulators who are interested in identifying situations which amplify and diminish aggressive reporting.

Details

Advances in Accounting Behavioral Research
Type: Book
ISBN: 978-1-78350-445-9

Keywords

Book part
Publication date: 1 December 2016

Roman Liesenfeld, Jean-François Richard and Jan Vogler

We propose a generic algorithm for numerically accurate likelihood evaluation of a broad class of spatial models characterized by a high-dimensional latent Gaussian process and…

Abstract

We propose a generic algorithm for numerically accurate likelihood evaluation of a broad class of spatial models characterized by a high-dimensional latent Gaussian process and non-Gaussian response variables. The class of models under consideration includes specifications for discrete choices, event counts and limited-dependent variables (truncation, censoring, and sample selection) among others. Our algorithm relies upon a novel implementation of efficient importance sampling (EIS) specifically designed to exploit typical sparsity of high-dimensional spatial precision (or covariance) matrices. It is numerically very accurate and computationally feasible even for very high-dimensional latent processes. Thus, maximum likelihood (ML) estimation of high-dimensional non-Gaussian spatial models, hitherto considered to be computationally prohibitive, becomes feasible. We illustrate our approach with ML estimation of a spatial probit for US presidential voting decisions and spatial count data models (Poisson and Negbin) for firm location choices.

Details

Spatial Econometrics: Qualitative and Limited Dependent Variables
Type: Book
ISBN: 978-1-78560-986-2

Keywords

Abstract

Details

Transportation and Traffic Theory in the 21st Century
Type: Book
ISBN: 978-0-080-43926-6

1 – 10 of 432