Search results

1 – 10 of over 1000
Book part
Publication date: 19 November 2014

Garland Durham and John Geweke

Massively parallel desktop computing capabilities now well within the reach of individual academics modify the environment for posterior simulation in fundamental and potentially…

Abstract

Massively parallel desktop computing capabilities now well within the reach of individual academics modify the environment for posterior simulation in fundamental and potentially quite advantageous ways. But to fully exploit these benefits algorithms that conform to parallel computing environments are needed. This paper presents a sequential posterior simulator designed to operate efficiently in this context. The simulator makes fewer analytical and programming demands on investigators, and is faster, more reliable, and more complete than conventional posterior simulators. The paper extends existing sequential Monte Carlo methods and theory to provide a thorough and practical foundation for sequential posterior simulation that is well suited to massively parallel computing environments. It provides detailed recommendations on implementation, yielding an algorithm that requires only code for simulation from the prior and evaluation of prior and data densities and works well in a variety of applications representative of serious empirical work in economics and finance. The algorithm facilitates Bayesian model comparison by producing marginal likelihood approximations of unprecedented accuracy as an incidental by-product, is robust to pathological posterior distributions, and provides estimates of numerical standard error and relative numerical efficiency intrinsically. The paper concludes with an application that illustrates the potential of these simulators for applied Bayesian inference.

Book part
Publication date: 1 December 2008

Zhen Wei

Survival (default) data are frequently encountered in financial (especially credit risk), medical, educational, and other fields, where the “default” can be interpreted as the…

Abstract

Survival (default) data are frequently encountered in financial (especially credit risk), medical, educational, and other fields, where the “default” can be interpreted as the failure to fulfill debt payments of a specific company or the death of a patient in a medical study or the inability to pass some educational tests.

This paper introduces the basic ideas of Cox's original proportional model for the hazard rates and extends the model within a general framework of statistical data mining procedures. By employing regularization, basis expansion, boosting, bagging, Markov chain Monte Carlo (MCMC) and many other tools, we effectively calibrate a large and flexible class of proportional hazard models.

The proposed methods have important applications in the setting of credit risk. For example, the model for the default correlation through regularization can be used to price credit basket products, and the frailty factor models can explain the contagion effects in the defaults of multiple firms in the credit market.

Details

Econometrics and Risk Management
Type: Book
ISBN: 978-1-84855-196-1

Abstract

Details

Dynamic General Equilibrium Modelling for Forecasting and Policy: A Practical Guide and Documentation of MONASH
Type: Book
ISBN: 978-0-44451-260-4

Abstract

Details

Dynamic General Equilibrium Modelling for Forecasting and Policy: A Practical Guide and Documentation of MONASH
Type: Book
ISBN: 978-0-44451-260-4

Book part
Publication date: 1 January 2004

Nathan Lael Joseph, David S. Brée and Efstathios Kalyvas

Are the learning procedures of genetic algorithms (GAs) able to generate optimal architectures for artificial neural networks (ANNs) in high frequency data? In this experimental…

Abstract

Are the learning procedures of genetic algorithms (GAs) able to generate optimal architectures for artificial neural networks (ANNs) in high frequency data? In this experimental study, GAs are used to identify the best architecture for ANNs. Additional learning is undertaken by the ANNs to forecast daily excess stock returns. No ANN architectures were able to outperform a random walk, despite the finding of non-linearity in the excess returns. This failure is attributed to the absence of suitable ANN structures and further implies that researchers need to be cautious when making inferences from ANN results that use high frequency data.

Details

Applications of Artificial Intelligence in Finance and Economics
Type: Book
ISBN: 978-1-84950-303-7

Abstract

Details

Dynamic General Equilibrium Modelling for Forecasting and Policy: A Practical Guide and Documentation of MONASH
Type: Book
ISBN: 978-0-44451-260-4

Abstract

Details

Algorithms, Blockchain & Cryptocurrency: Implications for the Future of the Workplace
Type: Book
ISBN: 978-1-83867-495-3

Abstract

Details

Overlapping Generations: Methods, Models and Morphology
Type: Book
ISBN: 978-1-83753-052-6

Content available
Book part
Publication date: 4 September 2023

Stephen E. Spear and Warren Young

Abstract

Details

Overlapping Generations: Methods, Models and Morphology
Type: Book
ISBN: 978-1-83753-052-6

Content available
Book part
Publication date: 1 January 2001

Abstract

Details

Dynamic General Equilibrium Modelling for Forecasting and Policy: A Practical Guide and Documentation of MONASH
Type: Book
ISBN: 978-0-44451-260-4

1 – 10 of over 1000