Search results

1 – 10 of over 3000
Book part
Publication date: 1 January 2008

Michiel de Pooter, Francesco Ravazzolo, Rene Segers and Herman K. van Dijk

Several lessons learnt from a Bayesian analysis of basic macroeconomic time-series models are presented for the situation where some model parameters have substantial posterior…

Abstract

Several lessons learnt from a Bayesian analysis of basic macroeconomic time-series models are presented for the situation where some model parameters have substantial posterior probability near the boundary of the parameter region. This feature refers to near-instability within dynamic models, to forecasting with near-random walk models and to clustering of several economic series in a small number of groups within a data panel. Two canonical models are used: a linear regression model with autocorrelation and a simple variance components model. Several well-known time-series models like unit root and error correction models and further state space and panel data models are shown to be simple generalizations of these two canonical models for the purpose of posterior inference. A Bayesian model averaging procedure is presented in order to deal with models with substantial probability both near and at the boundary of the parameter region. Analytical, graphical, and empirical results using U.S. macroeconomic data, in particular on GDP growth, are presented.

Details

Bayesian Econometrics
Type: Book
ISBN: 978-1-84855-308-8

Article
Publication date: 1 April 2000

Nobuhiko Terui

In market share analysis, it is fully recognized that we have often inadmissibly predicted market share, which means that some of predictors take the values outside the range [0…

2132

Abstract

In market share analysis, it is fully recognized that we have often inadmissibly predicted market share, which means that some of predictors take the values outside the range [0, 1] and the total sum of predicted shares is not always one, so‐called “logical inconsistency”. Based on the Bayesian VAR model, proposes a dynamic market share model with logical consistency. The proposed method makes it possible to forecast not only the values of market share themselves, but also various dynamic market share relations across different brands or companies. The daily scanner data from the Nikkei POS information system are analyzed by the proposed method.

Details

Marketing Intelligence & Planning, vol. 18 no. 2
Type: Research Article
ISSN: 0263-4503

Keywords

Abstract

Details

Applying Maximum Entropy to Econometric Problems
Type: Book
ISBN: 978-0-76230-187-4

Book part
Publication date: 6 January 2016

Antonello D’Agostino, Domenico Giannone, Michele Lenza and Michele Modugno

We develop a framework for measuring and monitoring business cycles in real time. Following a long tradition in macroeconometrics, inference is based on a variety of indicators of…

Abstract

We develop a framework for measuring and monitoring business cycles in real time. Following a long tradition in macroeconometrics, inference is based on a variety of indicators of economic activity, treated as imperfect measures of an underlying index of business cycle conditions. We extend existing approaches by permitting for heterogenous lead–lag patterns of the various indicators along the business cycles. The framework is well suited for high-frequency monitoring of current economic conditions in real time – nowcasting – since inference can be conducted in the presence of mixed frequency data and irregular patterns of data availability. Our assessment of the underlying index of business cycle conditions is accurate and more timely than popular alternatives, including the Chicago Fed National Activity Index (CFNAI). A formal real-time forecasting evaluation shows that the framework produces well-calibrated probability nowcasts that resemble the consensus assessment of the Survey of Professional Forecasters.

Article
Publication date: 1 July 2006

George Chang

The purpose of this paper is to investigate whether Markov mixture of normals (MMN) model is a viable approach to modeling financial returns.

Abstract

Purpose

The purpose of this paper is to investigate whether Markov mixture of normals (MMN) model is a viable approach to modeling financial returns.

Design/methodology/approach

This paper adopts the full Bayesian estimation approach based on the method of Gibbs sampling, and the latent state variables simulation algorithm developed by Chib.

Findings

Using data from the S&P 500 index, the paper first demonstrates that the MMN model is able to capture the unconditional features of the S&P 500 daily returns. It further conducts formal model comparisons to examine the performance of the Markov mixture structures relative to two well‐known alternatives, the GARCH and the t‐GARCH models. The results clearly indicate that MMN models are viable alternatives to modeling financial returns.

Research limitations/implications

The univariate MMN structure in this paper can be generalized to a multivariate setting, which can provide a flexible yet practical approach to modeling multiple time series of assets returns.

Practical implications

Given the encouraging empirical performance of the MMN models, it is hopeful that the MMN models will have success in some interesting financial applications such as Value‐at‐Risk and option pricing.

Originality/value

The paper explicitly formulates the Gibbs sampling procedures for estimating MMN models in a Bayesian framework. It also shows empirically that MMN models are able to capture the stylized features of financial returns. The MMN models and their estimation method in this paper can be applied to other financial data, especially in which tail probability is of major interest or concern.

Details

Studies in Economics and Finance, vol. 23 no. 2
Type: Research Article
ISSN: 1086-7376

Keywords

Abstract

This article surveys recent developments in the evaluation of point and density forecasts in the context of forecasts made by vector autoregressions. Specific emphasis is placed on highlighting those parts of the existing literature that are applicable to direct multistep forecasts and those parts that are applicable to iterated multistep forecasts. This literature includes advancements in the evaluation of forecasts in population (based on true, unknown model coefficients) and the evaluation of forecasts in the finite sample (based on estimated model coefficients). The article then examines in Monte Carlo experiments the finite-sample properties of some tests of equal forecast accuracy, focusing on the comparison of VAR forecasts to AR forecasts. These experiments show the tests to behave as should be expected given the theory. For example, using critical values obtained by bootstrap methods, tests of equal accuracy in population have empirical size about equal to nominal size.

Details

VAR Models in Macroeconomics – New Developments and Applications: Essays in Honor of Christopher A. Sims
Type: Book
ISBN: 978-1-78190-752-8

Keywords

Article
Publication date: 13 May 2014

Timothy Hart and Paul Zandbergen

The purpose of this paper is to examine the effects of user-defined parameters settings (e.g. interpolation method, grid cell size, and bandwidth) on the predictive accuracy of…

4394

Abstract

Purpose

The purpose of this paper is to examine the effects of user-defined parameters settings (e.g. interpolation method, grid cell size, and bandwidth) on the predictive accuracy of crime hotspot maps produced from kernel density estimation (KDE).

Design/methodology/approach

The influence of variations in parameter settings on prospective KDE maps is examined across two types of interpersonal violence (e.g. aggravated assault and robbery) and two types of property crime (e.g. commercial burglary and motor vehicle theft).

Findings

Results show that interpolation method has a considerable effect on predictive accuracy, grid cell size has little to no effect, and bandwidth as some effect.

Originality/value

The current study advances the knowledge and understanding of prospective hotspot crime mapping as it answers the calls by Chainey et al. (2008) and others to further investigate the methods used to predict crime.

Details

Policing: An International Journal of Police Strategies & Management, vol. 37 no. 2
Type: Research Article
ISSN: 1363-951X

Keywords

Article
Publication date: 27 June 2008

Gladys D.C. Barriga, Linda Lee Ho and Vicente G. Cancho

The purpose of this paper is to present designs for an accelerated life test (ALT).

Abstract

Purpose

The purpose of this paper is to present designs for an accelerated life test (ALT).

Design/methodology/approach

Bayesian methods and simulation Monte Carlo Markov Chain (MCMC) methods were used.

Findings

In the paper a Bayesian method based on MCMC for ALT under EW distribution (for life time) and Arrhenius models (relating the stress variable and parameters) was proposed. The paper can conclude that it is a reasonable alternative to the classical statistical methods since the implementation of the proposed method is simple, not requiring advanced computational understanding and inferences on the parameters can be made easily. By the predictive density of a future observation, a procedure was developed to plan ALT and also to verify if the conformance fraction of the manufactured process reaches some desired level of quality. This procedure is useful for statistical process control in many industrial applications.

Research limitations/implications

The results may be applied in a semiconductor manufacturer.

Originality/value

The Exponentiated‐Weibull‐Arrhenius model has never before been used to plan an ALT.

Details

International Journal of Quality & Reliability Management, vol. 25 no. 6
Type: Research Article
ISSN: 0265-671X

Keywords

Abstract

Details

Applying Maximum Entropy to Econometric Problems
Type: Book
ISBN: 978-0-76230-187-4

Article
Publication date: 29 November 2019

A. George Assaf and Mike G. Tsionas

This paper aims to present several Bayesian specification tests for both in- and out-of-sample situations.

Abstract

Purpose

This paper aims to present several Bayesian specification tests for both in- and out-of-sample situations.

Design/methodology/approach

The authors focus on the Bayesian equivalents of the frequentist approach for testing heteroskedasticity, autocorrelation and functional form specification. For out-of-sample diagnostics, the authors consider several tests to evaluate the predictive ability of the model.

Findings

The authors demonstrate the performance of these tests using an application on the relationship between price and occupancy rate from the hotel industry. For purposes of comparison, the authors also provide evidence from traditional frequentist tests.

Research limitations/implications

There certainly exist other issues and diagnostic tests that are not covered in this paper. The issues that are addressed, however, are critically important and can be applied to most modeling situations.

Originality/value

With the increased use of the Bayesian approach in various modeling contexts, this paper serves as an important guide for diagnostic testing in Bayesian analysis. Diagnostic analysis is essential and should always accompany the estimation of regression models.

Details

International Journal of Contemporary Hospitality Management, vol. 32 no. 4
Type: Research Article
ISSN: 0959-6119

Keywords

1 – 10 of over 3000