Search results

1 – 10 of over 3000
Book part
Publication date: 24 March 2006

Ngai Hang Chan and Wilfredo Palma

Since the seminal works by Granger and Joyeux (1980) and Hosking (1981), estimations of long-memory time series models have been receiving considerable attention and a number of…

Abstract

Since the seminal works by Granger and Joyeux (1980) and Hosking (1981), estimations of long-memory time series models have been receiving considerable attention and a number of parameter estimation procedures have been proposed. This paper gives an overview of this plethora of methodologies with special focus on likelihood-based techniques. Broadly speaking, likelihood-based techniques can be classified into the following categories: the exact maximum likelihood (ML) estimation (Sowell, 1992; Dahlhaus, 1989), ML estimates based on autoregressive approximations (Granger & Joyeux, 1980; Li & McLeod, 1986), Whittle estimates (Fox & Taqqu, 1986; Giraitis & Surgailis, 1990), Whittle estimates with autoregressive truncation (Beran, 1994a), approximate estimates based on the Durbin–Levinson algorithm (Haslett & Raftery, 1989), state-space-based maximum likelihood estimates for ARFIMA models (Chan & Palma, 1998), and estimation of stochastic volatility models (Ghysels, Harvey, & Renault, 1996; Breidt, Crato, & de Lima, 1998; Chan & Petris, 2000) among others. Given the diversified applications of these techniques in different areas, this review aims at providing a succinct survey of these methodologies as well as an overview of important related problems such as the ML estimation with missing data (Palma & Chan, 1997), influence of subsets of observations on estimates and the estimation of seasonal long-memory models (Palma & Chan, 2005). Performances and asymptotic properties of these techniques are compared and examined. Inter-connections and finite sample performances among these procedures are studied. Finally, applications to financial time series of these methodologies are discussed.

Details

Econometric Analysis of Financial and Economic Time Series
Type: Book
ISBN: 978-1-84950-388-4

Book part
Publication date: 13 December 2013

Bertrand Candelon, Elena-Ivona Dumitrescu, Christophe Hurlin and Franz C. Palm

In this article we propose a multivariate dynamic probit model. Our model can be viewed as a nonlinear VAR model for the latent variables associated with correlated binary…

Abstract

In this article we propose a multivariate dynamic probit model. Our model can be viewed as a nonlinear VAR model for the latent variables associated with correlated binary time-series data. To estimate it, we implement an exact maximum likelihood approach, hence providing a solution to the problem generally encountered in the formulation of multivariate probit models. Our framework allows us to study the predictive relationships among the binary processes under analysis. Finally, an empirical study of three financial crises is conducted.

Details

VAR Models in Macroeconomics – New Developments and Applications: Essays in Honor of Christopher A. Sims
Type: Book
ISBN: 978-1-78190-752-8

Keywords

Book part
Publication date: 30 August 2019

Timothy Cogley and Richard Startz

Standard estimation of ARMA models in which the AR and MA roots nearly cancel, so that individual coefficients are only weakly identified, often produces inferential ranges for…

Abstract

Standard estimation of ARMA models in which the AR and MA roots nearly cancel, so that individual coefficients are only weakly identified, often produces inferential ranges for individual coefficients that give a spurious appearance of accuracy. We remedy this problem with a model that uses a simple mixture prior. The posterior mixing probability is derived using Bayesian methods, but we show that the method works well in both Bayesian and frequentist setups. In particular, we show that our mixture procedure weights standard results heavily when given data from a well-identified ARMA model (which does not exhibit near root cancellation) and weights heavily an uninformative inferential region when given data from a weakly-identified ARMA model (with near root cancellation). When our procedure is applied to a well-identified process the investigator gets the “usual results,” so there is no important statistical cost to using our procedure. On the other hand, when our procedure is applied to a weakly identified process, the investigator learns that the data tell us little about the parameters – and is thus protected against making spurious inferences. We recommend that mixture models be computed routinely when inference about ARMA coefficients is of interest.

Details

Topics in Identification, Limited Dependent Variables, Partial Observability, Experimentation, and Flexible Modeling: Part A
Type: Book
ISBN: 978-1-78973-241-2

Keywords

Article
Publication date: 31 July 2009

Donald E. Hutto, Thomas Mazzuchi and Shahram Sarkani

The purpose of this paper is to provide maintenance personnel with a methodology for using masked field reliability data to determine the probability of each subassembly failure.

1995

Abstract

Purpose

The purpose of this paper is to provide maintenance personnel with a methodology for using masked field reliability data to determine the probability of each subassembly failure.

Design/methodology/approach

The paper compares an iterative maximum likelihood estimation method and a Bayesian methodology for handling masked data collected from 227 identical radar power supplies. The power supply consists of several subassemblies hereafter referred to as shop replaceable assemblies (SRAs).

Findings

The study examined two approaches for dealing with masking, an iterative maximum likelihood estimate procedure, IMLEP, and a Bayesian approach implemented with the application WinBUGS. It indicates that the performances of IMLEP and WinBUGS in estimating the parameters of the SRA distribution under no masking conditions are similar. IMLEP and WinBUGS also provide similar results under masking conditions. However, the study indicates that WinBUGS may perform better than IMLEP when the competing risk responsible for a failure represents a smaller total percentage of the total failures. Future study to confirm this conclusion by expanding the number of SRAs into which the item under study is organized is required.

Research limitations/implications

If an item is considered to be comprised of various subassemblies and the failure of the first subassembly causes the item to fail, then the item is referred to as a series system in the literature. If the probability of a each subassembly failure is statistically independent then the item can be represented by a competing risk model and the probability distributions of the subassemblies can be ascertained from the item's failure data. When the item's cause of failure is not known, the data are referred to in the literature as being masked. Since competing risk theory requires a cause of failure and a time of failure, any masked data must be addressed in the competing risk model.

Practical implications

This study indicates that competing risk theory can be applied to the equipment field failure data to determine a SRA's probability of failure and thereby provide an efficient sequence of replacing suspect failed SRAs.

Originality/value

The analysis of masked failure data is an important area that has had only limited study in the literature due to the availability of failure data. This paper contributes to the research by providing the complete historical equipment usage data for the item under study gathered over a time frame of approximately seven years.

Details

International Journal of Quality & Reliability Management, vol. 26 no. 7
Type: Research Article
ISSN: 0265-671X

Keywords

Abstract

Details

Functional Structure and Approximation in Econometrics
Type: Book
ISBN: 978-0-44450-861-4

Abstract

Details

Economics, Econometrics and the LINK: Essays in Honor of Lawrence R.Klein
Type: Book
ISBN: 978-0-44481-787-7

Article
Publication date: 1 January 2001

S.L. Byers and K. Ben Nowman

Refers to previous research on the empirical testing of continuous time, two factor short rate interest models by Chan, Karolyi, Longstaff and Sanders (1992), Vasicek (1997) and…

Abstract

Refers to previous research on the empirical testing of continuous time, two factor short rate interest models by Chan, Karolyi, Longstaff and Sanders (1992), Vasicek (1997) and Cox, Ingersoll and Ross (1985); and the Nowman (1997, 2000) Gaussian estimation approach. Applies these ideas to monthly interbank and Euro‐currency data for a variety of periods and currencies to compare the explanatory/forecasting power of each model with the unrestricted model. Presents the results which show that volatility levels and forecasting performance vary for the models and markets tested.

Details

Managerial Finance, vol. 27 no. 1/2
Type: Research Article
ISSN: 0307-4358

Keywords

Article
Publication date: 22 February 2013

Jianfeng Zhang and Wenxiu Hu

The purpose of this paper is to examine whether realized volatility can provide additional information on the volatility process to the GARCH and EGARCH model, based on the data…

Abstract

Purpose

The purpose of this paper is to examine whether realized volatility can provide additional information on the volatility process to the GARCH and EGARCH model, based on the data of Chinese stock market.

Design/methodology/approach

The realized volatility is defined as the squared overnight return plus the close to open squared return of the period between the morning and afternoon session, to plus the sum of the squared f-minute returns between the trading hours during the relevant trading day. The methodology is a GARCH (EGARCH) model with added explanation variables in the variance equation. The estimation methodology is exact maximum likelihood estimation, using the BHHH algorithms for optimization.

Findings

There are some stocks for which realized volatility measures add information in the volatility process, but there are still quite a number of stocks for which they do not contain any additional information. The 30 minutes realized volatility measures outperform measures constructed on other time intervals. The firm size, turnover rate, and amplitude also partially explain the difference in realized volatility ' s explanatory power across firms.

Research limitations/implications

When analyzing the factors determining the role of realized volatility, as the difficulty of getting the data, ownership structure and ultimately ownerships are not taken into account, except for the turnover ratio, amplitude and size.

Originality/value

This study extends firstly this line of inquiry of realized volatility into the emerging Chinese stock market. Due to the unique institutional setting in China, the results of this study have played an important role on pricing warrant for domestic investors in the Chinese market.

Details

International Journal of Managerial Finance, vol. 9 no. 1
Type: Research Article
ISSN: 1743-9132

Keywords

Book part
Publication date: 29 March 2006

Borus Jungbacker and Siem Jan Koopman

In this chapter, we aim to measure the actual volatility within a model-based framework using high-frequency data. In the empirical finance literature, it is widely discussed that…

Abstract

In this chapter, we aim to measure the actual volatility within a model-based framework using high-frequency data. In the empirical finance literature, it is widely discussed that tick-by-tick prices are subject to market micro-structure effects such as bid-ask bounces and trade information. These market micro-structure effects become more and more apparent as prices or returns are sampled at smaller and smaller time intervals. An increasingly popular measure for the variability of spot prices on a particular day is realised volatility that is typically defined as the sum of squared intra-daily log-returns. Recent theoretical results have shown that realised volatility is a consistent estimator of actual volatility, but when it is subject to micro-structure noise and the sampling frequency increases, the estimator diverges. Parametric and nonparametric methods can be adopted to account for the micro-structure bias. Here, we measure actual volatility using a model that takes account of micro-structure noise together with intra-daily volatility patterns and stochastic volatility. The coefficients of this model are estimated by maximum likelihood methods that are based on importance sampling techniques. It is shown that such Monte Carlo techniques can be employed successfully for our purposes in a feasible way. As far as we know, this is a first attempt to model the basic components of the mean and variance of high-frequency prices simultaneously. An illustration is given for three months of tick-by-tick transaction prices of the IBM stock traded at the New York Stock Exchange.

Details

Econometric Analysis of Financial and Economic Time Series
Type: Book
ISBN: 978-0-76231-274-0

Book part
Publication date: 18 November 2014

Ted D. Englebrecht, Xiaoyan Chu and Yingxu Kuang

Dissatisfaction with the current federal tax system is fostering serious interest in several tax reform plans such as a value-added tax (VAT), a flat tax, and a national retail…

Abstract

Dissatisfaction with the current federal tax system is fostering serious interest in several tax reform plans such as a value-added tax (VAT), a flat tax, and a national retail sales tax. Recently, one of the former Republican presidential candidates, Herman Cain, initiated a 999 tax plan. As illustrated on Cain’s official website, the 999 plan intends to replace current federal taxes with a 9% business flat tax, a 9% individual flat tax, and a 9% national sales tax. We examine the distributional effects of the 999 tax plan, as well as the current system it intends to replace, under both annual income and lifetime income approaches. Global measures of progressivity and bootstrap-t confidence intervals suggest that the current federal tax system is progressive while Cain’s 999 tax plan is regressive under the annual income approach. Under the lifetime income approach, both the current federal tax system and Cain’s 999 tax plan show progressivity. However, the current federal tax system is more progressive. The findings in this study suggest that Cain’s 999 tax plan should be considered more seriously and further analysis of the 999 tax plan is warranted.

Details

Advances in Taxation
Type: Book
ISBN: 978-1-78441-120-6

Keywords

1 – 10 of over 3000