Search results

1 – 10 of 170
Book part
Publication date: 1 December 2016

Roman Liesenfeld, Jean-François Richard and Jan Vogler

We propose a generic algorithm for numerically accurate likelihood evaluation of a broad class of spatial models characterized by a high-dimensional latent Gaussian process and…

Abstract

We propose a generic algorithm for numerically accurate likelihood evaluation of a broad class of spatial models characterized by a high-dimensional latent Gaussian process and non-Gaussian response variables. The class of models under consideration includes specifications for discrete choices, event counts and limited-dependent variables (truncation, censoring, and sample selection) among others. Our algorithm relies upon a novel implementation of efficient importance sampling (EIS) specifically designed to exploit typical sparsity of high-dimensional spatial precision (or covariance) matrices. It is numerically very accurate and computationally feasible even for very high-dimensional latent processes. Thus, maximum likelihood (ML) estimation of high-dimensional non-Gaussian spatial models, hitherto considered to be computationally prohibitive, becomes feasible. We illustrate our approach with ML estimation of a spatial probit for US presidential voting decisions and spatial count data models (Poisson and Negbin) for firm location choices.

Details

Spatial Econometrics: Qualitative and Limited Dependent Variables
Type: Book
ISBN: 978-1-78560-986-2

Keywords

Article
Publication date: 11 September 2017

Seon-Hi Shin, Charles L. Slater and Steve Ortiz

The purpose of this paper is to examine what factors affect student achievement in reading and mathematics. The research questions addressed the perceptions of school principals…

Abstract

Purpose

The purpose of this paper is to examine what factors affect student achievement in reading and mathematics. The research questions addressed the perceptions of school principals and background characteristics related to student achievement in Korea and the USA with respect to differences among students in low, middle and high quantiles.

Design/methodology/approach

Data were taken from the Program for International Student Assessment 2012. Scores in the reading and mathematics were analyzed in conjunction with a principal survey. The Quantile Regression method was used for data analysis with three quantile points. T-statistics were used to test for significance. The predictor set consisted of seven school-leadership variables, and four to six additional covariates.

Findings

The most important finding for the USA was a relationship between organizational hindrance (HND) and low student achievement for the middle and upper quantiles in mathematics and for all quantiles in reading. The (HND) variable included poor teacher-student relations, low expectation of students, overly strict enforcement of rules, lack of attention to student needs, resistance to change, lateness to class, and lack of preparation. The most important finding for Korea was that there were significant associations across all groups between teacher attitude (TCHATT) and student reading achievement and with the low group in mathematics.

Research limitations/implications

This study adds to knowledge about school capacity and suggests that the leadership role of the principal is to overcome negative environmental factors and create a positive organization.

Originality/value

The non-Gaussian approach of regression analysis allowed us to identify significant differences that we otherwise might not have found.

Details

International Journal of Educational Management, vol. 31 no. 7
Type: Research Article
ISSN: 0951-354X

Keywords

Book part
Publication date: 11 August 2016

Kousik Guhathakurta, Basabi Bhattacharya and A. Roy Chowdhury

It has long been challenged that the distributions of empirical returns do not follow the log-normal distribution upon which many celebrated results of finance are based including…

Abstract

It has long been challenged that the distributions of empirical returns do not follow the log-normal distribution upon which many celebrated results of finance are based including the Black–Scholes Option-Pricing model. Borland (2002) succeeds in obtaining alternate closed form solutions for European options based on Tsallis distribution, which allow for statistical feedback as a model of the underlying stock returns. Motivated by this, we simulate two distinct time series based on initial data from NIFTY daily close values, one based on the Gaussian return distribution and the other on non-Gaussian distribution. Using techniques of non-linear dynamics, we examine the underlying dynamic characteristics of both the simulated time series and compare them with the characteristics of actual data. Our findings give a definite edge to the non-Gaussian model over the Gaussian one.

Details

The Spread of Financial Sophistication through Emerging Markets Worldwide
Type: Book
ISBN: 978-1-78635-155-5

Keywords

Book part
Publication date: 19 November 2012

Naceur Naguez and Jean-Luc Prigent

Purpose – The purpose of this chapter is to estimate non-Gaussian distributions by means of Johnson distributions. An empirical illustration on hedge fund returns is…

Abstract

Purpose – The purpose of this chapter is to estimate non-Gaussian distributions by means of Johnson distributions. An empirical illustration on hedge fund returns is detailed.

Methodology/approach – To fit non-Gaussian distributions, the chapter introduces the family of Johnson distributions and its general extensions. We use both parametric and non-parametric approaches. In a first step, we analyze the serial correlation of our sample of hedge fund returns and unsmooth the series to correct the correlations. Then, we estimate the distribution by the standard Johnson system of laws. Finally, we search for a more general distribution of Johnson type, using a non-parametric approach.

Findings – We use data from the indexes Credit Suisse/Tremont Hedge Fund (CSFB/Tremont) provided by Credit Suisse. For the parametric approach, we find that the SU Johnson distribution is the most appropriate, except for the Managed Futures. For the non-parametric approach, we determine the best polynomial approximation of the function characterizing the transformation from the initial Gaussian law to the generalized Johnson distribution.

Originality/value of chapter – These findings are novel since we use an extension of the Johnson distributions to better fit non-Gaussian distributions, in particular in the case of hedge fund returns. We illustrate the power of this methodology that can be further developed in the multidimensional case.

Details

Recent Developments in Alternative Finance: Empirical Assessments and Economic Implications
Type: Book
ISBN: 978-1-78190-399-5

Keywords

Article
Publication date: 24 September 2020

Diego Ferreira, Andreza Aparecida Palma and Marcos Minoru Hasegawa

This paper analyzes the potential presence of time-varying asymmetries in the preference parameters of the Central Bank of Brazil during the inflation targeting regime.

Abstract

Purpose

This paper analyzes the potential presence of time-varying asymmetries in the preference parameters of the Central Bank of Brazil during the inflation targeting regime.

Design/methodology/approach

Given the econometric issues inherent to classical time-varying parameter (TVP) regressions, a Bayesian estimation procedure is implemented in order to provide more robust parameter estimates. A stochastic volatility specification is also included to take into account the potential presence of conditional heteroskedasticity.

Findings

The obtained results show that the reduced form and structural parameters were not constant during the period considered. Moreover, the subsequent analysis of the preference parameters provided evidences of short periods in which asymmetry was an important feature to the conduction of monetary policy in Brazil. Yet, during most of the sample period, the loss function was considered to be symmetrical.

Originality/value

This paper aims to contribute to the rather scarce monetary debate on time-varying central bank preferences. The study of Lopes and Aragón (2014) is, to the best of the authors’ knowledge, the only study for Brazil considering specifically TVPs. The authors applied Kalman filter estimation to data from 2000:M1 to 2011:M12. Despite the similar structure of TVPs, the present paper extends the latter study by controlling for stochastic volatility. Ignoring conditional heteroskedasticity might lead to spurious movements in time-varying variables and inaccurate inference (Hamilton, 2010). Thus, the stochastic volatility specification is included to take this issue into account. The authors follow the theoretical scheme put forward by Surico (2007) and Aragón and Portugal (2010), in which the economy is modeled from a New Keynesian perspective and the central bank loss function is assumed to be asymmetric regarding the responses to inflation and output deviations from their targets. On the empirical side, the authors propose a TVP univariate regression with stochastic volatility for the Brazilian reduced-form reaction function, following closely the Bayesian econometric procedure developed by Nakajima (2011). Given the nonlinear non-Gaussian nature of the TVP regression with stochastic volatility, the choice of a nonlinear Bayesian approach using the Markov chain Monte Carlo (MCMC) method is justified due to the intractability of the associated likelihood function (Primiceri, 2005). Finally, based on the theoretical model specification, the authors intend to recover the central bank preference parameters as to further evaluate the degree of asymmetry and its potential time-variation under the inflation targeting regime.

Details

Journal of Economic Studies, vol. 48 no. 4
Type: Research Article
ISSN: 0144-3585

Keywords

Book part
Publication date: 18 April 2018

Mohammed Quddus

Purpose – Time-series regression models are applied to analyse transport safety data for three purposes: (1) to develop a relationship between transport accidents (or incidents…

Abstract

Purpose – Time-series regression models are applied to analyse transport safety data for three purposes: (1) to develop a relationship between transport accidents (or incidents) and various time-varying factors, with the aim of identifying the most important factors; (2) to develop a time-series accident model in forecasting future accidents for the given values of future time-varying factors and (3) to evaluate the impact of a system-wide policy, education or engineering intervention on accident counts. Regression models for analysing transport safety data are well established, especially in analysing cross-sectional and panel datasets. There is, however, a dearth of research relating to time-series regression models in the transport safety literature. The purpose of this chapter is to examine existing literature with the aim of identifying time-series regression models that have been employed in safety analysis in relation to wider applications. The aim is to identify time-series regression models that are applicable in analysing disaggregated accident counts.

Methodology/Approach – There are two main issues in modelling time-series accident counts: (1) a flexible approach in addressing serial autocorrelation inherent in time-series processes of accident counts and (2) the fact that the conditional distribution (conditioned on past observations and covariates) of accident counts follow a Poisson-type distribution. Various time-series regression models are explored to identify the models most suitable for analysing disaggregated time-series accident datasets. A recently developed time-series regression model – the generalised linear autoregressive and moving average (GLARMA) – has been identified as the best model to analyse safety data.

Findings – The GLARMA model was applied to a time-series dataset of airproxes (aircraft proximity) that indicate airspace safety in the United Kingdom. The aim was to evaluate the impact of an airspace intervention (i.e., the introduction of reduced vertical separation minima, RVSM) on airspace safety while controlling for other factors, such as air transport movements (ATMs) and seasonality. The results indicate that the GLARMA model is more appropriate than a generalised linear model (e.g., Poisson or Poisson-Gamma), and it has been found that the introduction of RVSM has reduced the airprox events by 15%. In addition, it was found that a 1% increase in ATMs within UK airspace would lead to a 1.83% increase in monthly airproxes in UK airspace.

Practical applications – The methodology developed in this chapter is applicable to many time-series processes of accident counts. The models recommended in this chapter could be used to identify different time-varying factors and to evaluate the effectiveness of various policy and engineering interventions on transport safety or similar data (e.g., crimes).

Originality/value of paper – The GLARMA model has not been properly explored in modelling time-series safety data. This new class of model has been applied to a dataset in evaluating the effectiveness of an intervention. The model recommended in this chapter would greatly benefit researchers and analysts working with time-series data.

Details

Safe Mobility: Challenges, Methodology and Solutions
Type: Book
ISBN: 978-1-78635-223-1

Keywords

Article
Publication date: 12 October 2012

Yaoqi Guo, Jianbo Huang and Hui Cheng

Recently, many scholars have been paying more attention to studying the existence and application of multifractality. However, most researches concentrate on studying multifractal…

Abstract

Purpose

Recently, many scholars have been paying more attention to studying the existence and application of multifractality. However, most researches concentrate on studying multifractal features of returns or volume separately, and ignore the correlation between them. The purpose of this paper, therefore, is to give an empirical test on multifractal features of price‐volume correlation in China metal futures market and then to conduct a comparative analysis from time and space dimensions, in order to better understand metals futures market behavior.

Design/methodology/approach

This paper gives an empirical test by means of multifractal detrended cross‐correlation analysis (MF‐DCCA) approach, which is a technique employed in statistical physics to detect multifractal features of two cross‐correlated nonstationary time series.

Findings

Empirical results show that the price‐volume correlation in China metal futures market is multifractal and that long range correlation and non‐Gaussian probability distribution are the main reasons for the existence of multifractality. Also, a comparative analysis is conducted and it is found that although China metal futures market is becoming more and more effective, the effectiveness is lower than that in mature LME metal futures markets. The futures market still needs further development.

Originality/value

The paper's conclusions would help to understand the nonlinear dependency relationship and potential dynamics mechanism in price‐volume correlation.

Article
Publication date: 1 August 1994

T.A. Spedding and P.L. Rawlings

Control charts and process capability calculations remain fundamentaltechniques for statistical process control. However, it has long beenrealized that the accuracy of these…

1639

Abstract

Control charts and process capability calculations remain fundamental techniques for statistical process control. However, it has long been realized that the accuracy of these calculations can be significantly affected when sampling from a non‐Gaussian population. Many quality practitioners are conscious of these problems but are not aware of the effects such problems might have on the integrity of their results. Considers non‐normality with respect to the use of traditional control charts and process capability calculations, so that users may be aware of the errors that are involved when sampling from a non‐Gaussian population. Use is made of the Johnson system of distributions as a simulation technique to investigate the effects of non‐normality of control charts and process control calculations. An alternative technique is suggested for process capability calculations which alleviates the problems of non‐normality while retaining computational efficiency.

Details

International Journal of Quality & Reliability Management, vol. 11 no. 6
Type: Research Article
ISSN: 0265-671X

Keywords

Article
Publication date: 1 June 2000

C.A.N. Dias and J.R.D. Petreche

In marine structures, the long‐term non‐stationary response of flexible lines, due to random environmental loads, may be regarded as successive short‐term stationary processes in…

Abstract

In marine structures, the long‐term non‐stationary response of flexible lines, due to random environmental loads, may be regarded as successive short‐term stationary processes in which current, wind and ocean wave conditions remain constant. The power spectrum of each stationary process can be characterized by its linear and non‐linear energy components: the linear energy defines a Gaussian process, and the additional nonlinear energy characterizes a non‐Gaussian process. Within this scope, digital bispectral analysis has enabled one to describe non‐linear stationary response of flexible lines in the frequency domain, so that the complex coefficients of a quadratic model, in the frequency domain, can be estimated. The real and symmetrical matrix constructed from these coefficients has eigenvalues and eigenvectors useful to describe the characteristic function of the response from where the probability density function can be obtained by using a fast Fourier transform algorithm. The bases of the method presented here have already been treated, in a similar but pure algebraic method, to obtain the asymptotic probability function applicable to the response of non‐linear systems in closed form.

Details

Engineering Computations, vol. 17 no. 4
Type: Research Article
ISSN: 0264-4401

Keywords

Article
Publication date: 6 March 2007

Hongtao Guo, Guojun Wu and Zhijie Xiao

The purpose of this article is to estimate value at risk (VaR) using quantile regression and provide a risk analysis for defaultable bond portfolios.

2007

Abstract

Purpose

The purpose of this article is to estimate value at risk (VaR) using quantile regression and provide a risk analysis for defaultable bond portfolios.

Design/methodology/approach

The method proposed is based on quantile regression pioneered by Koenker and Bassett. The quantile regression approach allows for a general treatment on the error distribution and is robust to distributions with heavy tails.

Findings

This article provides a risk analysis for defaultable bond portfolios using quantile regression method. In the proposed model we use information variables such as short‐term interest rates and term spreads as covariates to improve the estimation accuracy. The study also finds that confidence intervals constructed around the estimated VaRs can be very wide under volatile market conditions, making the estimated VaRs less reliable when their accurate measurement is most needed.

Originality/value

Provides a risk analysis for defaultable bond using quantile regression approach.

Details

The Journal of Risk Finance, vol. 8 no. 2
Type: Research Article
ISSN: 1526-5943

Keywords

1 – 10 of 170