Search results

1 – 10 of over 12000
Article
Publication date: 16 October 2020

Julia S. Mehlitz and Benjamin R. Auer

Motivated by the growing importance of the expected shortfall in banking and finance, this study aims to compare the performance of popular non-parametric estimators of the…

Abstract

Purpose

Motivated by the growing importance of the expected shortfall in banking and finance, this study aims to compare the performance of popular non-parametric estimators of the expected shortfall (i.e. different variants of historical, outlier-adjusted and kernel methods) to each other, selected parametric benchmarks and estimates based on the idea of forecast combination.

Design/methodology/approach

Within a multidimensional simulation setup (spanned by different distributional settings, sample sizes and confidence levels), the authors rank the estimators based on classic error measures, as well as an innovative performance profile technique, which the authors adapt from the mathematical programming literature.

Findings

The rich set of results supports academics and practitioners in the search for an answer to the question of which estimators are preferable under which circumstances. This is because no estimator or combination of estimators ranks first in all considered settings.

Originality/value

To the best of their knowledge, the authors are the first to provide a structured simulation-based comparison of non-parametric expected shortfall estimators, study the effects of estimator averaging and apply the mentioned profiling technique in risk management.

Details

The Journal of Risk Finance, vol. 21 no. 4
Type: Research Article
ISSN: 1526-5943

Keywords

Book part
Publication date: 19 December 2012

Eric Hillebrand and Tae-Hwy Lee

We examine the Stein-rule shrinkage estimator for possible improvements in estimation and forecasting when there are many predictors in a linear time series model. We consider the…

Abstract

We examine the Stein-rule shrinkage estimator for possible improvements in estimation and forecasting when there are many predictors in a linear time series model. We consider the Stein-rule estimator of Hill and Judge (1987) that shrinks the unrestricted unbiased ordinary least squares (OLS) estimator toward a restricted biased principal component (PC) estimator. Since the Stein-rule estimator combines the OLS and PC estimators, it is a model-averaging estimator and produces a combined forecast. The conditions under which the improvement can be achieved depend on several unknown parameters that determine the degree of the Stein-rule shrinkage. We conduct Monte Carlo simulations to examine these parameter regions. The overall picture that emerges is that the Stein-rule shrinkage estimator can dominate both OLS and principal components estimators within an intermediate range of the signal-to-noise ratio. If the signal-to-noise ratio is low, the PC estimator is superior. If the signal-to-noise ratio is high, the OLS estimator is superior. In out-of-sample forecasting with AR(1) predictors, the Stein-rule shrinkage estimator can dominate both OLS and PC estimators when the predictors exhibit low persistence.

Details

30th Anniversary Edition
Type: Book
ISBN: 978-1-78190-309-4

Keywords

Book part
Publication date: 21 November 2014

Purevdorj Tuvaandorj and Victoria Zinde-Walsh

We consider conditional distribution and conditional density functionals in the space of generalized functions. The approach follows Phillips (1985, 1991, 1995) who employed…

Abstract

We consider conditional distribution and conditional density functionals in the space of generalized functions. The approach follows Phillips (1985, 1991, 1995) who employed generalized functions to overcome non-differentiability in order to develop expansions. We obtain the limit of the kernel estimators for weakly dependent data, even under non-differentiability of the distribution function; the limit Gaussian process is characterized as a stochastic random functional (random generalized function) on the suitable function space. An alternative simple to compute estimator based on the empirical distribution function is proposed for the generalized random functional. For test statistics based on this estimator, limit properties are established. A Monte Carlo experiment demonstrates good finite sample performance of the statistics for testing logit and probit specification in binary choice models.

Details

Essays in Honor of Peter C. B. Phillips
Type: Book
ISBN: 978-1-78441-183-1

Keywords

Article
Publication date: 9 November 2012

Octavio Andrés González‐Estrada, Juan José Ródenas, Stéphane Pierre Alain Bordas, Marc Duflot, Pierre Kerfriden and Eugenio Giner

The purpose of this paper is to assess the effect of the statical admissibility of the recovered solution and the ability of the recovered solution to represent the singular…

1200

Abstract

Purpose

The purpose of this paper is to assess the effect of the statical admissibility of the recovered solution and the ability of the recovered solution to represent the singular solution; also the accuracy, local and global effectivity of recovery‐based error estimators for enriched finite element methods (e.g. the extended finite element method, XFEM).

Design/methodology/approach

The authors study the performance of two recovery techniques. The first is a recently developed superconvergent patch recovery procedure with equilibration and enrichment (SPR‐CX). The second is known as the extended moving least squares recovery (XMLS), which enriches the recovered solutions but does not enforce equilibrium constraints. Both are extended recovery techniques as the polynomial basis used in the recovery process is enriched with singular terms for a better description of the singular nature of the solution.

Findings

Numerical results comparing the convergence and the effectivity index of both techniques with those obtained without the enrichment enhancement clearly show the need for the use of extended recovery techniques in Zienkiewicz‐Zhu type error estimators for this class of problems. The results also reveal significant improvements in the effectivities yielded by statically admissible recovered solutions.

Originality/value

The paper shows that both extended recovery procedures and statical admissibility are key to an accurate assessment of the quality of enriched finite element approximations.

Article
Publication date: 10 September 2018

Muneer Shaik and Maheswaran S.

The purpose of this paper is twofold: first, to propose a new robust volatility ratio (RVR) that compares the intraday high–low volatility with that of the intraday open–close…

Abstract

Purpose

The purpose of this paper is twofold: first, to propose a new robust volatility ratio (RVR) that compares the intraday high–low volatility with that of the intraday open–close volatility estimator; and second, to empirically test the proposed RVR on the cross-sectional (CS) average of the constituent stocks of India’s BSE Sensex and US’s Dow Jones Industrial Average index to find the evidence of “excess volatility.”

Design/methodology/approach

The authors model the proposed RVR by assuming the logarithm of the price process to follow the Brownian motion. The authors have theoretically shown that the RVR is unbiased in the case of zero drift parameter. Moreover, the RVR is found to be an even function of the non-zero drift parameter.

Findings

The empirical results show that the analysis based on the RVR supports the existence of “excess volatility” in the CS average of the constituent stocks of India’s BSE Sensex and US’s Dow Jones index. In particular, the authors have observed that the CS average of individual constituent stocks of BSE Sensex is found to be more excessively volatile than the US’s Dow Jones index during the period of the study from January 2008 to September 2016, based on multiple k-day time window analysis.

Practical implications

The study has implications for the policy makers and practitioners who would like to understand the volatility behavior in the asset returns based on the RVR of this study. In general, the proposed model can be used as a specification tool to find whether the stock prices follow the random walk behavior or excessively volatile.

Originality/value

The authors contribute to the existing volatility literature in finance by proposing a new RVR based on extreme values of asset prices and absolute returns. The authors implement the bootstrap technique on RVR to find the estimates of mean and standard error for multiple k-day time windows. The RVR can capture the excess volatility by comparing two independent volatility estimators. This is possibly the first study to find the CS average of all the constituent stocks of BSE Sensex based on the RVR.

Details

Journal of Economic Studies, vol. 45 no. 4
Type: Research Article
ISSN: 0144-3585

Keywords

Book part
Publication date: 13 October 2009

Andreas Kleine and Regina Schlindwein

DEA is a favored method to investigate the efficiency of institutions that provide educational services. We measure the efficiency of German universities especially from the…

Abstract

DEA is a favored method to investigate the efficiency of institutions that provide educational services. We measure the efficiency of German universities especially from the students’ perspective. Since 1998, the Centrum für Hochschulentwicklung (CHE) evaluates German universities annually. The CHE ranking consists of three ranking groups for different indicators, but they do not create a hierarchy of the universities. Thus, a differentiation of the universities ranked in the same group is not possible. Based on the CHE data set, especially the surveys among students, we evaluate teaching performance from the students’ point of view using data envelopment analysis (DEA). DEA enables us to identify departments that – in the students’ perspective – are efficient in the sense that they provide high quality of education. As a method for performance evaluation, we apply a DEA bootstrap approach. By the use of this approach, we incorporate stochastic influences in the data and derive confidence intervals for the efficiency. Based on data generated by the bootstrap procedure, we are able to identify stochastic efficient departments. These universities serve as a benchmark to improve teaching performance.

Details

Financial Modeling Applications and Data Envelopment Applications
Type: Book
ISBN: 978-1-84855-878-6

Article
Publication date: 12 January 2015

Viktoria Goebel

The purpose of this paper is to identify a measure of intellectual capital (IC) value which offers new research opportunities for empirical investigations and to examine the…

1706

Abstract

Purpose

The purpose of this paper is to identify a measure of intellectual capital (IC) value which offers new research opportunities for empirical investigations and to examine the determinants of IC value.

Design/methodology/approach

In total, 4,488 firm years of German companies are investigated to compare three measures of IC value: market-to-book, Tobin’s q, and long-run value-to-book (LRVTB).

Findings

LRVTB is observed to be the IC value measure with the highest explanatory value. This measure provides an approach to empirically test previously untested hypotheses on IC value. The results on testing determinants of IC value indicate that IC value is positively related to leverage and motivational payments to employees and negatively associated with company size. In contrast, recognised intangible assets, research and development (R & D), company age and concentrated ownership show no significant effects.

Research limitations/implications

The findings on IC value measures contribute to IC research as they offer a way to estimate IC value for testing IC-related hypotheses. The findings on IC determinants have implications for IC management as the relevant determinants can be considered for IC value creation.

Originality/value

This paper responds to the challenge posed by previous IC research to develop more creative quantitative approaches to estimate IC value (Marr et al., 2003; Mouritsen, 2006) in order to test IC-related hypotheses by innovatively applying a measure from mergers and acquisitions research to IC.

Details

Journal of Intellectual Capital, vol. 16 no. 1
Type: Research Article
ISSN: 1469-1930

Keywords

Article
Publication date: 9 May 2016

Anukal Chiralaksanakul

The purpose of this paper is to investigate the impact of bias error resulted from using Monte Carlo simulation in evaluating the American-style option value.

Abstract

Purpose

The purpose of this paper is to investigate the impact of bias error resulted from using Monte Carlo simulation in evaluating the American-style option value.

Design/methodology/approach

The authors develop an analytical approximation formula to quantify the bias error under the assumption of conditionally independent and identically distributed samples of asset prices. The bias arises from the nested optimization and expectation calculation. The formula is then used to numerically quantify the bias and as an objective function for bias minimization for a given budget of samples.

Findings

Monte Carlo methods used in valuation of American-style options can results in bias error ranging from 2 to 10 per cent of the option value. The bias error can be reduced up to 50 per cent either by performing a better scheme for sampling or by efficiently allocating sample size.

Research limitations/implications

The running time of the proposed procedure can be improved by using a specialized algorithm to solve the sample size allocation problem instead of using a commercially available subroutine MINOS. Other sampling procedures for bias reduction may be extended and applied to this multi-stage problem.

Practical implications

The methodology can help to more accurately approximate the option value.

Originality/value

The paper provides a method to develop an analytical approximation for bias error and provide a numerical experiment to test the methodology.

Details

Journal of Modelling in Management, vol. 11 no. 2
Type: Research Article
ISSN: 1746-5664

Keywords

Book part
Publication date: 19 December 2012

Monalisa Sen, Anil K. Bera and Yu-Hsien Kao

In this chapter we investigate the finite sample properties of a Hausman test for the spatial error model (SEM) proposed by Pace and LeSage (2008). In particular, we demonstrate…

Abstract

In this chapter we investigate the finite sample properties of a Hausman test for the spatial error model (SEM) proposed by Pace and LeSage (2008). In particular, we demonstrate that the power of their test could be very low against a natural alternative like the spatial autoregressive (SAR) model.

Article
Publication date: 1 July 1998

Philip K. M'Pherson

An approach to providing an inclusive model for business value analysis is presented, where “inclusive” means that broadly defined stakeholder values are added to conventional…

Abstract

An approach to providing an inclusive model for business value analysis is presented, where “inclusive” means that broadly defined stakeholder values are added to conventional financial criteria. A value oriented model is an interesting extension of accounting and information provision, with information seen as the carrier of value. Modelling and evaluation can be thought of as an information process that is central to steering a business towards its goals and maximising its value. The analytical expression of the model deals rigorously with both financial and intangible values, serves as a test‐bed for strategic explorations, can cover environmental and ethical risk, and can optimise a business with respect to both financial and inclusive criteria. Gathering the information and operating a model of this kind is akin to providing a strategic information service that integrates all the information into a single strategic picture. The model is a valuable information resource.

Details

Aslib Proceedings, vol. 50 no. 7
Type: Research Article
ISSN: 0001-253X

1 – 10 of over 12000