Search results

1 – 10 of 65
Open Access
Article
Publication date: 19 March 2019

Ako Doffou

This paper aims to test three parametric models in pricing and hedging higher-order moment swaps. Using vanilla option prices from the volatility surface of the Euro Stoxx 50…

1364

Abstract

Purpose

This paper aims to test three parametric models in pricing and hedging higher-order moment swaps. Using vanilla option prices from the volatility surface of the Euro Stoxx 50 Index, the paper shows that the pricing accuracy of these models is very satisfactory under four different pricing error functions. The result is that taking a position in a third moment swap considerably improves the performance of the standard hedge of a variance swap based on a static position in the log-contract and a dynamic trading strategy. The position in the third moment swap is taken by running a Monte Carlo simulation.

Design/methodology/approach

This paper undertook empirical tests of three parametric models. The aim of the paper is twofold: assess the pricing accuracy of these models and show how the classical hedge of the variance swap in terms of a position in a log-contract and a dynamic trading strategy can be significantly enhanced by using third-order moment swaps. The pricing accuracy was measured under four different pricing error functions. A Monte Carlo simulation was run to take a position in the third moment swap.

Findings

The results of the paper are twofold: the pricing accuracy of the Heston (1993) model and that of two Levy models with stochastic time and stochastic volatility are satisfactory; taking a position in third-order moment swaps can significantly improve the performance of the standard hedge of a variance swap.

Research limitations/implications

The limitation is that these empirical tests are conducted on existing three parametric models. Maybe more critical insights could have been revealed had these tests been conducted in a brand new derivatives pricing model.

Originality/value

This work is 100 per cent original, and it undertook empirical tests of the pricing and hedging accuracy of existing three parametric models.

Details

Studies in Economics and Finance, vol. 36 no. 2
Type: Research Article
ISSN: 1086-7376

Keywords

Article
Publication date: 1 June 2000

A. Savini

Gives introductory remarks about chapter 1 of this group of 31 papers, from ISEF 1999 Proceedings, in the methodologies for field analysis, in the electromagnetic community…

1133

Abstract

Gives introductory remarks about chapter 1 of this group of 31 papers, from ISEF 1999 Proceedings, in the methodologies for field analysis, in the electromagnetic community. Observes that computer package implementation theory contributes to clarification. Discusses the areas covered by some of the papers ‐ such as artificial intelligence using fuzzy logic. Includes applications such as permanent magnets and looks at eddy current problems. States the finite element method is currently the most popular method used for field computation. Closes by pointing out the amalgam of topics.

Details

COMPEL - The international journal for computation and mathematics in electrical and electronic engineering, vol. 19 no. 2
Type: Research Article
ISSN: 0332-1649

Keywords

Book part
Publication date: 11 August 2016

Karoll Gómez Portilla

This chapter focuses on examining how changes in the liquidity differential between nominal and TIPS yields influence optimal portfolio allocations in U.S. Treasury securities…

Abstract

This chapter focuses on examining how changes in the liquidity differential between nominal and TIPS yields influence optimal portfolio allocations in U.S. Treasury securities. Based on a nonparametric estimation technique and comparing the optimal allocation decisions of mean-variance and CRRA investor, when investment opportunities are time varying, I present evidence that liquidity risk premium is a significant risk-factor in a portfolio allocation context. In fact, I find that a conditional allocation strategy translates into improved in-sample and out-of-sample asset allocation and performance. The analysis of the portfolio allocation to U.S. government bonds is particularly important for central banks, specially in developing countries, given the fact that, collectively they have accumulate a large holdings of U.S. securities over the last 15 years.

Details

The Spread of Financial Sophistication through Emerging Markets Worldwide
Type: Book
ISBN: 978-1-78635-155-5

Keywords

Article
Publication date: 17 March 2014

Vassilis Polimenis and Ioannis Papantonis

This paper aims to enhance a co-skew-based risk measurement methodology initially introduced in Polimenis, by extending it for the joint estimation of the jump betas for two…

Abstract

Purpose

This paper aims to enhance a co-skew-based risk measurement methodology initially introduced in Polimenis, by extending it for the joint estimation of the jump betas for two stocks.

Design/methodology/approach

The authors introduce the possibility of idiosyncratic jumps and analyze the robustness of the estimated sensitivities when two stocks are jointly fit to the same set of latent jump factors. When individual stock skews substantially differ from those of the market, the requirement that the individual skew is exactly matched is placing a strain on the single stock estimation system.

Findings

The authors argue that, once the authors relax this restrictive requirement in an enhanced joint framework, the system calibrates to a more robust solution in terms of uncovering the true magnitude of the latent parameters of the model, at the same time revealing information about the level of idiosyncratic skews in individual stock return distributions.

Research limitations/implications

Allowing for idiosyncratic skews relaxes the demands placed on the estimation system and hence improves its explanatory power by focusing on matching systematic skew that is more informational. Furthermore, allowing for stock-specific jumps that are not related to the market is a realistic assumption. There is now evidence that idiosyncratic risks are priced as well, and this has been a major drawback and criticism in using CAPM to assess risk premia.

Practical implications

Since jumps in stock prices incorporate the most valuable information, then quantifying a stock's exposure to jump events can have important practical implications for financial risk management, portfolio construction and option pricing.

Originality/value

This approach boosts the “signal-to-noise” ratio by utilizing co-skew moments, so that the diffusive component is filtered out through higher-order cumulants. Without making any distributional assumptions, the authors are able not only to capture the asymmetric sensitivity of a stock to latent upward and downward systematic jump risks, but also to uncover the magnitude of idiosyncratic stock skewness. Since cumulants in a Levy process evolve linearly in time, this approach is horizon independent and hence can be deployed at all frequencies.

Details

The Journal of Risk Finance, vol. 15 no. 2
Type: Research Article
ISSN: 1526-5943

Keywords

Book part
Publication date: 30 November 2011

Massimo Guidolin

I review the burgeoning literature on applications of Markov regime switching models in empirical finance. In particular, distinct attention is devoted to the ability of Markov…

Abstract

I review the burgeoning literature on applications of Markov regime switching models in empirical finance. In particular, distinct attention is devoted to the ability of Markov Switching models to fit the data, filter unknown regimes and states on the basis of the data, to allow a powerful tool to test hypotheses formulated in light of financial theories, and to their forecasting performance with reference to both point and density predictions. The review covers papers concerning a multiplicity of sub-fields in financial economics, ranging from empirical analyses of stock returns, the term structure of default-free interest rates, the dynamics of exchange rates, as well as the joint process of stock and bond returns.

Details

Missing Data Methods: Time-Series Methods and Applications
Type: Book
ISBN: 978-1-78052-526-6

Keywords

Book part
Publication date: 17 October 2014

J. Barkley Rosser

Political economies evolve institutionally and technologically over time. This means that to understand evolutionary political economy one must understand the nature of the…

Abstract

Political economies evolve institutionally and technologically over time. This means that to understand evolutionary political economy one must understand the nature of the evolutionary process in its full complexity. From the time of Darwin and Spencer natural selection has been seen as the foundation of evolution. This view has remained even as views of how evolution operates more broadly have changed. An issue that some have viewed as an aspect of evolution that natural selection may not fully explain is that of emergence of higher order structures, with this aspect having been associated with the idea of emergence. In recent decades it has been argued that self-organization dynamics may explain such emergence, with this being argued to be constrained, if not overshadowed, by natural selection. Just as the balance between these aspects is debated within organic evolutionary theory, it also arises in the evolution of political economy, as between such examples of self-organizing emergence as the Mengerian analysis of the appearance of commodity money in primitive societies and the natural selection that operates in the competition between firms in markets.

Details

Entangled Political Economy
Type: Book
ISBN: 978-1-78441-102-2

Keywords

Article
Publication date: 16 April 2018

Jacek Ptaszny and Marcin Hatłas

The purpose of this paper is to evaluate the efficiency of the fast multipole boundary element method (FMBEM) in the analysis of stress and effective properties of 3D linear…

Abstract

Purpose

The purpose of this paper is to evaluate the efficiency of the fast multipole boundary element method (FMBEM) in the analysis of stress and effective properties of 3D linear elastic structures with cavities. In particular, a comparison between the FMBEM and the finite element method (FEM) is performed in terms of accuracy, model size and computation time.

Design/methodology/approach

The developed FMBEM uses eight-node Serendipity boundary elements with numerical integration based on the adaptive subdivision of elements. Multipole and local expansions and translations involve solid harmonics. The proposed model is used to analyse a solid body with two interacting spherical cavities, and to predict the homogenized response of a porous material under linear displacement boundary condition. The FEM results are generated in commercial codes Ansys and MSC Patran/Nastran, and the results are compared in terms of accuracy, model size and execution time. Analytical solutions available in the literature are also considered.

Findings

FMBEM and FEM approximate the geometry with similar accuracy and provide similar results. However, FMBEM requires a model size that is smaller by an order of magnitude in terms of the number of degrees of freedom. The problems under consideration can be solved by using FMBEM within the time comparable to the FEM with an iterative solver.

Research limitations/implications

The present results are limited to linear elasticity.

Originality/value

This work is a step towards a comprehensive efficiency evaluation of the FMBEM applied to selected problems of micromechanics, by comparison with the commercial FEM codes.

Article
Publication date: 1 February 2016

Philip Blonski and Simon Christian Blonski

The purpose of this study is to question the undifferentiated treatment of individual traders as “dumb noise traders?”. We question this undifferentiated verdict by conducting an…

1456

Abstract

Purpose

The purpose of this study is to question the undifferentiated treatment of individual traders as “dumb noise traders?”. We question this undifferentiated verdict by conducting an analysis of the cognitive competence of individual investors.

Design/methodology/approach

The authors let experts (both experienced researchers as well as practitioners) assess the mathematical and verbal reasoning demands of investment tasks investigated in previous studies.

Findings

Based on this assessment, this paper concludes that individual investors are able to perform a number of complex cognitive actions, especially those demanding higher-order verbal reasoning. However, they seem to reach cognitive limitations with tasks demanding greater mathematical reasoning ability. This is especially unfortunate, as tasks requiring higher mathematical reasoning are considered to be more relevant to performance. These findings have important implications for future regulatory measures.

Research limitations/implications

This study has two non-trivial limitations. First, indirect measurement of mental requirements does not allow authors to make definite statements about the cognitive competence of individual investors. To do so, it would be necessary to conduct laboratory experiments which directly measure performance of investors on different investment and other cognitively demanding tasks. However, such data are not available for retail investors on this market to the best of the authors’s knowledge. We therefore think that our approach is a valuable first step toward understanding investors’ cognitive competence using data that are available at this moment. Second, the number of analyzed (and available) tasks is rather low (n = 10) which limits the power of tests and restricts the authors from using more profound (deductive) statistical analyses.

Practical implications

This paper proposes to illustrate information in key investor documents mostly verbally (e.g. as proposed by Rieger, 2009), compel exchanges and issuers of retail derivatives to create awareness for the results of the reviewed studies and our conclusion and to offer online math trainings especially designed for individual investors to better prepare them for different trading activities, as these have been shown to be as effective as face-to-face trainings (Frederickson et al., 2005; Karr et al., 2003).

Social implications

This study can only be considered as a first step toward understanding the cognitive limitations of individual investors indirectly and could be transferred to other market areas as well.

Originality/value

This study is the first to combine the assessment of outstanding researchers in this field with the results of previous studies. In doing so, this paper provides an overarching framework of interpretation for these studies.

Details

Qualitative Research in Financial Markets, vol. 8 no. 1
Type: Research Article
ISSN: 1755-4179

Keywords

Article
Publication date: 27 September 2011

Isao Ishida, Michael McAleer and Kosuke Oya

The purpose of this paper is to propose a new method for estimating continuous‐time stochastic volatility (SV) models for the S&P 500 stock index process using intraday…

Abstract

Purpose

The purpose of this paper is to propose a new method for estimating continuous‐time stochastic volatility (SV) models for the S&P 500 stock index process using intraday high‐frequency observations of both the S&P 500 index and the Chicago Board Options Exchange (CBOE) implied (or expected) volatility index (VIX).

Design/methodology/approach

A primary purpose of the paper is to provide a framework for using intraday high‐frequency data of both the indices' estimates, in particular, for improving the estimation accuracy of the leverage parameter, that is, the correlation between the two Brownian motions driving the diffusive components of the price process and its spot variance process, respectively.

Findings

Finite sample simulation results show that the proposed estimator delivers more accurate estimates of the leverage parameter than do existing methods.

Research limitations/implications

The focus of the paper is on the Heston and non‐Heston leverage parameters.

Practical implications

Finite sample simulation results show that the proposed estimator delivers more accurate estimates of the leverage parameter than do existing methods.

Social implications

The research findings are important for the analysis of ultra high‐frequency financial data.

Originality/value

The paper provides a framework for using intraday high‐frequency data of both indices' estimates, in particular, for improving the estimation accuracy of the leverage parameter, that is, the correlation between the two Brownian motions driving the diffusive components of the price process and its spot variance process, respectively.

Details

Managerial Finance, vol. 37 no. 11
Type: Research Article
ISSN: 0307-4358

Keywords

Book part
Publication date: 5 November 2016

Ioanna D. Constantiou, Arisa Shollo and Morten Thanning Vendelø

An ongoing debate in the field of organizational decision-making concerns the use of intuition versus analytical rationality in decision-making. For the purpose of contributing to…

Abstract

An ongoing debate in the field of organizational decision-making concerns the use of intuition versus analytical rationality in decision-making. For the purpose of contributing to this debate we use a rich empirical dataset built from a longitudinal study of information technology project prioritization in a large financial institution to investigate how managers make space for the use of intuition in decision-making. Our findings show that during project prioritization meetings, senior decision makers apply three different techniques: bringing-in project intangibles, co-promoting intuitive judgments, and associating intuitive judgments with shared group context, when they make space for intuition in decision processes.

Details

Uncertainty and Strategic Decision Making
Type: Book
ISBN: 978-1-78635-170-8

Keywords

1 – 10 of 65