Books and journals Case studies Expert Briefings Open Access
Advanced search

Search results

1 – 10 of 108
To view the access options for this content please click here
Article
Publication date: 6 March 2017

Speeding up estimation of the Hurst exponent by a two-stage procedure from a large to small range

Yen-Ching Chang

The Hurst exponent has been very important in telling the difference between fractal signals and explaining their significance. For estimators of the Hurst exponent…

HTML
PDF (329 KB)

Abstract

Purpose

The Hurst exponent has been very important in telling the difference between fractal signals and explaining their significance. For estimators of the Hurst exponent, accuracy and efficiency are two inevitable considerations. The main purpose of this study is to raise the execution efficiency of the existing estimators, especially the fast maximum likelihood estimator (MLE), which has optimal accuracy.

Design/methodology/approach

A two-stage procedure combining a quicker method and a more accurate one to estimate the Hurst exponent from a large to small range will be developed. For the best possible accuracy, the data-induction method is currently ideal for the first-stage estimator and the fast MLE is the best candidate for the second-stage estimator.

Findings

For signals modeled as discrete-time fractional Gaussian noise, the proposed two-stage estimator can save up to 41.18 per cent the computational time of the fast MLE while remaining almost as accurate as the fast MLE, and even for signals modeled as discrete-time fractional Brownian motion, it can also save about 35.29 per cent except for smaller data sizes.

Originality/value

The proposed two-stage estimation procedure is a novel idea. It can be expected that other fields of parameter estimation can apply the concept of the two-stage estimation procedure to raise computational performance while remaining almost as accurate as the more accurate of two estimators.

Details

Engineering Computations, vol. 34 no. 1
Type: Research Article
DOI: https://doi.org/10.1108/EC-01-2016-0036
ISSN: 0264-4401

Keywords

  • Hurst exponent
  • Fractional Brownian motion
  • Data-induction method
  • Fast maximum likelihood estimator
  • Fractional Gaussian noise
  • Maximum likelihood estimator

To view the access options for this content please click here
Article
Publication date: 9 September 2014

Long-range dependence in Indian stock market: a study of Indian sectoral indices

Dilip Kumar

The purpose of this paper is to test the efficient market hypothesis for major Indian sectoral indices by means of long memory approach in both time domain and frequency…

HTML
PDF (454 KB)

Abstract

Purpose

The purpose of this paper is to test the efficient market hypothesis for major Indian sectoral indices by means of long memory approach in both time domain and frequency domain. This paper also tests the accuracy of the detrended fluctuation analysis (DFA) approach and the local Whittle (LW) approach by means of Monte Carlo simulation experiments.

Design/methodology/approach

The author applies the DFA approach for the computation of the scaling exponent in the time domain. The robustness of the results is tested by the computation of the scaling exponent in the frequency domain by means of the LW estimator. The author applies moving sub-sample approach on DFA to study the evolution of market efficiency in Indian sectoral indices.

Findings

The Monte Carlo simulation experiments indicate that the DFA approach and the LW approach provides good estimates of the scaling exponent as the sample size increases. The author also finds that the efficiency characteristics of Indian sectoral indices and their stages of development are dynamic in nature.

Originality/value

This paper has both methodological and empirical originality. On the methodological side, the author tests the small sample properties of the DFA and the LW approaches by using simulated series of fractional Gaussian noise and find that both the approach possesses superior properties in terms of capturing the scaling behavior of asset prices. On the empirical side, the author studies the evolution of long-range dependence characteristics in Indian sectoral indices.

Details

International Journal of Emerging Markets, vol. 9 no. 4
Type: Research Article
DOI: https://doi.org/10.1108/IJoEM-09-2011-0090
ISSN: 1746-8809

Keywords

  • Hurst exponent
  • Detrended fluctuation analysis (DFA)
  • Local Whittle
  • Long-range dependence

To view the access options for this content please click here
Article
Publication date: 9 March 2010

A novel prediction‐based collision resolution algorithm

Shuangmao Yang and Wei Guo

The purpose of this paper is to introduce the performance of traditional collision resolution algorithm (CRA) under self‐similar traffic and present a prediction‐based CRA…

HTML
PDF (270 KB)

Abstract

Purpose

The purpose of this paper is to introduce the performance of traditional collision resolution algorithm (CRA) under self‐similar traffic and present a prediction‐based CRA for wireless media access. With experimentations, the method is there after evaluated.

Design/methodology/approach

The traditional traffic models are mostly based on “Poisson” model or “Bernoulli” process, but recent decade traffic measurements found the coexistence of both long‐ and short‐range dependence in network traffic. On the other hand, CRA is an effective strategy to improve the performance of multiple access protocol and it achieves the highest capacity among all known multiple access protocols under the Poisson traffic model. In this paper, a CRA model is built on OPNET to study the effects of different traffic traces such as the fractional autoregressive integrated moving average process with non‐Gaussian white driving sequence and the real traffic data that are captured at a well‐attended ACM Conference. The performance is compared of traditional CRA based on the self‐similar traffic model and Poisson model and a novel CRA based on time – series prediction theory under self‐similar traffic models is designed.

Findings

The traditional Poisson traffic model gets the best performance under traditional CRA, while the self‐similar traffic performance under traditional CRA is too poor to be applied in an actual network environment. For example, the Poisson traffic model obtains the biggest throughput, the smallest delay and smallest collision resolution numbers under traditional CRA. This paper demonstrates the first‐come first‐serve (FCFS) must be improved under self‐similar traffic models. The novel collision resolution strategy‐based prediction can provide better performance under self‐similar traffic. The parameters of performance, such as throughput, delay and collision resolution numbers, are better than the traditional CRA.

Originality/value

This paper presents a prediction‐based CRA for self‐similar traffic which is the combination of the FCFS and the prediction theory and also provides a new method to resolve packets colliding.

Details

COMPEL - The international journal for computation and mathematics in electrical and electronic engineering, vol. 29 no. 2
Type: Research Article
DOI: https://doi.org/10.1108/03321641011014878
ISSN: 0332-1649

Keywords

  • Collisions
  • Programming and algorithm theory
  • Predictor‐corrector methods
  • Wireless

To view the access options for this content please click here
Article
Publication date: 2 October 2009

The efficiency of African equity markets

David G. McMillan and Pako Thupayagale

In order to assess the informational efficiency of African equity markets (AEMs), the purpose of this paper is to examine long memory in both equity returns and volatility…

HTML
PDF (331 KB)

Abstract

Purpose

In order to assess the informational efficiency of African equity markets (AEMs), the purpose of this paper is to examine long memory in both equity returns and volatility using auto‐regressive fractionally integrated moving average (ARFIMA)‐FIGARCH/hyperbolic GARCH (HYGARCH) models.

Design/methodology/approach

In order to test for long memory, the behaviour of the auto‐correlation function for 11 AEMs is examined. Following the graphical analysis, the authors proceed to estimate ARFIMA‐FIGARCH and ARFIMA‐HYGARCH models, specifically designed to capture long‐memory dynamics.

Findings

The results show that these markets (largely) display a predictable component in returns; while evidence of long memory in volatility is very mixed. In comparison, results from the control of the UK and USA show short memory in returns while evidence of long memory in volatility is mixed. These results show that the behaviour of equity market returns and risks are dissimilar across markets and this may have implications for portfolio diversification and risk management strategies.

Practical implications

The results of the analysis may have important implications for portfolio diversification and risk management strategies.

Originality/value

The importance of this paper lies in it being the first to systematically analyse long‐memory dynamics for a range of AEMs. African markets are becoming increasingly important as a source of international portfolio diversification and risk management. Hence, the results here have implication for the conduct of international portfolio building, asset pricing and hedging.

Details

Studies in Economics and Finance, vol. 26 no. 4
Type: Research Article
DOI: https://doi.org/10.1108/10867370910995726
ISSN: 1086-7376

Keywords

  • Equity capital
  • Financial markets
  • Financial marketing
  • Africa
  • Portfolio investment

To view the access options for this content please click here
Article
Publication date: 1 October 2006

Formal calculus for real‐valued fractional Brownian motions prospects in systems science

Guy Jumarie

To define the main elements of a formal calculus which deals with fractional Brownian motion (fBm), and to examine its prospects of applications in systems science.

HTML
PDF (142 KB)

Abstract

Purpose

To define the main elements of a formal calculus which deals with fractional Brownian motion (fBm), and to examine its prospects of applications in systems science.

Design/methodology/approach

The approach is based on a generalization of the Maruyama's notation. The key is the new Taylor's series of fractional order f(x+h)=Eα(hαDα)f(x), where Eα( · ) is the Mittag‐Leffler function.

Findings

As illustrative applications of this formal calculus in systems science, one considers the linear quadratic Gaussian problem with fractal noises, the analysis of the equilibrium position of a system disturbed by a local fractal time, and a model of growing which involves fractal noises. And then, one examines what happens when one applies the maximum entropy principle to systems involving fBms (or shortly fractals).

Research limitations/implications

The framework of this paper is applied mathematics and engineering mathematics, and the results so obtained allow the practical analysis of stochastic dynamics subject to fractional noises.

Practical implications

The direct prospect of application of this approach is the analysis of some stock markets dynamics and some biological systems.

Originality/value

The fractional Taylor's series is new and thus so are all its implications.

Details

Kybernetes, vol. 35 no. 9
Type: Research Article
DOI: https://doi.org/10.1108/03684920610662430
ISSN: 0368-492X

Keywords

  • Cybernetics
  • Calculus
  • Optimal control
  • Systems theory

To view the access options for this content please click here
Article
Publication date: 28 January 2014

Multiple-period market risk prediction under long memory: when VaR is higher than expected

Harald Kinateder and Niklas Wagner

– The paper aims to model multiple-period market risk forecasts under long memory persistence in market volatility.

HTML
PDF (1.5 MB)

Abstract

Purpose

The paper aims to model multiple-period market risk forecasts under long memory persistence in market volatility.

Design/methodology/approach

The paper proposes volatility forecasts based on a combination of the GARCH(1,1)-model with potentially fat-tailed and skewed innovations and a long memory specification of the slowly declining influence of past volatility shocks. As the square-root-of-time rule is known to be mis-specified, the GARCH setting of Drost and Nijman is used as benchmark model. The empirical study of equity market risk is based on daily returns during the period January 1975 to December 2010. The out-of-sample accuracy of VaR predictions is studied for 5, 10, 20 and 60 trading days.

Findings

The long memory scaling approach remarkably improves VaR forecasts for the longer horizons. This result is only in part due to higher predicted risk levels. Ex post calibration to equal unconditional VaR levels illustrates that the approach also enhances efficiency in allocating VaR capital through time.

Practical implications

The improved VaR forecasts show that one should account for long memory when calibrating risk models.

Originality/value

The paper models single-period returns rather than choosing the simpler approach of modeling lower-frequency multiple-period returns for long-run volatility forecasting. The approach considers long memory in volatility and has two main advantages: it yields a consistent set of volatility predictions for various horizons and VaR forecasting accuracy is improved.

Details

The Journal of Risk Finance, vol. 15 no. 1
Type: Research Article
DOI: https://doi.org/10.1108/JRF-07-2013-0051
ISSN: 1526-5943

Keywords

  • GARCH
  • Hurst exponent
  • Long memory
  • Multiple-period value-at-risk
  • Square-root-of-time rule
  • Volatility scaling

To view the access options for this content please click here
Article
Publication date: 1 June 1999

Elements for a theory of fractals in dynamic systems involving human factors

Guy Jumarie

The theory which is addressed in this paper is that we should not be surprised to come across fractals in the analysis of some dynamic systems involving human factors…

HTML
PDF (136 KB)

Abstract

The theory which is addressed in this paper is that we should not be surprised to come across fractals in the analysis of some dynamic systems involving human factors. Moreover, in substance, fractals in human behaviour is acceptable. For the convenience of the reader a primary background to some models of fractional Brownian motions which can be found in the literature is given, and then the main features of the complex‐valued model, via a random walk in the complex plane, recently introduced by the author are recalled. The practical meaning of the model is exhibited. The parallel of the central limit theorem here is Levy’s stability. If it is supposed that human decision‐makers work via an observation process which combines the Heisenberg principle and a quantization principle in the measurement, then fractal dynamics appears to be quite in order. The relation with the theory of relative information is exhibited. The conjecture is then the following: could this model explain why fractals appear in finance, for instance?

Details

Kybernetes, vol. 28 no. 4
Type: Research Article
DOI: https://doi.org/10.1108/03684929910267743
ISSN: 0368-492X

Keywords

  • Behavioural sciences
  • Cybernetics
  • Fractals

To view the access options for this content please click here
Article
Publication date: 3 May 2016

An autoregressive approach to modeling commodity prices as a quasi-fractional Brownian motion

Calum G. Turvey and Paitoon Wongsasutthikul

The purpose of this paper is to argue that a stationary-differenced autoregressive (AR) process with lag greater than 1, AR(q > 1), has certain properties that are…

HTML
PDF (917 KB)

Abstract

Purpose

The purpose of this paper is to argue that a stationary-differenced autoregressive (AR) process with lag greater than 1, AR(q > 1), has certain properties that are consistent with a fractional Brownian motion (fBm). What the authors are interested in is the investigation of approaches to identifying the existence of persistent memory of one form or another for the purposes of simulating commodity (and other asset) prices. The authors show in theory, and with application to agricultural commodity prices the relationship between AR(q) and quasi-fBm.

Design/methodology/approach

In this paper the authors develop mathematical relationships in support of using AR(q > 1) processes for simulating quasi-fBm.

Findings

From theory the authors show that any AR(q) process is a stationary, self-similar process, with a lag structure that captures the essential elements of scaling and a fractional power law. The authors illustrate through various means the approach, and apply the quasi-fractional AR(q) process to agricultural commodity prices.

Research limitations/implications

While the results can be applied to most time series of commodity prices, the authors limit the evaluation to the Gaussian case. Thus the approach does not apply to infinite-variance models.

Practical implications

The approach to using the structure of an AR(q > 1) model to simulate quasi-fBm is a simple approach that can be applied with ease using conventional Monte Carlo methods.

Originality/value

The authors believe that the approach to simulating quasi-fBm using standard AR(q > 1) models is original. The approach is intuitive and can be applied easily.

Details

Agricultural Finance Review, vol. 76 no. 1
Type: Research Article
DOI: https://doi.org/10.1108/AFR-01-2016-0004
ISSN: 0002-1466

Keywords

  • Brownian motion
  • Commodity futures
  • Hurst coefficient
  • Quasi-fractional Brownian motion

To view the access options for this content please click here
Article
Publication date: 9 November 2020

A clean energy forecasting model based on artificial intelligence and fractional derivative grey Bernoulli models

Yonghong Zhang, Shuhua Mao and Yuxiao Kang

With the massive use of fossil energy polluting the natural environment, clean energy has gradually become the focus of future energy development. The purpose of this…

HTML
PDF (3.7 MB)

Abstract

Purpose

With the massive use of fossil energy polluting the natural environment, clean energy has gradually become the focus of future energy development. The purpose of this article is to propose a new hybrid forecasting model to forecast the production and consumption of clean energy.

Design/methodology/approach

Firstly, the memory characteristics of the production and consumption of clean energy were analyzed by the rescaled range analysis (R/S) method. Secondly, the original series was decomposed into several components and residuals with different characteristics by the ensemble empirical mode decomposition (EEMD) algorithm, and the residuals were predicted by the fractional derivative grey Bernoulli model [FDGBM (p, 1)]. The other components were predicted using artificial intelligence (AI) models (least square support vector regression [LSSVR] and artificial neural network [ANN]). Finally, the fitting values of each part were added to get the predicted value of the original series.

Findings

This study found that clean energy had memory characteristics. The hybrid models EEMD–FDGBM (p, 1)–LSSVR and EEMD–FDGBM (p, 1)–ANN were significantly higher than other models in the prediction of clean energy production and consumption.

Originality/value

Consider that clean energy has complex nonlinear and memory characteristics. In this paper, the EEMD method combined the FDGBM (P, 1) and AI models to establish hybrid models to predict the consumption and output of clean energy.

Details

Grey Systems: Theory and Application, vol. ahead-of-print no. ahead-of-print
Type: Research Article
DOI: https://doi.org/10.1108/GS-08-2020-0101
ISSN: 2043-9377

Keywords

  • Clean energy
  • EEMD decomposition
  • FDGBM (P, 1)
  • Artificial intelligence models

To view the access options for this content please click here
Book part
Publication date: 24 March 2006

Estimation of Long-Memory Time Series Models: a Survey of Different Likelihood-Based Methods

Ngai Hang Chan and Wilfredo Palma

Since the seminal works by Granger and Joyeux (1980) and Hosking (1981), estimations of long-memory time series models have been receiving considerable attention and a…

HTML
PDF (286 KB)

Abstract

Since the seminal works by Granger and Joyeux (1980) and Hosking (1981), estimations of long-memory time series models have been receiving considerable attention and a number of parameter estimation procedures have been proposed. This paper gives an overview of this plethora of methodologies with special focus on likelihood-based techniques. Broadly speaking, likelihood-based techniques can be classified into the following categories: the exact maximum likelihood (ML) estimation (Sowell, 1992; Dahlhaus, 1989), ML estimates based on autoregressive approximations (Granger & Joyeux, 1980; Li & McLeod, 1986), Whittle estimates (Fox & Taqqu, 1986; Giraitis & Surgailis, 1990), Whittle estimates with autoregressive truncation (Beran, 1994a), approximate estimates based on the Durbin–Levinson algorithm (Haslett & Raftery, 1989), state-space-based maximum likelihood estimates for ARFIMA models (Chan & Palma, 1998), and estimation of stochastic volatility models (Ghysels, Harvey, & Renault, 1996; Breidt, Crato, & de Lima, 1998; Chan & Petris, 2000) among others. Given the diversified applications of these techniques in different areas, this review aims at providing a succinct survey of these methodologies as well as an overview of important related problems such as the ML estimation with missing data (Palma & Chan, 1997), influence of subsets of observations on estimates and the estimation of seasonal long-memory models (Palma & Chan, 2005). Performances and asymptotic properties of these techniques are compared and examined. Inter-connections and finite sample performances among these procedures are studied. Finally, applications to financial time series of these methodologies are discussed.

Details

Econometric Analysis of Financial and Economic Time Series
Type: Book
DOI: https://doi.org/10.1016/S0731-9053(05)20023-3
ISBN: 978-1-84950-388-4

Access
Only content I have access to
Only Open Access
Year
  • Last 3 months (4)
  • Last 6 months (6)
  • Last 12 months (17)
  • All dates (108)
Content type
  • Article (87)
  • Book part (15)
  • Earlycite article (6)
1 – 10 of 108
Emerald Publishing
  • Opens in new window
  • Opens in new window
  • Opens in new window
  • Opens in new window
© 2021 Emerald Publishing Limited

Services

  • Authors Opens in new window
  • Editors Opens in new window
  • Librarians Opens in new window
  • Researchers Opens in new window
  • Reviewers Opens in new window

About

  • About Emerald Opens in new window
  • Working for Emerald Opens in new window
  • Contact us Opens in new window
  • Publication sitemap

Policies and information

  • Privacy notice
  • Site policies
  • Modern Slavery Act Opens in new window
  • Chair of Trustees governance statement Opens in new window
  • COVID-19 policy Opens in new window
Manage cookies

We’re listening — tell us what you think

  • Something didn’t work…

    Report bugs here

  • All feedback is valuable

    Please share your general feedback

  • Member of Emerald Engage?

    You can join in the discussion by joining the community or logging in here.
    You can also find out more about Emerald Engage.

Join us on our journey

  • Platform update page

    Visit emeraldpublishing.com/platformupdate to discover the latest news and updates

  • Questions & More Information

    Answers to the most commonly asked questions here