Search results

1 – 10 of over 83000
Article
Publication date: 20 November 2020

Lydie Myriam Marcelle Amelot, Ushad Subadar Agathee and Yuvraj Sunecher

This study constructs time series model, artificial neural networks (ANNs) and statistical topologies to examine the volatility and forecast foreign exchange rates. The Mauritian…

Abstract

Purpose

This study constructs time series model, artificial neural networks (ANNs) and statistical topologies to examine the volatility and forecast foreign exchange rates. The Mauritian forex market has been utilized as a case study, and daily data for nominal spot rate (during a time period of five years spanning from 2014 to 2018) for EUR/MUR, GBP/MUR, CAD/MUR and AUD/MUR have been applied for the predictions.

Design/methodology/approach

Autoregressive integrated moving average (ARIMA) and generalized autoregressive conditional heteroskedasticity (GARCH) models are used as a basis for time series modelling for the analysis, along with the non-linear autoregressive network with exogenous inputs (NARX) neural network backpropagation algorithm utilizing different training functions, namely, Levenberg–Marquardt (LM), Bayesian regularization and scaled conjugate gradient (SCG) algorithms. The study also features a hybrid kernel principal component analysis (KPCA) using the support vector regression (SVR) algorithm as an additional statistical tool to conduct financial market forecasting modelling. Mean squared error (MSE) and root mean square error (RMSE) are employed as indicators for the performance of the models.

Findings

The results demonstrated that the GARCH model performed better in terms of volatility clustering and prediction compared to the ARIMA model. On the other hand, the NARX model indicated that LM and Bayesian regularization training algorithms are the most appropriate method of forecasting the different currency exchange rates as the MSE and RMSE seemed to be the lowest error compared to the other training functions. Meanwhile, the results reported that NARX and KPCA–SVR topologies outperformed the linear time series models due to the theory based on the structural risk minimization principle. Finally, the comparison between the NARX model and KPCA–SVR illustrated that the NARX model outperformed the statistical prediction model. Overall, the study deduced that the NARX topology achieves better prediction performance results compared to time series and statistical parameters.

Research limitations/implications

The foreign exchange market is considered to be instable owing to uncertainties in the economic environment of any country and thus, accurate forecasting of foreign exchange rates is crucial for any foreign exchange activity. The study has an important economic implication as it will help researchers, investors, traders, speculators and financial analysts, users of financial news in banking and financial institutions, money changers, non-banking financial companies and stock exchange institutions in Mauritius to take investment decisions in terms of international portfolios. Moreover, currency rates instability might raise transaction costs and diminish the returns in terms of international trade. Exchange rate volatility raises the need to implement a highly organized risk management measures so as to disclose future trend and movement of the foreign currencies which could act as an essential guidance for foreign exchange participants. By this way, they will be more alert before conducting any forex transactions including hedging, asset pricing or any speculation activity, take corrective actions, thus preventing them from making any potential losses in the future and gain more profit.

Originality/value

This is one of the first studies applying artificial intelligence (AI) while making use of time series modelling, the NARX neural network backpropagation algorithm and hybrid KPCA–SVR to predict forex using multiple currencies in the foreign exchange market in Mauritius.

Details

African Journal of Economic and Management Studies, vol. 12 no. 1
Type: Research Article
ISSN: 2040-0705

Keywords

Article
Publication date: 30 March 2010

Ricardo de A. Araújo

The purpose of this paper is to present a new quantum‐inspired evolutionary hybrid intelligent (QIEHI) approach, in order to overcome the random walk dilemma for stock market…

1564

Abstract

Purpose

The purpose of this paper is to present a new quantum‐inspired evolutionary hybrid intelligent (QIEHI) approach, in order to overcome the random walk dilemma for stock market prediction.

Design/methodology/approach

The proposed QIEHI method is inspired by the Takens' theorem and performs a quantum‐inspired evolutionary search for the minimum necessary dimension (time lags) embedded in the problem for determining the characteristic phase space that generates the financial time series phenomenon. The approach presented in this paper consists of a quantum‐inspired intelligent model composed of an artificial neural network (ANN) with a modified quantum‐inspired evolutionary algorithm (MQIEA), which is able to evolve the complete ANN architecture and parameters (pruning process), the ANN training algorithm (used to further improve the ANN parameters supplied by the MQIEA), and the most suitable time lags, to better describe the time series phenomenon.

Findings

This paper finds that, initially, the proposed QIEHI method chooses the better prediction model, then it performs a behavioral statistical test to adjust time phase distortions that appear in financial time series. Also, an experimental analysis is conducted with the proposed approach using six real‐word stock market times series, and the obtained results are discussed and compared, according to a group of relevant performance metrics, to results found with multilayer perceptron networks and the previously introduced time‐delay added evolutionary forecasting method.

Originality/value

The paper usefully demonstrates how the proposed QIEHI method chooses the best prediction model for the times series representation and performs a behavioral statistical test to adjust time phase distortions that frequently appear in financial time series.

Details

International Journal of Intelligent Computing and Cybernetics, vol. 3 no. 1
Type: Research Article
ISSN: 1756-378X

Keywords

Book part
Publication date: 9 November 2023

Michał Bernardelli and Mariusz Próchniak

The comparison between economic growth and the character of monetary policy is one of the most frequently studied issues in policymaking. However, the number of studies…

Abstract

Research Background

The comparison between economic growth and the character of monetary policy is one of the most frequently studied issues in policymaking. However, the number of studies incorporating a dynamic time warping approach to analyse the similarity of macroeconomic variables is relatively small.

The Purpose of the Chapter

The study aims at assessing the mutual similarity among various variables representing the financial sector (including the monetary policy by the central bank) and the real sector (e.g. economic growth, industrial production, household consumption expenditure), as well as cross-similarity between both sectors.

Methodology

The analysis is based on the dynamic time warping (DTW) method, which allows for capturing various dimensions of changes of considered variables. This method is almost non-existent in the literature to compare financial and economic time series. The application of this method constitutes the main area of value added of the research. The analysis includes five variables representing the financial sector and five from the real sector. The study covers four countries: Czechia, Hungary, Poland and Romania and the 2010–2022 period (quarterly data).

Findings

The results show that variables representing the financial sector, including those reflecting monetary policy, are weakly correlated with each other, whereas the variables representing the real economy have a solid mutual similarity. As regards individual variables, for example, GDP fluctuations show relatively substantial similarity to ROE fluctuations – especially in Czechia and Hungary. In the case of Hungary and Romania, CAR fluctuations are consistent with GDP fluctuations. In the case of Poland and Hungary, there is a relatively strong similarity between the economy's monetisation and economic growth. Comparing the individual countries, two clusters of countries can be identified. One cluster includes Poland and Czechia, while another covers Hungary and Romania.

Details

Modeling Economic Growth in Contemporary Poland
Type: Book
ISBN: 978-1-83753-655-9

Keywords

Article
Publication date: 17 August 2012

Heping Pan

The purpose of this study is to discover and model the asymmetry in the price volatility of financial markets, in particular the foreign exchange markets as the first underlying…

Abstract

Purpose

The purpose of this study is to discover and model the asymmetry in the price volatility of financial markets, in particular the foreign exchange markets as the first underlying applications.

Design/methodology/approach

The volatility of the financial market price is usually defined with the standard deviation or variance of the price or price returns. This standard definition of volatility is split into the upper part and the lower one, which are termed here as Yang volatility and Yin volatility. However, the definition of yin‐yang volatility depends on the scale of the time, thus the notion of scale space of price‐time is also introduced.

Findings

It turns out that the duality of yin‐yang volatility expresses not only the asymmetry of price volatility, but also the information about the trend. The yin‐yang volatilities in the scale space of price‐time provide a complete representation of the information about the multi‐level trends and asymmetric volatilities. Such a representation is useful for designing strategies in market risk management and technical trading. A trading robot (a complete automated trading system) was developed using yin‐yang volatility, its performance is shown to be non‐trivial. The notion and model of yin‐yang volatility has opened up new possibilities to rewrite the option pricing formulas, the GARCH models, as well as to develop new comprehensive models for foreign exchange markets.

Research limitations/implications

The asymmetry of price volatility and the magnitude of volatility in the scale space of price‐time has yet to be united in a more coherent model.

Practical implications

The new model of yin‐yang volatility and scale space of price‐time provides a new theoretical structure for financial market risk. It is likely to enable a new generation of core technologies for market risk management and technical trading strategies.

Originality/value

This work is original. The new notion and model of yin‐yang volatility in scale space of price‐time has cracked up the core structure of the financial market risk. It is likely to open up new possibilities such as: a new portfolio theory with a new objective function to minimize the sum of the absolute yin‐volatilities of the asset returns, a new option pricing theory using yin‐yang volatility to replace the symmetric volatility, a new GARCH model aiming to model the dynamics of yin‐yang volatility instead of the symmetric volatility, new technical trading strategies as are shown in the paper.

Article
Publication date: 24 August 2021

N. Prabakaran, Rajasekaran Palaniappan, R. Kannadasan, Satya Vinay Dudi and V. Sasidhar

We propose a Machine Learning (ML) approach that will be trained from the available financial data and is able to gain the trends over the data and then uses the acquired…

Abstract

Purpose

We propose a Machine Learning (ML) approach that will be trained from the available financial data and is able to gain the trends over the data and then uses the acquired knowledge for a more accurate forecasting of financial series. This work will provide a more precise results when weighed up to aged financial series forecasting algorithms. The LSTM Classic will be used to forecast the momentum of the Financial Series Index and also applied to its commodities. The network will be trained and evaluated for accuracy with various sizes of data sets, i.e. weekly historical data of MCX, GOLD, COPPER and the results will be calculated.

Design/methodology/approach

Desirable LSTM model for script price forecasting from the perspective of minimizing MSE. The approach which we have followed is shown below. (1) Acquire the Dataset. (2) Define your training and testing columns in the dataset. (3) Transform the input value using scalar. (4) Define the custom loss function. (5) Build and Compile the model. (6) Visualise the improvements in results.

Findings

Financial series is one of the very aged techniques where a commerce person would commerce financial scripts, make business and earn some wealth from these companies that vend a part of their business on trading manifesto. Forecasting financial script prices is complex tasks that consider extensive human–computer interaction. Due to the correlated nature of financial series prices, conventional batch processing methods like an artificial neural network, convolutional neural network, cannot be utilised efficiently for financial market analysis. We propose an online learning algorithm that utilises an upgraded of recurrent neural networks called long short-term memory Classic (LSTM). The LSTM Classic is quite different from normal LSTM as it has customised loss function in it. This LSTM Classic avoids long-term dependence on its metrics issues because of its unique internal storage unit structure, and it helps forecast financial time series. Financial Series Index is the combination of various commodities (time series). This makes Financial Index more reliable than the financial time series as it does not show a drastic change in its value even some of its commodities are affected. This work will provide a more precise results when weighed up to aged financial series forecasting algorithms.

Originality/value

We had built the customised loss function model by using LSTM scheme and have experimented on MCX index and as well as on its commodities and improvements in results are calculated for every epoch that we run for the whole rows present in the dataset. For every epoch we can visualise the improvements in loss. One more improvement that can be done to our model that the relationship between price difference and directional loss is specific to other financial scripts. Deep evaluations can be done to identify the best combination of these for a particular stock to obtain better results.

Details

International Journal of Intelligent Computing and Cybernetics, vol. 14 no. 4
Type: Research Article
ISSN: 1756-378X

Keywords

Book part
Publication date: 30 November 2011

Massimo Guidolin

I review the burgeoning literature on applications of Markov regime switching models in empirical finance. In particular, distinct attention is devoted to the ability of Markov…

Abstract

I review the burgeoning literature on applications of Markov regime switching models in empirical finance. In particular, distinct attention is devoted to the ability of Markov Switching models to fit the data, filter unknown regimes and states on the basis of the data, to allow a powerful tool to test hypotheses formulated in light of financial theories, and to their forecasting performance with reference to both point and density predictions. The review covers papers concerning a multiplicity of sub-fields in financial economics, ranging from empirical analyses of stock returns, the term structure of default-free interest rates, the dynamics of exchange rates, as well as the joint process of stock and bond returns.

Details

Missing Data Methods: Time-Series Methods and Applications
Type: Book
ISBN: 978-1-78052-526-6

Keywords

Article
Publication date: 2 August 2011

Abdullahi D. Ahmed and Abu N.M. Wahid

This paper aims to use the newly developed panel data cointegration analysis and the dynamic time series modeling approach to examine the linkages between financial structure…

2681

Abstract

Purpose

This paper aims to use the newly developed panel data cointegration analysis and the dynamic time series modeling approach to examine the linkages between financial structure (market‐based vs bank‐based) and economic growth in African economies.

Design/methodology/approach

The research investigates the dynamic relationship between financial structure and economic growth in a panel of a group of seven African developing countries over the period of 1986‐2007. The paper uses various indicators/measures of financial structure and financial system, and employs the traditional timeseries analysis for causality as well as the newly developed panel unit root and cointegration techniques and estimated finance‐growth relationship using FMOLS for heterogeneous panel.

Findings

From the dynamic heterogeneous panel approach, the paper firstly finds that market‐based financial system is important for explaining output growth through enhancing efficiency and productivity. Second, the authors' empirical evidence supports the view that higher levels of banking system development are positively associated with capital accumulation growth and lead to faster rates of economic growth.

Originality/value

Panel cointegration, group mean panel FMOLS and country‐by‐country time series investigations indicate that the market‐based financial system is important for explaining output growth through enhancing efficiency and productivity, whereas the development of banking system is significantly associated with capital accumulation growth. Further results from the timeseries approach show evidence of unidirectional causality running from market‐oriented as well as bank‐oriented financial systems to economic growth.

Details

Journal of Economic Studies, vol. 38 no. 3
Type: Research Article
ISSN: 0144-3585

Keywords

Book part
Publication date: 28 March 2022

Kristina Bojare

Introduction: The Great Financial Crisis of 2008 highlighted the importance of financial cycle fluctuations. While the regulatory response was to mandate higher bank capital

Abstract

Introduction: The Great Financial Crisis of 2008 highlighted the importance of financial cycle fluctuations. While the regulatory response was to mandate higher bank capital requirements during the financial cycle upswing, academic research focussed on identifying the best performing early warning indicators to forecast financial cycle fluctuations that have proven to be often unrelated to business cycle changes. To safeguard the global financial system against the financial cycle fluctuations, Basel Committee of Banking Supervisors, based on first strands of empirical evidence, proposed the credit-to-GDP gap as the headline indicator tied to the countercyclical capital buffer. However, later research on this indicator identified certain concerns, among them subpar performance for economies with short available data series.

Aim of the Study: To this end this study aims to analyse various financial cycle indicators from a unique perspective of their potential viability under limited historical data availability.

Methods: For this purpose, a meta-study of existing research is carried out as well as an empirical study to compare performance of certain indicators for the sample of six countries in the Central, Eastern and South-Eastern European region, where long data series are not available.

Main Findings: It was found that certain approaches, among them calculation of raw credit growth rate and application of Hamilton filter, can supplement or possibly even outperform the Basel credit-to-GDP gap indicator under limited data availability.

Conclusion: Author concludes that for limited time series Basel credit-to-GDP gap can be potentially outperformed by other indicators and further research in this currently under-studied field is warranted.

Originality of the Paper: By using various financial cycle indicators that already proven their early warning prediction powers from previous research, this study focusses on their potential viability under limited historical data availability. Respective findings might be appreciated for supplementing policy-makers’ toolkits as complementary indicators in cases where there is no available long time series for financial cycle estimation, for example, such as countries that entered market economies relatively late.

Details

Managing Risk and Decision Making in Times of Economic Distress, Part B
Type: Book
ISBN: 978-1-80262-971-2

Keywords

Article
Publication date: 25 March 2022

Fatemeh Yazdani, Mehdi Khashei and Seyed Reza Hejazi

This paper aims to detect the most profitable, i.e. optimal turning points (TPs), from the history of time series using a binary integer programming (BIP) model. TPs prediction…

Abstract

Purpose

This paper aims to detect the most profitable, i.e. optimal turning points (TPs), from the history of time series using a binary integer programming (BIP) model. TPs prediction problem is one of the most popular yet challenging topics in financial planning. Predicting profitable TPs results in earning profit by offering the opportunity to buy at low and selling at high. TPs detected from the history of time series will be used as the prediction model’s input. According to the literature, the predicted TPs’ profitability depends on the detected TPs’ profitability. Therefore, research for improving the profitability of detection methods has been never given up. Nevertheless, to the best of our knowledge, none of the existing methods can detect the optimal TPs.

Design/methodology/approach

The objective function of our model maximizes the profit of adopting all the trading strategies. The decision variables represent whether or not to detect the breakpoints as TPs. The assumptions of the model are as follows. Short-selling is possible. The time value for the money is not considered. Detection of consecutive buying (selling) TPs is not possible.

Findings

Empirical results with 20 data sets from Shanghai Stock Exchange indicate that the model detects the optimal TPs.

Originality/value

The proposed model, in contrast to the other methods, can detect the optimal TPs. Additionally, the proposed model, in contrast to the other methods, requires transaction cost as its only input parameter. This advantage reduces the process’ calculations.

Details

Journal of Modelling in Management, vol. 18 no. 5
Type: Research Article
ISSN: 1746-5664

Keywords

Article
Publication date: 4 February 2021

Erkki K. Laitinen

The purpose of this study is to analyze the business-failure-process risk from two perspectives. First, a simplified model of the loss-generation process in a failing firm is…

Abstract

Purpose

The purpose of this study is to analyze the business-failure-process risk from two perspectives. First, a simplified model of the loss-generation process in a failing firm is developed to show that the linear system embedded in accounting makes financial ratios to depend linearly on each other. Second, a simplified model of the development of the risk during the failure process is developed to introduce a new concept of failure-process-risk line (FPRL) to assess the systematic failure risk of a firm. Empirical evidence from Finnish firms is used to test two hypotheses.

Design/methodology/approach

This study makes use of simple mathematical modeling to depict the loss-generation process and the development of failure risk during the failure process. Hypotheses are extracted from the mathematical results for empirical testing. Time-series data originally from 13,082 non-failing and 515 failing Finnish are used to test the hypotheses. Analysis of variance F statistics and Mann–Whitney U test are used in testing of the hypotheses.

Findings

The findings show that the linear time-series correlations are generally higher in failing than in non-failing firms because of the loss-generation process. The FPRL depicted efficiently the systematic failure-process risk through the beta coefficient. Beta coefficient efficiently discriminated between failing and non-failing firms. The difference between the last-period risk estimate and FPRL was largely determined by the approximated growth rate of the periodic failure risk.

Research limitations/implications

The loss-generation process is based on a simple cash-based approach ignoring the growth of the firm. In future research, the model could be generalized to a growing firm in an accrual-based framework. The failure-process risk is assumed to grow at a constant rate. In further studies, more general models could be applied. Empirical analyses are based on simple statistical methods and tests. More advanced methods could be used to analyze the data.

Practical implications

This study shows that failure process makes the time-series correlation between financial ratios to increase making their signals of failure consistent and allowing the use of static classification models to assess failure risk. The beta coefficient is a useful tool to reflect systematic failure-process risk. In addition, it can be used in practice to warn a firm about ongoing failure process.

Originality/value

To the best of the author’s knowledge, this is the first study analyzing systematically business-failure-process risk. It is first in introducing a mathematical loss-generation process and the FPRL based on the beta coefficient assessing the systematic failure risk.

Details

Journal of Financial Reporting and Accounting, vol. 19 no. 4
Type: Research Article
ISSN: 1985-2517

Keywords

1 – 10 of over 83000