Search results

1 – 10 of 814
Book part
Publication date: 19 November 2014

Garland Durham and John Geweke

Massively parallel desktop computing capabilities now well within the reach of individual academics modify the environment for posterior simulation in fundamental and potentially…

Abstract

Massively parallel desktop computing capabilities now well within the reach of individual academics modify the environment for posterior simulation in fundamental and potentially quite advantageous ways. But to fully exploit these benefits algorithms that conform to parallel computing environments are needed. This paper presents a sequential posterior simulator designed to operate efficiently in this context. The simulator makes fewer analytical and programming demands on investigators, and is faster, more reliable, and more complete than conventional posterior simulators. The paper extends existing sequential Monte Carlo methods and theory to provide a thorough and practical foundation for sequential posterior simulation that is well suited to massively parallel computing environments. It provides detailed recommendations on implementation, yielding an algorithm that requires only code for simulation from the prior and evaluation of prior and data densities and works well in a variety of applications representative of serious empirical work in economics and finance. The algorithm facilitates Bayesian model comparison by producing marginal likelihood approximations of unprecedented accuracy as an incidental by-product, is robust to pathological posterior distributions, and provides estimates of numerical standard error and relative numerical efficiency intrinsically. The paper concludes with an application that illustrates the potential of these simulators for applied Bayesian inference.

Article
Publication date: 1 November 2001

J.G. Marakis, J. Chamiço, G. Brenner and F. Durst

Notes that, in a full‐scale application of the Monte Carlo method for combined heat transfer analysis, problems usually arise from the large computing requirements. Here the…

Abstract

Notes that, in a full‐scale application of the Monte Carlo method for combined heat transfer analysis, problems usually arise from the large computing requirements. Here the method to overcome this difficulty is the parallel execution of the Monte Carlo method in a distributed computing environment. Addresses the problem of determination of the temperature field formed under the assumption of radiative equilibrium in an enclosure idealizing an industrial furnace. The medium contained in this enclosure absorbs, emits and scatters anisotropically thermal radiation. Discusses two topics in detail: first, the efficiency of the parallelization of the developed code, and second, the influence of the scattering behavior of the medium. The adopted parallelization method for the first topic is the decomposition of the statistical sample and its subsequent distribution among the available processors. The measured high efficiencies showed that this method is particularly suited to the target architecture of this study, which is a dedicated network of workstations supporting the message passing paradigm. For the second topic, the results showed that taking into account the isotropic scattering, as opposed to neglecting the scattering, has a pronounced impact on the temperature distribution inside the enclosure. In contrast, the consideration of the sharply forward scattering, that is characteristic of all the real combustion particles, leaves the predicted temperature field almost undistinguishable from the absorbing/emitting case.

Details

International Journal of Numerical Methods for Heat & Fluid Flow, vol. 11 no. 7
Type: Research Article
ISSN: 0961-5539

Keywords

Book part
Publication date: 19 November 2014

Martin Burda

The BEKK GARCH class of models presents a popular set of tools for applied analysis of dynamic conditional covariances. Within this class the analyst faces a range of model…

Abstract

The BEKK GARCH class of models presents a popular set of tools for applied analysis of dynamic conditional covariances. Within this class the analyst faces a range of model choices that trade off flexibility with parameter parsimony. In the most flexible unrestricted BEKK the parameter dimensionality increases quickly with the number of variables. Covariance targeting decreases model dimensionality but induces a set of nonlinear constraints on the underlying parameter space that are difficult to implement. Recently, the rotated BEKK (RBEKK) has been proposed whereby a targeted BEKK model is applied after the spectral decomposition of the conditional covariance matrix. An easily estimable RBEKK implies a full albeit constrained BEKK for the unrotated returns. However, the degree of the implied restrictiveness is currently unknown. In this paper, we suggest a Bayesian approach to estimation of the BEKK model with targeting based on Constrained Hamiltonian Monte Carlo (CHMC). We take advantage of suitable parallelization of the problem within CHMC utilizing the newly available computing power of multi-core CPUs and Graphical Processing Units (GPUs) that enables us to deal effectively with the inherent nonlinear constraints posed by covariance targeting in relatively high dimensions. Using parallel CHMC we perform a model comparison in terms of predictive ability of the targeted BEKK with the RBEKK in the context of an application concerning a multivariate dynamic volatility analysis of a Dow Jones Industrial returns portfolio. Although the RBEKK does improve over a diagonal BEKK restriction, it is clearly dominated by the full targeted BEKK model.

Details

Bayesian Model Comparison
Type: Book
ISBN: 978-1-78441-185-5

Keywords

Article
Publication date: 3 August 2020

Swapnali Makdey, Rajendra Patrikar and Mohammad Farukh Hashmi

A “spin-diode” is the spintronics equivalent of an electrical diode: applying an external magnetic field greater than the limit of spin-diode BT flips the spin-diode between an…

Abstract

Purpose

A “spin-diode” is the spintronics equivalent of an electrical diode: applying an external magnetic field greater than the limit of spin-diode BT flips the spin-diode between an isolating state and a conducting state [1]. While conventional electrical diodes are two-terminal devices with electrical current between the two terminals modulated by an electrical field, these two-terminal magneto resistive devices can generally be referred to as “spin-diodes” in which a magnetic field modulates the electrical current between the two terminals.

Design/methodology/approach

Current modulation and rectification are an important subject of electronics as well as spintronics spin diode is two-terminal magnetoresistive devices in which change in resistance in response to an applied magnetic field; this magnetoresistance occurs due to a variety of phenomena and with varying magnitudes and directions.

Findings

In this paper, an efficient rectifying spin diode is introduced. The resulting spin diode is formed from graphene gallium and indium quantum dots and antimony-doped molybdenum disulfide. Converting an alternating bias voltage to direct current is the main achievement of this model device with an additional profit of rectified spin-current. The non-equilibrium density functional theory with a Monte Carlo sampling method is used to evaluate the flow of electrons and rectification ratio of the system.

Originality/value

The results indicate that spin diode displaying both spin-current and charge-current rectification should be possible and may find practical application in nanoscale devices that combine logic and memory functions.

Details

Circuit World, vol. 47 no. 4
Type: Research Article
ISSN: 0305-6120

Keywords

Article
Publication date: 16 October 2009

Dongqing Zhang, Xuanxi Ning and Xueni Liu

As the conventional multistep‐ahead prediction may be unsuitable in some cases, the purpose of this paper is to propose a novel method based on joint probability distributions…

261

Abstract

Purpose

As the conventional multistep‐ahead prediction may be unsuitable in some cases, the purpose of this paper is to propose a novel method based on joint probability distributions, which provides the most probable estimation for the predicted trajectory.

Design/methodology/approach

Many real‐time series can be modeled in hidden Markov models. In order to predict these time series online, sequential Monte Carlo (SMC) method is applied for joint multistep‐ahead prediction.

Findings

The data of monthly national air passengers in China are analyzed, and the experimental results demonstrate that the method proposed and the corresponding online algorithms are effective.

Research limitations/implications

In this paper, SMC method is applied for joint multistep‐ahead prediction. However, with the increasing of prediction step, the number of particles is increasing exponentially, which means that the prediction steps cannot be too large.

Practical implications

A very useful advice for researchers who study time‐series forecasts.

Originality/value

A novel method of multistep‐ahead prediction based on joint probability distribution is proposed and SMC method is applied to prediction time series online. This paper is aimed at those researchers who focus on time‐series forecasts.

Details

Kybernetes, vol. 38 no. 10
Type: Research Article
ISSN: 0368-492X

Keywords

Article
Publication date: 1 March 2004

Xingbin Yu and Chanan Singh

This paper proposes a method of probabilistic security analysis of power systems including protection system failures. Protection system failure is the main cause of cascading…

1558

Abstract

This paper proposes a method of probabilistic security analysis of power systems including protection system failures. Protection system failure is the main cause of cascading outages. A protection system reliability model including two major protection failure modes is adopted to demonstrate the effects of different protection failure modes on power system reliability. The mechanism and scheme of protection system have been analyzed for their contribution to cascading outages as well as system stability after a fault occurs. All contingencies and responses in the power system are depicted in their inherent stochastic manner. Therefore, all simulations in this paper contain the features of a real power system. Non‐sequential Monte Carlo simulation approach is used to implement the stochastic properties of component contingencies and protection system failures. The WSCC‐9 bus system is used as the security test system. The security index “probability of stability” is calculated to quantify the vulnerability of a power system under cascading outages.

Details

COMPEL - The international journal for computation and mathematics in electrical and electronic engineering, vol. 23 no. 1
Type: Research Article
ISSN: 0332-1649

Keywords

Article
Publication date: 5 July 2022

Firano Zakaria and Anass Benbachir

One of the crucial issues in the contemporary finance is the prediction of the volatility of financial assets. In this paper, the authors are interested in modelling the…

Abstract

Purpose

One of the crucial issues in the contemporary finance is the prediction of the volatility of financial assets. In this paper, the authors are interested in modelling the stochastic volatility of the MAD/EURO and MAD/USD exchange rates.

Design/methodology/approach

For this purpose, the authors have adopted Bayesian approach based on the MCMC (Monte Carlo Markov Chain) algorithm which permits to reproduce the main stylized empirical facts of the assets studied. The data used in this study are the daily historical series of MAD/EURO and MAD/USD exchange rates covering the period from February 2, 2000, to March 3, 2017, which represent 4,456 observations.

Findings

By the aid of this approach, the authors were able to estimate all the random parameters of the stochastic volatility model which permit the prediction of the future exchange rates. The authors also have simulated the histograms, the posterior densities as well as the cumulative averages of the model parameters. The predictive efficiency of the stochastic volatility model for Morocco is capable to facilitate the management of the exchange rate in more flexible exchange regime to ensure better targeting of monetary and exchange policies.

Originality/value

To the best of the authors’ knowledge, the novelty of the paper lies in the production of a tool for predicting the evolution of the Moroccan exchange rate and also the design of a tool for the monetary authorities who are today in a proactive conception of management of the rate of exchange. Cyclical policies such as monetary policy and exchange rate policy will introduce this type of modelling into the decision-making process to achieve a better stabilization of the macroeconomic and financial framework.

Details

Journal of Modelling in Management, vol. 18 no. 5
Type: Research Article
ISSN: 1746-5664

Keywords

Article
Publication date: 11 August 2023

Jianhui Liu, Ziyang Zhang, Longxiang Zhu, Jie Wang and Yingbao He

Due to the limitation of experimental conditions and budget, fatigue data of mechanical components are often scarce in practical engineering, which leads to low reliability of…

Abstract

Purpose

Due to the limitation of experimental conditions and budget, fatigue data of mechanical components are often scarce in practical engineering, which leads to low reliability of fatigue data and reduces the accuracy of fatigue life prediction. Therefore, this study aims to expand the available fatigue data and verify its reliability, enabling the achievement of life prediction analysis at different stress levels.

Design/methodology/approach

First, the principle of fatigue life probability percentiles consistency and the perturbation optimization technique is used to realize the equivalent conversion of small samples fatigue life test data at different stress levels. Meanwhile, checking failure model by fitting the goodness of fit test and proposing a Monte Carlo method based on the data distribution characteristics and a numerical simulation strategy of directional sampling is used to extend equivalent data. Furthermore, the relationship between effective stress and characteristic life is analyzed using a combination of the Weibull distribution and the Stromeyer equation. An iterative sequence is established to obtain predicted life.

Findings

The TC4–DT titanium alloy is selected to assess the accuracy and reliability of the proposed method and the results show that predicted life obtained with the proposed method is within the double dispersion band, indicating high accuracy.

Originality/value

The purpose of this study is to provide a reference for the expansion of small sample fatigue test data, verification of data reliability and prediction of fatigue life data. In addition, the proposed method provides a theoretical basis for engineering applications.

Details

International Journal of Structural Integrity, vol. 14 no. 5
Type: Research Article
ISSN: 1757-9864

Keywords

Article
Publication date: 10 May 2018

Shoudong Chen, Yan-lin Sun and Yang Liu

In the process of discussing the relationship between volume and price in the stock market, the purpose of this paper is to consider how to take the flow of foreign capital into…

Abstract

Purpose

In the process of discussing the relationship between volume and price in the stock market, the purpose of this paper is to consider how to take the flow of foreign capital into consideration, to determine whether the inclusion of volume information really contributes to the prediction of the volatility of the stock price.

Design/methodology/approach

By comparing the relative advantages and disadvantages of the two main non-parametric methods mainstream, and taking the characteristics of the time series of the volume into consideration, the stochastic volatility with Volume (SV-VOL) model based on the APF-LW simulation method is used in the end, to explore and implement a more efficient estimation algorithm. And the volume is incorporated into the model for submersible quantization, by which the problem of insufficient use of volume information in previous research has been solved, which means that the development of the SV model is realized.

Findings

Through the Sequential Monte Carlo (SMC) algorithm, the effective estimation of the SV-VOL model is realized by programming. It is found that the stock market volume information is helpful to the prediction of the volatility of the stock price. The exchange market volume information affects the stock returns and the price-volume relationship, which is achieved indirectly through the net capital into stock market. The current exchange devaluation and fluctuation are not conducive to the restoration and recovery of the stock market.

Research limitations/implications

It is still in the exploratory stage that whether the inclusion of volume information really contributes to the prediction of the volatility of the stock price, and how to incorporate the exchange market volume information. This paper tries to determine the information weight of the exchange market volume according to the direct and indirect channels from the perspective of causality. The relevant practices and conclusions need to be tested and perfected.

Practical implications

Previous studies have neglected the influence of the information contained in the exchange market volume on the volatility of stock prices. To a certain extent, this research makes a useful supplement to the existing research, especially in the aspects of research problems, research paradigms, research methods and research conclusion.

Originality/value

SV model with volume information can not only effectively solve the inefficiency of information use problem contained in volume in traditional practice, but also further improve the estimation accuracy of the model by introducing the exchange market volume information into the model through weighted processing, which is a useful supplement to the existing literature. The SMC algorithm realized by programming is helpful to the further advancement and development of non-parametric algorithms. And this paper has made a useful attempt to determine the weight of the exchange market volume information, and some useful conclusions are drawn.

Details

China Finance Review International, vol. 8 no. 3
Type: Research Article
ISSN: 2044-1398

Keywords

Article
Publication date: 19 June 2017

Janusz Marian Bedkowski and Timo Röhling

This paper aims to focus on real-world mobile systems, and thus propose relevant contribution to the special issue on “Real-world mobile robot systems”. This work on 3D laser…

Abstract

Purpose

This paper aims to focus on real-world mobile systems, and thus propose relevant contribution to the special issue on “Real-world mobile robot systems”. This work on 3D laser semantic mobile mapping and particle filter localization dedicated for robot patrolling urban sites is elaborated with a focus on parallel computing application for semantic mapping and particle filter localization. The real robotic application of patrolling urban sites is the goal; thus, it has been shown that crucial robotic components have reach high Technology Readiness Level (TRL).

Design/methodology/approach

Three different robotic platforms equipped with different 3D laser measurement system were compared. Each system provides different data according to the measured distance, density of points and noise; thus, the influence of data into final semantic maps has been compared. The realistic problem is to use these semantic maps for robot localization; thus, the influence of different maps into particle filter localization has been elaborated. A new approach has been proposed for particle filter localization based on 3D semantic information, and thus, the behavior of particle filter in different realistic conditions has been elaborated. The process of using proposed robotic components for patrolling urban site, such as the robot checking geometrical changes of the environment, has been detailed.

Findings

The focus on real-world mobile systems requires different points of view for scientific work. This study is focused on robust and reliable solutions that could be integrated with real applications. Thus, new parallel computing approach for semantic mapping and particle filter localization has been proposed. Based on the literature, semantic 3D particle filter localization has not yet been elaborated; thus, innovative solutions for solving this issue have been proposed. Recently, a semantic mapping framework that was already published was developed. For this reason, this study claimed that the authors’ applied studies during real-world trials with such mapping system are added value relevant for this special issue.

Research limitations/implications

The main problem is the compromise between computer power and energy consumed by heavy calculations, thus our main focus is to use modern GPGPU, NVIDIA PASCAL parallel processor architecture. Recent advances in GPGPUs shows great potency for mobile robotic applications, thus this study is focused on increasing mapping and localization capabilities by improving the algorithms. Current limitation is related with the number of particles processed by a single processor, and thus achieved performance of 500 particles in real-time is the current limitation. The implication is that multi-GPU architectures for increasing the number of processed particle can be used. Thus, further studies are required.

Practical implications

The research focus is related to real-world mobile systems; thus, practical aspects of the work are crucial. The main practical application is semantic mapping that could be used for many robotic applications. The authors claim that their particle filter localization is ready to integrate with real robotic platforms using modern 3D laser measurement system. For this reason, the authors claim that their system can improve existing autonomous robotic platforms. The proposed components can be used for detection of geometrical changes in the scene; thus, many practical functionalities can be applied such as: detection of cars, detection of opened/closed gate, etc. […] These functionalities are crucial elements of the safe and security domain.

Social implications

Improvement of safe and security domain is a crucial aspect of modern society. Protecting critical infrastructure plays an important role, thus introducing autonomous mobile platforms capable of supporting human operators of safe and security systems could have a positive impact if viewed from many points of view.

Originality/value

This study elaborates the novel approach of particle filter localization based on 3D data and semantic mapping. This original work could have a great impact on the mobile robotics domain, and thus, this study claims that many algorithmic and implementation issues were solved assuming real-task experiments. The originality of this work is influenced by the use of modern advanced robotic systems being a relevant set of technologies for proper evaluation of the proposed approach. Such a combination of experimental hardware and original algorithms and implementation is definitely an added value.

Details

Industrial Robot: An International Journal, vol. 44 no. 4
Type: Research Article
ISSN: 0143-991X

Keywords

1 – 10 of 814