Search results
11 – 20 of over 13000
The purpose of this paper is to define the related concepts and theorems about the second moment process in stochastic analysis on time scales.
Abstract
Purpose
The purpose of this paper is to define the related concepts and theorems about the second moment process in stochastic analysis on time scales.
Design/methodology/approach
The study on stochastic analysis now includes two special conditions, namely discrete and continual conditions. However, in some cases, conditions are time scales, so that a new concept, namely stochastic analysis on time scales, is needed. Applying the time scales theory to the second moment process in stochastic analysis, related concept foundation of stochastic analysis on time scales has been established.
Findings
The need for the theory about stochastic analysis on time scales is recognized.
Practical implications
This is a very useful theory in related fields in future.
Originality/value
Stochastic analysis on time scales expand applying fields of stochastic analysis, and will be helpful to related fields.
Details
Keywords
This paper investigates forecasting US Treasury bond and Dollar Eurocurrency rates using the stochastic unit root (STUR) model of Leybourne et al. (1996), and the stochastic…
Abstract
This paper investigates forecasting US Treasury bond and Dollar Eurocurrency rates using the stochastic unit root (STUR) model of Leybourne et al. (1996), and the stochastic cointegration (SC) model of Harris et al. (2002, 2006). Both models have time-varying parameter representations and are conceptually attractive for modelling interest rates as both allow for conditional heteroscedasticity. I find that for many of the series considered STUR and SC models generate statistically significant gains in out-of-sample forecasting accuracy relative to simple orthodox models. The results obtained highlight the usefulness of these extensions and raise some issues for future research.
Diep Duong and Norman R. Swanson
The topic of volatility measurement and estimation is central to financial and more generally time-series econometrics. In this chapter, we begin by surveying models of…
Abstract
The topic of volatility measurement and estimation is central to financial and more generally time-series econometrics. In this chapter, we begin by surveying models of volatility, both discrete and continuous, and then we summarize some selected empirical findings from the literature. In particular, in the first sections of this chapter, we discuss important developments in volatility models, with focus on time-varying and stochastic volatility as well as nonparametric volatility estimation. The models discussed share the common feature that volatilities are unobserved and belong to the class of missing variables. We then provide empirical evidence on “small” and “large” jumps from the perspective of their contribution to overall realized variation, using high-frequency price return data on 25 stocks in the DOW 30. Our “small” and “large” jump variations are constructed at three truncation levels, using extant methodology of Barndorff-Nielsen and Shephard (2006), Andersen, Bollerslev, and Diebold (2007), and Aït-Sahalia and Jacod (2009a, 2009b, 2009c). Evidence of jumps is found in around 22.8% of the days during the 1993–2000 period, much higher than the corresponding figure of 9.4% during the 2001–2008 period. Although the overall role of jumps is lessening, the role of large jumps has not decreased, and indeed, the relative role of large jumps, as a proportion of overall jumps, has actually increased in the 2000s.
Details
Keywords
Introduction Operations research, i.e. the application of scientific methodology to operational problems in the search for improved understanding and control, can be said to have…
Abstract
Introduction Operations research, i.e. the application of scientific methodology to operational problems in the search for improved understanding and control, can be said to have started with the application of mathematical tools to military problems of supply bombing and strategy, during the Second World War. Post‐war these tools were applied to business problems, particularly production scheduling, inventory control and physical distribution because of the acute shortages of goods and the numerical aspects of these problems.
A new approach to living organisms with irreversibly perturbed homeostatis, based on the integrated moving average IMA (0,1,1) and autoregressive‐integrated moving average ARIMA…
Abstract
A new approach to living organisms with irreversibly perturbed homeostatis, based on the integrated moving average IMA (0,1,1) and autoregressive‐integrated moving average ARIMA (1,2,1) linear stochastic models of non‐stationary photon emission processes, is proposed. The approach consists of introducing a stochastic formulation of the transfer function and memory functional into a general description of non‐equilibrium states of perturbed organisms. A memory function‐based quantitative measure of perturbed biohomeostasis has also been proposed and discussed.
Details
Keywords
Mircea Fratila, Rindra Ramarotafika, Abdelkader Benabou, Stéphane Clénet and Abdelmonaïm Tounzi
To take account of the uncertainties introduced on the magnetic properties during the manufacturing process, the present work aims to focus on the stochastic modelling of iron…
Abstract
Purpose
To take account of the uncertainties introduced on the magnetic properties during the manufacturing process, the present work aims to focus on the stochastic modelling of iron losses in electrical machine stators.
Design/methodology/approach
The investigated samples are composed of 28 slinky stators, coming from the same production chain. The stochastic modelling approach is first described. Thereafter, the Monte‐Carlo sampling method is used to calculate, in post‐processing, the iron loss density in a PMSM that is modelled by the finite element method.
Findings
The interest of such an approach is emphasized by calculating the main statistical characteristics associated to the losses variability, which are Gaussian distributed for A and Ω formulations.
Originality/value
The originality of the approach is due to the fact that the global influence of the manufacturing process (cutting, assembly, …) on magnetic properties of the considered samples is taken into account in the way of computing the iron losses.
Details
Keywords
Ramzi Lajili, Olivier Bareille, Mohamed Lamjed Bouazizi, Mohamed Ichchou and Noureddine Bouhaddi
This paper aims to propose numerical-based and experiment-based identification processes, accounting for uncertainties to identify structural parameters, in a wave propagation…
Abstract
Purpose
This paper aims to propose numerical-based and experiment-based identification processes, accounting for uncertainties to identify structural parameters, in a wave propagation framework.
Design/methodology/approach
A variant of the inhomogeneous wave correlation (IWC) method is proposed. It consists on identifying the propagation parameters, such as the wavenumber and the wave attenuation, from the frequency response functions. The latters can be computed numerically or experimentally. The identification process is thus called numerical-based or experiment-based, respectively. The proposed variant of the IWC method is then combined with the Latin hypercube sampling method for uncertainty propagation. Stochastic processes are consequently proposed allowing more realistic identification.
Findings
The proposed variant of the IWC method permits to identify accurately the propagation parameters of isotropic and composite beams, whatever the type of the identification process in which it is included: numerical-based or experiment-based. Its efficiency is proved with respect to an analytical model and the Mc Daniel method, considered as reference. The application of the stochastic identification processes shows good agreement between simulation and experiment-based results and that all identified parameters are affected by uncertainties, except damping.
Originality/value
The proposed variant of the IWC method is an accurate alternative for structural identification on wide frequency ranges. Numerical-based identification process can reduce experiments’ cost without significant loss of accuracy. Statistical investigations of the randomness of identified parameters illustrate the robustness of identification against uncertainties.
Details
Keywords
Robert J. Elliott, Tak Kuen Siu and Alex Badescu
The purpose of this paper is to consider a discrete‐time, Markov, regime‐switching, affine term‐structure model for valuing bonds and other interest rate securities. The proposed…
Abstract
Purpose
The purpose of this paper is to consider a discrete‐time, Markov, regime‐switching, affine term‐structure model for valuing bonds and other interest rate securities. The proposed model incorporates the impact of structural changes in (macro)‐economic conditions on interest‐rate dynamics. The market in the proposed model is, in general, incomplete. A modified version of the Esscher transform, namely, a double Esscher transform, is used to specify a price kernel so that both market and economic risks are taken into account.
Design/methodology/approach
The market in the proposed model is, in general, incomplete. A modified version of the Esscher transform, namely, a double Esscher transform, is used to specify a price kernel so that both market and economic risks are taken into account.
Findings
The authors derive a simple way to give exponential affine forms of bond prices using backward induction. The authors also consider a continuous‐time extension of the model and derive exponential affine forms of bond prices using the concept of stochastic flows.
Originality/value
The methods and results presented in the paper are new.
Details