Search results
1 – 10 of 459How to describe, reliably and effectively, the quality of a continuous stream is a problem facing many companies in the chemical industry. The average quality estimated from an…
Abstract
How to describe, reliably and effectively, the quality of a continuous stream is a problem facing many companies in the chemical industry. The average quality estimated from an average of individual test sample values is most useful if its distribution is known or can be estimated. A simple extension of the Central Limit Theorem is given that provides a means of estimating the distribution which, when coupled with the variogram method of variance estimation of Saunders et al., enables the calculation of probabilities of closeness of lot mean to the nominal value. The result “sits well” with the current emphasis being placed on the importance of “conformance to target”.
Details
Keywords
By using the central limit theorem, it is possible to consider the Fourier’s transform of a stochastic process with independent increments as a Gaussian random variable of which…
Abstract
By using the central limit theorem, it is possible to consider the Fourier’s transform of a stochastic process with independent increments as a Gaussian random variable of which the mathematical expectation and the variance are respectively integrals of the mean and the variance of this process. One can then use this result to analyze the statistical properties of linear feedback systems in the frequency domain, that is to say in terms of transfer functions. The advantage of this approach is that one can deal with linear systems subject to squared white noises, in a very simple manner.
Details
Keywords
The purpose of this paper is to investigate the fatigue crack growth (FCG) under random loading using analytical methods.
Abstract
Purpose
The purpose of this paper is to investigate the fatigue crack growth (FCG) under random loading using analytical methods.
Design/methodology/approach
For this purpose, two methods of cycle-by-cycle technique and central limit theorem (CLT) were used. The Walker equation was used to consider the stress ratio effect on the FCG rate. In order to validate the results in three random loading group with different loading levels and bandwidths, the results of the analysis, such as the mean lifetime of the specimen and the average crack length were compared with the test results in terms of the number of loading cycles.
Findings
The comparison indicated a good agreement between the results of the analysis and the test. Further, the diagrams of reliability and the probability of failure of the specimen were obtained for each loading group and were compared together.
Originality/value
Applying the cycle-by-cycle and CLT methods for the calculation of fatigue reliability of a CT specimen under random loading by the Walker equation and comparing their results with each other is not observed in other researches. Also in this study, the effect of the loading frequency bandwidth on lifetime was studied.
Details
Keywords
In this paper, we consider a two color multi-drawing urn model. At each discrete time step, we draw uniformly at random a sample of
Abstract
In this paper, we consider a two color multi-drawing urn model. At each discrete time step, we draw uniformly at random a sample of
Details
Keywords
The invariance principle known as the functional central limit theorem is the fundamental knowledge for understanding the convergence of a Markov chain to a diffusion process and…
Abstract
The invariance principle known as the functional central limit theorem is the fundamental knowledge for understanding the convergence of a Markov chain to a diffusion process and has been applied to many areas of financial economics. The derivation of that principle from the properties of stochastic process is nasty (Billingsley, 1999). Following the approach of Gikhman and Skorokhod (2000) this study uses the properties of probability measures to simplify the proof of the principle. If the sequences of finite distributions on weakly converge then they are tight. Using this properties, the invariance principle was directly proved. As applications of this principle, I derived the statistic of unit root process and the convergence of the Cox et al. (1979) binomial option pricing model to the continuous time Black Scholes (1972) option pricing model.
Details
Keywords
There are many phenomena in information science which, when quantified, yield variants of the hyperbolic frequency distribution and are characterized by long straggling tails…
Abstract
There are many phenomena in information science which, when quantified, yield variants of the hyperbolic frequency distribution and are characterized by long straggling tails. Conventional statistical sampling theory offers no general techniques for the confident analysis of finite discrete distributions of this kind. A sampling theorem, based on the binomial probability distribution, but otherwise distribution‐free, is derived and some of its applications are illustrated numerically. One application answers Kendall's question: How many additional periodicals would be found on doubling the period of publication within which periodicals contributing to a specified topic are enumerated? The effects of sampling truncated distributions are also briefly discussed.
The theory which is addressed in this paper is that we should not be surprised to come across fractals in the analysis of some dynamic systems involving human factors. Moreover…
Abstract
The theory which is addressed in this paper is that we should not be surprised to come across fractals in the analysis of some dynamic systems involving human factors. Moreover, in substance, fractals in human behaviour is acceptable. For the convenience of the reader a primary background to some models of fractional Brownian motions which can be found in the literature is given, and then the main features of the complex‐valued model, via a random walk in the complex plane, recently introduced by the author are recalled. The practical meaning of the model is exhibited. The parallel of the central limit theorem here is Levy’s stability. If it is supposed that human decision‐makers work via an observation process which combines the Heisenberg principle and a quantization principle in the measurement, then fractal dynamics appears to be quite in order. The relation with the theory of relative information is exhibited. The conjecture is then the following: could this model explain why fractals appear in finance, for instance?
Details
Keywords
T.A. Spedding and P.L. Rawlings
Control charts and process capability calculations remain fundamentaltechniques for statistical process control. However, it has long beenrealized that the accuracy of these…
Abstract
Control charts and process capability calculations remain fundamental techniques for statistical process control. However, it has long been realized that the accuracy of these calculations can be significantly affected when sampling from a non‐Gaussian population. Many quality practitioners are conscious of these problems but are not aware of the effects such problems might have on the integrity of their results. Considers non‐normality with respect to the use of traditional control charts and process capability calculations, so that users may be aware of the errors that are involved when sampling from a non‐Gaussian population. Use is made of the Johnson system of distributions as a simulation technique to investigate the effects of non‐normality of control charts and process control calculations. An alternative technique is suggested for process capability calculations which alleviates the problems of non‐normality while retaining computational efficiency.
Details
Keywords
The main purpose of this paper is to present an optimal economic solution for a different adaptive sampling method that is highly intuitive in its nature: the normal sampling…
Abstract
Purpose
The main purpose of this paper is to present an optimal economic solution for a different adaptive sampling method that is highly intuitive in its nature: the normal sampling intervals (NSI) method.
Design/methodology/approach
Considering costs associated with sampling, false alarms and imperfect operation per unit of time, the paper presents a new optimal simple solution that minimizes the expected total cost per cycle. This NSI method involves the density function of the standard normal variable, assuming that the distribution of averages is normal or approximately normal (on the basis of the central limit theorem). It depends on a single scale parameter while other methods depend on various parameters.
Findings
When this expected total cost associated with the new NSI method is compared with the fixed (FSI) and variable sampling intervals (VSI) methods, in identical situations, it may be seen that, in general, it is lower (and may be much lower) and, also, that it is lower for a wider range of changes in terms of quality. This feature is particularly important because, in practice, the degree of change that occurs is not known, so this greater robustness in terms of performance is relevant.
Practical implications
In the practice, the minimization of total expected costs is an important point of view in the life of companies, concerning quality and statistical process control (SPC). The paper holds that this NSI method has a great degree of potential, in particular considering that in industrial processes there is growing recourse to automated systems for the collection and analysis of samples, and thus there are no special additional costs associated with labour, management or administration.
Originality/value
The great advantages of this NSI method are its highly intuitive nature and the fact that it enables generally much better results to be achieved as compared with the use of FSI and VSI methods. An optimal economic solution for this NSI method is presented in this paper.
Details
Keywords
Many investors have been disappointed with the practical results of portfolio insurance programs, which attempt to achieve option‐like results through dynamic hedging. This…
Abstract
Many investors have been disappointed with the practical results of portfolio insurance programs, which attempt to achieve option‐like results through dynamic hedging. This article takes the simplest form of dynamic hedging, constant proportion portfolio insurance (CPPI), as a model for developing a more optimal approach. The author uses Monte Carlo simulation to model the multi‐period median growth in discretionary wealth. In addition, he constructs leverage policies that mitigate the practical drawbacks to dynamic hedging. The article also shows that self‐imposed ex ante borrowing constraints (not the ex post constraint imposed by a margin call) can, under certain conditions, improve the performance of dynamic hedging with respect to median terminal wealth.