Search results

1 – 10 of 459
Article
Publication date: 1 April 1989

Neil S. Barnett

How to describe, reliably and effectively, the quality of a continuous stream is a problem facing many companies in the chemical industry. The average quality estimated from an…

Abstract

How to describe, reliably and effectively, the quality of a continuous stream is a problem facing many companies in the chemical industry. The average quality estimated from an average of individual test sample values is most useful if its distribution is known or can be estimated. A simple extension of the Central Limit Theorem is given that provides a means of estimating the distribution which, when coupled with the variogram method of variance estimation of Saunders et al., enables the calculation of probabilities of closeness of lot mean to the nominal value. The result “sits well” with the current emphasis being placed on the importance of “conformance to target”.

Details

International Journal of Quality & Reliability Management, vol. 6 no. 4
Type: Research Article
ISSN: 0265-671X

Keywords

Article
Publication date: 1 March 1999

Guy Jumarie

By using the central limit theorem, it is possible to consider the Fourier’s transform of a stochastic process with independent increments as a Gaussian random variable of which…

154

Abstract

By using the central limit theorem, it is possible to consider the Fourier’s transform of a stochastic process with independent increments as a Gaussian random variable of which the mathematical expectation and the variance are respectively integrals of the mean and the variance of this process. One can then use this result to analyze the statistical properties of linear feedback systems in the frequency domain, that is to say in terms of transfer functions. The advantage of this approach is that one can deal with linear systems subject to squared white noises, in a very simple manner.

Details

COMPEL - The international journal for computation and mathematics in electrical and electronic engineering, vol. 18 no. 1
Type: Research Article
ISSN: 0332-1649

Keywords

Article
Publication date: 4 September 2019

Mohammad Salari

The purpose of this paper is to investigate the fatigue crack growth (FCG) under random loading using analytical methods.

Abstract

Purpose

The purpose of this paper is to investigate the fatigue crack growth (FCG) under random loading using analytical methods.

Design/methodology/approach

For this purpose, two methods of cycle-by-cycle technique and central limit theorem (CLT) were used. The Walker equation was used to consider the stress ratio effect on the FCG rate. In order to validate the results in three random loading group with different loading levels and bandwidths, the results of the analysis, such as the mean lifetime of the specimen and the average crack length were compared with the test results in terms of the number of loading cycles.

Findings

The comparison indicated a good agreement between the results of the analysis and the test. Further, the diagrams of reliability and the probability of failure of the specimen were obtained for each loading group and were compared together.

Originality/value

Applying the cycle-by-cycle and CLT methods for the calculation of fatigue reliability of a CT specimen under random loading by the Walker equation and comparing their results with each other is not observed in other researches. Also in this study, the effect of the loading frequency bandwidth on lifetime was studied.

Details

International Journal of Structural Integrity, vol. 11 no. 2
Type: Research Article
ISSN: 1757-9864

Keywords

Open Access
Article
Publication date: 11 January 2019

Aguech Rafik and Selmi Olfa

In this paper, we consider a two color multi-drawing urn model. At each discrete time step, we draw uniformly at random a sample of m…

Abstract

In this paper, we consider a two color multi-drawing urn model. At each discrete time step, we draw uniformly at random a sample of m balls (m1) and note their color, they will be returned to the urn together with a random number of balls depending on the sample’s composition. The replacement rule is a 2 × 2 matrix depending on bounded discrete positive random variables. Using a stochastic approximation algorithm and martingales methods, we investigate the asymptotic behavior of the urn after many draws.

Details

Arab Journal of Mathematical Sciences, vol. 26 no. 1/2
Type: Research Article
ISSN: 1319-5166

Keywords

Open Access
Article
Publication date: 28 February 2014

Sang Il Han

The invariance principle known as the functional central limit theorem is the fundamental knowledge for understanding the convergence of a Markov chain to a diffusion process and…

31

Abstract

The invariance principle known as the functional central limit theorem is the fundamental knowledge for understanding the convergence of a Markov chain to a diffusion process and has been applied to many areas of financial economics. The derivation of that principle from the properties of stochastic process is nasty (Billingsley, 1999). Following the approach of Gikhman and Skorokhod (2000) this study uses the properties of probability measures to simplify the proof of the principle. If the sequences of finite distributions on weakly converge then they are tight. Using this properties, the invariance principle was directly proved. As applications of this principle, I derived the statistic of unit root process and the convergence of the Cox et al. (1979) binomial option pricing model to the continuous time Black Scholes (1972) option pricing model.

Details

Journal of Derivatives and Quantitative Studies, vol. 22 no. 1
Type: Research Article
ISSN: 2713-6647

Keywords

Article
Publication date: 1 January 1975

B.C. BROOKES

There are many phenomena in information science which, when quantified, yield variants of the hyperbolic frequency distribution and are characterized by long straggling tails…

Abstract

There are many phenomena in information science which, when quantified, yield variants of the hyperbolic frequency distribution and are characterized by long straggling tails. Conventional statistical sampling theory offers no general techniques for the confident analysis of finite discrete distributions of this kind. A sampling theorem, based on the binomial probability distribution, but otherwise distribution‐free, is derived and some of its applications are illustrated numerically. One application answers Kendall's question: How many additional periodicals would be found on doubling the period of publication within which periodicals contributing to a specified topic are enumerated? The effects of sampling truncated distributions are also briefly discussed.

Details

Journal of Documentation, vol. 31 no. 1
Type: Research Article
ISSN: 0022-0418

Article
Publication date: 1 June 1999

Guy Jumarie

The theory which is addressed in this paper is that we should not be surprised to come across fractals in the analysis of some dynamic systems involving human factors. Moreover…

Abstract

The theory which is addressed in this paper is that we should not be surprised to come across fractals in the analysis of some dynamic systems involving human factors. Moreover, in substance, fractals in human behaviour is acceptable. For the convenience of the reader a primary background to some models of fractional Brownian motions which can be found in the literature is given, and then the main features of the complex‐valued model, via a random walk in the complex plane, recently introduced by the author are recalled. The practical meaning of the model is exhibited. The parallel of the central limit theorem here is Levy’s stability. If it is supposed that human decision‐makers work via an observation process which combines the Heisenberg principle and a quantization principle in the measurement, then fractal dynamics appears to be quite in order. The relation with the theory of relative information is exhibited. The conjecture is then the following: could this model explain why fractals appear in finance, for instance?

Details

Kybernetes, vol. 28 no. 4
Type: Research Article
ISSN: 0368-492X

Keywords

Article
Publication date: 1 August 1994

T.A. Spedding and P.L. Rawlings

Control charts and process capability calculations remain fundamentaltechniques for statistical process control. However, it has long beenrealized that the accuracy of these…

1639

Abstract

Control charts and process capability calculations remain fundamental techniques for statistical process control. However, it has long been realized that the accuracy of these calculations can be significantly affected when sampling from a non‐Gaussian population. Many quality practitioners are conscious of these problems but are not aware of the effects such problems might have on the integrity of their results. Considers non‐normality with respect to the use of traditional control charts and process capability calculations, so that users may be aware of the errors that are involved when sampling from a non‐Gaussian population. Use is made of the Johnson system of distributions as a simulation technique to investigate the effects of non‐normality of control charts and process control calculations. An alternative technique is suggested for process capability calculations which alleviates the problems of non‐normality while retaining computational efficiency.

Details

International Journal of Quality & Reliability Management, vol. 11 no. 6
Type: Research Article
ISSN: 0265-671X

Keywords

Article
Publication date: 9 January 2009

J. Rodrigues Dias

The main purpose of this paper is to present an optimal economic solution for a different adaptive sampling method that is highly intuitive in its nature: the normal sampling…

Abstract

Purpose

The main purpose of this paper is to present an optimal economic solution for a different adaptive sampling method that is highly intuitive in its nature: the normal sampling intervals (NSI) method.

Design/methodology/approach

Considering costs associated with sampling, false alarms and imperfect operation per unit of time, the paper presents a new optimal simple solution that minimizes the expected total cost per cycle. This NSI method involves the density function of the standard normal variable, assuming that the distribution of averages is normal or approximately normal (on the basis of the central limit theorem). It depends on a single scale parameter while other methods depend on various parameters.

Findings

When this expected total cost associated with the new NSI method is compared with the fixed (FSI) and variable sampling intervals (VSI) methods, in identical situations, it may be seen that, in general, it is lower (and may be much lower) and, also, that it is lower for a wider range of changes in terms of quality. This feature is particularly important because, in practice, the degree of change that occurs is not known, so this greater robustness in terms of performance is relevant.

Practical implications

In the practice, the minimization of total expected costs is an important point of view in the life of companies, concerning quality and statistical process control (SPC). The paper holds that this NSI method has a great degree of potential, in particular considering that in industrial processes there is growing recourse to automated systems for the collection and analysis of samples, and thus there are no special additional costs associated with labour, management or administration.

Originality/value

The great advantages of this NSI method are its highly intuitive nature and the fact that it enables generally much better results to be achieved as compared with the use of FSI and VSI methods. An optimal economic solution for this NSI method is presented in this paper.

Details

International Journal of Quality & Reliability Management, vol. 26 no. 1
Type: Research Article
ISSN: 0265-671X

Keywords

Article
Publication date: 1 March 2001

JARROD WILCOX

Many investors have been disappointed with the practical results of portfolio insurance programs, which attempt to achieve option‐like results through dynamic hedging. This…

Abstract

Many investors have been disappointed with the practical results of portfolio insurance programs, which attempt to achieve option‐like results through dynamic hedging. This article takes the simplest form of dynamic hedging, constant proportion portfolio insurance (CPPI), as a model for developing a more optimal approach. The author uses Monte Carlo simulation to model the multi‐period median growth in discretionary wealth. In addition, he constructs leverage policies that mitigate the practical drawbacks to dynamic hedging. The article also shows that self‐imposed ex ante borrowing constraints (not the ex post constraint imposed by a margin call) can, under certain conditions, improve the performance of dynamic hedging with respect to median terminal wealth.

Details

The Journal of Risk Finance, vol. 2 no. 4
Type: Research Article
ISSN: 1526-5943

1 – 10 of 459