Search results

1 – 10 of over 3000
To view the access options for this content please click here
Article

Muhammad Aslam, Abdur Razzaque Mughal and Munir Ahmad

The purpose of this paper is to propose the group acceptance sampling plans for when the lifetime of the submitted product follows the Pareto distribution.

Abstract

Purpose

The purpose of this paper is to propose the group acceptance sampling plans for when the lifetime of the submitted product follows the Pareto distribution.

Design/methodology/approach

The single‐point approach (only consumer's risk) is used to find the plan parameter of the proposed plan for specified values of consumer's risk, producer's risk, acceptance number, number of testers and experiment time.

Findings

Tables are constructed using the Poisson and the weighted Poisson distribution. Extensive tables are provided for practical use.

Research limitations/implications

The tables in this paper can be used only when the lifetime of a product follows the Pareto distribution of 2nd kind.

Practical implications

The result can be used to test the product to save cost and time of the experiment. The use of the weighted Poisson distribution provides the less group size (sample size) as than the plans in the literature.

Social implications

By implementing the proposed plan, the experiment cost can be minimized.

Originality/value

The novelty of this paper is that Poisson and the weighted Poisson distributions are used to find the plan parameter of the proposed plan instead of the binomial distribution when the lifetime of submitted product follows the Pareto distribution of 2nd kind.

Details

International Journal of Quality & Reliability Management, vol. 28 no. 8
Type: Research Article
ISSN: 0265-671X

Keywords

To view the access options for this content please click here
Article

Stephen J. Bensman

The purpose of this article is to analyze the historical significance of Donald J. Urquhart, who established the National Lending Library for Science and Technology (NLL…

Abstract

Purpose

The purpose of this article is to analyze the historical significance of Donald J. Urquhart, who established the National Lending Library for Science and Technology (NLL) that later was merged into the British Library Lending Division (BLLD), now called the British Library Document Supply Centre (BLDSC).

Design/methodology/approach

The paper presents a short history of the probabilistic revolution, particularly as it developed in the UK in the form of biometric statistics due to Darwin's theory of evolution. It focuses on the overthrow of the normal paradigm, according to which frequency distributions in nature and society conform to the normal law of error. The paper discusses the importance of the Poisson distribution and its utilization in the construction of stochastic models that better describe reality. Here the focus is on the compound Poisson distribution in the form of the negative binomial distribution (NBD). The paper then shows how Urquhart extended the probabilistic revolution to librarianship by using the Poisson as the probabilistic model in his analyses of the 1956 external loans made by the Science Museum Library (SML) as well as in his management of the scientific and technical (sci/tech) journal collection of the NLL. Thanks to this, Urquhart can be considered as playing a pivotal role in the creation of bibliometrics or the statistical bases of modern library and information science. The paper relates how Urquhart's son and daughter‐in‐law, John A. and Norma C. Urquhart, completed Urquhart's probabilistic breakthrough by advancing for the first time the NBD as the model for library use in a study executed at the University of Newcastle upon Tyne, connecting bibliometrics with biometrics. It concludes with a discussion of Urquhart's Law and its probabilistic implications for the use of sci/tech journals in a library system.

Findings

By being the first librarian to apply probability to the analysis of sci/tech journal use, Urquhart was instrumental in the creation of modern library and information science. His findings force a probabilistic re‐conceptualization of sci/tech journal use in a library system that has great implications for the transition of sci/tech journals from locally held paper copies to shared electronic databases.

Originality/value

Urquhart's significance is considered from the perspective of the development of science as a whole as well as library and information science in particular.

Details

Interlending & Document Supply, vol. 35 no. 2
Type: Research Article
ISSN: 0264-1615

Keywords

To view the access options for this content please click here
Article

Kandasamy Subramani and Venugopal Haridoss

The purpose of this paper is to present the single sampling attribute plan for given acceptance quality level (AQL) and limiting quality level (LQL) involving minimum sum…

Abstract

Purpose

The purpose of this paper is to present the single sampling attribute plan for given acceptance quality level (AQL) and limiting quality level (LQL) involving minimum sum of risks using weighted Poisson distribution.

Design/methodology/approach

For the given AQL and LQL, sum of producer's and consumer's risks have been attained. Based on weighted Poisson distribution, the sum of these risks has been arrived at, along with the acceptance number and the rejection number. Also, the operating characteristic function for the single sampling attribute sampling plan, using weighted Poisson distribution, has been derived.

Findings

In the final inspection, the producer and the consumer represent the same party. So, the sum these two risks should be minimized. In this paper, the sum of risks has been tabulated using weighted Poisson distribution for different operating ratios. These tabulated values are comparatively less than the sum of risks derived using Poisson distribution.

Originality/value

The sampling plan presented in this paper is particularly useful for testing the quality of finished products in shop floor situations.

Details

International Journal of Quality & Reliability Management, vol. 30 no. 1
Type: Research Article
ISSN: 0265-671X

Keywords

To view the access options for this content please click here
Article

Younès Mouatassim

The purpose of this paper is to introduce the zero‐modified distributions in the calculation of operational value‐at‐risk.

Abstract

Purpose

The purpose of this paper is to introduce the zero‐modified distributions in the calculation of operational value‐at‐risk.

Design/methodology/approach

This kind of distributions is preferred when excess of zeroes is observed. In operational risk, this phenomenon may be due to the scarcity of data, the existence of extreme values and/or the threshold from which banks start to collect losses. In this article, the paper focuses on the analysis of damage to physical assets.

Findings

The results show that basic Poisson distribution underestimates the dispersion, and then leads to the underestimation of the capital charge. However, zero‐modified Poisson distributions perform well the frequency. In addition, basic negative binomial and its related zero‐modified distributions, in their turn, offer a good prediction of count events. To choose the distribution that suits better the frequency, the paper uses the Vuong's test. Its results indicate that zero‐modified Poisson distributions, basic negative binomial and its related zero‐modified distributions are equivalent. This conclusion is confirmed by the capital charge calculated since the differences between the six aggregations are not significant except that of basic Poisson distribution.

Originality/value

Recently, the zero‐modified formulations are widely used in many fields because of the low frequency of the events. This article aims to describe the frequency of operational risk using zero‐modified distributions.

To view the access options for this content please click here
Article

Venugopal Haridoss and Kandasamy Subramani

– The purpose of this paper is to present the optimal double sampling attribute plan using the weighted Poisson distribution.

Abstract

Purpose

The purpose of this paper is to present the optimal double sampling attribute plan using the weighted Poisson distribution.

Design/methodology/approach

For the given AQL and LQL, sum of producer’s and consumer’s risks have been attained. Based on the weighted Poisson distribution, the sum of these risks has been optimized.

Findings

In the final inspection, the producer and the consumer represent the same party. So, the sum these two risks should be minimized. In this paper, the sum of risks has been tabulated using the weighted Poisson distribution for different operating ratios. These tabulated values are comparatively less than the sum of risks derived using Poisson distribution.

Originality/value

The sampling plan presented in this paper is particularly useful for testing the quality of finished products in shop floor situations.

Details

International Journal of Quality & Reliability Management, vol. 33 no. 1
Type: Research Article
ISSN: 0265-671X

Keywords

To view the access options for this content please click here
Article

Richard L. Henshel

Briefly reviews the standard Poisson distribution and then examines a set of derivative, modified Poisson distributions for testing hypotheses derived from positive…

Abstract

Briefly reviews the standard Poisson distribution and then examines a set of derivative, modified Poisson distributions for testing hypotheses derived from positive deviation‐amplifying feedback models, which do not lend themselves to ordinary statistically based hypothesis testing. The “reinforcement” or “contagious” Poisson offers promise for a subset of such models, in particular those models with data in the form of rates (rather than magnitudes). The practical difficulty lies in distinguishing reinforcement effects from initial heterogeneity, since both can form negative binomial distributions, with look‐alike data. Illustrates these difficulties, and also opportunities, for various feedback models employing the self‐fulfilling prophecy, and especially for confidence loops, which incorporate particular self‐fulfilling prophecies as part of a larger dynamic process. Describes an actual methodology for testing hypotheses regarding confidence loops with the aid of a “reinforcement” Poisson distribution, as well as its place within sociocybernetics.

To view the access options for this content please click here
Book part

Virginia M. Miori

The challenge of truckload routing is increased in complexity by the introduction of stochastic demand. Typically, this demand is generalized to follow a Poisson

Abstract

The challenge of truckload routing is increased in complexity by the introduction of stochastic demand. Typically, this demand is generalized to follow a Poisson distribution. In this chapter, we cluster the demand data using data mining techniques to establish the more acceptable distribution to predict demand. We then examine this stochastic truckload demand using an econometric discrete choice model known as a count data model. Using actual truckload demand data and data from the bureau of transportation statistics, we perform count data regressions. Two outcomes are produced from every regression run, the predicted demand between every origin and destination, and the likelihood that that demand will occur. The two allow us to generate an expected value forecast of truckload demand as input to a truckload routing formulation. The negative binomial distribution produces an improved forecast over the Poisson distribution.

Details

Advances in Business and Management Forecasting
Type: Book
ISBN: 978-1-84855-548-8

To view the access options for this content please click here
Article

Nataliya Chukhrova and Arne Johannssen

In acceptance sampling, the hypergeometric operating characteristic (OC) function (so called type-A OC) is used to be approximated by the binomial or Poisson OC function…

Abstract

Purpose

In acceptance sampling, the hypergeometric operating characteristic (OC) function (so called type-A OC) is used to be approximated by the binomial or Poisson OC function, which actually reduce computational effort, but do not provide suffcient approximation results. The purpose of this paper is to examine binomial- and Poisson-type approximations to the hypergeometric distribution, in order to find a simple but accurate approximation that can be successfully applied in acceptance sampling.

Design/methodology/approach

The authors present a new binomial-type approximation for the type-A OC function, and derive its properties. Further, the authors compare this approximation via an extensive numerical study with other common approximations in terms of variation distance and relative efficiency under various conditions on the parameters including limiting cases.

Findings

The introduced approximation generates best numerical results over a wide range of parameter values, and ensures arithmetic simplicity of the binomial distribution and high accuracy to meet requirements regarding acceptance sampling problems. Additionally, it can considerably reduce the computational effort in relation to the type-A OC function and therefore is strongly recommended for calculating sampling plans.

Originality/value

The newly presented approximation provides a remarkably close fit to the type-A OC function, is discrete and needs no correction for continuity, and is skewed in the same direction by roughly the same amount as the exact OC. Due to less factorials, this OC in general involves lower powers than the type-A OC function. Moreover, the binomial-type approximation is easy to fit to the conventional statistical computing packages.

Details

International Journal of Quality & Reliability Management, vol. 36 no. 4
Type: Research Article
ISSN: 0265-671X

Keywords

To view the access options for this content please click here
Article

Thankappan Vasanthi and Ganapathy Arulmozhi

The purpose of this paper is to use Bayesian probability theory to analyze the software reliability model with multiple types of faults. The probability that all faults…

Abstract

Purpose

The purpose of this paper is to use Bayesian probability theory to analyze the software reliability model with multiple types of faults. The probability that all faults are detected and corrected after a series of independent software tests and correction cycles is presented. This in turn has a number of applications, such as how long to test a software, estimating the cost of testing, etc.

Design/methodology/approach

The use of Bayesian probabilistic models, when compared to traditional point forecast estimation models, provides tools for risk estimation and allows decision makers to combine historical data with subjective expert estimates. Probability evaluation is done both prior to and after observing the number of faults detected in each cycle. The conditions under which these two measures, the conditional and unconditional probabilities, are the same is also shown. Expressions are derived to evaluate the probability that, after a series of sequential independent reviews have been completed, no class of fault remains in the software system by assuming the prior distribution as Poisson and binomial.

Findings

From results in Sections 4 and 5 it can be observed that the conditional and unconditional probabilities are the same if the prior probability distribution is Poisson and binomial. In these cases the confidence that all faults are deleted is not a function of the number of faults observed during the successive reviews but it is a function of the number of reviews, the detection probabilities and the mean of the prior distribution. This is a remarkable result because it gives a circumstance in which the statistical confidence from a Bayesian analysis is actually independent of all observed data. From the result in Section 4 it can be seen that exponential formula could be used to evaluate the probability that no fault remains when a Poisson prior distribution is combined with a multinomial detection process in each review cycle.

Originality/value

The paper is part of research work for a PhD degree.

Details

International Journal of Quality & Reliability Management, vol. 30 no. 1
Type: Research Article
ISSN: 0265-671X

Keywords

To view the access options for this content please click here

Abstract

Details

Understanding Financial Risk Management, Second Edition
Type: Book
ISBN: 978-1-78973-794-3

1 – 10 of over 3000