Search results

1 – 10 of over 1000
Article
Publication date: 2 November 2012

Younès Mouatassim

The purpose of this paper is to introduce the zero‐modified distributions in the calculation of operational value‐at‐risk.

Abstract

Purpose

The purpose of this paper is to introduce the zero‐modified distributions in the calculation of operational value‐at‐risk.

Design/methodology/approach

This kind of distributions is preferred when excess of zeroes is observed. In operational risk, this phenomenon may be due to the scarcity of data, the existence of extreme values and/or the threshold from which banks start to collect losses. In this article, the paper focuses on the analysis of damage to physical assets.

Findings

The results show that basic Poisson distribution underestimates the dispersion, and then leads to the underestimation of the capital charge. However, zero‐modified Poisson distributions perform well the frequency. In addition, basic negative binomial and its related zero‐modified distributions, in their turn, offer a good prediction of count events. To choose the distribution that suits better the frequency, the paper uses the Vuong's test. Its results indicate that zero‐modified Poisson distributions, basic negative binomial and its related zero‐modified distributions are equivalent. This conclusion is confirmed by the capital charge calculated since the differences between the six aggregations are not significant except that of basic Poisson distribution.

Originality/value

Recently, the zero‐modified formulations are widely used in many fields because of the low frequency of the events. This article aims to describe the frequency of operational risk using zero‐modified distributions.

Article
Publication date: 1 March 1987

JEAN TAGUE and ISOLA AJIFERUKE

Two dynamic models of library circulation, the Markov model originally proposed by Morse and the mixed Poisson model proposed by Burrell and Cane, are applied to a large…

Abstract

Two dynamic models of library circulation, the Markov model originally proposed by Morse and the mixed Poisson model proposed by Burrell and Cane, are applied to a large eleven‐year university circulation data set. Goodness of fit tests indicate that neither model fits the data. In both cases, the set of non‐circulating items is larger than that predicted by the model.

Details

Journal of Documentation, vol. 43 no. 3
Type: Research Article
ISSN: 0022-0418

Article
Publication date: 1 April 1974

P.R. BIRD

Most documentation systems allocate a variable number of descriptors to their documents. From a consideration of indexing as a stochastic process it is suggested that the…

Abstract

Most documentation systems allocate a variable number of descriptors to their documents. From a consideration of indexing as a stochastic process it is suggested that the distribution of indexing depth in such a system might represent samples of a (truncated) mixed Poisson process. Examination of five different systems showed that indexing depth does appear to be distributed in this manner, since a reasonable fit to negative binomial distributions can be made statistically. Factors in the art of indexing which influence the distribution are discussed. As a first approximation the distribution of indexing depth, i, of a system, or of any subset of descriptors in it, is simple Poisson, p(i) = e−m(mi/i!), where m is the average depth of indexing. The results contradict previous reports that a log‐normal distribution of indexing depth is to be expected.

Details

Journal of Documentation, vol. 30 no. 4
Type: Research Article
ISSN: 0022-0418

Article
Publication date: 5 June 2007

Stephen J. Bensman

The purpose of this article is to analyze the historical significance of Donald J. Urquhart, who established the National Lending Library for Science and Technology (NLL) that…

Abstract

Purpose

The purpose of this article is to analyze the historical significance of Donald J. Urquhart, who established the National Lending Library for Science and Technology (NLL) that later was merged into the British Library Lending Division (BLLD), now called the British Library Document Supply Centre (BLDSC).

Design/methodology/approach

The paper presents a short history of the probabilistic revolution, particularly as it developed in the UK in the form of biometric statistics due to Darwin's theory of evolution. It focuses on the overthrow of the normal paradigm, according to which frequency distributions in nature and society conform to the normal law of error. The paper discusses the importance of the Poisson distribution and its utilization in the construction of stochastic models that better describe reality. Here the focus is on the compound Poisson distribution in the form of the negative binomial distribution (NBD). The paper then shows how Urquhart extended the probabilistic revolution to librarianship by using the Poisson as the probabilistic model in his analyses of the 1956 external loans made by the Science Museum Library (SML) as well as in his management of the scientific and technical (sci/tech) journal collection of the NLL. Thanks to this, Urquhart can be considered as playing a pivotal role in the creation of bibliometrics or the statistical bases of modern library and information science. The paper relates how Urquhart's son and daughter‐in‐law, John A. and Norma C. Urquhart, completed Urquhart's probabilistic breakthrough by advancing for the first time the NBD as the model for library use in a study executed at the University of Newcastle upon Tyne, connecting bibliometrics with biometrics. It concludes with a discussion of Urquhart's Law and its probabilistic implications for the use of sci/tech journals in a library system.

Findings

By being the first librarian to apply probability to the analysis of sci/tech journal use, Urquhart was instrumental in the creation of modern library and information science. His findings force a probabilistic re‐conceptualization of sci/tech journal use in a library system that has great implications for the transition of sci/tech journals from locally held paper copies to shared electronic databases.

Originality/value

Urquhart's significance is considered from the perspective of the development of science as a whole as well as library and information science in particular.

Details

Interlending & Document Supply, vol. 35 no. 2
Type: Research Article
ISSN: 0264-1615

Keywords

Article
Publication date: 1 January 1983

A. BAGUST

Starting from a basis laid by Burrell, this paper develops a stochastic model of library borrowing using the Negative Binomial distribution. This shows an improvement over…

Abstract

Starting from a basis laid by Burrell, this paper develops a stochastic model of library borrowing using the Negative Binomial distribution. This shows an improvement over previous characterizations for academic libraries and accords well with new data obtained at Huddersfield Public Library. Evidence concerning the process of issue decay is presented and employed to obtain an optimum stock turnover rate for any collection in its ‘steady state’. A method is then given by which simple relegation tests can be constructed to maintain such as optimum turnover. Although not the ‘final word’ in circulation modelling, the negative binomial distribution extends the range of model applicability into the area of high volume, rapid movement collections with some success.

Details

Journal of Documentation, vol. 39 no. 1
Type: Research Article
ISSN: 0022-0418

Article
Publication date: 1 February 1997

M. Xie and T.N. Goh

Control charts based on geometric distribution have shown to be useful when this is a better approximation of the underlying distribution than the Poisson distribution. The…

1733

Abstract

Control charts based on geometric distribution have shown to be useful when this is a better approximation of the underlying distribution than the Poisson distribution. The traditional c‐chart, if used, will cause too many false alarms. It is noted that for geometric distribution, the control limits are based on k times standard deviation which has been used previously, will cause a frequent false alarm, and cannot derive any reasonable lower control limits. Studies the use of probability limits to resolve these problems. Also discusses the use of geometric distribution for process control of high‐yield processes.

Details

International Journal of Quality & Reliability Management, vol. 14 no. 1
Type: Research Article
ISSN: 0265-671X

Keywords

Article
Publication date: 7 October 2021

Sandra García-Bustos, Nadia Cárdenas-Escobar, Ana Debón and César Pincay

The study aims to design a control chart based on an exponentially weighted moving average (EWMA) chart of Pearson's residuals of a model of negative binomial regression in order…

Abstract

Purpose

The study aims to design a control chart based on an exponentially weighted moving average (EWMA) chart of Pearson's residuals of a model of negative binomial regression in order to detect possible anomalies in mortality data.

Design/methodology/approach

In order to evaluate the performance of the proposed chart, the authors have considered official historical records of death of children of Ecuador. A negative binomial regression model was fitted to the data, and a chart of the Pearson residuals was designed. The parameters of the chart were obtained by simulation, as well as the performances of the charts related to changes in the mean of death.

Findings

When the chart was plotted, outliers were detected in the deaths of children in the years 1990–1995, 2001–2006, 2013–2015, which could show that there are underreporting or an excessive growth in mortality. In the analysis of performances, the value of λ = 0.05 presented the fastest detection of changes in the mean death.

Originality/value

The proposed charts present better performances in relation to EWMA charts for deviance residuals, with a remarkable advantage of the Pearson residuals, which are much easier to interpret and calculate. Finally, the authors would like to point out that although this paper only applies control charts to Ecuadorian infant mortality, the methodology can be used to calculate mortality in any geographical area or to detect outbreaks of infectious diseases.

Details

International Journal of Quality & Reliability Management, vol. 39 no. 10
Type: Research Article
ISSN: 0265-671X

Keywords

Article
Publication date: 1 March 1989

MICHAEL J. NELSON

Distributions of index terms have been used in modelling information retrieval systems and databases. Most previous models used some form of the Zipf distribution. This work uses…

Abstract

Distributions of index terms have been used in modelling information retrieval systems and databases. Most previous models used some form of the Zipf distribution. This work uses a probability model of the occurrence of index terms to derive discrete distributions which are mixtures of Poisson and negative binomial distributions. These distributions, the generalised inverse Gaussian‐Poisson and the Generalised Waring give better fits than the simpler Zipf distribution, particularly in the tails of the distribution where the high frequency terms are found. They have the advantage of being more explanatory and can incorporate a time parameter if necessary.

Details

Journal of Documentation, vol. 45 no. 3
Type: Research Article
ISSN: 0022-0418

Article
Publication date: 1 January 1982

QUENTIN BURRELL

Alternative forms of the desirability distribution for library materials, as defined by the author in an earlier work, are discussed. It is demonstrated that, while several…

Abstract

Alternative forms of the desirability distribution for library materials, as defined by the author in an earlier work, are discussed. It is demonstrated that, while several different distributions may adequately describe an observed circulation frequency distribution over a fairly short time period (one year, say), the long‐term implications may be quite different. Some of the statistical aspects are discussed with an eye to ensuring that the most appropriate model is used.

Details

Journal of Documentation, vol. 38 no. 1
Type: Research Article
ISSN: 0022-0418

Article
Publication date: 2 November 2012

Wael Hemrit and Mounira Ben Arab

The purpose of this paper is to examine the determinants of operational losses in insurance companies.

1020

Abstract

Purpose

The purpose of this paper is to examine the determinants of operational losses in insurance companies.

Design/methodology/approach

By using most common estimates of frequency and severity of losses that affected business‐lines during 2009, the paper integrates a quantitative aspect that reflects the mode of organization in the insurance company. In this paper, it would be more appropriate to focus on the frequency and severity of losses estimated by insurers and which are related to each category of operational risk events that took place in 2009.

Findings

The paper finds that the frequency of operational losses is positively related to the Market Share (MARKSHARE) and the Rate of Geographic Location (RAGELOC). However, the occurrence of loss is negatively related to the Variety of Insurance Activities (VARIACT). The paper also found a decrease in the frequency of losses associated with a large number of employees. Therefore, there is a significant relationship between the Human Factor (HF) and the occurrence of operational losses. In terms of severity, the empirical study has shown that the probability of zero intensity of operational losses is negatively influenced by the Market Share (MARKSHARE) and the Rate of Geographic Location (RAGELOC). In the same framework, the Variety of Insurance Activities (VARIACT) has a negative effect on the probability of high operational loss severity.

Originality/value

Despite the absence of the quantitative data of operational risk, this article will discover a new research perspective to estimate the frequency and severity of operational losses in the insurance sector in Tunisia.

Details

The Journal of Risk Finance, vol. 13 no. 5
Type: Research Article
ISSN: 1526-5943

Keywords

1 – 10 of over 1000