Search results

1 – 10 of over 1000
Article
Publication date: 20 September 2021

Marwa Kh. Hassan

Distribution. The purpose of this study is to obtain the modified maximum likelihood estimator of stress–strength model using the ranked set sampling, to obtain the asymptotic and…

Abstract

Purpose

Distribution. The purpose of this study is to obtain the modified maximum likelihood estimator of stress–strength model using the ranked set sampling, to obtain the asymptotic and bootstrap confidence interval of P[Y < X], to compare the performance of author’s estimates with the estimates under simple random sampling and to apply author’s estimates on head and neck cancer.

Design/methodology/approach

The maximum likelihood estimator of R = P[Y < X], where X and Y are two independent inverse Weibull random variables common shape parameter that affect the shape of the distribution, and different scale parameters that have an effect on the distribution dispersion are given under ranked set sampling. Together with the asymptotic and bootstrap confidence interval, Monte Carlo simulation shows that this estimator performs better than the estimator under simple random sampling. Also, the asymptotic and bootstrap confidence interval under ranked set sampling is better than these interval estimators under simple random sampling. The application to head and neck cancer disease data shows that the estimator of R = P[Y < X] that shows the treatment with radiotherapy is more efficient than the treatment with a combined radiotherapy and chemotherapy under ranked set sampling that is better than these estimators under simple random sampling.

Findings

The ranked set sampling is more effective than the simple random sampling for the inference of stress-strength model based on inverse Weibull distribution.

Originality/value

This study sheds light on the author’s estimates on head and neck cancer.

Details

International Journal of Quality & Reliability Management, vol. 39 no. 7
Type: Research Article
ISSN: 0265-671X

Keywords

Article
Publication date: 24 August 2021

Soumya Roy, Biswabrata Pradhan and Annesha Purakayastha

This article considers Inverse Gaussian distribution as the basic lifetime model for the test units. The unknown model parameters are estimated using the method of moments, the…

Abstract

Purpose

This article considers Inverse Gaussian distribution as the basic lifetime model for the test units. The unknown model parameters are estimated using the method of moments, the method of maximum likelihood and Bayesian methods. As part of maximum likelihood analysis, this article employs an expectation-maximization algorithm to simplify numerical computation. Subsequently, Bayesian estimates are obtained using the Metropolis–Hastings algorithm. This article then presents the design of optimal censoring schemes using a design criterion that deals with the precision of a particular system lifetime quantile. The optimal censoring schemes are obtained after taking into account budget constraints.

Design/methodology/approach

This article first presents classical and Bayesian statistical inference for Progressive Type-I Interval censored data. Subsequently, this article considers the design of optimal Progressive Type-I Interval censoring schemes after incorporating budget constraints.

Findings

A real dataset is analyzed to demonstrate the methods developed in this article. The adequacy of the lifetime model is ensured using a simulation-based goodness-of-fit test. Furthermore, the performance of various estimators is studied using a detailed simulation experiment. It is observed that the maximum likelihood estimator relatively outperforms the method of moment estimator. Furthermore, the posterior median fares better among Bayesian estimators even in the absence of any subjective information. Furthermore, it is observed that the budget constraints have real implications on the optimal design of censoring schemes.

Originality/value

The proposed methodology may be used for analyzing any Progressive Type-I Interval Censored data for any lifetime model. The methodology adopted to obtain the optimal censoring schemes may be particularly useful for reliability engineers in real-life applications.

Details

International Journal of Quality & Reliability Management, vol. 39 no. 8
Type: Research Article
ISSN: 0265-671X

Keywords

Article
Publication date: 13 February 2019

Preeti Wanti Srivastava and Manisha Manisha

Zero-failure reliability testing aims at demonstrating whether the product has achieved the desired reliability target with zero failure and high confidence level at a given time…

Abstract

Purpose

Zero-failure reliability testing aims at demonstrating whether the product has achieved the desired reliability target with zero failure and high confidence level at a given time. Incorporating accelerated degradation testing in zero-failure reliability demonstration test (RDT) facilitates early failure in high reliability items developed within short period of time to be able to survive in fiercely competitive market. The paper aims to discuss these issues.

Design/methodology/approach

The triangular cyclic stress uses one test chamber thus saving experimental cost. The parameters in model are estimated using maximum likelihood methods. The optimum plan consists in finding out optimum number of cycles, optimum specimens, optimum stress change point(s) and optimum stress rates.

Findings

The optimum plan consists in finding out optimum number of cycles, optimum specimens, optimum stress change point(s) and optimum stress rates by minimizing asymptotic variance of estimate of quantile of the lifetime distribution at use condition subject to the constraint that total testing or experimental cost does not exceed a pre-specified budget. Confidence intervals of the design parameters have been obtained and sensitivity analysis carried out. The results of sensitivity analysis show that the plan is robust to small deviations from the true values of baseline parameters.

Originality/value

For some highly reliable products, even accelerated life testing yields little failure data of units in a feasible amount of time. In such cases accelerated degradation testing is carried out, wherein the failure termed as soft failure is defined in terms of performance characteristic of the product exceeding its critical (threshold) value.

Details

International Journal of Quality & Reliability Management, vol. 36 no. 3
Type: Research Article
ISSN: 0265-671X

Keywords

Article
Publication date: 29 September 2022

Rani Kumari, Chandrakant Lodhi, Yogesh Mani Tripathi and Rajesh Kumar Sinha

Inferences for multicomponent reliability is derived for a family of inverted exponentiated densities having common scale and different shape parameters.

Abstract

Purpose

Inferences for multicomponent reliability is derived for a family of inverted exponentiated densities having common scale and different shape parameters.

Design/methodology/approach

Different estimates for multicomponent reliability is derived from frequentist viewpoint. Two bootstrap confidence intervals of this parametric function are also constructed.

Findings

Form a Monte-Carlo simulation study, the authors find that estimates obtained from maximum product spacing and Right-tail Anderson–Darling procedures provide better point and interval estimates of the reliability. Also the maximum likelihood estimate competes good with these estimates.

Originality/value

In literature several distributions are introduced and studied in lifetime analysis. Among others, exponentiated distributions have found wide applications in such studies. In this regard the authors obtain various frequentist estimates for the multicomponent reliability by considering inverted exponentiated distributions.

Details

International Journal of Quality & Reliability Management, vol. 40 no. 4
Type: Research Article
ISSN: 0265-671X

Keywords

Article
Publication date: 1 August 1997

Richard L. Henshel

Briefly reviews the standard Poisson distribution and then examines a set of derivative, modified Poisson distributions for testing hypotheses derived from positive…

800

Abstract

Briefly reviews the standard Poisson distribution and then examines a set of derivative, modified Poisson distributions for testing hypotheses derived from positive deviation‐amplifying feedback models, which do not lend themselves to ordinary statistically based hypothesis testing. The “reinforcement” or “contagious” Poisson offers promise for a subset of such models, in particular those models with data in the form of rates (rather than magnitudes). The practical difficulty lies in distinguishing reinforcement effects from initial heterogeneity, since both can form negative binomial distributions, with look‐alike data. Illustrates these difficulties, and also opportunities, for various feedback models employing the self‐fulfilling prophecy, and especially for confidence loops, which incorporate particular self‐fulfilling prophecies as part of a larger dynamic process. Describes an actual methodology for testing hypotheses regarding confidence loops with the aid of a “reinforcement” Poisson distribution, as well as its place within sociocybernetics.

Article
Publication date: 12 October 2015

Kelbesa Abdisa Megersa

The study of the link between debt and growth has been full of debates, both in theory and empirics. However, there is a growing consensus that the relationship is sensitive to…

Abstract

Purpose

The study of the link between debt and growth has been full of debates, both in theory and empirics. However, there is a growing consensus that the relationship is sensitive to the level of debt. The purpose of this paper is to address the question of non-linearity in the long-term relationship between public debt and economic growth. Specifically, the author set out to test if there exists an established “laffer curve” type relationship, where debt contributes to economic growth up to a certain point (maximal threshold) and then starts to have a negative effect on growth afterwards.

Design/methodology/approach

To carry out the tests, the author has used a methodology that delivers a superior test of inverse U-shapes (Lind and Mehlum, 2010), in addition to the traditional test based on a regression with a quadratic specification.

Findings

The results in the paper present evidence of a bell-shaped relationship between economic growth and total public debt in a panel of low-income Sub-Saharan African economies. This supports the hypothesis that debt has some positive contribution to economic growth in low-income countries, albeit up to a point.

Practical implications

The overall result supports the claim that public debt may start to be a drag on economic growth if it goes on increasing beyond the level where it would be sustainable.

Originality/value

This paper leads the way by implementing a robust test of non-linearity (“inverse-U” test) to the analyses the debt-growth nexus and the laffer curve in Sub-Saharan Africa.

Details

Journal of Economic Studies, vol. 42 no. 5
Type: Research Article
ISSN: 0144-3585

Keywords

Article
Publication date: 16 January 2017

Sharif Mozumder, Michael Dempsey and M. Humayun Kabir

The purpose of the paper is to back-test value-at-risk (VaR) models for conditional distributions belonging to a Generalized Hyperbolic (GH) family of Lévy processes – Variance…

Abstract

Purpose

The purpose of the paper is to back-test value-at-risk (VaR) models for conditional distributions belonging to a Generalized Hyperbolic (GH) family of Lévy processes – Variance Gamma, Normal Inverse Gaussian, Hyperbolic distribution and GH – and compare their risk-management features with a traditional unconditional extreme value (EV) approach using data from future contracts return data of S&P500, FTSE100, DAX, HangSeng and Nikkei 225 indices.

Design/methodology/approach

The authors apply tail-based and Lévy-based calibration to estimate the parameters of the models as part of the initial data analysis. While the authors utilize the peaks-over-threshold approach for generalized Pareto distribution, the conditional maximum likelihood method is followed in case of Lévy models. As the Lévy models do not have closed form expressions for VaR, the authors follow a bootstrap method to determine the VaR and the confidence intervals. Finally, for back-testing, they use both static calibration (on the entire data) and dynamic calibration (on a four-year rolling window) to test the unconditional, independence and conditional coverage hypotheses implemented with 95 and 99 per cent VaRs.

Findings

Both EV and Lévy models provide the authors with a conservative proportion of violation for VaR forecasts. A model targeting tail or fitting the entire distribution has little effect on either VaR calculation or a VaR model’s back-testing performance.

Originality/value

To the best of the authors’ knowledge, this is the first study to explore the back-testing performance of Lévy-based VaR models. The authors conduct various calibration and bootstrap techniques to test the unconditional, independence and conditional coverage hypotheses for the VaRs.

Details

The Journal of Risk Finance, vol. 18 no. 1
Type: Research Article
ISSN: 1526-5943

Keywords

Article
Publication date: 6 March 2017

Zbigniew Bulinski and Helcio R.B. Orlande

This paper aims to present development and application of the Bayesian inverse approach for retrieving parameters of non-linear diffusion coefficient based on the integral…

Abstract

Purpose

This paper aims to present development and application of the Bayesian inverse approach for retrieving parameters of non-linear diffusion coefficient based on the integral information.

Design/methodology/approach

The Bayes formula was used to construct posterior distribution of the unknown parameters of non-linear diffusion coefficient. The resulting aposteriori distribution of sought parameters was integrated using Markov Chain Monte Carlo method to obtain expected values of estimated diffusivity parameters as well as their confidence intervals. Unsteady non-linear diffusion equation was discretised with the Global Radial Basis Function Collocation method and solved in time using Crank–Nicholson technique.

Findings

A number of manufactured analytical solutions of the non-linear diffusion problem was used to verify accuracy of the developed inverse approach. Reasonably good agreement, even for highly correlated parameters, was obtained. Therefore, the technique was used to compute concentration dependent diffusion coefficient of water in paper.

Originality/value

An original inverse technique, which couples efficiently meshless solution of the diffusion problem with the Bayesian inverse methodology, is presented in the paper. This methodology was extensively verified and applied to the real-life problem.

Details

International Journal of Numerical Methods for Heat & Fluid Flow, vol. 27 no. 3
Type: Research Article
ISSN: 0961-5539

Keywords

Article
Publication date: 6 March 2017

Wit Stryczniewicz, Janusz Zmywaczyk and Andrzej Jaroslaw Panas

The paper aims to discuss the inverse heat conduction methodology in solution of a certain parameter identification problem. The problem itself concerns determination of the…

Abstract

Purpose

The paper aims to discuss the inverse heat conduction methodology in solution of a certain parameter identification problem. The problem itself concerns determination of the thermophysical properties of a thin layer coating by applying the laser flash apparatus.

Design/methodology/approach

The modelled laser flash diffusivity data from the three-layer sample investigation are used as input for the following parameter estimation procedure. Assuming known middle layer, i.e. substrate properties, the thermal diffusivity (TD) of the side layers’ material is determined. The estimation technique utilises the finite element method for numerical solution of the direct, 2D axisymmetric heat conduction problem.

Findings

The paper presents methodology developed for a three-layer sample studies and results of the estimation technique testing and evaluation based on simulated data. The multi-parametrical identification procedure results in identification of the out of plane thin layer material diffusivity from the inverse problem solution.

Research limitations/implications

The presentation itself is limited to numerical simulation data, but it should be underlined that the flake graphite thermophysical parameters have been utilised in numerical tests.

Practical implications

The developed methodology is planned to be applied in detailed experimental studies of flake graphite.

Originality/value

In the course of a present study, a methodology of the thin-coating layer TD determination was developed. In spite of the fact that it has been developed for the graphite coating investigation, it was planned to be universal in application to any thin–thick composite structure study.

Details

International Journal of Numerical Methods for Heat & Fluid Flow, vol. 27 no. 3
Type: Research Article
ISSN: 0961-5539

Keywords

Article
Publication date: 17 July 2020

Preeti Wanti Srivastava, Manisha Manisha and Manju Agarwal

Degradation measurement of some products requires destructive inspection; that is, the degradation of each unit can be observed only once. For example, observation on the…

Abstract

Purpose

Degradation measurement of some products requires destructive inspection; that is, the degradation of each unit can be observed only once. For example, observation on the mechanical strength of interconnection bonds or on the dielectric strength of insulators requires destruction of the unit. Testing high-reliability items under normal operating conditions yields a small amount of degradation in a reasonable length of time. To overcome this problem, the items are tested at higher than normal stress level – an approach called an accelerated destructive degradation test (ADDT). The present paper deals with formulation of constant-stress ADDT (CSADDT) plan with the test specimens subject to stress induced by temperature and voltage.

Design/methodology/approach

The stress–life relationship between temperature and voltage is described using Zhurkov–Arrhenius model. The fractional factorial experiment has been used to determine optimal number of stress combinations. The product's degradation path follows Wiener process. The model parameters are estimated using method of maximum likelihood. The optimum plan consists in finding out optimum allocations at each inspection time corresponding to each stress combination by using variance optimality criterion.

Findings

The method developed has been explained using a numerical example wherein point estimates and confidence intervals for the model parameters have been obtained and likelihood ratio test has been used to test for the presence of interaction effect. It has been found that both the temperature and the interaction between temperature and voltage influence the quantile lifetime of the product. Sensitivity analysis is also carried out.

Originality/value

Most of the work in the literature on the design of ADDT plans focusses on only a single stress factor. An interaction exists among two or more stress factors if the effect of one factor on a response depends on the levels of other factors. In this paper, an optimal CSADDT plan is studied with one main effect and one interaction effect. The method developed can help engineers study the effect of elevated temperature and its interaction with another stress factor, say, voltage on quantile lifetime of a high-reliability unit likely to last for several years.

Details

International Journal of Quality & Reliability Management, vol. 38 no. 3
Type: Research Article
ISSN: 0265-671X

Keywords

1 – 10 of over 1000