Search results

1 – 10 of 810
To view the access options for this content please click here
Article

Hare Krishna and Manish Malik

This paper seeks to focus on the study and estimation of reliability characteristics of Maxwell distribution under Type‐II censoring scheme.

Abstract

Purpose

This paper seeks to focus on the study and estimation of reliability characteristics of Maxwell distribution under Type‐II censoring scheme.

Design/methodology/approach

Maximum likelihood estimation and Bayes estimation methods have been used for the estimation of reliability characteristics. Monte‐Carlo simulation is used to compare the efficiency of the estimates developed by these estimation methods.

Findings

With prior information on the parameter of Maxwell distribution, Bayes estimation provides better estimates of reliability characteristics; otherwise Maximum likelihood estimation is good enough to use for reliability practitioners.

Practical implications

When items are costly, Type‐II censoring scheme can be used to save the cost of the experiment and the discussed methods provide the means to estimate the reliability characteristics of the proposed lifetime model under this scheme.

Originality/value

The study is useful for researchers and practitioners in reliability theory and also for scientists in physics and chemistry, where Maxwell distribution is widely used.

Details

International Journal of Quality & Reliability Management, vol. 26 no. 2
Type: Research Article
ISSN: 0265-671X

Keywords

To view the access options for this content please click here
Article

Hare Krishna and Ranjeet Sharma

The purpose of this paper is to consider a General System Configuration (GSC), whose particular cases are all the popular system configurations. In reliability engineering…

Abstract

Purpose

The purpose of this paper is to consider a General System Configuration (GSC), whose particular cases are all the popular system configurations. In reliability engineering one comes across various system configurations, for example, series, parallel and k‐out of‐m system models, which consist of a number of components.

Design/methodology/approach

The paper gives a general approach to express the reliability properties of the whole system in terms of component parameters. The reliability of a GSC is expressed as a polynomial of the component reliability. Lifetime data on components have been used to estimate the system reliability characteristics through classical and Bayes estimation procedures.

Findings

The paper finds that the underlying distribution is assumed to be Weibull and, in view of cost constraints, Type‐II censored information has been used.

Practical implications

The paper is useful for reliability practitioners as well as theoreticians. It provides an easy method to estimate the reliability of any system configuration.

Originality/value

Three types of estimation procedures for a general system configuration have been developed for the first time. The lifetimes of components are assumed to follow widely used Weibull distribution, whose particular case is the most popular exponential distribution.

Details

International Journal of Quality & Reliability Management, vol. 25 no. 7
Type: Research Article
ISSN: 0265-671X

Keywords

To view the access options for this content please click here
Article

Mayank Kumar Jha, Sanku Dey and Yogesh Mani Tripathi

The purpose of this paper is to estimate the multicomponent reliability by assuming the unit-Gompertz (UG) distribution. Both stress and strength are assumed to have an UG…

Abstract

Purpose

The purpose of this paper is to estimate the multicomponent reliability by assuming the unit-Gompertz (UG) distribution. Both stress and strength are assumed to have an UG distribution with common scale parameter.

Design/methodology/approach

The reliability of a multicomponent stress–strength system is obtained by the maximum likelihood (MLE) and Bayesian method of estimation. Bayes estimates of system reliability are obtained by using Lindley’s approximation and Metropolis–Hastings (M–H) algorithm methods when all the parameters are unknown. The highest posterior density credible interval is obtained by using M–H algorithm method. Besides, uniformly minimum variance unbiased estimator and exact Bayes estimates of system reliability have been obtained when the common scale parameter is known and the results are compared for both small and large samples.

Findings

Based on the simulation results, the authors observe that Bayes method provides better estimation results as compared to MLE. Proposed asymptotic and HPD intervals show satisfactory coverage probabilities. However, average length of HPD intervals tends to remain shorter than the corresponding asymptotic interval. Overall the authors have observed that better estimates of the reliability may be achieved when the common scale parameter is known.

Originality/value

Most of the lifetime distributions used in reliability analysis, such as exponential, Lindley, gamma, lognormal, Weibull and Chen, only exhibit constant, monotonically increasing, decreasing and bathtub-shaped hazard rates. However, in many applications in reliability and survival analysis, the most realistic hazard rates are upside-down bathtub and bathtub-shaped, which are found in the unit-Gompertz distribution. Furthermore, when reliability is measured as percentage or ratio, it is important to have models defined on the unit interval in order to have plausible results. Therefore, the authors have studied the multicomponent stress–strength reliability under the unit-Gompertz distribution by comparing the MLEs, Bayes estimators and UMVUEs.

Details

International Journal of Quality & Reliability Management, vol. 37 no. 3
Type: Research Article
ISSN: 0265-671X

Keywords

To view the access options for this content please click here
Article

D.R. Barot and M.N. Patel

This paper aims to deal with the estimation of the empirical Bayesian exact confidence limits of reliability indexes of a cold standby series system with (n+k−1) units…

Abstract

Purpose

This paper aims to deal with the estimation of the empirical Bayesian exact confidence limits of reliability indexes of a cold standby series system with (n+k−1) units under the general progressive Type II censoring scheme.

Design/methodology/approach

Assuming that the lifetime of each unit in the system is identical and independent random variable with exponential distribution, the exact confidence limits of the reliability indexes are derived by using an empirical Bayes approach when an exponential prior distribution of the failure rate parameter is considered. The accuracy of these confidence limits is examined in terms of their coverage probabilities by means of Monte-Carlo simulations.

Findings

The simulation results show that accuracy of exact confidence limits of reliability indexes of a cold standby series system is efficient. Therefore, this approach is good enough to use for reliability practitioners in order to improve the system reliability.

Practical implications

When items are costly, the general progressive Type II censoring scheme is used to reduce the total test time and the associated cost of an experiment. The proposed method provides the means to estimate the exact confidence limits of reliability indexes of the proposed cold standby series system under this scheme.

Originality/value

The application of the proposed technique will help the reliability engineers/managers/system engineers in various industrial and other setups where a cold standby series system is widely used.

Details

International Journal of Quality & Reliability Management, vol. 31 no. 3
Type: Research Article
ISSN: 0265-671X

Keywords

To view the access options for this content please click here
Article

Jared Charles Allen, Alasdair M. Goodwill, Kyle Watters and Eric Beauregard

The purpose of this paper is to discuss and demonstrate “best practices” for creating quantitative behavioural investigative advice (i.e. statements to assist police with…

Abstract

Purpose

The purpose of this paper is to discuss and demonstrate “best practices” for creating quantitative behavioural investigative advice (i.e. statements to assist police with psychological and behavioural aspects of investigations) where complex statistical modelling is not available.

Design/methodology/approach

Using a sample of 361 serial stranger sexual offenses and a cross-validation approach, the paper demonstrates prediction of offender characteristics using base rates and using Bayes’ Theorem. The paper predicts four dichotomous offender characteristic variables, first using simple base rates, then using Bayes’ Theorem with 16 categorical crime scene variable predictors.

Findings

Both methods consistently predict better than chance. By incorporating more information, analyses based on Bayes’ Theorem (74.6 per cent accurate) predict with 11.1 per cent more accuracy overall than analyses based on base rates (63.5 per cent accurate), and provide improved advising estimates in line with best practices.

Originality/value

The study demonstrates how useful predictions of offender characteristics can be acquired from crime information without large (i.e. >500 cases) data sets or “trained” statistical models. Advising statements are constructed for discussion, and results are discussed in terms of the pragmatic usefulness of the methods for police investigations.

Details

Policing: An International Journal of Police Strategies & Management, vol. 37 no. 1
Type: Research Article
ISSN: 1363-951X

Keywords

To view the access options for this content please click here
Book part

Elías Moreno and Luís Raúl Pericchi

We put forward the idea that for model selection the intrinsic priors are becoming a center of a cluster of a dominant group of methodologies for objective Bayesian Model…

Abstract

We put forward the idea that for model selection the intrinsic priors are becoming a center of a cluster of a dominant group of methodologies for objective Bayesian Model Selection.

The intrinsic method and its applications have been developed in the last two decades, and has stimulated closely related methods. The intrinsic methodology can be thought of as the long searched approach for objective Bayesian model selection and hypothesis testing.

In this paper we review the foundations of the intrinsic priors, their general properties, and some of their applications.

Details

Bayesian Model Comparison
Type: Book
ISBN: 978-1-78441-185-5

Keywords

To view the access options for this content please click here
Book part

Badi H. Baltagi, Georges Bresson and Jean-Michel Etienne

This chapter proposes semiparametric estimation of the relationship between growth rate of GDP per capita, growth rates of physical and human capital, labor as well as…

Abstract

This chapter proposes semiparametric estimation of the relationship between growth rate of GDP per capita, growth rates of physical and human capital, labor as well as other covariates and common trends for a panel of 23 OECD countries observed over the period 1971–2015. The observed differentiated behaviors by country reveal strong heterogeneity. This is the motivation behind using a mixed fixed- and random coefficients model to estimate this relationship. In particular, this chapter uses a semiparametric specification with random intercepts and slopes coefficients. Motivated by Lee and Wand (2016), the authors estimate a mean field variational Bayes semiparametric model with random coefficients for this panel of countries. Results reveal nonparametric specifications for the common trends. The use of this flexible methodology may enrich the empirical growth literature underlining a large diversity of responses across variables and countries.

To view the access options for this content please click here
Book part

Arnold Zellner

After briefly reviewing the past history of Bayesian econometrics and Alan Greenspan's (2004) recent description of his use of Bayesian methods in managing policy-making…

Abstract

After briefly reviewing the past history of Bayesian econometrics and Alan Greenspan's (2004) recent description of his use of Bayesian methods in managing policy-making risk, some of the issues and needs that he mentions are discussed and linked to past and present Bayesian econometric research. Then a review of some recent Bayesian econometric research and needs is presented. Finally, some thoughts are presented that relate to the future of Bayesian econometrics.

Details

Bayesian Econometrics
Type: Book
ISBN: 978-1-84855-308-8

To view the access options for this content please click here
Article

B.D. Bunday and I.D. Al‐Ayoubi

The contents and function of a computer package to fit reliability models for computer software are outlined. Parameters in the models are, in the first place, estimated…

Abstract

The contents and function of a computer package to fit reliability models for computer software are outlined. Parameters in the models are, in the first place, estimated by maximum likelihood estimation procedures. Bayesian estimation methods are also used and are shown to give estimates with a smaller variance than their MLE counterparts. An example of the application to a particular set of failure times is given.

Details

International Journal of Quality & Reliability Management, vol. 7 no. 5
Type: Research Article
ISSN: 0265-671X

Keywords

To view the access options for this content please click here
Book part

Simon Washington, Amir Pooyan Afghari and Mohammed Mazharul Haque

Purpose – The purpose of this chapter is to review the methodological and empirical underpinnings of transport network screening, or management, as it relates to improving…

Abstract

Purpose – The purpose of this chapter is to review the methodological and empirical underpinnings of transport network screening, or management, as it relates to improving road safety. As jurisdictions around the world are charged with transport network management in order to reduce externalities associated with road crashes, identifying potential blackspots or hotspots is an important if not critical function and responsibility of transport agencies.

Methodology – Key references from within the literature are summarised and discussed, along with a discussion of the evolution of thinking around hotspot identification and management. The theoretical developments that correspond with the evolution in thinking are provided, sprinkled with examples along the way.

Findings – Hotspot identification methodologies have evolved considerably over the past 30 or so years, correcting for methodological deficiencies along the way. Despite vast and significant advancements, identifying hotspots remains a reactive approach to managing road safety – relying on crashes to accrue in order to mitigate their occurrence. The most fruitful directions for future research will be in the establishment of reliable relationships between surrogate measures of road safety – such as ‘near misses’ – and actual crashes – so that safety can be proactively managed without the need for crashes to accrue.

Research implications – Research in hotspot identification will continue; however, it is likely to shift over time to both closer to ‘real-time’ crash risk detection and considering safety improvements using surrogate measures of road safety – described in Chapter 17.

Practical implications – There are two types of errors made in hotspot detection – identifying a ‘risky’ site as ‘safe’ and identifying a ‘safe’ site as ‘risky’. In the former case no investments will be made to improve safety, while in the latter case ineffective or inefficient safety improvements could be made. To minimise these errors, transport network safety managers should be applying the current state of the practice methods for hotspot detection. Moreover, transport network safety managers should be eager to transition to proactive methods of network safety management to avoid the need for crashes to occur. While in its infancy, the use of surrogate measures of safety holds significant promise for the future.

Details

Safe Mobility: Challenges, Methodology and Solutions
Type: Book
ISBN: 978-1-78635-223-1

Keywords

1 – 10 of 810