Search results

1 – 10 of over 3000
Article
Publication date: 10 August 2015

D. R. Prajapati and Sukhraj Singh

The purpose of this paper is to counter autocorrelation by designing the chart, using warning limits. Various optimal schemes of modified chart are proposed for various sample

Abstract

Purpose

The purpose of this paper is to counter autocorrelation by designing the chart, using warning limits. Various optimal schemes of modified chart are proposed for various sample sizes (n) at levels of correlation (Φ) of 0.00, 0.475 and 0.95. These optimal schemes of modified chart are compared with the double sampling (DS) chart, suggested by Costa and Claro (2008).

Design/methodology/approach

The performance of the chart is measured in terms of the average run length (ARL) that is the average number of samples before getting an out-of-control signal. Ultimately, due to the effect of autocorrelation among the data, the performance of the chart is suspected. The ARLs at various sets of parameters of the chart are computed by simulation, using MATLAB. The suggested optimal schemes are simpler schemes with limited number of parameters and smaller sample size (n=4) and this simplicity makes them very helpful in quality control.

Findings

The suggested optimal schemes of modified chart are compared with the DS chart, suggested by Costa and Claro (2008). It is concluded that the modified chart outperforms the DS chart at various levels of correlation (Φ) and shifts in the process mean. The simplicity in the design of modified chart, makes it versatile for many industries.

Research limitations/implications

Both the schemes are optimized by assuming the normal distribution. But this assumption may also be relaxed to design theses schemes for autocorrelated data. The optimal schemes for chart can be developed for variable sample size and for variable sampling intervals. The optimal schemes can also be explored for cumulative sum and exponentially weighted moving average charts.

Practical implications

The correlation among the process outputs of any industry can be find out and corresponding to that level of correlation the suggested control chart parameters can be applied. The understandable and robust design of modified chart makes it usable for industrial quality control.

Social implications

The rejection level of products in the industries can be reduced by designing the better control chart schemes which will also reduce the loss to the society, as suggested by Taguchi (1985).

Originality/value

Although it is the extension of previous work but it can be applied to various manufacturing industries as well as service industries, where the data are positively correlated and normally distributed.

Details

The TQM Journal, vol. 27 no. 5
Type: Research Article
ISSN: 1754-2731

Keywords

Article
Publication date: 26 July 2013

Sukhraj Singh and D.R. Prajapati

The purpose of this paper is to study the performance of the X‐bar chart on the basis of average run lengths (ARLs) for the positively correlated data. The ARLs at various sets of…

Abstract

Purpose

The purpose of this paper is to study the performance of the X‐bar chart on the basis of average run lengths (ARLs) for the positively correlated data. The ARLs at various sets of parameters of the X‐bar chart are computed by simulation. The performance of the chart at the various shifts in the process mean is compared with the X‐bar chart suggested by Zang and residual chart proposed by Zang. The optimal schemes suggested in this paper are also compared with variable parameters (VP) chart and double sampling (DS) X‐bar chart suggested by Costa and Machado.

Design/methodology/approach

Positively correlated observations having normal distribution are generated with the help of the MATLAB software. The performance of the X‐bar chart in terms of ARLs at the various shifts in the process mean is compared with the X‐bar chart suggested by Zang and residual chart proposed by Zang. The optimal schemes are also compared with VP X‐bar chart and DS X‐bar chart suggested by Costa and Machado.

Findings

The suggested optimal schemes of X‐bar chart perform better at the various shifts in the process mean than the X‐bar chart suggested by Zang and residual chart suggested by Zang. It was concluded that, although the suggested schemes for X‐bar chart detect shifts later than the VP and DS X‐bar charts proposed by Costa and Machado, they involved a much smaller number of parameters that are to be adjusted. So the time required for adjustment in case of optimal scheme is very small compared to the VP and DS charts.

Research limitations/implications

The optimal schemes of X‐bar chart are developed for the normally distributed autocorrelated data. But this assumption may also be relaxed to design these schemes for autocorrelated data. Moreover, the optimal schemes for chart can be developed for variable sample size and for variable sampling intervals.

Originality/value

Although it is the extension of previous work, it can be applied to various manufacturing industries as well as service industries where the data is positively correlated and normally distributed.

Details

International Journal of Quality & Reliability Management, vol. 30 no. 7
Type: Research Article
ISSN: 0265-671X

Keywords

Article
Publication date: 29 April 2014

Manuel do Carmo, Paulo Infante and Jorge M Mendes

– The purpose of this paper is to measure the performance of a sampling method through the average number of samples drawn in control.

1107

Abstract

Purpose

The purpose of this paper is to measure the performance of a sampling method through the average number of samples drawn in control.

Design/methodology/approach

Matching the adjusted average time to signal (AATS) of sampling methods, using as a reference the AATS of one of them the paper obtains the design parameters of the others. Thus, it will be possible to obtain, in control, the average number of samples required, so that the AATS of the mentioned sampling methods may be equal to the AATS of the method that the paper uses as the reference.

Findings

A more robust performance measure to compare sampling methods because in many cases the period of time where the process is in control is greater than the out of control period. With this performance measure the paper compares different sampling methods through the average total cost per cycle, in systems with Weibull lifetime distributions: three systems with an increasing hazard rate (shape parameter β=2, 4 and 7) and one system with a decreasing failure rate (β=0, 8).

Practical implications

In a usual production cycle where the in control period is much larger than the out of control period, particularly if the sampling costs and false alarms costs are high in relation to malfunction costs, the paper thinks that this methodology allows us a more careful choice of the appropriate sampling method.

Originality/value

To compare the statistical performance between different sampling methods using the average number of samples need to be inspected when the process is in control. Particularly, the paper compares the statistical and economic performance between different sampling methods in contexts not previously considered in literature. The paper presents an approximation for the average time between the instant that failure occurs and the first sample with the process out of control, as well.

Details

International Journal of Quality & Reliability Management, vol. 31 no. 5
Type: Research Article
ISSN: 0265-671X

Keywords

Article
Publication date: 1 January 1993

Kenneth R. Tillery, Arthur L. Rutledge and R. Anthony Inman

A continuous reorientation towards quality by US firms should be facilitated by the infusion of business and engineering school graduates having a solid conceptual foundation of…

Abstract

A continuous reorientation towards quality by US firms should be facilitated by the infusion of business and engineering school graduates having a solid conceptual foundation of quality and quality management. Examines a sample of the textbooks used in the standard under‐graduate course in production and operations management to assess the coverage and orientation provided towards quality and quality management. While the traditional quality model is still dominant, expanded coverage providing a more balanced view of the operational and strategic faces of quality is seen.

Details

International Journal of Quality & Reliability Management, vol. 10 no. 1
Type: Research Article
ISSN: 0265-671X

Keywords

Article
Publication date: 31 January 2022

Simone Massulini Acosta and Angelo Marcio Oliveira Sant'Anna

Process monitoring is a way to manage the quality characteristics of products in manufacturing processes. Several process monitoring based on machine learning algorithms have been…

Abstract

Purpose

Process monitoring is a way to manage the quality characteristics of products in manufacturing processes. Several process monitoring based on machine learning algorithms have been proposed in the literature and have gained the attention of many researchers. In this paper, the authors developed machine learning-based control charts for monitoring fraction non-conforming products in smart manufacturing. This study proposed a relevance vector machine using Bayesian sparse kernel optimized by differential evolution algorithm for efficient monitoring in manufacturing.

Design/methodology/approach

A new approach was carried out about data analysis, modelling and monitoring in the manufacturing industry. This study developed a relevance vector machine using Bayesian sparse kernel technique to improve the support vector machine used to both regression and classification problems. The authors compared the performance of proposed relevance vector machine with other machine learning algorithms, such as support vector machine, artificial neural network and beta regression model. The proposed approach was evaluated by different shift scenarios of average run length using Monte Carlo simulation.

Findings

The authors analyse a real case study in a manufacturing company, based on best machine learning algorithms. The results indicate that proposed relevance vector machine-based process monitoring are excellent quality tools for monitoring defective products in manufacturing process. A comparative analysis with four machine learning models is used to evaluate the performance of the proposed approach. The relevance vector machine has slightly better performance than support vector machine, artificial neural network and beta models.

Originality/value

This research is different from the others by providing approaches for monitoring defective products. Machine learning-based control charts are used to monitor product failures in smart manufacturing process. Besides, the key contribution of this study is to develop different models for fault detection and to identify any change point in the manufacturing process. Moreover, the authors’ research indicates that machine learning models are adequate tools for the modelling and monitoring of the fraction non-conforming product in the industrial process.

Details

International Journal of Quality & Reliability Management, vol. 40 no. 3
Type: Research Article
ISSN: 0265-671X

Keywords

Article
Publication date: 11 January 2022

Daniel Ashagrie Tegegne, Daniel Kitaw Azene and Eshetie Berhan Atanaw

This study aims to design a multivariate control chart that improves the applicability of the traditional Hotelling T2 chart. This new type of multivariate control chart displays…

Abstract

Purpose

This study aims to design a multivariate control chart that improves the applicability of the traditional Hotelling T2 chart. This new type of multivariate control chart displays sufficient information about the states and relationships of the variables in the production process. It is used to make better quality control decisions during the production process.

Design/methodology/approach

Multivariate data are collected at an equal time interval and are represented by nodes of the graph. The edges connecting the nodes represent the sequence of operation. Each node is plotted on the control chart based on their Hotelling T2 statistical distance. The changing behavior of each pair of input and output nodes is studied by the neural network. A case study from the cement industry is conducted to validate the control chart.

Findings

The finding of this paper is that the points and lines in the classic Hotelling T2 chart are effectively substituted by nodes and edges of the graph respectively. Nodes and edges have dimension and color and represent several attributes. As a result, this control chart displays much more information than the traditional Hotelling T2 control chart. The pattern of the plot represents whether the process is normal or not. The effect of the sequence of operation is visible in the control chart. The frequency of the happening of nodes is recognized by the size of nodes. The decision to change the product feature is assisted by finding the shortest path between nodes. Moreover, consecutive nodes have different behaviors, and that behavior change is recognized by neural network.

Originality/value

Modifying the classical Hotelling T2 control chart by integrating with the concept of graph theory and neural network is new of its kind.

Details

International Journal of Quality & Reliability Management, vol. 39 no. 7
Type: Research Article
ISSN: 0265-671X

Keywords

Article
Publication date: 17 January 2023

Salimeh Sadat Aghili, Mohsen Torabian, Mohammad Hassan Behzadi and Asghar Seif

The purpose of this paper is to develop a double-objective economic statistical design (ESD) of (X…

Abstract

Purpose

The purpose of this paper is to develop a double-objective economic statistical design (ESD) of (X) control chart under Weibull failure properties with the Linex asymmetric loss function. The authors have expressed the probability of type II error (β) as the statistical objective and the expected cost as the economic objective.

Design/methodology/approach

The design used in this study is based on a double-objective economic statistical design of (X) control chart with Weibull shock model via applying Banerjee and Rahim's model for non-uniform and uniform schemes with Linex asymmetric loss function. The results in the least average cost and β in uniform and non-uniform schemes by Linex loss function, compared with the same schemes without loss function.

Findings

Numerical results indicate that it is not possible to reduce the second type of error and costs at the same time, which means that by reducing the second type of error, the cost increases, and by reducing the cost, the second type of error increases, both of which are very important. Obtained based on the needs of the industry and which one has more priority has the right to choose. These designs define a Pareto optimal front of solutions that increase the flexibility and adaptability of the X control chart in practice. When the authors use non-uniform schemes instead of uniform schemes, the average cost per unit time decreases by an average and when the authors apply loss function, the average cost per unit time increases by an average. Also, this quantity for double-objective schemes with loss function compared to without loss function schemes in cases uniform and non-uniform increases. The reason for this result is that the model underestimated the costs before using the loss function.

Practical implications

This research adds to the body of knowledge related to flexibility in process quality control. This article may be of interest to quality systems experts in factories where the choice between cost reduction and statistical factor reduction can affect the production process.

Originality/value

The cost functions for double-objective uniform and non-uniform sampling schemes with the Weibull shock model based on the Linex loss function are presented for the first time.

Details

International Journal of Quality & Reliability Management, vol. 40 no. 8
Type: Research Article
ISSN: 0265-671X

Keywords

Article
Publication date: 5 February 2018

Olatunde Adebayo Adeoti

The purpose of this paper is to propose a double exponentially weighted moving average control chart using repetitive sampling (RS-DEWMA) for a normally distributed process…

Abstract

Purpose

The purpose of this paper is to propose a double exponentially weighted moving average control chart using repetitive sampling (RS-DEWMA) for a normally distributed process variable to improve the efficiency of detecting small process mean shift.

Design/methodology/approach

The algorithm for the implementation of the proposed chart is developed and the formulae for the in-control and out-of-control average run lengths (ARLs) are derived. Tables of ARLs are presented for various process mean shift. The performance of the proposed chart is investigated in terms of the average run-length for small process mean shift and compared with the existing DEWMA control chart. Numerical examples are given as illustration of the design and implementation of the proposed chart.

Findings

The proposed control chart is more efficient than the existing DEWMA control chart in the detection of small process mean shifts as it consistently gives smaller ARL values and quickly detects the process shift. However, the performance of the proposed chart relatively deteriorates for large smoothing constants.

Practical implications

The application of repetitive sampling in the control chart literature is gaining wide acceptability. The design and implementation of the RS-DEWMA control chart offers a new approach in the detection of small process mean shift by process control personnel.

Originality/value

This paper fills a gap in the literature by examining the performance of the repetitive sampling DEWMA control chart. The use of repetitive sampling technique in the control chart is discussed in the literature, however, its use based on the DEWMA statistic has not been considered in this context.

Details

International Journal of Quality & Reliability Management, vol. 35 no. 2
Type: Research Article
ISSN: 0265-671X

Keywords

Article
Publication date: 1 June 1943

H. Rissik

THE first part of this article, published in last month's issue of AIRCRAFT ENGINEERING, outlined the operation of the non‐statistical method of sampling inspection commonly met…

Abstract

THE first part of this article, published in last month's issue of AIRCRAFT ENGINEERING, outlined the operation of the non‐statistical method of sampling inspection commonly met with in purchasing specifications, and explained the inability of such a sampling clause to discriminate effectively between good and bad quality product. The present issue describes the practical applications of statistically designed sampling inspection procedures, giving adequate quality assurance wherever 100 per cent inspection of the product is either inapplicable or uneconomic.

Details

Aircraft Engineering and Aerospace Technology, vol. 15 no. 6
Type: Research Article
ISSN: 0002-2667

Article
Publication date: 1 December 1997

Su‐Fen Yang

Cost models for the design of control charts based on Duncan’s approach have been studied in recent years. Presents a double assignable‐cause cost model, which is in terms of…

Abstract

Cost models for the design of control charts based on Duncan’s approach have been studied in recent years. Presents a double assignable‐cause cost model, which is in terms of Taguchi’s loss imparted to society from the time a product is shipped, using renewal theory approach. The expression for the expected cycle length and the expected cost per cycle are easier to obtain by the proposed approach, and the cost model, including the customer’s voice, reveals the importance of quality. Sensitivity analysis performed on a large number of numerical examples illustrates that the cost of repair or replacement and customers’ tolerance, which are related to loss function, are critical when designing economically based on X‐ and S control charts.

Details

International Journal of Quality & Reliability Management, vol. 14 no. 9
Type: Research Article
ISSN: 0265-671X

Keywords

1 – 10 of over 3000