Search results

1 – 10 of over 56000
Article
Publication date: 22 May 2009

D.R. Prajapati and P.B. Mahapatra

The purpose of this paper is to introduce a new design of an R chart to catch smaller shifts in the process dispersion as well as maintaining the simplicity so that it may be…

Abstract

Purpose

The purpose of this paper is to introduce a new design of an R chart to catch smaller shifts in the process dispersion as well as maintaining the simplicity so that it may be applied at shopfloor level.

Design/methodology/approach

Here a new R chart has been proposed which can overcome the limitations of Shewhart, CUSUM and EWMA range charts. The concept of this R chart is based on chi‐square (χ2) distribution. Although CUSUM and EWMA charts are very useful for catching the small shifts in the mean or standard deviation, they can catch the process shift only when there is a single and sustained shift in process average or standard deviation.

Findings

It was found that the proposed chart performs significantly better than the conventional (Shewhart) R chart, CUSUM range schemes proposed by Chang and Gan for most of the process shifts in standard deviation. The ARLs of the proposed R chart is higher than ARLs of CUSUM schemes for only ten cases out of 40. The performance of the proposed R chart has also been compared with the variance chart proposed by Chang and Gan for various shifts in standard deviation. The ARLs of the proposed R chart are compared with Chang's R chart for sample sizes of 3 and it can be concluded from the comparisons that the proposed R chart is much better than Chang's variance chart for all shift ratios for sample size of three. Many difficulties related to the operation and design of CUSUM and EWMA control charts are greatly reduced by providing a simple and accurate proposed R chart scheme. The performance characteristics (ARLs) of the proposed charts are very comparable to a great degree with FIR CUSUM, simple CUSUM and other variance charts. It can be concluded that, instead of considering many parameters, it is better to consider single sample size and single control limits because a control chart loses its simplicity with a greater number of parameters. Moreover, practitioners may also find difficulty in applying it in production processes. On the other hand, CUSUM control charts are not effective when there is a single and sustained shift in the process dispersion.

Research limitations/implications

A lot of effort has been done to develop the new range charts for monitoring the process dispersion. Various assumptions and factors affecting the performance of the R chart have been identified and taken into account. In the proposed design, the observations have been assumed independent of one another but the observations may also be assumed to be auto‐correlated with previous observations and the performance of the proposed R chart may be studied.

Originality/value

The research findings could be applied to various manufacturing and service industries as it is more effective than the conventional (Shewhart) R chart and simpler than CUSUM charts.

Details

International Journal of Quality & Reliability Management, vol. 26 no. 5
Type: Research Article
ISSN: 0265-671X

Keywords

Article
Publication date: 22 May 2009

Moustafa Omar Ahmed Abu‐Shawiesh

This paper seeks to propose a univariate robust control chart for location and the necessary table of factors for computing the control limits and the central line as an…

1753

Abstract

Purpose

This paper seeks to propose a univariate robust control chart for location and the necessary table of factors for computing the control limits and the central line as an alternative to the Shewhart control chart.

Design/methodology/approach

The proposed method is based on two robust estimators, namely, the sample median, MD, to estimate the process mean, μ, and the median absolute deviation from the sample median, MAD, to estimate the process standard deviation, σ. A numerical example was given and a simulation study was conducted in order to illustrate the performance of the proposed method and compare it with that of the traditional Shewhart control chart.

Findings

The proposed robust MDMAD control chart gives better performance than the traditional Shewhart control chart if the underlying distribution of chance causes is non‐normal. It has good properties for heavy‐tailed distribution functions and moderate sample sizes and it compares favorably with the traditional Shewhart control chart.

Originality/value

The most common statistical process control (SPC) tool is the traditional Shewhart control chart. The chart is used to monitor the process mean based on the assumption that the underlying distribution of the quality characteristic is normal and there is no major contamination due to outliers. The sample mean, , and the sample standard deviation, S, are the most efficient location and scale estimators for the normal distribution often used to construct the control chart, but the sample mean, , and the sample standard deviation, S, might not be the best choices when one or both assumptions are not met. Therefore, the need for alternatives to the control chart comes into play. The literature shows that the sample median, MD, and the median absolute deviation from the sample median, MAD, are indeed more resistant to departures from normality and the presence of outliers.

Details

International Journal of Quality & Reliability Management, vol. 26 no. 5
Type: Research Article
ISSN: 0265-671X

Keywords

Article
Publication date: 7 September 2015

Angus Jeang

The purpose of this paper is to build a curve that can portray quality level, with standard deviation, as a function of the production process related to elements such as…

1040

Abstract

Purpose

The purpose of this paper is to build a curve that can portray quality level, with standard deviation, as a function of the production process related to elements such as operating time and cumulative units produced.

Design/methodology/approach

The Cobb-Douglas multiplicative power model will be introduced to represent the proposed function in simultaneously describing the learning process for productivity and quality. The experimental devices consisted of reflective mirror, path paper, iPod Touch and pen. They were arranged as shown in Plate 1. The students were instructed to draw a line with a pen along the middle of the rail line on the path paper through the mirror indirectly. The iPod Touch acted as a stopwatch to monitor the time taken to complete each experiment. The path paper is shown in Figure 1. This statistical analysis is completed by computer programs, SAS.

Findings

This study presented an experiment in which subjects drew a line on a path while looking through a mirror. This study uses the Cobb-Douglas model to regress the S as a function of 0.3366×x 1−0.347×x 2−0.011.

Research limitations/implications

All units produced are acceptable in quality, disregarding the magnitude of standard deviation in the produced quality level. Like Porteus (1986) with the fixed probabilistic distribution is assumed. The fatigues are ignored in presented curve. In fact, operators are easy to get tired for attending quality and productivity simultaneously. The initial value of operating time or standard deviation for the first unit is estimated from a subject having been trained for a sufficient period of time; however, this consideration does exist in the present experiment.

Practical implications

The economic order (production) quantity model with learning effects in a production system could be considered. The other implication could be in a wider framework, such as multistage and multivariate of production development production systems and supply chains.

Social implications

For a life cycle application, the criteria considered in resolving the production problem should not only be limited to the costs involved in the production process, but also the quality-related costs incurred after the goods are delivered to customers.

Originality/value

Previous works regarding the learning process never mention the quality-related learning process. However, this study aims to achieve the above goals in finding the relationship of quality vs production volume and production time simultaneously.

Details

International Journal of Quality & Reliability Management, vol. 32 no. 8
Type: Research Article
ISSN: 0265-671X

Keywords

Article
Publication date: 3 January 2017

Ravichandran Joghee

The purpose of this paper is to develop an innovative and quite new Six Sigma quality control (SSQC) chart for the benefit of Six Sigma practitioners. A step-by-step procedure for…

1368

Abstract

Purpose

The purpose of this paper is to develop an innovative and quite new Six Sigma quality control (SSQC) chart for the benefit of Six Sigma practitioners. A step-by-step procedure for the construction of the chart is also given.

Design/methodology/approach

Under the assumption of normality, in this paper, the construction of SSQC chart is proposed in which the population mean and standard deviation are drawn from the process specification from the perspective of Six Sigma quality (SSQ). In this chart, the concept of target range is used to restrict the shift in the process within plus or minus 1.5 times of standard deviation. This control chart is useful in monitoring the process to ensure that the process is well maintained within the specification limits with minimum variation (shift).

Findings

A step-by-step procedure is given for the construction of the proposed SSQC chart. It can be easily understood and its application is also simple for Six Sigma practitioners. The proposed chart suggests for timely improvements in process mean and variation. The illustrative example shows the improved performance of the proposed new procedure.

Research limitations/implications

The proposed approach assumes a normal population described by the known specification of the process/product characteristics though it may not be in all cases. This may call for a thorough study of the population before applying the chart.

Practical implications

The proposed SSQC chart is an innovative approach and is quite new for the practitioners. The paper assumes that the population standard deviation is known and is drawn from the specification of the process/product characteristics. The proposed chart helps in fine-tuning the process mean and bringing the process standard deviation to the satisfactory level from the perspective of SSQ.

Originality/value

The paper is the first of its kind. It is innovative and quite new to the Six Sigma practitioners who will find its application interesting.

Details

International Journal of Quality & Reliability Management, vol. 34 no. 1
Type: Research Article
ISSN: 0265-671X

Keywords

Book part
Publication date: 5 May 2017

Amitava Mitra

Processes, in practice, may involve more than one quality characteristic that are of interest. It is quite possible for such quality characteristics to not be independent of each…

Abstract

Processes, in practice, may involve more than one quality characteristic that are of interest. It is quite possible for such quality characteristics to not be independent of each other since the magnitude of one of the characteristics may influence the magnitude of the other characteristics. Under this setting, it is of interest to determine the optimal settings of the process parameters (usually the process mean and the process standard deviation of each quality characteristic) under various objectives. Some of the objectives may be conflicting to each other. In general, it may be possible for the decision-maker to prioritize the objectives. Using such a prioritized scheme, it is of interest to determine the optimal settings of the process mean and standard deviation for each quality characteristic that is being monitored. Such solutions could be labeled as “satisficing” solutions. Sensitivity analyses of the decision variables to the chosen objectives and parameter values are also investigated.

Article
Publication date: 6 March 2017

Chung-Ho Chen and Chao-Yu Chou

The quality level setting problem determines the optimal process mean, standard deviation and specification limits of product/process characteristic to minimize the expected total…

Abstract

Purpose

The quality level setting problem determines the optimal process mean, standard deviation and specification limits of product/process characteristic to minimize the expected total cost associated with products. Traditionally, it is assumed that the product/process characteristic is normally distributed. However, this may not be true. This paper aims to explore the quality level setting problem when the probability distribution of the process characteristic deviates from normality.

Design/methodology/approach

Burr developed a density function that can represent a wide range of normal and non-normal distributions. This can be applied to investigate the effect of non-normality on the studies of statistical quality control, for example, designs of control charts and sampling plans. The quality level setting problem is examined by introducing Burr’s density function as the underlying probability distribution of product/process characteristic such that the effect of non-normality to the determination of optimal process mean, standard deviation and specification limits of product/process characteristic can be studied. The expected total cost associated with products includes the quality loss of conforming products, the rework cost of non-conforming products and the scrap cost of non-conforming products.

Findings

Numerical results show that the expected total cost associated with products is significantly influenced by the parameter of Burr’s density function, the target value of product/process characteristic, quality loss coefficient, unit rework cost and unit scrap cost.

Research limitations/implications

The major assumption of the proposed model is that the lower specification limit must be positive for practical applications, which definitely affects the space of feasible solution for the different combinations of process mean and standard deviation.

Social implications

The proposed model can provide industry/business application for promoting the product/service quality assurance for the customer.

Originality/value

The authors adopt the Burr distribution to determine the optimum process mean, standard deviation and specification limits under non-normality. To the best of their knowledge, this is a new method for determining the optimum process and product policy, and it can be widely applied.

Details

Engineering Computations, vol. 34 no. 1
Type: Research Article
ISSN: 0264-4401

Keywords

Article
Publication date: 15 November 2019

Abbas Al-Refaie, Mays Haddadin and Alaa Qabaja

The purpose of this paper is to propose an approach to determine the optimal parameters and tolerances in concurrent product and process design in the early design stages…

Abstract

Purpose

The purpose of this paper is to propose an approach to determine the optimal parameters and tolerances in concurrent product and process design in the early design stages utilizing fuzzy goal programming. A wheelchair design is provided for illustration.

Design/methodology/approach

The product design is developed on the basis of both customer and functionality requirements. The critical product components are then determined. The design and analysis of experiments are performed by using simulation, and then the probability distributions are adopted to determine the values of desired responses under each combination of critical product parameters and tolerances. Regression nonlinear models are then developed and inserted as constraints in the complete optimization model. Preferences on product specifications and process settings, as well as process capability index ranges, are also set as model constraints. The combined objective functions are finally formulated to minimize the sum of positive and negative deviations from desired targets and maximize process capability. The optimization model is applied to determine the optimal wheelchair design.

Findings

The results showed that the proposed approach is effective in determining the optimal values of the design parameters and tolerances of the critical components of the wheelchair with their related process means and standard deviations that enhance desired multiple quality responses under uncertainty.

Practical implications

This work provides a general methodology that can be applied for concurrent optimization of product design and process design in a wide range of business applications. Moreover, the methodology is beneficial when uncertainty exists in quality responses and the parameters and tolerances of product design and its critical processes.

Originality/value

The fuzziness is rarely considered in research and development stage. This research considers membership functions for parameters and tolerances of a product and its related processes rather than crisp values. Moreover, presented optimization model considers multiple objective functions, sum of deviations and process capability. Finally, the indirect quality responses are calculated from the best-fit probability distributions rather than assuming a normal distribution.

Details

International Journal of Quality & Reliability Management, vol. 37 no. 2
Type: Research Article
ISSN: 0265-671X

Keywords

Article
Publication date: 1 July 1994

Matoteng M. Ncube

Combined Shewhart‐cumulative score (cuscore) quality control schemes areavailable for controlling the mean of a continuous production process.In many industrial applications, it…

707

Abstract

Combined Shewhart‐cumulative score (cuscore) quality control schemes are available for controlling the mean of a continuous production process. In many industrial applications, it is important to control the process variability as well. The proposed combined Shewhart‐cumulative‐score (cuscore) procedure for detecting shifts in process variability uses the procedures developed by Ncube and Woodall (1984) to monitor shifts in the process mean of continuous production processes. It is shown, in the one‐sided case, by average run length comparisons, that the proposed schemes perform significantly better than comparative Shewhart procedures and in some cases even better than cusum schemes when using some process variability quality characteristics.

Details

International Journal of Quality & Reliability Management, vol. 11 no. 5
Type: Research Article
ISSN: 0265-671X

Keywords

Article
Publication date: 15 March 2011

D.R. Prajapati

The concept of the proposed R chart is based on the sum of chi squares (χ2). The average run lengths (ARLs) of the proposed R chart are computed and compared with the ARLs of a…

Abstract

Purpose

The concept of the proposed R chart is based on the sum of chi squares (χ2). The average run lengths (ARLs) of the proposed R chart are computed and compared with the ARLs of a standard R chart, Shewhart variance chart proposed by Chang and Gan, a CUSUM range chart (with and without FIR feature) proposed by Chang and Gan and also with an EWMA range chart proposed by Crowder and Hamilton for various chart parameters. This paper aims to show that only FIR CUSUM schemes perform better than the proposed R chart but other CUSUM and EWMA schemes are less efficient than the proposed R chart.

Design/methodology/approach

The concept of the proposed R chart is based on the sum of chi squares (χ2). The proposed R chart divides the plot area into three regions, namely: outright rejection region; outright acceptance region; and transition region. The NULL hypothesis is rejected if a point falls beyond the control limit, and accepted if it falls below the warning limit. However, when a point falls beyond the warning limit, but not beyond the control limit, the decision is taken on the basis of individual observations of the previous H samples, which are considered to evaluate statistic U, that is the sum of chi squares. The NULL hypothesis is rejected if U exceeds a predefined value (U*) and accepted otherwise.

Findings

The comparisons also show that the CUSUM, EWMA and proposed R charts outperform the Shewhart R chart by a substantial amount. It is concluded that only FIR CUSUM schemes perform better than the proposed R chart, as it is second in ranking. The other CUSUM and EWMA schemes are less efficient than the proposed R chart.

Research limitations/implications

CUSUM and EWMA charts can catch a small shift in the process average but they are not efficient to catch a large shift. Many researchers have also pointed out that these charts' applicability is limited to the chemical industries. Another limitation of CUSUM and EWMA charts is that they can catch the shift only when there is a single and sustained shift in the process average. If the shift is not sustained, then they will not be effective.

Practical implications

Many difficulties related to the operation and design of CUSUM and EWMA control charts are greatly reduced by providing a simple and accurate proposed scheme. The performance characteristics (ARLs) of the proposed charts described in this paper are very much comparable with FIR CUSUM, CUSUM, EWMA and other charts. It can be concluded that, instead of considering many chart parameters used in CUSUM and EWMA charts, it is better to consider a simple and more effective scheme, because a control chart loses its simplicity with multiple parameters. Moreover, practitioners may also experience difficulty in using these charts in production processes.

Originality/value

It is a modification of the Shewhart Range Chart but it is more effective than the Shewhart Range chart, as shown in the research paper.

Details

International Journal of Quality & Reliability Management, vol. 28 no. 3
Type: Research Article
ISSN: 0265-671X

Keywords

Article
Publication date: 6 March 2017

Arash Geramian, Arash Shahin, Sara Bandarrigian and Yaser Shojaie

Average quadratic quality loss function (QQLF) measures quality of a given process using mean shift from its target value and variance. While it has a target parameter for the…

Abstract

Purpose

Average quadratic quality loss function (QQLF) measures quality of a given process using mean shift from its target value and variance. While it has a target parameter for the mean, it lacks a target for the variance revisable for counting any progress of the process across different quality levels, above/below the standard level; thus, it appears too general. Hence, in this research, it was initially supposed that all processes are located at two possible quality spaces, above/below the standard level. The purpose of this paper is to propose a two-criterion QQLF, in which each criterion is specifically proper to one of the quality spaces.

Design/methodology/approach

Since 1.33 is a literarily standard or satisfactory value for two most important process capability indices Cp and Cpk, its upper/lower spaces are assumed as high-/low-quality spaces. Then the indices are integrated into traditional QQLF, of type nominal the best (NTB), to develop a two-criterion QQLF, in which each criterion is more suitable for each quality space. These two criteria have also been innovatively embedded in the plan-do-check-act (PDCA) cycle to help continuous improvement. Finally, the proposed function has been examined in comparison with the traditional one in Feiz Hospital in the province of Isfahan, Iran.

Findings

Results indicate that the internal process of the studied case is placed on the lower quality space. So the first criterion of revised QQLF gives a more relevant evaluation for that process, compared with the traditional function. Moreover, this study has embedded both proposed criteria in the PDCA cycle as well.

Research limitations/implications

Formulating the two-criterion QQLF only for observations of normal and symmetric distributions, and offering it solely for NTB characteristics are limitations of this study.

Practical implications

Two more relevant quality loss criteria have been formulated for each process (service or manufacturing). However, in order to show the comprehensiveness of the proposed method even in service institutes, emergency function of Feiz Hospital has been examined.

Originality/value

The traditional loss function of type NTB merely and implicitly targets zero defect for variance. In fact, it calculates quality loss of all processes placed on different quality spaces using a same measure. This study, however, provides a practitioner with opportunity of targeting excellent or satisfactory targets.

Details

Benchmarking: An International Journal, vol. 24 no. 2
Type: Research Article
ISSN: 1463-5771

Keywords

1 – 10 of over 56000