Search results

1 – 10 of over 1000
Article
Publication date: 23 May 2008

D.R. Prajapati and P.B. Mahapatra

The purpose of this paper is to introduce a new design of the chart to catch smaller shifts in the process average as well as to maintain the simplicity like the…

Abstract

Purpose

The purpose of this paper is to introduce a new design of the chart to catch smaller shifts in the process average as well as to maintain the simplicity like the Shewhart chart so that it may be applied at shopfloor level.

Design/methodology/approach

In this paper, a new chart with two strategies is proposed which can overcome the limitations of Shewhart, CUSUM and EWMA charts. The Shewhart chart uses only two control limits to arrive at a decision to accept the Null Hypothesis (H0) or Alternative Hypothesis (H1), but in the new chart, two more limits at “K” times sample standard deviation on both sides from center line have been introduced. These limits are termed warning limits. The first strategy is based on chi‐square distribution (CSQ), while the second strategy is based on the average of sample means (ASM).

Findings

The proposed chart with “strategy ASM” shows lower average run length (ARL) values than ARLs of variable parameter (VP) chart for most of the cases. The VP chart shows little better performance than the new chart; but at large sample sizes (n) of 12 and 16. The VSS chart also shows lower ARLs but at very large sample size, which should not be used because, as far as possible, samples should be taken from a lot produced under identical conditions. The inherent feature of the new chart is its simplicity, so that it can be used without difficulty at shopfloor level as it uses only a fixed sample size and fixed sampling interval but it is very difficult to set the various chart parameters in VP and VSS charts.

Research limitations/implications

A lot of effort has been expended to develop the new strategies for monitoring the process mean. Various assumptions and factors affecting the performance of the chart have been identified and taken into account. In the proposed design, the observations have been assumed independent of one another but the observations may also be assumed to be auto‐correlated with previous observations and performance of the proposed chart may be studied.

Originality/value

The research findings could be applied to various manufacturing and service industries as it is more effective than the Shewhart chart and simpler than the VP, VSS and CUSUM charts.

Details

International Journal of Quality & Reliability Management, vol. 25 no. 5
Type: Research Article
ISSN: 0265-671X

Keywords

Article
Publication date: 20 April 2010

D.R. Prajapati and P.B. Mahapatra

The purpose of this paper is to make economic comparison of the proposed chart with the economic and economic‐statistical design of a multivariate exponentially…

515

Abstract

Purpose

The purpose of this paper is to make economic comparison of the proposed chart with the economic and economic‐statistical design of a multivariate exponentially weighted moving average (MEWMA) control chart proposed by Linderman and Love, using Lorenzen‐Vance cost model.

Design/methodology/approach

The economic design of proposed chart, using Lorenzen‐Vance cost model, is discussed in the paper. It is observed that sampling interval (h) and expected cost/hour (C) depend on various parameters of the chart, used in this model. When there is any change in any parameter of the chart, obviously both sampling interval and expected cost will be different. So it is suggested that one should use Lorenzen and Vance cost model (equation 1) to compute sampling interval and expected cost/hour for the proposed chart.

Findings

The economic design of the proposed chart has been compared with the economic and economic‐statistical design of the multivariate exponentially weighted moving average (MEWMA) control chart proposed by Linderman and Love. It is found that the proposed chart performs better than MEWMA chart proposed by Linderman and Love for sample sizes of 7, 9 and 10 for first set of parameters. The proposed chart also shows lower expected cost/hour than the MEWMA chart for sample size of 2 and 3 and for shifts of 2 and 3 for the second set of parameters.

Research limitations/implications

A lot of effort has been made to develop the proposed chart for monitoring the process mean. Although optimal sampling intervals are calculated only for two sets of parameters for shifts in the process average of 1, 2 and 3, it can be computed for any set of parameters using the Lorenzen‐Vance cost model.

Originality/value

The research findings could be applied to various manufacturing and service industries, as it is more effective than the Shewhart and EWMA charts.

Details

International Journal of Quality & Reliability Management, vol. 27 no. 4
Type: Research Article
ISSN: 0265-671X

Keywords

Article
Publication date: 22 May 2009

D.R. Prajapati and P.B. Mahapatra

The purpose of this paper is to introduce a new design of an R chart to catch smaller shifts in the process dispersion as well as maintaining the simplicity so that it may…

Abstract

Purpose

The purpose of this paper is to introduce a new design of an R chart to catch smaller shifts in the process dispersion as well as maintaining the simplicity so that it may be applied at shopfloor level.

Design/methodology/approach

Here a new R chart has been proposed which can overcome the limitations of Shewhart, CUSUM and EWMA range charts. The concept of this R chart is based on chi‐square (χ2) distribution. Although CUSUM and EWMA charts are very useful for catching the small shifts in the mean or standard deviation, they can catch the process shift only when there is a single and sustained shift in process average or standard deviation.

Findings

It was found that the proposed chart performs significantly better than the conventional (Shewhart) R chart, CUSUM range schemes proposed by Chang and Gan for most of the process shifts in standard deviation. The ARLs of the proposed R chart is higher than ARLs of CUSUM schemes for only ten cases out of 40. The performance of the proposed R chart has also been compared with the variance chart proposed by Chang and Gan for various shifts in standard deviation. The ARLs of the proposed R chart are compared with Chang's R chart for sample sizes of 3 and it can be concluded from the comparisons that the proposed R chart is much better than Chang's variance chart for all shift ratios for sample size of three. Many difficulties related to the operation and design of CUSUM and EWMA control charts are greatly reduced by providing a simple and accurate proposed R chart scheme. The performance characteristics (ARLs) of the proposed charts are very comparable to a great degree with FIR CUSUM, simple CUSUM and other variance charts. It can be concluded that, instead of considering many parameters, it is better to consider single sample size and single control limits because a control chart loses its simplicity with a greater number of parameters. Moreover, practitioners may also find difficulty in applying it in production processes. On the other hand, CUSUM control charts are not effective when there is a single and sustained shift in the process dispersion.

Research limitations/implications

A lot of effort has been done to develop the new range charts for monitoring the process dispersion. Various assumptions and factors affecting the performance of the R chart have been identified and taken into account. In the proposed design, the observations have been assumed independent of one another but the observations may also be assumed to be auto‐correlated with previous observations and the performance of the proposed R chart may be studied.

Originality/value

The research findings could be applied to various manufacturing and service industries as it is more effective than the conventional (Shewhart) R chart and simpler than CUSUM charts.

Details

International Journal of Quality & Reliability Management, vol. 26 no. 5
Type: Research Article
ISSN: 0265-671X

Keywords

Article
Publication date: 4 January 2013

George J. Besseris

The purpose of this paper is to propose a methodology that may assist quality professionals in assessing process variation with a combination of tools based on simple…

Abstract

Purpose

The purpose of this paper is to propose a methodology that may assist quality professionals in assessing process variation with a combination of tools based on simple robust statistics. The technique targets alternative way of screening and detection of common and special causes in individuals' control charts (ICC).

Design/methodology/approach

The technique is using the classical box plot to detect and filter out outliers attributed to special causes. Then, the runs test is used to partition the data streak at points where the p‐value exceeds an assigned critical value. The transition between partitions is where the onset of a common cause is suspected.

Findings

The approach presented is supplemented with a case study from foundry operations in large‐scale can‐making operations. It is demonstrated how the magnesium content of an aluminium alloy is trimmed against special causes and then the location of the common causes is identified in a step‐by‐step fashion.

Research limitations/implications

The proposed method is useful when the collected data do not seem to comply with a known reference distribution. Since it is rare that an initial monitoring of a process will abide to normality, this technique saves time in recycling control charting which point often to misleading assignable causes. This is because the outliers are identified through the box plot one and out.

Practical implications

The technique identifies and eliminates quickly the off‐shoot values that tend to cause major instability in a process. Moreover, the onset for non‐assignable data points is detected in an expedient fashion without the need to remodel each time the inspected data series or to test against a score of multifarious test rules. The ingredients of the method have been well researched in the past, therefore, they may be implemented immediately without a further need to prove their worth.

Originality/value

The method mixes up two distinctive robust tools in a unique manner to aid quality monitoring to become fortified against data model inconsistencies. The technique is suitable for controlling processes that generate numerical readings. As such, the approach is projected to be useful for industrial as well as service operations.

Article
Publication date: 15 March 2011

D.R. Prajapati

The concept of the proposed R chart is based on the sum of chi squares (χ2). The average run lengths (ARLs) of the proposed R chart are computed and compared with the ARLs…

Abstract

Purpose

The concept of the proposed R chart is based on the sum of chi squares (χ2). The average run lengths (ARLs) of the proposed R chart are computed and compared with the ARLs of a standard R chart, Shewhart variance chart proposed by Chang and Gan, a CUSUM range chart (with and without FIR feature) proposed by Chang and Gan and also with an EWMA range chart proposed by Crowder and Hamilton for various chart parameters. This paper aims to show that only FIR CUSUM schemes perform better than the proposed R chart but other CUSUM and EWMA schemes are less efficient than the proposed R chart.

Design/methodology/approach

The concept of the proposed R chart is based on the sum of chi squares (χ2). The proposed R chart divides the plot area into three regions, namely: outright rejection region; outright acceptance region; and transition region. The NULL hypothesis is rejected if a point falls beyond the control limit, and accepted if it falls below the warning limit. However, when a point falls beyond the warning limit, but not beyond the control limit, the decision is taken on the basis of individual observations of the previous H samples, which are considered to evaluate statistic U, that is the sum of chi squares. The NULL hypothesis is rejected if U exceeds a predefined value (U*) and accepted otherwise.

Findings

The comparisons also show that the CUSUM, EWMA and proposed R charts outperform the Shewhart R chart by a substantial amount. It is concluded that only FIR CUSUM schemes perform better than the proposed R chart, as it is second in ranking. The other CUSUM and EWMA schemes are less efficient than the proposed R chart.

Research limitations/implications

CUSUM and EWMA charts can catch a small shift in the process average but they are not efficient to catch a large shift. Many researchers have also pointed out that these charts' applicability is limited to the chemical industries. Another limitation of CUSUM and EWMA charts is that they can catch the shift only when there is a single and sustained shift in the process average. If the shift is not sustained, then they will not be effective.

Practical implications

Many difficulties related to the operation and design of CUSUM and EWMA control charts are greatly reduced by providing a simple and accurate proposed scheme. The performance characteristics (ARLs) of the proposed charts described in this paper are very much comparable with FIR CUSUM, CUSUM, EWMA and other charts. It can be concluded that, instead of considering many chart parameters used in CUSUM and EWMA charts, it is better to consider a simple and more effective scheme, because a control chart loses its simplicity with multiple parameters. Moreover, practitioners may also experience difficulty in using these charts in production processes.

Originality/value

It is a modification of the Shewhart Range Chart but it is more effective than the Shewhart Range chart, as shown in the research paper.

Details

International Journal of Quality & Reliability Management, vol. 28 no. 3
Type: Research Article
ISSN: 0265-671X

Keywords

Article
Publication date: 6 June 2016

D.R. Prajapati and Sukhraj Singh

It is found that the process outputs from most of the industries are correlated and the performance of X-bar chart deteriorates when the level of correlation increases…

Abstract

Purpose

It is found that the process outputs from most of the industries are correlated and the performance of X-bar chart deteriorates when the level of correlation increases. The purpose of this paper is to compute the level of correlation among the observations of the weights of tablets of a pharmaceutical industry by using modified X-bar chart.

Design/methodology/approach

The design of the modified X-bar chart is based upon the sum of χ2s, using warning limits and the performance of the chart is measured in terms of average run lengths (ARLs). The ARLs at various sets of parameters of the modified X-bar chart are computed; using MATLAB software at the given mean and standard deviation.

Findings

The performance of the modified X-bar chart is computed for sample sizes of four. ARLs of optimal schemes of X-bar chart for sample size of four are computed. Various optimal schemes of modified X-bar chart for sample size (n) of four at the levels of correlation (Φ) of 0.00, 0.25, 0.50, 0.75 and 1.00 are presented in this paper. Samples of weights of the tablets are taken from a pharmaceutical industry and computed the level of correlation among the observations of the weights of the tablets. It is found that the observations are closely resembled with the simulated observations for the level of correlation of 0.75 in this case study. The performance of modified X-bar chart for sample size (n) of four at the levels of correlation (Φ) of 0.50 and 0.75 is also compared with the conventional (Shewhart) X-bar chart and it is concluded that the modified X-bar chart performs better than Shewhart X-bar chart.

Research limitations/implications

All the schemes are optimized by assuming the normal distribution. But this assumption may also be relaxed to design theses schemes for autocorrelated data. The optimal schemes for modified X-bar chart can also be used for other industries; where the manufacturing time of products is small. This scheme may also be used for any sample sizes suitable for the industries

Practical implications

The optimal scheme of modified X-bar chart for sample size (n) of four is used according to the computed level of correlation in the observations. The simple design of modified X-bar chart makes it more useful at the shop floor level for many industries where correlation exists. The correlation among the process outputs of any industry can be find out and corresponding to that level of correlation, the suggested control chart parameters can be used.

Social implications

The design of modified X-bar chart uses very less numbers of parameters so it can be used at the shop floor level with ease. The rejection level of products in the industries can be reduced by designing the better control chart schemes which will also reduce the loss to the society as suggested by Taguchi (1985).

Originality/value

Although; it is the extension of previous work but it can be applied to various manufacturing and service industries; where the data are correlated and normally distributed.

Details

International Journal of Quality & Reliability Management, vol. 33 no. 6
Type: Research Article
ISSN: 0265-671X

Keywords

Article
Publication date: 27 July 2012

Anupam Das, J. Maiti and R.N. Banerjee

Monitoring of a process leading to the detection of faults and determination of the root causes are essential for the production of consistent good quality end products…

1615

Abstract

Purpose

Monitoring of a process leading to the detection of faults and determination of the root causes are essential for the production of consistent good quality end products with improved yield. The history of process monitoring fault detection (PMFD) strategies can be traced back to 1930s. Thereafter various tools, techniques and approaches were developed along with their application in diversified fields. The purpose of this paper is to make a review to categorize, describe and compare the various PMFD strategies.

Design/methodology/approach

Taxonomy was developed to categorize PMFD strategies. The basis for the categorization was the type of techniques being employed for devising the PMFD strategies. Further, PMFD strategies were discussed in detail along with emphasis on the areas of applications. Comparative evaluations of the PMFD strategies based on some commonly identified issues were also carried out. A general framework common to all the PMFD has been presented. And lastly a discussion into future scope of research was carried out.

Findings

The techniques employed for PMFD are primarily of three types, namely data driven techniques such as statistical model based and artificial intelligent based techniques, priori knowledge based techniques, and hybrid models, with a huge dominance of the first type. The factors that should be considered in developing a PMFD strategy are ease in development, diagnostic ability, fault detection speed, robustness to noise, generalization capability, and handling of nonlinearity. The review reveals that there is no single strategy that can address all aspects related to process monitoring and fault detection efficiently and there is a need to mesh the different techniques from various PMFD strategies to devise a more efficient PMFD strategy.

Research limitations/implications

The review documents the existing strategies for PMFD with an emphasis on finding out the nature of the strategies, data requirements, model building steps, applicability and scope for amalgamation. The review helps future researchers and practitioners to choose appropriate techniques for PMFD studies for a given situation. Further, future researchers will get a comprehensive but precise report on PMFD strategies available in the literature to date.

Originality/value

The review starts with identifying key indicators of PMFD for review and taxonomy was proposed. An analysis was conducted to identify the pattern of published articles on PMFD followed by evolution of PMFD strategies. Finally, a general framework is given for PMFD strategies for future researchers and practitioners.

Details

International Journal of Quality & Reliability Management, vol. 29 no. 7
Type: Research Article
ISSN: 0265-671X

Keywords

Article
Publication date: 11 July 2022

Sunil Kumar Jauhar, Hossein Zolfagharinia and Saman Hassanzadeh Amin

This research is about embedding service-based supply chain management (SCM) concepts in the education sector. Due to Canada's competitive education sector, the authors…

Abstract

Purpose

This research is about embedding service-based supply chain management (SCM) concepts in the education sector. Due to Canada's competitive education sector, the authors focus on Canadian universities.

Design/methodology/approach

The authors develop a framework for evaluating and forecasting university performance using data envelopment analysis (DEA) and artificial neural networks (ANNs) to assist education policymakers. The application of the proposed framework is illustrated based on information from 16 Canadian universities and by investigating their teaching and research performance.

Findings

The major findings are (1) applying the service SCM concept to develop a performance evaluation and prediction framework, (2) demonstrating the application of DEA-ANN for computing and predicting the efficiency of service SCM in Canadian universities, and (3) generating insights to enable universities to improve their research and teaching performances considering critical inputs and outputs.

Research limitations/implications

This paper presents a new framework for universities' performance assessment and performance prediction. DEA and ANN are integrated to aid decision-makers in evaluating the performances of universities.

Practical implications

The findings suggest that higher education policymakers should monitor attrition rates at graduate and undergraduate levels and provide financial support to facilitate research and concentrate on Ph.D. programs. Additionally, the sensitivity analysis indicates that selecting inputs and outputs is critical in determining university rankings.

Originality/value

This research proposes a new integrated DEA and ANN framework to assess and forecast future teaching and research efficiencies applying the service supply chain concept. The findings offer policymakers insights such as paying close attention to the attrition rates of undergraduate and postgraduate programs. In addition, prioritizing internal research support and concentrating on Ph.D. programs is recommended.

Details

Benchmarking: An International Journal, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 1463-5771

Keywords

Article
Publication date: 12 August 2021

Hale Yalcin and Sema Dube

The authors examine whether Turkish fund managers employ liquidity timing along with market return timing, and if additional economic and market factors could affect their…

Abstract

Purpose

The authors examine whether Turkish fund managers employ liquidity timing along with market return timing, and if additional economic and market factors could affect their timing abilities, to help explain the contradictory results in literature vis-a-vis market timing ability.

Design/methodology/approach

The authors apply panel data analyses, with interaction terms and incorporating structural breaks, to monthly data for 96 out of 131 Turkish variable mutual funds which have available data for the sample period of 2011–2018. The authors employ the Amihud (2002) illiquidity measure to study market liquidity timing ability along with how additional economic and market factors affect this ability.

Findings

The authors find liquidity timing to be the performance enhancing method employed by Turkish variable fund managers in conjunction with market timing and that evidence for market timing may depend on whether structural breaks, that may be present in returns, are incorporated in the analysis. The authors also find that economic, technology and market-related factors affect timing abilities of fund managers.

Research limitations/implications

Conclusions are for Turkey, for the sample period studied, and for the control factors selected based on literature.

Practical implications

It is important to understand the role of market liquidity in making investment decisions and the paper contributes toward an understanding of how managers design their timing strategies in order to enhance portfolio performance, as well as the impact of additional factors on their ability to time market returns and liquidity. This is also important for evaluating fund managers' performance in terms of contribution to portfolio value.

Originality/value

To the authors knowledge this is the first study on Turkish markets to employ liquidity timing in the context of panel data analyses using interaction terms, as well as structural breaks, to distinguish the extent of liquidity timing from return timing, while incorporating the effect of additional factors on timing ability.

Article
Publication date: 3 September 2018

Rosembergue Pereira Souza, Luiz Fernando Rust da Costa Carmo and Luci Pirmez

The purpose of this paper is to present a procedure for finding unusual patterns in accredited tests using a rapid processing method for analyzing video records. The…

Abstract

Purpose

The purpose of this paper is to present a procedure for finding unusual patterns in accredited tests using a rapid processing method for analyzing video records. The procedure uses the temporal differencing technique for object tracking and considers only frames not identified as statistically redundant.

Design/methodology/approach

An accreditation organization is responsible for accrediting facilities to undertake testing and calibration activities. Periodically, such organizations evaluate accredited testing facilities. These evaluations could use video records and photographs of the tests performed by the facility to judge their conformity to technical requirements. To validate the proposed procedure, a real-world data set with video records from accredited testing facilities in the field of vehicle safety in Brazil was used. The processing time of this proposed procedure was compared with the time needed to process the video records in a traditional fashion.

Findings

With an appropriate threshold value, the proposed procedure could successfully identify video records of fraudulent services. Processing time was faster than when a traditional method was employed.

Originality/value

Manually evaluating video records is time consuming and tedious. This paper proposes a procedure to rapidly find unusual patterns in videos of accredited tests with a minimum of manual effort.

Details

International Journal of Quality & Reliability Management, vol. 35 no. 8
Type: Research Article
ISSN: 0265-671X

Keywords

1 – 10 of over 1000