Search results
1 – 10 of 209Seyedeh Zahra Hosseinifard and Babak Abbasi
In profile monitoring, which is a growing research area in the field of statistical process control, the relationship between response and explanatory variables is monitored over…
Abstract
Purpose
In profile monitoring, which is a growing research area in the field of statistical process control, the relationship between response and explanatory variables is monitored over time. The purpose of this paper is to focus on the process capability analysis of linear profiles. Process capability indices give a quick indication of the capability of a manufacturing process.
Design/methodology/approach
In this paper, the proportion of the non‐conformance criteria is employed to estimate process capability index. The paper has considered the cases where specification limits is constant or is a function of explanatory variable X. Moreover, cases where both equal and random design schemes in profile data acquisition is required (as the explanatory variable) is considered. Profiles with the assumption of deterministic design points are usually used in the calibration applications. However, there are other applications where design points within a profile would be i.i.d. random variables from a given distribution.
Findings
Simulation studies using simple linear profile processes for both fixed and random explanatory variable with constant and functional specification limits are considered to assess the efficacy of the proposed method.
Originality/value
There are many cases in industries such as semiconductor industries where quality characteristics are in form of profiles. There is no method in the literature to analyze process capability for theses processes, however recently quite a few methods have been presented in monitoring profiles. Proposed methods provide a framework for quality engineers and production engineers to evaluate and analyze capability of the profile processes.
Details
Keywords
Abstract
Details
Keywords
George N Kenyon, R. Samual Sale, Kurt Hozak and Paul Chiou
The purpose of this paper is to develop an yield-based process capability index (PCI), C py , to overcome the shortcomings of…
Abstract
Purpose
The purpose of this paper is to develop an yield-based process capability index (PCI), C py , to overcome the shortcomings of existing PCIs that limit their use and lead to inaccurate measures of quality conformance under a variety of common conditions.
Design/methodology/approach
–C py is developed conceptually to flexibly and accurately reflect conformance and then used to numerically measure inaccuracies of C pk .
Findings
–C py overcomes many of the problems associated with existing PCIs, including C pk . The degree of process distribution non-normality, level of quality (the sigma level), and whether the process is centered or shifted left or right affect the direction and size of process capability error produced by C pk . The accuracy of C pk can be greatly affected by process data that deviate even slightly from normality.
Practical implications
–C py offers numerous advantages compared to existing PCIs. It accurately reflects process conformance regardless of the process distribution. It is applicable even if the process has multiple characteristics and with both variable and attribute data. Its calculation is relatively simple and the necessary data for it are likely already captured by most organizations.
Originality/value
The main contributions are the development of a new PCI, C py ; a conceptual analysis of its advantages; and a numerical analysis of the improved accuracy of C py as compared to C pk for shifted and non-shifted process means for normal, nearly normal, and highly non-normal distributions over a range of process variability levels.
Details
Keywords
Patrick Barber, Andrew Graves, Mark Hall, Darryl Sheath and Cyril Tomkins
A methodology was developed to measure cost of quality failures in two major road projects, largely based upon a work‐shadowing method. Shows how the initial data were collected…
Abstract
A methodology was developed to measure cost of quality failures in two major road projects, largely based upon a work‐shadowing method. Shows how the initial data were collected and categorised into definable groups and how the costs were estimated for each of these categories. The findings suggest that, if the projects examined are typical, the cost of failures may be a significant percentage of total costs, and that conventional means of identifying them may not be reliable. Moreover, the costs will not be easy to eradicate without widespread changes in attitudes and norms of behaviour within the industry and improved managerial co‐ordination of activities throughout the supply chain.
Details
Keywords
Hamidreza Izadbakhsh, Rassoul Noorossana and Seyed Taghi Akhavan Niaki
The purpose of this paper is to apply Poisson generalized linear model (PGLM) with log link instead of multinomial logistic regression to monitor multinomial logistic profiles in…
Abstract
Purpose
The purpose of this paper is to apply Poisson generalized linear model (PGLM) with log link instead of multinomial logistic regression to monitor multinomial logistic profiles in Phase I. Hence, estimating the coefficients becomes easier and more accurate.
Design/methodology/approach
Simulation technique is used to assess the performance of the proposed algorithm using four different control charts for monitoring.
Findings
The proposed algorithm is faster and more accurate than the previous algorithms. Simulation results also indicate that the likelihood ratio test method is able to detect out-of-control parameters more efficiently.
Originality/value
The PGLM with log link has not been used to monitor multinomial profiles in Phase I.
Details
Keywords
James Tannock and Sittichai Saelem
Many authors have suggested that disruption and associated costs result from poor quality performance in manufacturing. The purpose of this paper is to define and quantify the…
Abstract
Purpose
Many authors have suggested that disruption and associated costs result from poor quality performance in manufacturing. The purpose of this paper is to define and quantify the disruption costs associated with a simple manufacturing scenario using a simulation approach.
Design/methodology/approach
A manufacturing cell incorporating inspection and rework was simulated, and a validation exercise carried out. Using results from the simulation study, the authors then formulate the concept of a cost category for disruption cost, which is compatible with the traditional prevention‐appraisal‐failure (PAF) model for quality costs.
Findings
Comparative graphs of disruption costs and PAF costs elements are presented. The simulated disruption cost is compared with these traditional costs categories, and found to represent a significant additional cost at higher levels of non‐conformance.
Research limitations/implications
The results presented in this paper are derived from a discrete‐event simulation exercise, using a model of a simplified generic manufacturing cell. They are believed to be indicative of costs that would occur in practical situations, but are not validated with empirical data. Further work would include such validation.
Practical implications
This is a theoretical paper, which attempts to extend a useful and well established cost model that has been widely accepted in industry.
Originality/value
The originality of this paper lies in the definition of the concept of disruption cost, as a separate category of quality cost. The simulation work indicates the potential size and behaviour of the disruption cost, compared with the traditional PAF costs categories.
Details
Keywords
Vineet Kaushik and Sanjay Dhir
The purpose of this paper is to study, explore and rank the non-conforming factors in apparels purchased from e-shops.
Abstract
Purpose
The purpose of this paper is to study, explore and rank the non-conforming factors in apparels purchased from e-shops.
Design/methodology/approach
Data were collected by visiting and interacting people in colleges and through the structured online questionnaires (n=222). The exploratory factor analysis was performed using “R” software. Identified factors were ranked using AHP methodology; 12 experts from various fashion institutes participated in identifying the factors.
Findings
Based upon the results of the exploratory study, non-conforming factors such as “visual variation”, “functional inconvenience”, “cloth attribute variation”, “haptic variation”, “aesthetic variation” and “fit variation” were identified. The priority ranking of factors and sub-factors was done.
Research limitations/implications
The sample primarily comprised of the young adult population (19–27 years) and most of them were females (71.6 per cent). There can be other demographic factors. Research is limited to online apparel retailers. Advanced methods of prioritisation can be used.
Practical implications
The paper can be useful to online apparel retailers, vendors and manufacturers to understand the factors that may be important for improving their business.
Originality/value
There is no study that identifies the non-conformance factors related to online apparel retailing.
Details
Keywords
Zero acceptance number plans (c = 0 plans) are sometimes described as the only appropriate method of acceptance sampling in an environment in which zero defects is a meaningful…
Abstract
Zero acceptance number plans (c = 0 plans) are sometimes described as the only appropriate method of acceptance sampling in an environment in which zero defects is a meaningful concept. Considers the operating characteristic curves of such plans, and evaluates their effectiveness in the presence of low levels of defects. Suggests a confidence interval approach to establish sample size for c = 0 plans. Refers to alternative approaches.
Details
Keywords
Taguchi’s quality loss function is here proposed as a tool to evaluate costs connected to non conformity in manufacturing industries. First of all, Taguchi’s quadratic cost…
Abstract
Taguchi’s quality loss function is here proposed as a tool to evaluate costs connected to non conformity in manufacturing industries. First of all, Taguchi’s quadratic cost function is compared with the traditionally adopted zero defects cost function, and the differential effects of either function are analysed. Conceptual issues as well as practical ones are considered in this analysis. Results provided match in suggesting that the quadratic cost function, as compared with zero defects, is conceptually more consistent with the most advanced definitions of industrial quality and practically more suitable to support continuous improvement processes with a long‐term, customer‐oriented and profit‐oriented perspective. Yet, in order to provide for a full utilisation of this tool inside quality costs reporting systems, a number of problems have still to be resolved. These problems too are highlighted and brought to the attention of researchers of this field.
Details
Keywords
Control charts for statistical quality control have been the subject of academic study for many years. Various analytical approaches to economic control chart design have been…
Abstract
Control charts for statistical quality control have been the subject of academic study for many years. Various analytical approaches to economic control chart design have been advanced, although none has found wide use in practice. Describes a simulation approach to the investigation of control chart economics. Simulation can provide guidance on chart design issues such as sample size, sampling interval and the use of alternative chart alarm rules. Applies the method to the economic comparison between variables control charting and other inspection strategies such as 100 per cent inspection. Presents some generalized results, allowing comparison to be made for various scenarios. Emphasizes the importance of process capability in the choice of quality control strategy and demonstrates the economic advantages of control charting where special or assignable causes exist.
Details