Search results

1 – 10 of over 58000
Article
Publication date: 6 March 2017

Arash Geramian, Arash Shahin, Sara Bandarrigian and Yaser Shojaie

Average quadratic quality loss function (QQLF) measures quality of a given process using mean shift from its target value and variance. While it has a target parameter for the…

Abstract

Purpose

Average quadratic quality loss function (QQLF) measures quality of a given process using mean shift from its target value and variance. While it has a target parameter for the mean, it lacks a target for the variance revisable for counting any progress of the process across different quality levels, above/below the standard level; thus, it appears too general. Hence, in this research, it was initially supposed that all processes are located at two possible quality spaces, above/below the standard level. The purpose of this paper is to propose a two-criterion QQLF, in which each criterion is specifically proper to one of the quality spaces.

Design/methodology/approach

Since 1.33 is a literarily standard or satisfactory value for two most important process capability indices Cp and Cpk, its upper/lower spaces are assumed as high-/low-quality spaces. Then the indices are integrated into traditional QQLF, of type nominal the best (NTB), to develop a two-criterion QQLF, in which each criterion is more suitable for each quality space. These two criteria have also been innovatively embedded in the plan-do-check-act (PDCA) cycle to help continuous improvement. Finally, the proposed function has been examined in comparison with the traditional one in Feiz Hospital in the province of Isfahan, Iran.

Findings

Results indicate that the internal process of the studied case is placed on the lower quality space. So the first criterion of revised QQLF gives a more relevant evaluation for that process, compared with the traditional function. Moreover, this study has embedded both proposed criteria in the PDCA cycle as well.

Research limitations/implications

Formulating the two-criterion QQLF only for observations of normal and symmetric distributions, and offering it solely for NTB characteristics are limitations of this study.

Practical implications

Two more relevant quality loss criteria have been formulated for each process (service or manufacturing). However, in order to show the comprehensiveness of the proposed method even in service institutes, emergency function of Feiz Hospital has been examined.

Originality/value

The traditional loss function of type NTB merely and implicitly targets zero defect for variance. In fact, it calculates quality loss of all processes placed on different quality spaces using a same measure. This study, however, provides a practitioner with opportunity of targeting excellent or satisfactory targets.

Details

Benchmarking: An International Journal, vol. 24 no. 2
Type: Research Article
ISSN: 1463-5771

Keywords

Article
Publication date: 17 April 2008

Seungwook Park

The process capability indices have been widely used to measure process capability and performance. In this paper, we proposed a new process capability index which is based on an…

Abstract

The process capability indices have been widely used to measure process capability and performance. In this paper, we proposed a new process capability index which is based on an actual dollar loss by defects. The new index is similar to the Taguchi’s loss function and fully incorporates the distribution of quality attribute in a process. The strength of the index is to apply itself to non‐normal or asymmetric distributions. Numerical examples were presented to show superiority of the new index against Cp, Cpk, and Cpm which are the most widely used process capability indices.

Details

Asian Journal on Quality, vol. 9 no. 1
Type: Research Article
ISSN: 1598-2688

Keywords

Article
Publication date: 1 May 2005

Avinandan Mukherjee and Prithwiraj Nath

The purpose of this paper is to propose and empirically assess three comparative approaches to measuring service quality: modified gap model, TOPSIS and loss function. Aims to…

6462

Abstract

Purpose

The purpose of this paper is to propose and empirically assess three comparative approaches to measuring service quality: modified gap model, TOPSIS and loss function. Aims to argue for the use of TOPSIS from decision sciences, and Loss function from operations research and engineering, as alternative approaches to the gap model.

Design/methodology/approach

The empirical evidence is provided by large sample consumer data on the service quality for leading Indian commercial banks. The service quality evaluations obtained from these three distinct methods are compared and tested for their mutual agreement.

Findings

Fndings show that the rankings obtained from different methods are statistically in agreement, suggesting that the alternative approaches can provide equally good measurement of service quality. But they should not be used in an interchangeable manner.

Research/limitations/implications

Research shows that a single measure of overall service quality based on gap model is over‐simplistic. It would be more useful to explore a richer profile of customer service quality provided by different measurement approaches. Each methodology has its own advantages and disadvantages, and should be used based on its suitability for a particular application.

Practical implications

This research offers profound practical implications. It offers managers with a framework of service quality improvement that measures service quality gaps, selects an optimal combination of attribute levels to deliver customer satisfaction, and focuses on reducing the future loss caused by poor quality.

Originality/value

Extant marketing literature is replete with gap model applications for measuring service quality. Drawing from interdisciplinary literature, alternatives are provided to the traditional gap model, which show equally good measurement with greater suitability of application under certain conditions.

Details

Journal of Services Marketing, vol. 19 no. 3
Type: Research Article
ISSN: 0887-6045

Keywords

Article
Publication date: 17 January 2023

Salimeh Sadat Aghili, Mohsen Torabian, Mohammad Hassan Behzadi and Asghar Seif

The purpose of this paper is to develop a double-objective economic statistical design (ESD) of (X…

Abstract

Purpose

The purpose of this paper is to develop a double-objective economic statistical design (ESD) of (X) control chart under Weibull failure properties with the Linex asymmetric loss function. The authors have expressed the probability of type II error (β) as the statistical objective and the expected cost as the economic objective.

Design/methodology/approach

The design used in this study is based on a double-objective economic statistical design of (X) control chart with Weibull shock model via applying Banerjee and Rahim's model for non-uniform and uniform schemes with Linex asymmetric loss function. The results in the least average cost and β in uniform and non-uniform schemes by Linex loss function, compared with the same schemes without loss function.

Findings

Numerical results indicate that it is not possible to reduce the second type of error and costs at the same time, which means that by reducing the second type of error, the cost increases, and by reducing the cost, the second type of error increases, both of which are very important. Obtained based on the needs of the industry and which one has more priority has the right to choose. These designs define a Pareto optimal front of solutions that increase the flexibility and adaptability of the X control chart in practice. When the authors use non-uniform schemes instead of uniform schemes, the average cost per unit time decreases by an average and when the authors apply loss function, the average cost per unit time increases by an average. Also, this quantity for double-objective schemes with loss function compared to without loss function schemes in cases uniform and non-uniform increases. The reason for this result is that the model underestimated the costs before using the loss function.

Practical implications

This research adds to the body of knowledge related to flexibility in process quality control. This article may be of interest to quality systems experts in factories where the choice between cost reduction and statistical factor reduction can affect the production process.

Originality/value

The cost functions for double-objective uniform and non-uniform sampling schemes with the Weibull shock model based on the Linex loss function are presented for the first time.

Details

International Journal of Quality & Reliability Management, vol. 40 no. 8
Type: Research Article
ISSN: 0265-671X

Keywords

Article
Publication date: 11 September 2007

Jeh‐Nan Pan

The purpose of this research is to provide a new loss function‐based risk assessment method so the likelihood and consequence resulting from the failure of a manufacturing or…

1124

Abstract

Purpose

The purpose of this research is to provide a new loss function‐based risk assessment method so the likelihood and consequence resulting from the failure of a manufacturing or environmental system can be evaluated simultaneously.

Design/methodology/approach

Instead of using risk matrices of the occurrence and consequence separately for evaluating manufacturing and environmental risks, an integrated approach by exploring the relationship between process capability indices: Cp, Cpk and Cpm, and three different loss functions: Taguchi's loss function; Inverted normal loss function (INLF); and Revised inverted normal loss function (RINLF) is proposed.

Findings

The new method of quantitative risk assessment linking the likelihood and expected loss of failure is illustrated by two numeric examples. The results suggest that the revised inverted normal loss function (RINLF) be used in assessing manufacturing and environmental risks.

Practical implications

It gives decision‐makers a concrete tool to assess the likelihood and consequence of their processes. Linking the process capability indices and loss functions is particularly promising, as this provides a useful risk assessment tool for practitioners who want to reduce hazardous waste and manufacturing losses from their facilities.

Originality/value

The manufacturing and environmental risks are determined by paring the process capability indices and loss function. From the loss function‐based estimation, one can quantify the consequence of a manufacturing loss and get the severity rating in an objective way.

Details

International Journal of Quality & Reliability Management, vol. 24 no. 8
Type: Research Article
ISSN: 0265-671X

Keywords

Article
Publication date: 1 June 1997

Saeed Maghsoodloo and Lien‐Hai Huang

Mixed bivariate vectors occur when a sampling unit has two different types of response. This is a common occurrence in many manufacturing processes. The traditional optimization…

Abstract

Mixed bivariate vectors occur when a sampling unit has two different types of response. This is a common occurrence in many manufacturing processes. The traditional optimization approach for such a problem is to analyse each response separately and to determine vital factors for that response, then choose optimal factor settings by making trade‐off adjustments among all factors. Develops a general mixed bivariate model that will consider the correlation between the two responses of general quality loss function. First, develops a general quality loss function to evaluate societal losses for a vector response and then develops signal‐to‐noise ratios as performance measures and for the three different mixed responses (smaller‐the‐better, larger‐the‐better), (smaller‐the‐better, nominal‐the‐best) and (larger‐the‐better, nominal‐the‐best). Introduces simulation to evaluate the efficiency of performance measures that are developed herein.

Details

Benchmarking for Quality Management & Technology, vol. 4 no. 2
Type: Research Article
ISSN: 1351-3036

Keywords

Article
Publication date: 31 July 2009

Yuan Mao Huang and Ching‐Shin Shiau

The purpose of this paper is to provide an optimal tolerance allocation model for assemblies with consideration of the manufacturing cost, the quality loss, the design reliability…

Abstract

Purpose

The purpose of this paper is to provide an optimal tolerance allocation model for assemblies with consideration of the manufacturing cost, the quality loss, the design reliability index with various distributions to enhance existing models. Results of two case studies are presented.

Design/methodology/approach

The paper develops a model with consideration of the manufacturing cost, the Taguchi's asymmetric quadratic quality loss and the design reliability index for the optimal tolerance allocation of assemblies. The dimensional variables in normal distributions are initially used as testing and compared with the data from the prior researches. Then, the dimensional variables in lognormal distributions with the mean shift and the correlation are applied and investigated.

Findings

The results obtained based on a lognormal distribution and a normal distribution of the dimension are similar, but the tolerance with a lognormal distribution is little smaller than that with a normal distribution. The result of the reliability with the lognormal distribution obtained by the Monte‐Carlo is higher than that with a normal distribution. This paper shows that effects of the mean shift, the correlation coefficient and the replacement cost on the cost are significant and designers should pay attention to them during the tolerance optimization. The optimum tolerances of components of a compressor are recommended.

Research limitations/implications

The model is limited to the dimensions of components with the normal distribution and lognormal distributions. The implication should be enhanced with more data of dimension distributions and cost of assembly components.

Practical implications

Two case studies are presented. One is an assembly of two pieces and another is a compressor with many components.

Originality/value

This model provides an optimal tolerance allocation method for assemblies with the lowest manufacturing cost, the minimum quality loss, and the required reliability index for the normal distribution and lognormal distribution.

Details

Assembly Automation, vol. 29 no. 3
Type: Research Article
ISSN: 0144-5154

Keywords

Article
Publication date: 1 October 1998

Rune M. Moen

Measuring quality costs has been emphasized as an important part of quality improvement since the early 1950s. A chapter on quality costs seems to be almost compulsory in every…

3189

Abstract

Measuring quality costs has been emphasized as an important part of quality improvement since the early 1950s. A chapter on quality costs seems to be almost compulsory in every book pertaining to total quality management, business process improvement, and similar topics. There is no doubt that measuring quality costs is useful in order to direct improvement efforts; the problem is that the concept is not as valid today as it used to be. While customer requirements and production systems have changed considerably during the last decades, quality cost measurement is advocated in nearly the same way as it was 40 years ago. This work presents a new customer and process focused poor quality cost model that enables the provider of a product or service to focus on elements that really matter to his customers. The input to the model is customer requirements and the output is expected poor quality costs estimated through the Taguchi loss function. Quality function deployment is used to translate the voice of the customer to key process parameters, that is process parameters having a direct influence on the fulfilment of customer requirements. The quality function deployment matrix is also used to estimate intangible costs. Traditional cost categories have been altered, and the expected loss for each cost category is estimated based on actual process performance and stepwise quadratic loss functions with multiple intervals. The intended use of the model is as a top management decision‐making tool able to link quality improvement to customer satisfaction and loyalty.

Details

The TQM Magazine, vol. 10 no. 5
Type: Research Article
ISSN: 0954-478X

Keywords

Article
Publication date: 1 August 1994

Geanie W. Margavio, Ross L. Fink and Thomas M. Margavio

Quality improvement decisions are the catalyst for substantialtechnological improvements being made in the manufacturing sector. Thenew technology, however, has developed faster…

2372

Abstract

Quality improvement decisions are the catalyst for substantial technological improvements being made in the manufacturing sector. The new technology, however, has developed faster than techniques for evaluating capital investments in such improvements. This is largely because the benefits of quality improvement technology are difficult to quantify. The Taguchi loss function is incorporated into a net present value capital budgeting technique to provide an estimate of these benefits. Describes the loss function in relation to key quality costs: appraisal and prevention costs, and internal and external failure costs. External failure cost savings are generated by reducing variability in the manufacturing process. These savings are then compared with the cost of the quality improving technology. Results indicate that these savings can be substantial, depending on the achieved reduction in the process variability, the cost of capital, and on the estimate of the cost of processing a customer’s return of the product.

Details

International Journal of Quality & Reliability Management, vol. 11 no. 6
Type: Research Article
ISSN: 0265-671X

Keywords

Article
Publication date: 5 October 2015

Satyendra Kumar Sharma and Vinod Kumar

Selection of logistics service provider (LSP) (also known as Third-party logistics (3PL) is a critical decision, because logistics affects top and bottom line as well. Companies…

1636

Abstract

Purpose

Selection of logistics service provider (LSP) (also known as Third-party logistics (3PL) is a critical decision, because logistics affects top and bottom line as well. Companies consider logistics as a cost driver and at the time of LSP selection decision, many important decision criteria’s are left out. 3PL selection is multi-criteria decision-making process. The purpose of this paper is to develop an integrated approach, combining quality function deployment (QFD), and Taguchi loss function (TLF) to select optimal 3PL.

Design/methodology/approach

Multiple criteria are derived from the company requirements using house of quality. The 3PL service attributes are developed using QFD and the relative importance of the attributes are assessed. TLFs are used to measure performance of each 3PL on each decision variable. Composite weighted loss scores are used to rank 3PLs.

Findings

QFD is a better tool which connects attributes used in a decision problem to decision maker’s requirements. In total, 15 criteria were used and TLF provides performance on these criteria.

Practical implications

The proposed model provides a methodology to make informed decision related to 3PL selection. The proposed model may be converted into decision support system.

Originality/value

Proposed approach in this paper is a novel approach that connects the 3PL selection problem to practice in terms of identifying criteria’s and provides a single numerical value in terms of Taghui loss.

Details

Benchmarking: An International Journal, vol. 22 no. 7
Type: Research Article
ISSN: 1463-5771

Keywords

1 – 10 of over 58000