Search results

1 – 10 of 36
To view the access options for this content please click here
Article
Publication date: 1 October 1998

This article has been withdrawn as it was published elsewhere and accidentally duplicated. The original article can be seen here: 10.1108/02656719810196360. When citing…

Abstract

This article has been withdrawn as it was published elsewhere and accidentally duplicated. The original article can be seen here: 10.1108/02656719810196360. When citing the article, please cite: Nigel P. Grigg, (1998), “Statistical process control in UK food production: an overview”, International Journal of Quality & Reliability Management, Vol. 15 Iss: 2, pp. 223 - 238.

Details

British Food Journal, vol. 100 no. 8
Type: Research Article
ISSN: 0007-070X

To view the access options for this content please click here
Article
Publication date: 29 September 2020

Nigel P. Grigg

The purpose of this paper is to present a literature review demonstrating that quality and its management are increasingly definable as a balancing act between value, risk…

Abstract

Purpose

The purpose of this paper is to present a literature review demonstrating that quality and its management are increasingly definable as a balancing act between value, risk and cost throughout the value stream, from product/service design to production and delivery, and purchaser decision-making. An original framework is presented showing this interplay across the value stream, referred to as the QVRC framework.

Design/methodology/approach

Content analysis is combined with bibliometric analytics, displayed via temporal graphs and citation networks. Reviewed literature is transdisciplinary, encompassing marketing, operations/quality and psychology sources. Core quality management methodologies are positioned on the framework illustrating their relative contribution to value, risk and cost management.

Findings

The QVRC framework is developed, and used as a basis for classifying models and methodologies associated with quality management. A set of propositions are developed, which, together with the framework, set an agenda for further research.

Research limitations/implications

No literature review can capture the richness of discourses on terms as pervasive as value, risk and cost. This paper aims to present a systematic and reliable sampling of such literature.

Practical implications

The resulting model can be applied to management tools, and to products and services.

Originality/value

Researchers, particularly in marketing, have developed models of value, risk and cost in terms of products and services. However, delivering products that provide the appropriate value, risk and cost trade-off is an operations management problem. This is the first paper to combine value, risk and cost across the value stream showing how this interplay extends beyond product.

Details

International Journal of Quality & Reliability Management, vol. 38 no. 5
Type: Research Article
ISSN: 0265-671X

Keywords

To view the access options for this content please click here
Article
Publication date: 1 February 2003

Susan M. Ogden and Nigel P. Grigg

In 1979, the UK set the standard on which the universally recognised ISO 9000 series was based. Part of the rationale for the creation of a generic quality assurance…

Abstract

In 1979, the UK set the standard on which the universally recognised ISO 9000 series was based. Part of the rationale for the creation of a generic quality assurance standard was that it would supplant the need for independent customer inspections, avoid duplications of audits, and coordinate the various national approaches to quality standards. Ironically, however, as the award has grown internationally, there has been a corresponding growth in the number and type of quality standards available to UK organisations. This paper reviews the development of sector‐based quality assurance standards in the UK leisure, hospitality and food industries and draws conclusions on the extent to which the various standards can be aligned. It is found that whereas industry‐specific standards in the food industry dovetail with generic standards, there is a degree of overlap in the hospitality and leisure sectors.

Details

The TQM Magazine, vol. 15 no. 1
Type: Research Article
ISSN: 0954-478X

Keywords

To view the access options for this content please click here
Article
Publication date: 1 November 1999

Nigel P. Grigg and Lesley Walls

Presents a synthesis of the early findings from an ongoing project researching the issues surrounding the use of SPC in a food packing environment. A cognitive mapping…

Abstract

Presents a synthesis of the early findings from an ongoing project researching the issues surrounding the use of SPC in a food packing environment. A cognitive mapping approach has been utilised to make sense of the complex and varied data resulting from the survey, case studies and interviews carried out to date. This methodological approach is described, and its application illustrated in relation to the research topic. Argues that SPC is one weapon in an arsenal of quality management techniques that food companies can use to consolidate or improve their position in an increasingly competitive marketplace. Once successfully adopted SPC can offer proven operational and financial benefits, but the ability of the organisation to successfully achieve implementation will depend upon a number of organisational factors. Finally, presents the agenda for further research which outlines how this ongoing project is intended to be taken forward from this point.

Details

British Food Journal, vol. 101 no. 10
Type: Research Article
ISSN: 0007-070X

Keywords

To view the access options for this content please click here
Article
Publication date: 3 October 2008

Max Saunders, Robin S. Mann and Nigel P. Grigg

The purpose of this paper is to examine the international use of business excellence (BE) models and the practices used by BE framework (BEF) custodians to encourage use.

Abstract

Purpose

The purpose of this paper is to examine the international use of business excellence (BE) models and the practices used by BE framework (BEF) custodians to encourage use.

Design/methodology/approach

A literature review, three surveys, a series of focus groups and key informant interviews were conducted. The study involved input from 16 countries and was part of a larger study of how BEFs are designed, reviewed, promoted and deployed within and across nations.

Findings

Only two of 16 BEF custodians had a formal measurement system in place to objectively measure the use of BEF by organisations over time. The use of the Australian BEF was lower than previously estimated at 1.3 percent and global use between 4 and 15 percent of organisations. The three most effective practices for assisting organisations in applying BE were tours of best or good practice organisations, publications on BE, and on‐line service/database of BE information.

Research limitations/implications

While the primary focus was on the Australian context, the findings draw upon a range of international sources and hence are of relevance to all BEF custodians.

Practical implications

The findings from the project were used to redesign the ABEF, and are expected to help inform national BE strategies worldwide.

Originality/value

The paper updates the current situation regarding the utilisation of BE in 16 countries, with a focus on Australia.

Details

The TQM Journal, vol. 20 no. 6
Type: Research Article
ISSN: 1754-2731

Keywords

To view the access options for this content please click here
Article
Publication date: 23 May 2008

Nihal P. Jayamaha, Nigel P. Grigg and Robin S. Mann

The purpose of this paper is to empirically assess the validity of Baldrige Criteria for Performance Excellence (CPE) for New Zealand organisations and to identify…

Abstract

Purpose

The purpose of this paper is to empirically assess the validity of Baldrige Criteria for Performance Excellence (CPE) for New Zealand organisations and to identify methodological gaps.

Design/methodology/approach

By means of data collected from a sample of 91 New Zealand organisations, through a self‐assessment instrument (as a proxy for the CPE) a structural equation model was studied using the partial least squares method. The measurement validity of the CPE as well as the implied causal relationships in the CPE framework was tested. A sensitivity analysis was conducted to gain additional insights.

Findings

The measurement validity of the CPE was established; of the 13 implied causal relationships in the CPE framework, 11 were statistically significant, which compared favourably with past studies. The results endorse some salient features of quality management: reliance on measurement, analysis, and knowledge management; the involvement of people; and the role of leadership in setting direction.

Research limitations/implications

As the study was based on a small sample, this model needs to be tested with other data sets. The study revealed the need to meta‐analyse past measurement and structural models as well as measurement instruments.

Practical implications

The study endorsed the reliability and validity of a well designed, well administered, self‐assessment instrument.

Originality/value

As the first New Zealand CPE validity study, the paper introduces the partial least squares method and shows some of its relevant versatile features, introducing some measurement perspectives not conceptualised before in CPE validation studies.

Details

International Journal of Quality & Reliability Management, vol. 25 no. 5
Type: Research Article
ISSN: 0265-671X

Keywords

To view the access options for this content please click here
Article
Publication date: 25 February 2014

Zafar Iqbal, Nigel P. Grigg, K. Govinderaju and Nicola Campbell-Allen

Quality function deployment (QFD) is a methodology to translate the “voice of the customer” into engineering/technical specifications (HOWs) to be followed in designing of…

Abstract

Purpose

Quality function deployment (QFD) is a methodology to translate the “voice of the customer” into engineering/technical specifications (HOWs) to be followed in designing of products or services. For the method to be effective, QFD practitioners need to be able to accurately differentiate between the final weights (FWs) that have been assigned to HOWs in the house of quality matrix. The paper aims to introduce a statistical testing procedure to determine whether the FWs of HOWs are significantly different and investigate the robustness of different rating scales used in QFD practice in contributing to these differences.

Design/methodology/approach

Using a range of published QFD examples, the paper uses a parametric bootstrap testing procedure to test the significance of the differences between the FWs by generating simulated random samples based on a theoretical probability model. The paper then determines the significance or otherwise of the differences between: the two most extreme FWs and all pairs of FWs. Finally, the paper checks the robustness of different attribute rating scales (linear vs non-linear) in the context of these testing procedures.

Findings

The paper demonstrates that not all of the differences that exist between the FWs of HOW attributes are in fact significant. In the absence of such a procedure, there is no reliable analytical basis for QFD practitioners to determine whether FWs are significantly different, and they may wrongly prioritise one engineering attribute over another.

Originality/value

This is the first article to test the significance of the differences between FWs of HOWs and to determine the robustness of different strength of scales used in relationship matrix.

Details

International Journal of Quality & Reliability Management, vol. 31 no. 2
Type: Research Article
ISSN: 0265-671X

Keywords

To view the access options for this content please click here
Article
Publication date: 2 March 2015

Zafar Iqbal, Nigel Peter Grigg, K. Govindaraju and Nicola Marie Campbell-Allen

Quality function deployment (QFD) is a planning methodology to improve products, services and their associated processes by ensuring that the voice of the customer has…

Abstract

Purpose

Quality function deployment (QFD) is a planning methodology to improve products, services and their associated processes by ensuring that the voice of the customer has been effectively deployed through specified and prioritised technical attributes (TAs). The purpose of this paper is two ways: to enhance the prioritisation of TAs: computer simulation significance test; and computer simulation confidence interval. Both are based on permutation sampling, bootstrap sampling and parametric bootstrap sampling of given empirical data.

Design/methodology/approach

The authors present a theoretical case for the use permutation sampling, bootstrap sampling and parametric bootstrap sampling. Using a published case study the authors demonstrate how these can be applied on given empirical data to generate a theoretical population. From this the authors describe a procedure to decide upon which TAs have significantly different priority, and also estimate confidence intervals from the theoretical simulated populations.

Findings

First, the authors demonstrate not only parametric bootstrap is useful to simulate theoretical populations. The authors can also employ permutation sampling and bootstrap sampling to generate theoretical populations. Then the authors obtain the results from these three approaches. qThe authors describe why there is a difference in results of permutation sampling, bootstrap and parametric bootstrap sampling. Practitioners can employ any approach, it depends how much variation in FWs is required by quality assurance division.

Originality/value

Using these methods provides QFD practitioners with a robust and reliable method for determining which TAs should be selected for attention in product and service design. The explicit selection of TAs will help to achieve maximum customer satisfaction, and save time and money, which are the ultimate objectives of QFD.

Details

International Journal of Productivity and Performance Management, vol. 64 no. 3
Type: Research Article
ISSN: 1741-0401

Keywords

To view the access options for this content please click here
Article
Publication date: 1 October 2001

Nigel P. Grigg and Catherine McAlinden

Traditional criticisms of the ISO 9000 standards, that they are generic, procedurally‐oriented, expensive and burdensome, are particularly applicable within the food…

Abstract

Traditional criticisms of the ISO 9000 standards, that they are generic, procedurally‐oriented, expensive and burdensome, are particularly applicable within the food industry. Their lack of fit with industry priorities and requirements, moreover, has created a growth in uptake of alternative “bespoke” standards in the UK, designed to better meet the needs of the industry and demands of the retail customer. The year 2000 revision of ISO 9000 may serve to redefine the role of this standard in the food industry, whereby it can augment such standards and provide a template for Business Excellence. This paper presents an analysis of industry trends in relation to quality standards, and discusses the potential role of ISO 9000:2000 within this sector based upon published data from ISO, industry survey data, and interviews with a major UK food certification body and with technical managers from food companies in the UK and overseas. Implications of such trends are presented in relation to the auditing of UK companies.

Details

British Food Journal, vol. 103 no. 9
Type: Research Article
ISSN: 0007-070X

Keywords

To view the access options for this content please click here
Article
Publication date: 1 September 2000

Nigel P. Grigg and Jane Williams

In July 1999, a consultation paper was issued by the DTI relating to the modernisation of Part V of the Weights and Measures Act 1985. This was in response to issues…

Abstract

In July 1999, a consultation paper was issued by the DTI relating to the modernisation of Part V of the Weights and Measures Act 1985. This was in response to issues concerning the complexity of the legislation, the burden it places on traders, and its appropriateness in the modern trading environment. The research described in this was begun shortly before publication of the document, with the aim of establishing the views of Trading Standards Officers on current problems with the legislation and its enforcement, from the point of view of those who are responsible for its enforcement, and for bringing prosecutions. The research was carried out via a survey, results of which were analysed using exploratory Principal Components Analysis, t‐tests, and correlation analysis. Path analysis was used as a final stage in order to produce a model of the factors that significantly influence officers’ perceptions of the legislation/enforcement.

Details

British Food Journal, vol. 102 no. 8
Type: Research Article
ISSN: 0007-070X

Keywords

1 – 10 of 36