Search results

1 – 10 of over 74000
Article
Publication date: 1 August 1999

M. Xie and T.N. Goh

When objective decisions are to be made, statistical methods should be used based on any objective information in the form of data collected about a product or process. Statistical

3399

Abstract

When objective decisions are to be made, statistical methods should be used based on any objective information in the form of data collected about a product or process. Statistical techniques such as control charts, process capability indices and design of experiments have been used in the manufacturing industry for many years. There are a number of practical and managerial issues related to the application of statistical techniques in studies aimed at improving process and product quality. This paper is a summary of the thoughts and discussions from a recent Internet conference on this issue. Statistical process control techniques and their role in process improvement are first discussed and some issues related to the interpretation and use of experimental design techniques are also summarised. The focus will be on continuous quality improvement using statistical techniques.

Details

The TQM Magazine, vol. 11 no. 4
Type: Research Article
ISSN: 0954-478X

Keywords

Article
Publication date: 1 July 2006

Nuno R.P. Costa, António R. Pires and Celma O. Ribeiro

The purpose of this paper is to focus the application of design of experiments (DOE) using industrial equipments, reinforcing idea that non‐statistical aspects in planning and…

1612

Abstract

Purpose

The purpose of this paper is to focus the application of design of experiments (DOE) using industrial equipments, reinforcing idea that non‐statistical aspects in planning and conducting experiments are so important as formal design and analysis.

Design/methodology/approach

Two case studies are presented to illustrate typical industrial applications and difficulties. Supported on these case studies and literature, this paper presents guidelines to planning, conducting and analysis involving technical and organizational aspects.

Findings

Solving problems in industry, including in companies recognized as competent in the respective industrial sector, is not just a question of applying the right technique. Ceramic industry case study illustrates how important are non‐statistical issues in DOE application. Paint industry case study illustrates the strong relationship of the results with incorporating presented guidelines into practice. Moreover, both case studies consolidating a fundamental advantage of DOE: experimentation provides more knowledge about products, processes and technologies, even in unsuccessful case studies.

Research limitations/implications

Unsuccessful cases studies are very useful for identifying pitfalls and others limitations. This paper highlight difficulties aroused from non‐statistical aspects, although it is possible to find unsuccessful case studies due to statistical issues also. So, papers illustrating inadequate application of statistical techniques are welcome.

Practical implications

Successful DOE implementation depends on statistical and non‐statistical aspects. Although none of them shall be neglected, technical skills and technological knowledge about processes and products, management understanding of potential possibilities of statistical techniques and statistical fundamentals and knowledge about techniques of DOE must be ensuring to successful case studies in industrial setting.

Originality/value

This paper highlights non‐statistical aspects instead of the statistical ones. Tob overcome difficulties structured guidelines were designed to support DOE application in industrial setting.

Details

The TQM Magazine, vol. 18 no. 4
Type: Research Article
ISSN: 0954-478X

Keywords

Article
Publication date: 1 January 1987

John S. Oakland and Amrik Sohal

The work described in this paper is part of a large study of the barriers to acceptance of production management techniques in UK manufacturing industry. The first part of this…

Abstract

The work described in this paper is part of a large study of the barriers to acceptance of production management techniques in UK manufacturing industry. The first part of this study is described, it:(i) establishes the use being made of proven traditional techniques of production management and operational research/statistical techiques by British production managers; and (ii) begins to investigate the barriers to acceptance of the techniques. The results reveal that in industry in the UK there is low usage of many of the techniques, particularly the highly quantitative techniques. The major barrier preventing usage of the techniques is lack of knowledge; training in production management has been found to be an extremely important factor in the usage of all the techniques examined.

Details

International Journal of Operations & Production Management, vol. 7 no. 1
Type: Research Article
ISSN: 0144-3577

Keywords

Article
Publication date: 1 May 2005

Miltiadis Makrymichalos, Jiju Antony, Frenie Antony and Maneesh Kumar

The purpose of this paper is to demonstrate the vital linkage between six sigma and statistical thinking. The paper also explains the key characteristics required for statistical

3159

Abstract

Purpose

The purpose of this paper is to demonstrate the vital linkage between six sigma and statistical thinking. The paper also explains the key characteristics required for statistical thinking and some of the common barriers in the implementation of the key principles of statistical thinking.

Design/methodology/approach

The objectives of the paper have been achieved in several ways. The paper provides the key principles of statistical thinking and then discusses the possible reasons for lack of statistical thinking in modern organizations. The paper then illustrates the linkage between the statistical principles and six sigma. The tools and techniques of six sigma used within statistical thinking are also highlighted in the paper.

Findings

The key findings of this work include the relationship between the two key powerful methodologies: six sigma and statistical thinking, reasons for lack of applications of statistical thinking in organizations, the future role of managers and engineers in companies with regard to statistical thinking era and the commonalities in the application of tools and techniques between these two methodologies.

Research limitations/implications

The paper needs more justification through surveys and case examples and this will be the future step of this study. In fact, one of the co‐authors is currently conducting a survey in the UK organizations to investigate the relationship between statistical thinking and six sigma.

Practical implications

The paper is very practical in nature and it does yield a great value to those people who are currently embarking on six sigma program, especially at senior manager and executive levels.

Originality/value

Very little is published in the field of statistical thinking in the UK academic world. In fact, there is a cognitive gap in this field and this paper certainly forms a good platform for further research that will enable to bridge the gap.

Details

Managerial Auditing Journal, vol. 20 no. 4
Type: Research Article
ISSN: 0268-6902

Keywords

Article
Publication date: 25 July 2019

Yinhua Liu, Rui Sun and Sun Jin

Driven by the development in sensing techniques and information and communications technology, and their applications in the manufacturing system, data-driven quality control…

Abstract

Purpose

Driven by the development in sensing techniques and information and communications technology, and their applications in the manufacturing system, data-driven quality control methods play an essential role in the quality improvement of assembly products. This paper aims to review the development of data-driven modeling methods for process monitoring and fault diagnosis in multi-station assembly systems. Furthermore, the authors discuss the applications of the methods proposed and present suggestions for future studies in data mining for quality control in product assembly.

Design/methodology/approach

This paper provides an outline of data-driven process monitoring and fault diagnosis methods for reduction in variation. The development of statistical process monitoring techniques and diagnosis methods, such as pattern matching, estimation-based analysis and artificial intelligence-based diagnostics, is introduced.

Findings

A classification structure for data-driven process control techniques and the limitations of their applications in multi-station assembly processes are discussed. From the perspective of the engineering requirements of real, dynamic, nonlinear and uncertain assembly systems, future trends in sensing system location, data mining and data fusion techniques for variation reduction are suggested.

Originality/value

This paper reveals the development of process monitoring and fault diagnosis techniques, and their applications in variation reduction in multi-station assembly.

Details

Assembly Automation, vol. 39 no. 4
Type: Research Article
ISSN: 0144-5154

Keywords

Article
Publication date: 27 July 2012

Anupam Das, J. Maiti and R.N. Banerjee

Monitoring of a process leading to the detection of faults and determination of the root causes are essential for the production of consistent good quality end products with…

1751

Abstract

Purpose

Monitoring of a process leading to the detection of faults and determination of the root causes are essential for the production of consistent good quality end products with improved yield. The history of process monitoring fault detection (PMFD) strategies can be traced back to 1930s. Thereafter various tools, techniques and approaches were developed along with their application in diversified fields. The purpose of this paper is to make a review to categorize, describe and compare the various PMFD strategies.

Design/methodology/approach

Taxonomy was developed to categorize PMFD strategies. The basis for the categorization was the type of techniques being employed for devising the PMFD strategies. Further, PMFD strategies were discussed in detail along with emphasis on the areas of applications. Comparative evaluations of the PMFD strategies based on some commonly identified issues were also carried out. A general framework common to all the PMFD has been presented. And lastly a discussion into future scope of research was carried out.

Findings

The techniques employed for PMFD are primarily of three types, namely data driven techniques such as statistical model based and artificial intelligent based techniques, priori knowledge based techniques, and hybrid models, with a huge dominance of the first type. The factors that should be considered in developing a PMFD strategy are ease in development, diagnostic ability, fault detection speed, robustness to noise, generalization capability, and handling of nonlinearity. The review reveals that there is no single strategy that can address all aspects related to process monitoring and fault detection efficiently and there is a need to mesh the different techniques from various PMFD strategies to devise a more efficient PMFD strategy.

Research limitations/implications

The review documents the existing strategies for PMFD with an emphasis on finding out the nature of the strategies, data requirements, model building steps, applicability and scope for amalgamation. The review helps future researchers and practitioners to choose appropriate techniques for PMFD studies for a given situation. Further, future researchers will get a comprehensive but precise report on PMFD strategies available in the literature to date.

Originality/value

The review starts with identifying key indicators of PMFD for review and taxonomy was proposed. An analysis was conducted to identify the pattern of published articles on PMFD followed by evolution of PMFD strategies. Finally, a general framework is given for PMFD strategies for future researchers and practitioners.

Details

International Journal of Quality & Reliability Management, vol. 29 no. 7
Type: Research Article
ISSN: 0265-671X

Keywords

Article
Publication date: 25 September 2009

Hussein A. Abdou and John Pointon

The main aims of this paper are: first, to investigate how decisions are currently made within the Egyptian public sector environment; and, second, to determine whether the…

1841

Abstract

Purpose

The main aims of this paper are: first, to investigate how decisions are currently made within the Egyptian public sector environment; and, second, to determine whether the decision making can be significantly improved through the use of credit scoring models. A subsidiary aim is to analyze the impact of different proportions of sub‐samples of accepted credit applicants on both efficient decision making and the optimal choice of credit scoring techniques.

Design/methodology/approach

Following an investigative phase to identify relevant variables in the sector, the research proceeds to an evaluative phase, in which an analysis is undertaken of real data sets (comprising 1,262 applicants), provided by the commercial public sector banks in Egypt. Two types of neural nets are used, and correspondingly two types of conventional techniques are applied. The use of two evaluative measures/criteria: average correct classification (ACC) rate and estimated misclassification cost (EMC) under different misclassification cost (MC) ratios are investigated.

Findings

The currently used approach is based on personal judgement. Statistical scoring techniques are shown to provide more efficient classification results than the currently used judgemental techniques. Furthermore, neural net models give better ACC rates, but the optimal choice of techniques depends on the MC ratio. The probabilistic neural net (PNN) is preferred for a lower cost ratio, whilst the multiple discriminant analysis (MDA) is the preferred choice for a higher ratio. Thus, there is a role for MDA as well as neural nets. There is evidence of statistically significant differences between advanced scoring models and conventional models.

Research limitations/implications

Future research could investigate the use of further evaluative measures, such as the area under the ROC curve and GINI coefficient techniques and more statistical techniques, such as genetic and fuzzy programming. The plan is to enlarge the data set.

Practical implications

There is a huge financial benefit from applying these scoring models to Egyptian public sector banks, for at present only judgemental techniques are being applied in credit evaluation processes. Hence, these techniques can be introduced to support the bank credit decision makers.

Originality/value

Thie paper reveals a set of key variables culturally relevant to the Egyptian environment, and provides an evaluation of personal loans in the Egyptian public sector banking environment, in which (to the best of the author's knowledge) no other authors have studied the use of sophisticated statistical credit scoring techniques.

Details

International Journal of Managerial Finance, vol. 5 no. 4
Type: Research Article
ISSN: 1743-9132

Keywords

Article
Publication date: 1 October 1994

Michael Wood

The techniques of statistical process control (SPC) are designed tomonitor production processes in order to prevent the production of wasteand improve the quality of future…

2335

Abstract

The techniques of statistical process control (SPC) are designed to monitor production processes in order to prevent the production of waste and improve the quality of future output. The emphasis is on the prevention of problems before they occur instead of simply revealing and correcting past mistakes. SPC is now increasingly used for service processes as well as the manufacturing processes for which it was originally developed. This raises the question of whether the same benefits can be achieved, and whether the techniques need to be refined in any way, if they are to be equally useful in the service arena. Looks at a number of examples of the application of SPC techniques to service processes. Argues that there are features of many service processes which have implications for the way SPC should be applied. Proposes a set of guidelines for systems for the statistical monitoring of service processes. Argues that standard SPC techniques can yield substantial benefits for service processes, provided that users remember these guidelines. In particular, argues that the use of the word “control”, and so the phrase “statistical process control”, is often inappropriate. Finally, suggests that some of the conclusions may be equally applicable to many production processes.

Details

International Journal of Service Industry Management, vol. 5 no. 4
Type: Research Article
ISSN: 0956-4233

Keywords

Article
Publication date: 24 April 2007

Nigel Peter Grigg and Lesley Walls

The paper aims to describe a recently completed research project on the use of statistical quality control (SQC) methods in the context of food and drinks manufacturing. It…

2971

Abstract

Purpose

The paper aims to describe a recently completed research project on the use of statistical quality control (SQC) methods in the context of food and drinks manufacturing. It discusses issues surrounding the successful uptake of such methods, including organisational motivation, possible application, costs and benefits, critical success factors and the central importance of prerequisite statistical thinking (ST).

Design/methodology/approach

A three stage, mixed methods approach was adopted, incorporating surveys augmented by case studies and key informant interviews with industry managers and providers of relevant industry training. All data were combined to produce the final model.

Findings

The paper finds that SQC methods are of relevance in the industry, providing the process is appropriate and management have a basic awareness of the fundamentals of ST. Certain organisational and external factors were found to progressively reduce the effectiveness with which such methods are introduced and sustained. The paper ends with discussion of an original model, developed from the research, which illustrates the “filters” that tend to reduce the effectiveness with which methods are used in the industry, with a discussion of how each can be overcome.

Research limitations/implications

The research in this paper is focused on the European food manufacturing and legislative context, and predominantly UK. Low survey response rates numbers necessitated a nonparametric approach to survey analysis.

Practical implications

The paper shows that the filters model is of generic applicability and interest to current and future managers, and to other researchers in this area.

Originality/value

The paper addresses the “how to” of SQC, and examines an industry where there is not yet widespread literature on the benefits of SQC methods. It presents an original model of the barriers to effective use of statistical methods within a process knowledge and improvement cycle.

Details

International Journal of Quality & Reliability Management, vol. 24 no. 4
Type: Research Article
ISSN: 0265-671X

Keywords

Article
Publication date: 26 April 2013

Erastus Karanja and Jigish Zaveri

MIS researchers have consistently adopted survey‐based research method while investigating MIS and related phenomenon, making survey‐based research method one of the widely used…

1159

Abstract

Purpose

MIS researchers have consistently adopted survey‐based research method while investigating MIS and related phenomenon, making survey‐based research method one of the widely used research method in MIS research. This study seeks to revisit some of the inherent characteristics of survey‐based research method with the aim of improving the quality, replication, and validation of results in MIS survey‐based studies. Additionally, this study provides information on the most prevalent analytical and statistical tools used in MIS survey research studies.

Design/methodology/approach

In this research, the authors adopt the content analysis technique. The choice of content analysis is premised on the desire to investigate the sources of survey data, units of analysis, research methods, and statistical tools used in MIS research with the aim of improving empirical research in the MIS discipline.

Findings

The results show the prevalent sources of data, the dominant units of analysis, the most commonly used analytical research methods, and the statistical tools adopted by many MIS researchers. The results indicate that many MIS researchers get their data from US sources, although researchers are increasingly acquiring data from other countries. Also, the results reveal that most MIS survey researchers are using SEM, LISREL, and PLS statistical methods and tools.

Practical implications

The paper concludes with recommendations and implications on how to inform and retool upcoming and existing researchers on the current and future MIS research tools and methods. Editors should ensure that MIS researchers provide as much information as possible about the sources of data, the dominant units of analysis, the analytical research methods used, and the statistical tools adopted; these will demonstrate the rigor of the research process and enable replication, validation, and extension of the research works.

Originality/value

The paper presents the results of a content analysis of 749 survey‐based research articles published between 1990 and 2010 in nine mainstream MIS Journals. Prior studies have broadly addressed aspects of MIS research methodologies like investigating MIS research methods, ranking them, and generated a taxonomy of MIS research methodology. The results of this study make a case for the reporting of, both, the analytical method(s) and statistical tools used by MIS researchers to aid in replicating, validating, and extending the resultant findings of their survey‐based research.

Details

Journal of Systems and Information Technology, vol. 15 no. 2
Type: Research Article
ISSN: 1328-7265

Keywords

1 – 10 of over 74000