Search results

1 – 10 of over 9000
Article
Publication date: 2 October 2023

Nicole King and Ian Asquith

This study aims to evaluate the quality of information recorded in Behaviour Monitoring Charts (BMC) for Behaviours that Challenge (BtC) in dementia in an older adult inpatient…

Abstract

Purpose

This study aims to evaluate the quality of information recorded in Behaviour Monitoring Charts (BMC) for Behaviours that Challenge (BtC) in dementia in an older adult inpatient dementia service in the North of England (Aim I) and to understand staff perceptions and experiences of completing BMC for BtC in dementia (Aim II).

Design/methodology/approach

Descriptive statistics and graphs were used to analyse and interpret quantitative data gathered from BMC (Aim I) and Likert-scale survey responses (Aim II). Thematic analysis (Braun and Clarke, 2006) was used to analyse and interpret qualitative data collected from responses to open-ended survey questions and, separately, focus group discussions (Aim II).

Findings

Analysis of the BMCs revealed that some of the data recorded relating to antecedents, behaviours and consequences lacked richness and used vague language (i.e. gave reassurance), which limited its clinical utility. Overall, participants and respondents found BMC to be problematic. For them, completing BMCs were not viewed as worthwhile, the processes that followed their completion were unclear, and they left staff feeling disempowered in the systemic hierarchy of an inpatient setting.

Originality/value

Functional analysis of BMC helps identify and inform appropriately tailored interventions for BtC in dementia. Understanding how BMCs are used and how staff perceive BMC provides a unique opportunity to improve them. Improving BMC will support better functional analysis of BtC, thus allowing for more tailored interventions to meet the needs of people with dementia.

Details

Working with Older People, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 1366-3666

Keywords

Open Access
Article
Publication date: 17 August 2021

Abeer A. Zaki, Nesma A. Saleh and Mahmoud A. Mahmoud

This study aims to assess the effect of updating the Phase I data – to enhance the parameters' estimates – on the control charts' detection power designed to monitor social…

Abstract

Purpose

This study aims to assess the effect of updating the Phase I data – to enhance the parameters' estimates – on the control charts' detection power designed to monitor social networks.

Design/methodology/approach

A dynamic version of the degree corrected stochastic block model (DCSBM) is used to model the network. Both the Shewhart and exponentially weighted moving average (EWMA) control charts are used to monitor the model parameters. A performance comparison is conducted for each chart when designed using both fixed and moving windows of networks.

Findings

Our results show that continuously updating the parameters' estimates during the monitoring phase delays the Shewhart chart's detection of networks' anomalies; as compared to the fixed window approach. While the EWMA chart performance is either indifferent or worse, based on the updating technique, as compared to the fixed window approach. Generally, the EWMA chart performs uniformly better than the Shewhart chart for all shift sizes. We recommend the use of the EWMA chart when monitoring networks modeled with the DCSBM, with sufficiently small to moderate fixed window size to estimate the unknown model parameters.

Originality/value

This study shows that the excessive recommendations in literature regarding the continuous updating of Phase I data during the monitoring phase to enhance the control chart performance cannot generally be extended to social network monitoring; especially when using the DCSBM. That is to say, the effect of continuously updating the parameters' estimates highly depends on the nature of the process being monitored.

Details

Review of Economics and Political Science, vol. 6 no. 4
Type: Research Article
ISSN: 2356-9980

Keywords

Article
Publication date: 3 August 2015

Anupam Das, S. C. Mondal, J. J. Thakkar and J. Maiti

The purpose of this paper is to build a monitoring scheme in order to detect and subsequently eliminate abnormal behavior of the concerned casting process so as to produce worm…

Abstract

Purpose

The purpose of this paper is to build a monitoring scheme in order to detect and subsequently eliminate abnormal behavior of the concerned casting process so as to produce worm wheels with good quality characteristics.

Design/methodology/approach

In this a study, a process monitoring strategy has been devised for a centrifugal casting process using data-based multivariate statistical technique, namely, partial least squares regression (PLSR).

Findings

Based on a case study, the PLSR model constructed for this study seems to mimic the actual process quite well which is evident from the various performance criteria (predicted and analysis of variance results).

Practical implications

The practical implication of the study involves development of a software application with a back-end database which would be interfaced with a computer program based on PLSR algorithm for estimation of model parameters and the control limit for the monitoring chart. It would help in easy and real-time detection of faults.

Originality/value

This study concerns the application of a PLSR-based monitoring strategy to a centrifugal casting process engaged in the production of worm wheel.

Details

International Journal of Quality & Reliability Management, vol. 32 no. 7
Type: Research Article
ISSN: 0265-671X

Keywords

Article
Publication date: 27 July 2012

Anupam Das, J. Maiti and R.N. Banerjee

Monitoring of a process leading to the detection of faults and determination of the root causes are essential for the production of consistent good quality end products with…

1715

Abstract

Purpose

Monitoring of a process leading to the detection of faults and determination of the root causes are essential for the production of consistent good quality end products with improved yield. The history of process monitoring fault detection (PMFD) strategies can be traced back to 1930s. Thereafter various tools, techniques and approaches were developed along with their application in diversified fields. The purpose of this paper is to make a review to categorize, describe and compare the various PMFD strategies.

Design/methodology/approach

Taxonomy was developed to categorize PMFD strategies. The basis for the categorization was the type of techniques being employed for devising the PMFD strategies. Further, PMFD strategies were discussed in detail along with emphasis on the areas of applications. Comparative evaluations of the PMFD strategies based on some commonly identified issues were also carried out. A general framework common to all the PMFD has been presented. And lastly a discussion into future scope of research was carried out.

Findings

The techniques employed for PMFD are primarily of three types, namely data driven techniques such as statistical model based and artificial intelligent based techniques, priori knowledge based techniques, and hybrid models, with a huge dominance of the first type. The factors that should be considered in developing a PMFD strategy are ease in development, diagnostic ability, fault detection speed, robustness to noise, generalization capability, and handling of nonlinearity. The review reveals that there is no single strategy that can address all aspects related to process monitoring and fault detection efficiently and there is a need to mesh the different techniques from various PMFD strategies to devise a more efficient PMFD strategy.

Research limitations/implications

The review documents the existing strategies for PMFD with an emphasis on finding out the nature of the strategies, data requirements, model building steps, applicability and scope for amalgamation. The review helps future researchers and practitioners to choose appropriate techniques for PMFD studies for a given situation. Further, future researchers will get a comprehensive but precise report on PMFD strategies available in the literature to date.

Originality/value

The review starts with identifying key indicators of PMFD for review and taxonomy was proposed. An analysis was conducted to identify the pattern of published articles on PMFD followed by evolution of PMFD strategies. Finally, a general framework is given for PMFD strategies for future researchers and practitioners.

Details

International Journal of Quality & Reliability Management, vol. 29 no. 7
Type: Research Article
ISSN: 0265-671X

Keywords

Article
Publication date: 9 October 2019

Soham Chakraborty and Pathik Mandal

Modeling and inferring about the process using growth models are the problems of enormous practical importance. Growth behavior of melting point (MP) during hydrogenation is found…

Abstract

Purpose

Modeling and inferring about the process using growth models are the problems of enormous practical importance. Growth behavior of melting point (MP) during hydrogenation is found to be nonlinear. The purpose of this paper is to propose a control chart based method for on-line detection of a growth process becoming dead.

Design/methodology/approach

The nonlinear growth kinetics of MP during hydrogenation is modeled as a random walk with drift. In earlier work, the random walk model is developed based on a linear approximation and the control chart is constructed based on this approximate model. Here, an alternative model that does not make use of any such approximation is proposed. The variable drift component of the random walk is estimated following an innovative method of instrumental variable estimation. The model thus obtained is then used to construct a new control chart.

Findings

It is shown that both the control charts are able to detect dead batches satisfactorily, but the new chart is superior to the earlier one.

Originality/value

The authors are not aware of any relevant literature which provides an implementable and practitioner friendly approach to model the usually cumbersome variance function using signal-to-noise ratio and then use the same for estimating the parameters of a nonlinear dynamic growth model.

Details

International Journal of Quality & Reliability Management, vol. 36 no. 10
Type: Research Article
ISSN: 0265-671X

Keywords

Article
Publication date: 12 October 2021

Shovan Chowdhury, Amarjit Kundu and Bidhan Modok

As an alternative to the standard p and np charts along with their various modifications, beta control charts are used in the literature for monitoring proportion data. These…

Abstract

Purpose

As an alternative to the standard p and np charts along with their various modifications, beta control charts are used in the literature for monitoring proportion data. These charts in general use average of proportions to set up the control limits assuming in-control parameters known. The purpose of the paper is to propose a control chart for detecting shift(s) in the percentiles of a beta distributed process monitoring scheme when in-control parameters are unknown. Such situations arise when specific percentile of proportion of conforming or non-conforming units is the quality parameter of interest.

Design/methodology/approach

Parametric bootstrap method is used to develop the control chart for monitoring percentiles of a beta distributed process when in-control parameters are unknown. Extensive Monte Carlo simulations are conducted for various combinations of percentiles, false-alarm rates and sample sizes to evaluate the in-control performance of the proposed bootstrap control charts in terms of average run lengths (ARL). The out-of-control behavior and performance of the proposed bootstrap percentile chart is thoroughly investigated for several choices of shifts in the parameters of beta distribution. The proposed chart is finally applied to two skewed data sets for illustration.

Findings

The simulated values of in-control ARL are found to be closer to the theoretical results implying that the proposed chart for percentiles performs well with both positively and negatively skewed data. Also, the out-of-control ARL values for the percentiles decrease sharply with both downward and upward small, medium and large shifts in the parameters. The phenomenon indicates that the chart is effective in detecting shifts in the parameters. However, the speed of detection of shifts varies depending on the type of shift, the parameters and the percentile being considered. The proposed chart is found to be effective in comparison to the Shewhart-type chart and bootstrap-based unit gamma chart.

Originality/value

It is worthwhile to mention that the beta control charts proposed in the literature use average of proportion to set up the control limits. However, in practice, specific percentile of proportion of conforming or non-conforming items should be more useful as the quality parameter of interest than average. To the best of our knowledge, no research addresses beta control chart for percentiles of proportion in the literature. Moreover, the proposed control chart assumes in-control parameters to be unknown, and hence captures additional variability introduced into the monitoring scheme through parameter estimation. In this sense, the proposed chart is original and unique.

Details

International Journal of Quality & Reliability Management, vol. 39 no. 10
Type: Research Article
ISSN: 0265-671X

Keywords

Article
Publication date: 31 January 2022

Simone Massulini Acosta and Angelo Marcio Oliveira Sant'Anna

Process monitoring is a way to manage the quality characteristics of products in manufacturing processes. Several process monitoring based on machine learning algorithms have been…

Abstract

Purpose

Process monitoring is a way to manage the quality characteristics of products in manufacturing processes. Several process monitoring based on machine learning algorithms have been proposed in the literature and have gained the attention of many researchers. In this paper, the authors developed machine learning-based control charts for monitoring fraction non-conforming products in smart manufacturing. This study proposed a relevance vector machine using Bayesian sparse kernel optimized by differential evolution algorithm for efficient monitoring in manufacturing.

Design/methodology/approach

A new approach was carried out about data analysis, modelling and monitoring in the manufacturing industry. This study developed a relevance vector machine using Bayesian sparse kernel technique to improve the support vector machine used to both regression and classification problems. The authors compared the performance of proposed relevance vector machine with other machine learning algorithms, such as support vector machine, artificial neural network and beta regression model. The proposed approach was evaluated by different shift scenarios of average run length using Monte Carlo simulation.

Findings

The authors analyse a real case study in a manufacturing company, based on best machine learning algorithms. The results indicate that proposed relevance vector machine-based process monitoring are excellent quality tools for monitoring defective products in manufacturing process. A comparative analysis with four machine learning models is used to evaluate the performance of the proposed approach. The relevance vector machine has slightly better performance than support vector machine, artificial neural network and beta models.

Originality/value

This research is different from the others by providing approaches for monitoring defective products. Machine learning-based control charts are used to monitor product failures in smart manufacturing process. Besides, the key contribution of this study is to develop different models for fault detection and to identify any change point in the manufacturing process. Moreover, the authors’ research indicates that machine learning models are adequate tools for the modelling and monitoring of the fraction non-conforming product in the industrial process.

Details

International Journal of Quality & Reliability Management, vol. 40 no. 3
Type: Research Article
ISSN: 0265-671X

Keywords

Article
Publication date: 21 August 2007

P. Castagliola, G. Celano and S. Fichera

The aim of this study is to present the economic‐statistical design of an EWMA control chart for monitoring the process dispersion.

Abstract

Purpose

The aim of this study is to present the economic‐statistical design of an EWMA control chart for monitoring the process dispersion.

Design/methodology/approach

The optimal economic‐statistical design of the S EWMA chart was determined for a wide benchmark of examples organized as a two level factorial design and was compared with the designs obtained for the S Shewhart chart. Both the two charts have been designed so that an equal number of false alarms (in‐control Average Run Length) is expected.

Findings

The S EWMA allows significant hourly cost savings to be achieved for the entire set of process scenarios with respect to the S Shewhart; a mean percentage cost saving of 6.77 per cent is obtained for processes characterized by a reduction in process dispersion (i.e. processes whose natural variability is reduced through an external technological intervention), whereas up to a 9.78 per cent saving is achieved for processes whose dispersion is increased by the occurrence of an undesired special cause.

Practical implications

The proposed S EWMA chart can be considered as an effective tool when statistical process control procedures should be implemented on a process with the aim of monitoring its data dispersion.

Originality/value

In literature the economic design of EWMA charts covers only the process cost evaluation when the sample mean is monitored; here, the study is extended to the sample standard deviation to investigate if the EWMA scheme still outperforms the Shewhart chart. An extensive analysis is proposed to evaluate the influence of the process operating parameters on the EWMA chart design variables.

Details

Journal of Quality in Maintenance Engineering, vol. 13 no. 3
Type: Research Article
ISSN: 1355-2511

Keywords

Article
Publication date: 1 March 1997

O.O. Atienza, B.W. Ang and L.C. Tang

Explores the relationships between statistical process control (SPC) and forecasting procedures. While both procedures are often applied and used in different contexts, a careful…

5393

Abstract

Explores the relationships between statistical process control (SPC) and forecasting procedures. While both procedures are often applied and used in different contexts, a careful analysis shows that they go through the same stages that culminate in process or forecast monitoring. This apparent similarity of SPC and forecasting enables a general framework to be established for model‐based SPC. Discusses some forecasting procedures applicable to SPC and underlines the importance of SPC concepts in forecasting.

Details

International Journal of Quality Science, vol. 2 no. 1
Type: Research Article
ISSN: 1359-8538

Keywords

Book part
Publication date: 26 October 2017

Matthew Lindsey and Robert Pavur

Control charts are designed to be effective in detecting a shift in the distribution of a process. Typically, these charts assume that the data for these processes follow an…

Abstract

Control charts are designed to be effective in detecting a shift in the distribution of a process. Typically, these charts assume that the data for these processes follow an approximately normal distribution or some known distribution. However, if a data-generating process has a large proportion of zeros, that is, the data is intermittent, then traditional control charts may not adequately monitor these processes. The purpose of this study is to examine proposed control chart methods designed for monitoring a process with intermittent data to determine if they have a sufficiently small percentage of false out-of-control signals. Forecasting techniques for slow-moving/intermittent product demand have been extensively explored as intermittent data is common to operational management applications (Syntetos & Boylan, 2001, 2005, 2011; Willemain, Smart, & Schwarz, 2004). Extensions and modifications of traditional forecasting models have been proposed to model intermittent or slow-moving demand, including the associated trends, correlated demand, seasonality and other characteristics (Altay, Litteral, & Rudisill, 2012). Croston’s (1972) method and its adaptations have been among the principal procedures used in these applications. This paper proposes adapting Croston’s methodology to design control charts, similar to Exponentially Weighted Moving Average (EWMA) control charts, to be effective in monitoring processes with intermittent data. A simulation study is conducted to assess the performance of these proposed control charts by evaluating their Average Run Lengths (ARLs), or equivalently, their percent of false positive signals.

Details

Advances in Business and Management Forecasting
Type: Book
ISBN: 978-1-78743-069-3

Keywords

1 – 10 of over 9000