Search results

1 – 10 of over 2000
Article
Publication date: 9 January 2024

Mahendra Saha, Pratibha Pareek, Harsh Tripathi and Anju Devi

First is to develop the time truncated median control chart for the Rayleigh distribution (RD) and generalized RD (GRD), respectively. Second is to evaluate the performance of…

Abstract

Purpose

First is to develop the time truncated median control chart for the Rayleigh distribution (RD) and generalized RD (GRD), respectively. Second is to evaluate the performance of the proposed attribute control chart which depends on the average run length (ARL) and third is to include real life examples for application purpose of the proposed attribute control chart.

Design/methodology/approach

(1) Select a random sample of size n from each subgroup from the production process and put them on a test for specified time t, where t = ? × µe. Then, count the numbers of failed items in each subgroup up to time t. (2) Step 2: Using np chart, define D = np, the number of failures, which also a random variable follows the Binomial distribution. It is better to use D = np chart rather than p chart because the authors are using number of failure rather than proportion of failure p. When the process is in control, then the parameters of the binomial distribution are n and p0, respectively. (3) Step 3: The process is said to be in control if LCL = D = UCL; otherwise, the process is said to be out of control. Hence, LCL and UCL for the proposed control chart.

Findings

From the findings, it is concluded that the GRD has smaller ARL values than the RD for specified values of parameters, which indicate that GRD performing well for out of control signal as compared to the RD.

Research limitations/implications

This developed control chart is applicable when real life situation coincide with RD and GRD.

Social implications

Researcher can directly use presented study and save consumers from accepting bad lot and also encourage producers to make good quality products so that society can take benefit from their products.

Originality/value

This article dealt with time truncated attribute median control chart for non-normal distributions, namely, the RD and GRD, respectively. The structure of the proposed control chart is developed based on median lifetime of the RD and GRD, respectively.

Details

International Journal of Quality & Reliability Management, vol. 41 no. 5
Type: Research Article
ISSN: 0265-671X

Keywords

Article
Publication date: 17 January 2023

Razieh Seirani, Mohsen Torabian, Mohammad Hassan Behzadi and Asghar Seif

The purpose of this paper is to present an economic–statistical design (ESD) for the Bayesian X…

Abstract

Purpose

The purpose of this paper is to present an economic–statistical design (ESD) for the Bayesian X control chart based on predictive distribution with two types of informative and noninformative prior distributions.

Design/methodology/approach

The design used in this study is based on determining the control chart of the predictive distribution and then its ESD. The new proposed cost model is presented by considering the conjugate and Jeffrey's prior distribution in calculating the expected total cycle time and expected cost per cycle, and finally, the optimal design parameters and related costs are compared with the fixed ratio sampling (FRS) mode.

Findings

Numerical results show decreases in costs in this Bayesian approach with both Jeffrey's and conjugate prior distribution compared to the FRS mode. This result shows that the Bayesian approach which is based on predictive density works better than the classical approach. Also, for the Bayesian approach, however, there is no significant difference between the results of using Jeffrey's and conjugate prior distributions. Using sensitivity analysis, the effect of cost parameters and shock model parameters and deviation from the mean on the optimal values of design parameters and related costs have been investigated and discussed.

Practical implications

This research adds to the body of knowledge related to quality control of process monitoring systems. This paper may be of particular interest to quality system practitioners for whom the effect of the prior distribution of parameters on the quality characteristic distribution is important.

Originality/value

economic statistical design (ESD) of Bayesian control charts based on predictive distribution is presented for the first time.

Details

International Journal of Quality & Reliability Management, vol. 40 no. 8
Type: Research Article
ISSN: 0265-671X

Keywords

Article
Publication date: 17 January 2023

Salimeh Sadat Aghili, Mohsen Torabian, Mohammad Hassan Behzadi and Asghar Seif

The purpose of this paper is to develop a double-objective economic statistical design (ESD) of (X…

Abstract

Purpose

The purpose of this paper is to develop a double-objective economic statistical design (ESD) of (X) control chart under Weibull failure properties with the Linex asymmetric loss function. The authors have expressed the probability of type II error (β) as the statistical objective and the expected cost as the economic objective.

Design/methodology/approach

The design used in this study is based on a double-objective economic statistical design of (X) control chart with Weibull shock model via applying Banerjee and Rahim's model for non-uniform and uniform schemes with Linex asymmetric loss function. The results in the least average cost and β in uniform and non-uniform schemes by Linex loss function, compared with the same schemes without loss function.

Findings

Numerical results indicate that it is not possible to reduce the second type of error and costs at the same time, which means that by reducing the second type of error, the cost increases, and by reducing the cost, the second type of error increases, both of which are very important. Obtained based on the needs of the industry and which one has more priority has the right to choose. These designs define a Pareto optimal front of solutions that increase the flexibility and adaptability of the X control chart in practice. When the authors use non-uniform schemes instead of uniform schemes, the average cost per unit time decreases by an average and when the authors apply loss function, the average cost per unit time increases by an average. Also, this quantity for double-objective schemes with loss function compared to without loss function schemes in cases uniform and non-uniform increases. The reason for this result is that the model underestimated the costs before using the loss function.

Practical implications

This research adds to the body of knowledge related to flexibility in process quality control. This article may be of interest to quality systems experts in factories where the choice between cost reduction and statistical factor reduction can affect the production process.

Originality/value

The cost functions for double-objective uniform and non-uniform sampling schemes with the Weibull shock model based on the Linex loss function are presented for the first time.

Details

International Journal of Quality & Reliability Management, vol. 40 no. 8
Type: Research Article
ISSN: 0265-671X

Keywords

Article
Publication date: 4 July 2023

Karim Atashgar and Mahnaz Boush

When a process experiences an out-of-control condition, identification of the change point is capable of leading practitioners to an effective root cause analysis. The change…

Abstract

Purpose

When a process experiences an out-of-control condition, identification of the change point is capable of leading practitioners to an effective root cause analysis. The change point addresses the time when a special cause(s) manifests itself into the process. In the statistical process monitoring when the chart signals an out-of-control condition, the change point analysis is an important step for the root cause analysis of the process. This paper attempts to propose a model approaching the artificial neural network to identify the change point of a multistage process with cascade property in the case that the process is modeled properly by a simple linear profile.

Design/methodology/approach

In practice, many processes can be modeled by a functional relationship rather than a single random variable or a random vector. This approach of modeling is referred to as the profile in the statistical process control literature. In this paper, two models based on multilayer perceptron (MLP) and convolutional neural network (CNN) approaches are proposed for identifying the change point of the profile of a multistage process.

Findings

The capability of the proposed models are evaluated and compared using several numerical scenarios. The numerical analysis of the proposed neural networks indicates that the two proposed models are capable of identifying the change point in different scenarios effectively. The comparative sensitivity analysis shows that the capability of the proposed convolutional network is superior compared to MLP network.

Originality/value

To the best of the authors' knowledge, this is the first time that: (1) A model is proposed to identify the change point of the profile of a multistage process. (2) A convolutional neural network is modeled for identifying the change point of an out-of-control condition.

Details

International Journal of Quality & Reliability Management, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 0265-671X

Keywords

Article
Publication date: 27 September 2022

Souad El Houssaini, Mohammed-Alamine El Houssaini and Jamal El Kafi

In vehicular ad hoc networks (VANETs), the information transmitted is broadcast in a free access environment. Therefore, VANETs are vulnerable against attacks that can directly…

Abstract

Purpose

In vehicular ad hoc networks (VANETs), the information transmitted is broadcast in a free access environment. Therefore, VANETs are vulnerable against attacks that can directly perturb the performance of the networks and then provoke big fall of capability. Black hole attack is an example such attack, where the attacker node pretends that having the shortest path to the destination node and then drops the packets. This paper aims to present a new method to detect the black hole attack in real-time in a VANET network.

Design/methodology/approach

This method is based on capability indicators that are widely used in industrial production processes. If the different capability indicators are greater than 1.33 and the stability ratio (Sr) is greater than 75%, the network is stable and the vehicles are communicating in an environment without the black hole attack. When the malicious nodes representing the black hole attacks are activated one by one, the fall of capability becomes more visible and the network is unstable, out of control and unmanaged, due to the presence of the attacks. The simulations were conducted using NS-3 for the network simulation and simulation of urban mobility for generating the mobility model.

Findings

The proposed mechanism does not impose significant overheads or extensive modifications in the standard Institute of Electrical and Electronics Engineers 802.11p or in the routing protocols. In addition, it can be implemented at any receiving node which allows identifying malicious nodes in real-time. The simulation results demonstrated the effectiveness of proposed scheme to detect the impact of the attack very early, especially with the use of the short-term capability indicators (Cp, Cpk and Cpm) of each performance metrics (throughput and packet loss ratio), which are more efficient at detecting quickly and very early the small deviations over a very short time. This study also calculated another indicator of network stability which is Sr, which allows to make a final decision if the network is under control and that the vehicles are communicating in an environment without the black hole attack.

Originality/value

According to the best of the authors’ knowledge, the method, using capability indicators for detecting the black hole attack in VANETs, has not been presented previously in the literature.

Details

International Journal of Pervasive Computing and Communications, vol. 19 no. 5
Type: Research Article
ISSN: 1742-7371

Keywords

Open Access
Article
Publication date: 22 May 2023

Rebecca Gilligan, Rachel Moran and Olivia McDermott

This study aims to utilise Six Sigma in an Irish-based red meat processor to reduce process variability and improve yields.

1548

Abstract

Purpose

This study aims to utilise Six Sigma in an Irish-based red meat processor to reduce process variability and improve yields.

Design/methodology/approach

This is a case study within an Irish meat processor where the structured Define, Measure, Analyse, Improve and Control (DMAIC) methodology was utilised along with statistical analysis to highlight areas of the meat boning process to improve.

Findings

The project led to using Six Sigma to identify and measure areas of process variation. This resulted in eliminating over-trimming of meat cuts, improving process capabilities, increasing revenue and reducing meat wastage. In addition, key performance indicators and control charts, meat-cutting templates and smart cutting lasers were implemented.

Research limitations/implications

The study is one of Irish meat processors' first Six Sigma applications. The wider food and meat processing industries can leverage the learnings to understand, measure and minimise variation to enhance revenue.

Practical implications

Organisations can use this study to understand the benefits of adopting Six Sigma, particularly in the food industry and how measuring process variation can affect quality.

Originality/value

This is the first practical case study on Six sigma deployment in an Irish meat processor, and the study can be used to benchmark how Six Sigma tools can aid in understanding variation, thus benefiting key performance metrics.

Details

The TQM Journal, vol. 35 no. 9
Type: Research Article
ISSN: 1754-2731

Keywords

Article
Publication date: 28 June 2022

Nursuhana Alauddin and Shu Yamada

The availability of daily assessment data in a centralized monitoring system at school provides the opportunity to detect unusual scores soon after the assessment is carried out…

Abstract

Purpose

The availability of daily assessment data in a centralized monitoring system at school provides the opportunity to detect unusual scores soon after the assessment is carried out. This paper introduces a model for the detection of unusual scores of individual students to immediately improve performances that deviate from a normal state.

Design/methodology/approach

A student's ability, a subject's difficulty level, a student's specific ability in a subject, and the difficulty level of an assessment in a subject are selected as factor effects of a linear ANOVA model. Through analysis of variance, a case study is conducted based on 330 data points of assessment scores of primary grade students retrieved from an international school in Japan.

Findings

The actual score is below the lower control limit, which is recognized as an unusual score, and the score can be detected immediately after sitting for an assessment and is beneficial for students to take immediate remedies based on daily assessment. This is demonstrated through a case study.

Originality/value

The detection of unusual scores based on a linear model of individual students soon after each assessment benefits from immediate remedy aligns with a daily management concept. The daily assessment data in a school system enable detection based on individual students, subject-wise and assessment-wise to improve student performances in the same academic year.

Details

The TQM Journal, vol. 35 no. 6
Type: Research Article
ISSN: 1754-2731

Keywords

Article
Publication date: 25 May 2023

Mohammad Shamsuzzaman, Mohammad Khadem, Salah Haridy, Ahm Shamsuzzoha, Mohammad Abdalla, Marwan Al-Hanini, Hamdan Almheiri and Omar Masadeh

The purpose of this study is to implement lean six sigma (LSS) methodology to improve the admission process in a higher education institute (HEI).

Abstract

Purpose

The purpose of this study is to implement lean six sigma (LSS) methodology to improve the admission process in a higher education institute (HEI).

Design/methodology/approach

In this study, case study research methodology is adopted and implemented through an LSS define-measure-analyze-improve-control (DMAIC) framework.

Findings

The preliminary investigation showed that the completion of the whole admission process of a new student takes an average of 88 min, which is equivalent to a sigma level of about 0.71 based on the targeted admission cycle time of 60 min. The implementation of the proposed LSS approach increased the sigma level from 0.71 to 2.57, which indicates a reduction in the mean admission cycle time by around 55%. This substantial improvement is expected not only to provide an efficient admission process but also to enhance the satisfaction of students and employees and increase the reputation of the HEI to a significant level.

Research limitations/implications

In this study, the sample size used in the analysis is considered small. In addition, the effectiveness of the proposed approach is investigated using a discrete event simulation with a single-case study, which may limit generalization of the results. However, this study can provide useful guidance for further research for the generalization of the results to wider scopes in terms of different sectors of HEIs and geographical locations.

Practical implications

This study uses several statistical process control tools and techniques through a LSS DMAIC framework to identify and element the root causes of the long admission cycle time at a HEI. The approach followed, and the lessons learned, as documented in the study, can be of a great benefit in improving different sectors of HEIs.

Originality/value

This study is one of the few attempts to implement LSS in HEIs to improve the administrative process so that better-quality services can be provided to customers, such as students and guardians. The project is implemented by a group of undergraduate students as a part of their senior design project, which paves the way for involving students in future LSS projects in HEIs. This study is expected to help to improve understanding of how LSS methodology can be implemented in solving quality-related problems in HEIs and to offer valuable insights for both academics and practitioners.

Details

International Journal of Lean Six Sigma, vol. 14 no. 7
Type: Research Article
ISSN: 2040-4166

Keywords

Article
Publication date: 17 May 2023

Simone Caruso, Manfredi Bruccoleri, Astrid Pietrosi and Antonio Scaccianoce

The nature and amount of data that public organizations have to monitor to counteract corruption lead to a phenomenon called “KPI overload”, consisting of the business analyst…

Abstract

Purpose

The nature and amount of data that public organizations have to monitor to counteract corruption lead to a phenomenon called “KPI overload”, consisting of the business analyst feeling overwhelmed by the amount of information and resulting in the absence of appropriate control. The purpose of this study is to develop a solution based on Artificial Intelligence technology to avoid data overloading and, at the same time, under-controlling in business process monitoring.

Design/methodology/approach

The authors adopted a design science research approach. The authors started by observing a specific problem in a real context (a healthcare organization); then conceptualized, designed and implemented a solution to the problem with the goal to develop knowledge that can be used to design solutions for similar problems. The proposed solution for business process monitoring integrates databases and self-service business intelligence for outlier detection and artificial intelligence for classification analysis.

Findings

The authors found the solution powerful to solve problems related to KPI overload in process monitoring. In the specific case study, the authors found that the combination of Business Intelligence and Artificial Intelligence can provide a significant contribution to the detection of fraud, corruption and/or policy misalignment in public organizations.

Originality/value

The authors provide a big-data-based solution to the problem of data overload in business process monitoring that does not sacrifice any monitored Key Performance Indicators and that also reduces the workload of the business analyst. The authors also developed and implemented this automated solution in a context where data sensitivity and privacy are critical issues.

Open Access
Article
Publication date: 10 January 2023

Anna Trubetskaya, Olivia McDermott and Seamus McGovern

This article aims to optimise energy use and consumption by integrating Lean Six Sigma methodology with the ISO 50001 energy management system standard in an Irish dairy plant…

2843

Abstract

Purpose

This article aims to optimise energy use and consumption by integrating Lean Six Sigma methodology with the ISO 50001 energy management system standard in an Irish dairy plant operation.

Design/methodology/approach

This work utilised Lean Six Sigma methodology to identify methods to measure and optimise energy consumption. The authors use a single descriptive case study in an Irish dairy as the methodology to explain how DMAIC was applied to reduce energy consumption.

Findings

The replacement of heavy oil with liquid natural gas in combination with the new design of steam boilers led to a CO2 footprint reduction of almost 50%.

Practical implications

A further longitudinal study would be useful to measure and monitor the energy management system progress and carry out more case studies on LSS integration with energy management systems across the dairy industry.

Originality/value

The novelty of this study is the application of LSS in the dairy sector as an enabler of a greater energy-efficient facility, as well as the testing of the DMAIC approach to meet a key objective for ISO 50001 accreditation.

Details

The TQM Journal, vol. 35 no. 9
Type: Research Article
ISSN: 1754-2731

Keywords

1 – 10 of over 2000