Search results

1 – 10 of 127
Article
Publication date: 4 July 2023

Karim Atashgar and Mahnaz Boush

When a process experiences an out-of-control condition, identification of the change point is capable of leading practitioners to an effective root cause analysis. The change…

Abstract

Purpose

When a process experiences an out-of-control condition, identification of the change point is capable of leading practitioners to an effective root cause analysis. The change point addresses the time when a special cause(s) manifests itself into the process. In the statistical process monitoring when the chart signals an out-of-control condition, the change point analysis is an important step for the root cause analysis of the process. This paper attempts to propose a model approaching the artificial neural network to identify the change point of a multistage process with cascade property in the case that the process is modeled properly by a simple linear profile.

Design/methodology/approach

In practice, many processes can be modeled by a functional relationship rather than a single random variable or a random vector. This approach of modeling is referred to as the profile in the statistical process control literature. In this paper, two models based on multilayer perceptron (MLP) and convolutional neural network (CNN) approaches are proposed for identifying the change point of the profile of a multistage process.

Findings

The capability of the proposed models are evaluated and compared using several numerical scenarios. The numerical analysis of the proposed neural networks indicates that the two proposed models are capable of identifying the change point in different scenarios effectively. The comparative sensitivity analysis shows that the capability of the proposed convolutional network is superior compared to MLP network.

Originality/value

To the best of the authors' knowledge, this is the first time that: (1) A model is proposed to identify the change point of the profile of a multistage process. (2) A convolutional neural network is modeled for identifying the change point of an out-of-control condition.

Details

International Journal of Quality & Reliability Management, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 0265-671X

Keywords

Article
Publication date: 8 August 2023

Berihun Bizuneh, Abrham Destaw, Fasika Hailu, Solomon Tsegaye and Bizuayehu Mamo

Sizing system is a fundamental topic in garment fitting. The purpose of this study was to assess the fit of existing police uniforms (shirt, jacket, overcoat and trousers) and…

Abstract

Purpose

Sizing system is a fundamental topic in garment fitting. The purpose of this study was to assess the fit of existing police uniforms (shirt, jacket, overcoat and trousers) and develop a sizing system for upper and lower body uniforms of Amhara policemen in Ethiopia.

Design/methodology/approach

In total, 35 body dimensions of 889 policemen were taken through a manual anthropometric survey following the procedures in ISO 8559:1989 after each subject was interviewed on issues related to garment fit. The anthropometric data were pre-processed, key body dimensions were identified by principal components analysis and body types were clustered by the agglomerative hierarchical clustering algorithm and verified by the XGBoost classifier in a Python programming environment. The developed size charts were validated statistically using aggregate loss and accommodation rate.

Findings

About 44% of the subjects encountered fit problems every time they own new readymade uniforms. Lengths and side seams of shirts, and lengths and waist girths of trousers are the most frequently altered garment sites. Analysis of the anthropometric measurements resulted in 13 and 15 sizes for the upper and lower bodies, respectively. Moreover, the comparison of the developed upper garment size chart with the existing size chart for a shirt showed a considerable difference. This indicates that inappropriate size charts create fit problems.

Originality/value

The study considers the analysis of fit problems and sizing system development in a less researched country. Moreover, the proposed data mining procedure and its application for size chart development is unique and workable.

Details

Research Journal of Textile and Apparel, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 1560-6074

Keywords

Article
Publication date: 14 July 2023

Hamid Hassani, Azadeh Mohebi, M.J. Ershadi and Ammar Jalalimanesh

The purpose of this research is to provide a framework in which new data quality dimensions are defined. The new dimensions provide new metrics for the assessment of lecture video…

91

Abstract

Purpose

The purpose of this research is to provide a framework in which new data quality dimensions are defined. The new dimensions provide new metrics for the assessment of lecture video indexing. As lecture video indexing involves various steps, the proposed framework containing new dimensions, introduces new integrated approach for evaluating an indexing method or algorithm from the beginning to the end.

Design/methodology/approach

The emphasis in this study is on the fifth step of design science research methodology (DSRM), known as evaluation. That is, the methods that are developed in the field of lecture video indexing as an artifact, should be evaluated from different aspects. In this research, nine dimensions of data quality including accuracy, value-added, relevancy, completeness, appropriate amount of data, concise, consistency, interpretability and accessibility have been redefined based on previous studies and nominal group technique (NGT).

Findings

The proposed dimensions are implemented as new metrics to evaluate a newly developed lecture video indexing algorithm, LVTIA and numerical values have been obtained based on the proposed definitions for each dimension. In addition, the new dimensions are compared with each other in terms of various aspects. The comparison shows that each dimension that is used for assessing lecture video indexing, is able to reflect a different weakness or strength of an indexing method or algorithm.

Originality/value

Despite development of different methods for indexing lecture videos, the issue of data quality and its various dimensions have not been studied. Since data with low quality can affect the process of scientific lecture video indexing, the issue of data quality in this process requires special attention.

Details

Library Hi Tech, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 0737-8831

Keywords

Article
Publication date: 30 May 2023

Mahdi Salehi, Raha Rajaeei, Ehsan Khansalar and Samane Edalati Shakib

This paper aims to determine whether there is a relationship between intellectual capital and social capital and internal control weaknesses and assess the relationship between…

Abstract

Purpose

This paper aims to determine whether there is a relationship between intellectual capital and social capital and internal control weaknesses and assess the relationship between the variables of intellectual capital and social capital and internal control weaknesses.

Design/methodology/approach

The statistical population consists of 1,309 firm-year observations from 2014 to 2020. The research hypothesis is tested using statistical methods, including multivariate, least-squares and fixed-effects regression.

Findings

The results demonstrate a negative and significant relationship between intellectual capital, social capital and internal control weaknesses. The study also finds that increased intellectual and social capital quality improves human resource utilization, control mechanism, creativity and firm performance. The results also show that intellectual capital and social capital enhancement will reduce internal control weaknesses in the upcoming years.

Originality/value

This paper is the pioneer study on the relationship between intellectual capital and social capital and internal control weaknesses in Iran, carried out separately and in exploratory factor analysis. This paper considers intellectual capital components for theoretical factor analysis, including human capital, structural capital and customer capital. Internal control weakness is assessed based on financial, non-financial and information technology (IT) weaknesses.

Details

Journal of Islamic Accounting and Business Research, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 1759-0817

Keywords

Article
Publication date: 14 February 2023

Muhammed Ashiq Villanthenkodath and Shreya Pal

This study scrutinizes the impact of economic globalization on ecological footprint while endogenizing economic growth and energy consumption during 1990–2018 in India.

Abstract

Purpose

This study scrutinizes the impact of economic globalization on ecological footprint while endogenizing economic growth and energy consumption during 1990–2018 in India.

Design/methodology/approach

For time series analysis, the standard unit root test has been employed to unveil the integration order. Then, the cointegration was confirmed using autoregressive distributed lag (ARDL) analysis. Further, the study executed the dynamic ARDL simulation model to estimate long-run and short-run results along with simulation and robotic prediction.

Findings

The cointegration analysis confirms the existence of a long-run association among variables. Further, economic globalization reduces the ecological footprint in the long-run. Similarly, energy consumption decreases the ecological footprint. In contrast, economic growth spurs the ecological footprint in India.

Originality/value

The present study makes valuable and original contributions to the literature by applying a multivariate ecological footprint function, assessing the impact of economic globalization on ecological footprint while considering economic growth and energy consumption in India.

Details

Journal of Economic and Administrative Sciences, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 1026-4116

Keywords

Article
Publication date: 28 March 2024

Elisa Gonzalez Santacruz, David Romero, Julieta Noguez and Thorsten Wuest

This research paper aims to analyze the scientific and grey literature on Quality 4.0 and zero-defect manufacturing (ZDM) frameworks to develop an integrated quality 4.0 framework…

Abstract

Purpose

This research paper aims to analyze the scientific and grey literature on Quality 4.0 and zero-defect manufacturing (ZDM) frameworks to develop an integrated quality 4.0 framework (IQ4.0F) for quality improvement (QI) based on Six Sigma and machine learning (ML) techniques towards ZDM. The IQ4.0F aims to contribute to the advancement of defect prediction approaches in diverse manufacturing processes. Furthermore, the work enables a comprehensive analysis of process variables influencing product quality with emphasis on the use of supervised and unsupervised ML techniques in Six Sigma’s DMAIC (Define, Measure, Analyze, Improve and Control) cycle stage of “Analyze.”

Design/methodology/approach

The research methodology employed a systematic literature review (SLR) based on PRISMA guidelines to develop the integrated framework, followed by a real industrial case study set in the automotive industry to fulfill the objectives of verifying and validating the proposed IQ4.0F with primary data.

Findings

This research work demonstrates the value of a “stepwise framework” to facilitate a shift from conventional quality management systems (QMSs) to QMSs 4.0. It uses the IDEF0 modeling methodology and Six Sigma’s DMAIC cycle to structure the steps to be followed to adopt the Quality 4.0 paradigm for QI. It also proves the worth of integrating Six Sigma and ML techniques into the “Analyze” stage of the DMAIC cycle for improving defect prediction in manufacturing processes and supporting problem-solving activities for quality managers.

Originality/value

This research paper introduces a first-of-its-kind Quality 4.0 framework – the IQ4.0F. Each step of the IQ4.0F was verified and validated in an original industrial case study set in the automotive industry. It is the first Quality 4.0 framework, according to the SLR conducted, to utilize the principal component analysis technique as a substitute for “Screening Design” in the Design of Experiments phase and K-means clustering technique for multivariable analysis, identifying process parameters that significantly impact product quality. The proposed IQ4.0F not only empowers decision-makers with the knowledge to launch a Quality 4.0 initiative but also provides quality managers with a systematic problem-solving methodology for quality improvement.

Details

The TQM Journal, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 1754-2731

Keywords

Article
Publication date: 19 October 2023

Mohamed Saad Bajjou and Anas Chafi

Lean construction (LC) consists of very effective techniques; however, its implementation varies considerably from one industry to another. Although numerous lean initiatives do…

Abstract

Purpose

Lean construction (LC) consists of very effective techniques; however, its implementation varies considerably from one industry to another. Although numerous lean initiatives do exist in the construction industry, the research topic related to LC implementation is still unexplored due to the scarcity of validated assessment frameworks. This study aims to provide the first attempt in developing a structural model for successful LC implementation.

Design/methodology/approach

This study developed a Lean construction model (LCM) by critically reviewing seven previous LC frameworks from different countries, defining 18 subprinciples grouped into 6 major principles and formulating testable hypotheses. The questionnaire was pre-tested with 12 construction management experts and revised by 4 specialized academics. A pilot study with 20 construction units enhanced content reliability. Data from 307 Moroccan construction companies were collected to develop a measurement model. SPSS V. 26 was used for Exploratory Factor Analysis, followed by confirmatory factor analysis using AMOS version 23. Finally, a structural equation model statistically assessed each construct's contribution to the success of LC implementation.

Findings

This work led to the development of an original LCM based on valid and reliable LC constructs, consisting of 18 measurement items grouped into 6 LC principles: Process Transparency, People involvement, Waste elimination, Planning and Continuous improvement, Client Focus and Material/information flow and pull. According to the structural model, LC implementation success is positively influenced by Planning and Scheduling/continuous improvement (β = 0.930), followed by Elimination of waste (β = 0.896). Process transparency ranks third (β = 0.858). The study demonstrates that all these factors are mutually complementary, highlighting a positive relationship between LC implementation success and the holistic application of all LC principles.

Originality/value

To the best of the authors’ knowledge, this study is the first attempt to develop a statistically proven model of LC based on structural equation modelling analysis, which is promising for stimulating construction practitioners and researchers for more empirical studies in different countries to obtain a more accurate reflection of LC implementation. Moreover, the paper proposes recommendations to help policymakers, academics and practitioners anticipate the key success drivers for more successful LC implementation.

Details

Engineering, Construction and Architectural Management, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 0969-9988

Keywords

Article
Publication date: 5 January 2024

Vishal Ashok Wankhede, S. Vinodh and Jiju Antony

To achieve changing customer demands, organizations are striving hard to embrace cutting-edge technologies facilitating a high level of customization. Industry 4.0 (I4.0…

Abstract

Purpose

To achieve changing customer demands, organizations are striving hard to embrace cutting-edge technologies facilitating a high level of customization. Industry 4.0 (I4.0) implementation aids in handling big data that could help generate customized products. Lean six sigma (LSS) depends on data analysis to execute complex problems. Hence, the present study aims to empirically examine the key operational characteristics of LSS and I4.0 integration such as principles, workforce skills, critical success factors, challenges, LSS tools, I4.0 technologies and performance measures.

Design/methodology/approach

To stay competitive in the market and quickly respond to market demands, industries need to go ahead with digital transformation. I4.0 enables building intelligent factories by creating smart manufacturing systems comprising machines, operators and information and communication technologies through the complete value chain. This study utilizes an online survey on Operational Excellence professionals (Lean/Six Sigma), Managers/Consultants, Managing Directors/Executive Directors, Specialists/Analysts/Engineers, CEO/COO/CIO, SVP/VP/AVP, Industry 4.0 professionals and others working in the field of I4.0 and LSS. In total, 83 respondents participated in the study.

Findings

Based on the responses received, reliability, exploratory factor analysis and non-response bias analysis were carried out to understand the biasness of the responses. Further, the top five operational characteristics were reported for LSS and I4.0 integration.

Research limitations/implications

One of the limitations of the study is the sample size. Since I4.0 is a new concept and its integration with LSS is not yet explored; it was difficult to achieve a large sample size.

Practical implications

Organizations can utilize the study findings to realize the top principles, workforce skills, critical success factors, challenges, LSS tools, I4.0 tools and performance measures with respect to LSS and I4.0 integration. Moreover, these operational characteristics will help to assess the organization's readiness before and after the implementation of this integration.

Originality/value

The authors' original contribution is the empirical investigation of operational characteristics responsible for I4.0 and LSS integration.

Details

The TQM Journal, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 1754-2731

Keywords

Article
Publication date: 8 March 2024

Satyajit Mahato and Supriyo Roy

Managing project completion within the stipulated time is significant to all firms' sustainability. Especially for software start-up firms, it is of utmost importance. For any…

Abstract

Purpose

Managing project completion within the stipulated time is significant to all firms' sustainability. Especially for software start-up firms, it is of utmost importance. For any schedule variation, these firms must spend 25 to 40 percent of the development cost reworking quality defects. Significantly, the existing literature does not support defect rework opportunities under quality aspects among Indian IT start-ups. The present study aims to fill this niche by proposing a unique mathematical model of the defect rework aligned with the Six Sigma quality approach.

Design/methodology/approach

An optimization model was formulated, comprising the two objectives: rework “time” and rework “cost.” A case study was developed in relevance, and for the model solution, we used MATLAB and an elitist, Nondominated Sorting Genetic Algorithm (NSGA-II).

Findings

The output of the proposed approach reduced the “time” by 31 percent at a minimum “cost”. The derived “Pareto Optimal” front can be used to estimate the “cost” for a pre-determined rework “time” and vice versa, thus adding value to the existing literature.

Research limitations/implications

This work has deployed a decision tree for defect prediction, but it is often criticized for overfitting. This is one of the limitations of this paper. Apart from this, comparing the predicted defect count with other prediction models hasn’t been attempted. NSGA-II has been applied to solve the optimization problem; however, the optimal results obtained have yet to be compared with other algorithms. Further study is envisaged.

Practical implications

The Pareto front provides an effective visual aid for managers to compare multiple strategies to decide the best possible rework “cost” and “time” for their projects. It is beneficial for cost-sensitive start-ups to estimate the rework “cost” and “time” to negotiate with their customers effectively.

Originality/value

This paper proposes a novel quality management framework under the Six Sigma approach, which integrates optimization of critical metrics. As part of this study, a unique mathematical model of the software defect rework process was developed (combined with the proposed framework) to obtain the optimal solution for the perennial problem of schedule slippage in the rework process of software development.

Details

International Journal of Quality & Reliability Management, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 0265-671X

Keywords

Article
Publication date: 9 February 2024

Chao Xia, Bo Zeng and Yingjie Yang

Traditional multivariable grey prediction models define the background-value coefficients of the dependent and independent variables uniformly, ignoring the differences between…

Abstract

Purpose

Traditional multivariable grey prediction models define the background-value coefficients of the dependent and independent variables uniformly, ignoring the differences between their physical properties, which in turn affects the stability and reliability of the model performance.

Design/methodology/approach

A novel multivariable grey prediction model is constructed with different background-value coefficients of the dependent and independent variables, and a one-to-one correspondence between the variables and the background-value coefficients to improve the smoothing effect of the background-value coefficients on the sequences. Furthermore, the fractional order accumulating operator is introduced to the new model weaken the randomness of the raw sequence. The particle swarm optimization (PSO) algorithm is used to optimize the background-value coefficients and the order of the model to improve model performance.

Findings

The new model structure has good variability and compatibility, which can achieve compatibility with current mainstream grey prediction models. The performance of the new model is compared and analyzed with three typical cases, and the results show that the new model outperforms the other two similar grey prediction models.

Originality/value

This study has positive implications for enriching the method system of multivariable grey prediction model.

Details

Grey Systems: Theory and Application, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 2043-9377

Keywords

1 – 10 of 127