Search results

1 – 10 of over 7000
Article
Publication date: 28 March 2024

Elisa Gonzalez Santacruz, David Romero, Julieta Noguez and Thorsten Wuest

This research paper aims to analyze the scientific and grey literature on Quality 4.0 and zero-defect manufacturing (ZDM) frameworks to develop an integrated quality 4.0 framework…

Abstract

Purpose

This research paper aims to analyze the scientific and grey literature on Quality 4.0 and zero-defect manufacturing (ZDM) frameworks to develop an integrated quality 4.0 framework (IQ4.0F) for quality improvement (QI) based on Six Sigma and machine learning (ML) techniques towards ZDM. The IQ4.0F aims to contribute to the advancement of defect prediction approaches in diverse manufacturing processes. Furthermore, the work enables a comprehensive analysis of process variables influencing product quality with emphasis on the use of supervised and unsupervised ML techniques in Six Sigma’s DMAIC (Define, Measure, Analyze, Improve and Control) cycle stage of “Analyze.”

Design/methodology/approach

The research methodology employed a systematic literature review (SLR) based on PRISMA guidelines to develop the integrated framework, followed by a real industrial case study set in the automotive industry to fulfill the objectives of verifying and validating the proposed IQ4.0F with primary data.

Findings

This research work demonstrates the value of a “stepwise framework” to facilitate a shift from conventional quality management systems (QMSs) to QMSs 4.0. It uses the IDEF0 modeling methodology and Six Sigma’s DMAIC cycle to structure the steps to be followed to adopt the Quality 4.0 paradigm for QI. It also proves the worth of integrating Six Sigma and ML techniques into the “Analyze” stage of the DMAIC cycle for improving defect prediction in manufacturing processes and supporting problem-solving activities for quality managers.

Originality/value

This research paper introduces a first-of-its-kind Quality 4.0 framework – the IQ4.0F. Each step of the IQ4.0F was verified and validated in an original industrial case study set in the automotive industry. It is the first Quality 4.0 framework, according to the SLR conducted, to utilize the principal component analysis technique as a substitute for “Screening Design” in the Design of Experiments phase and K-means clustering technique for multivariable analysis, identifying process parameters that significantly impact product quality. The proposed IQ4.0F not only empowers decision-makers with the knowledge to launch a Quality 4.0 initiative but also provides quality managers with a systematic problem-solving methodology for quality improvement.

Details

The TQM Journal, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 1754-2731

Keywords

Article
Publication date: 11 March 2024

Sudhanshu Joshi, Manu Sharma, Sunil Luthra, Jose Arturo Garza-Reyes and Ramesh Anbanandam

The research aims to develop an assessment framework that evaluates critical success factors (CSFs) for the Quality 4.0 (Q 4.0) transition among Indian firms.

Abstract

Purpose

The research aims to develop an assessment framework that evaluates critical success factors (CSFs) for the Quality 4.0 (Q 4.0) transition among Indian firms.

Design/methodology/approach

The authors use the fuzzy-Delphi method to validate the results of a systematic literature review (SLR) that explores critical aspects. Further, the fuzzy decision-making trial and laboratory (DEMATEL) method determines the cause-and-effect link. The findings indicate that developing a Q 4.0 framework is essential for the long-term success of manufacturing companies. Utilizing the power of digital technology, data analytics and automation, manufacturing companies can benefit from the Q 4.0 framework. Product quality, operational effectiveness and overall business performance may all be enhanced by implementing the Q 4.0 transition framework.

Findings

The study highlights significant awareness of Q 4.0 in the Indian manufacturing sector that is acquired through various means such as training, experience, learning and research. However, most manufacturing industries in India still follow older quality paradigms. On the other hand, Indian manufacturing industries seem well-equipped to adopt Q 4.0, given practitioners' firm grasp of its concepts and anticipated benefits, including improved customer satisfaction, product refinement, continuous process enhancement, waste reduction and informed decision-making. Adoption hurdles involve challenges including reliable electricity access, high-speed Internet, infrastructure, a skilled workforce and financial support. The study also introduces a transition framework facilitating the shift from conventional methods to Q 4.0, aligned with the principles of the Fourth Industrial Revolution (IR).

Research limitations/implications

This research exclusively examines the manufacturing sector, neglecting other fields such as medical, service, mining and construction. Additionally, there needs to be more emphasis on the Q 4.0 implementation frameworks within the scope of the study.

Originality/value

This may be the inaugural framework for transitioning to Q 4.0 in India's manufacturing sectors and, conceivably, other developing nations.

Details

The TQM Journal, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 1754-2731

Keywords

Article
Publication date: 23 January 2024

Ranjit Roy Ghatak and Jose Arturo Garza-Reyes

The research explores the shift to Quality 4.0, examining the move towards a data-focussed transformation within organizational frameworks. This transition is characterized by…

Abstract

Purpose

The research explores the shift to Quality 4.0, examining the move towards a data-focussed transformation within organizational frameworks. This transition is characterized by incorporating Industry 4.0 technological innovations into existing quality management frameworks, signifying a significant evolution in quality control systems. Despite the evident advantages, the practical deployment in the Indian manufacturing sector encounters various obstacles. This research is dedicated to a thorough examination of these impediments. It is structured around a set of pivotal research questions: First, it seeks to identify the key barriers that impede the adoption of Quality 4.0. Second, it aims to elucidate these barriers' interrelations and mutual dependencies. Thirdly, the research prioritizes these barriers in terms of their significance to the adoption process. Finally, it contemplates the ramifications of these priorities for the strategic advancement of manufacturing practices and the development of informed policies. By answering these questions, the research provides a detailed understanding of the challenges faced. It offers actionable insights for practitioners and policymakers implementing Quality 4.0 in the Indian manufacturing sector.

Design/methodology/approach

Employing Interpretive Structural Modelling and Matrix Impact of Cross Multiplication Applied to Classification, the authors probe the interdependencies amongst fourteen identified barriers inhibiting Quality 4.0 adoption. These barriers were categorized according to their driving power and dependence, providing a richer understanding of the dynamic obstacles within the Technology–Organization–Environment (TOE) framework.

Findings

The study results highlight the lack of Quality 4.0 standards and Big Data Analytics (BDA) tools as fundamental obstacles to integrating Quality 4.0 within the Indian manufacturing sector. Additionally, the study results contravene dominant academic narratives, suggesting that the cumulative impact of organizational barriers is marginal, contrary to theoretical postulations emphasizing their central significance in Quality 4.0 assimilation.

Practical implications

This research provides concrete strategies, such as developing a collaborative platform for sharing best practices in Quality 4.0 standards, which fosters a synergistic relationship between organizations and policymakers, for instance, by creating a joint task force, comprised of industry leaders and regulatory bodies, dedicated to formulating and disseminating comprehensive guidelines for Quality 4.0 adoption. This initiative could lead to establishing industry-wide standards, benefiting from the pooled expertise of diverse stakeholders. Additionally, the study underscores the necessity for robust, standardized Big Data Analytics tools specifically designed to meet the Quality 4.0 criteria, which can be developed through public-private partnerships. These tools would facilitate the seamless integration of Quality 4.0 processes, demonstrating a direct route for overcoming the barriers of inadequate standards.

Originality/value

This research delineates specific obstacles to Quality 4.0 adoption by applying the TOE framework, detailing how these barriers interact with and influence each other, particularly highlighting the previously overlooked environmental factors. The analysis reveals a critical interdependence between “lack of standards for Quality 4.0” and “lack of standardized BDA tools and solutions,” providing nuanced insights into their conjoined effect on stalling progress in this field. Moreover, the study contributes to the theoretical body of knowledge by mapping out these novel impediments, offering a more comprehensive understanding of the challenges faced in adopting Quality 4.0.

Details

International Journal of Quality & Reliability Management, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 0265-671X

Keywords

Article
Publication date: 23 October 2023

Abhijeet Tewary and Vaishali Jadon

This research aims to analyze the literature on Quality 4.0 and pinpoint the essential factors contributing to its success. Additionally, the research aims to develop a framework…

Abstract

Purpose

This research aims to analyze the literature on Quality 4.0 and pinpoint the essential factors contributing to its success. Additionally, the research aims to develop a framework that can be used to create a capable workforce necessary for the successful implementation of Quality 4.0.

Design/methodology/approach

By following a systematic approach, the authors could ensure that their literature review was comprehensive and unbiased. Using a set of pre-determined inclusion and exclusion criteria, the authors screened 90 research articles to obtain the most relevant and reliable information for their study.

Findings

The authors' review identified essential findings, including the evolution of literature in the field of Quality 4.0 and the systematization of previous literature reviews focusing on training and development. The authors also identified several training barriers to implementing Quality 4.0 and proposed a model for building a competent workforce using Kolb's experiential learning model.

Practical implications

The authors' research offers insights into the training barriers that must be considered when building a competent workforce. Using the framework proposed in the authors' research, consultants and managers can better integrate Quality 4.0 into their organizations.

Social implications

The adoption of Quality 4.0 has significant social implications and is essential for advancing sustainability. It can improve efficiency, reduce waste, minimize environmental impacts and better meet the needs and expectations of stakeholders.

Originality/value

The authors' study stands out as one of the earliest reviews of the literature on Quality 4.0 to incorporate the theory-context-method (TCM) framework, allowing to provide unique insights into future research directions that had not been previously explored.

Details

The TQM Journal, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 1754-2731

Keywords

Article
Publication date: 14 July 2023

Hamid Hassani, Azadeh Mohebi, M.J. Ershadi and Ammar Jalalimanesh

The purpose of this research is to provide a framework in which new data quality dimensions are defined. The new dimensions provide new metrics for the assessment of lecture video…

89

Abstract

Purpose

The purpose of this research is to provide a framework in which new data quality dimensions are defined. The new dimensions provide new metrics for the assessment of lecture video indexing. As lecture video indexing involves various steps, the proposed framework containing new dimensions, introduces new integrated approach for evaluating an indexing method or algorithm from the beginning to the end.

Design/methodology/approach

The emphasis in this study is on the fifth step of design science research methodology (DSRM), known as evaluation. That is, the methods that are developed in the field of lecture video indexing as an artifact, should be evaluated from different aspects. In this research, nine dimensions of data quality including accuracy, value-added, relevancy, completeness, appropriate amount of data, concise, consistency, interpretability and accessibility have been redefined based on previous studies and nominal group technique (NGT).

Findings

The proposed dimensions are implemented as new metrics to evaluate a newly developed lecture video indexing algorithm, LVTIA and numerical values have been obtained based on the proposed definitions for each dimension. In addition, the new dimensions are compared with each other in terms of various aspects. The comparison shows that each dimension that is used for assessing lecture video indexing, is able to reflect a different weakness or strength of an indexing method or algorithm.

Originality/value

Despite development of different methods for indexing lecture videos, the issue of data quality and its various dimensions have not been studied. Since data with low quality can affect the process of scientific lecture video indexing, the issue of data quality in this process requires special attention.

Details

Library Hi Tech, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 0737-8831

Keywords

Article
Publication date: 8 March 2024

Satyajit Mahato and Supriyo Roy

Managing project completion within the stipulated time is significant to all firms' sustainability. Especially for software start-up firms, it is of utmost importance. For any…

Abstract

Purpose

Managing project completion within the stipulated time is significant to all firms' sustainability. Especially for software start-up firms, it is of utmost importance. For any schedule variation, these firms must spend 25 to 40 percent of the development cost reworking quality defects. Significantly, the existing literature does not support defect rework opportunities under quality aspects among Indian IT start-ups. The present study aims to fill this niche by proposing a unique mathematical model of the defect rework aligned with the Six Sigma quality approach.

Design/methodology/approach

An optimization model was formulated, comprising the two objectives: rework “time” and rework “cost.” A case study was developed in relevance, and for the model solution, we used MATLAB and an elitist, Nondominated Sorting Genetic Algorithm (NSGA-II).

Findings

The output of the proposed approach reduced the “time” by 31 percent at a minimum “cost”. The derived “Pareto Optimal” front can be used to estimate the “cost” for a pre-determined rework “time” and vice versa, thus adding value to the existing literature.

Research limitations/implications

This work has deployed a decision tree for defect prediction, but it is often criticized for overfitting. This is one of the limitations of this paper. Apart from this, comparing the predicted defect count with other prediction models hasn’t been attempted. NSGA-II has been applied to solve the optimization problem; however, the optimal results obtained have yet to be compared with other algorithms. Further study is envisaged.

Practical implications

The Pareto front provides an effective visual aid for managers to compare multiple strategies to decide the best possible rework “cost” and “time” for their projects. It is beneficial for cost-sensitive start-ups to estimate the rework “cost” and “time” to negotiate with their customers effectively.

Originality/value

This paper proposes a novel quality management framework under the Six Sigma approach, which integrates optimization of critical metrics. As part of this study, a unique mathematical model of the software defect rework process was developed (combined with the proposed framework) to obtain the optimal solution for the perennial problem of schedule slippage in the rework process of software development.

Details

International Journal of Quality & Reliability Management, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 0265-671X

Keywords

Article
Publication date: 25 January 2024

Besiki Stvilia and Dong Joon Lee

This study addresses the need for a theory-guided, rich, descriptive account of research data repositories' (RDRs) understanding of data quality and the structures of their data…

Abstract

Purpose

This study addresses the need for a theory-guided, rich, descriptive account of research data repositories' (RDRs) understanding of data quality and the structures of their data quality assurance (DQA) activities. Its findings can help develop operational DQA models and best practice guides and identify opportunities for innovation in the DQA activities.

Design/methodology/approach

The study analyzed 122 data repositories' applications for the Core Trustworthy Data Repositories, interview transcripts of 32 curators and repository managers and data curation-related webpages of their repository websites. The combined dataset represented 146 unique RDRs. The study was guided by a theoretical framework comprising activity theory and an information quality evaluation framework.

Findings

The study provided a theory-based examination of the DQA practices of RDRs summarized as a conceptual model. The authors identified three DQA activities: evaluation, intervention and communication and their structures, including activity motivations, roles played and mediating tools and rules and standards. When defining data quality, study participants went beyond the traditional definition of data quality and referenced seven facets of ethical and effective information systems in addition to data quality. Furthermore, the participants and RDRs referenced 13 dimensions in their DQA models. The study revealed that DQA activities were prioritized by data value, level of quality, available expertise, cost and funding incentives.

Practical implications

The study's findings can inform the design and construction of digital research data curation infrastructure components on university campuses that aim to provide access not just to big data but trustworthy data. Communities of practice focused on repositories and archives could consider adding FAIR operationalizations, extensions and metrics focused on data quality. The availability of such metrics and associated measurements can help reusers determine whether they can trust and reuse a particular dataset. The findings of this study can help to develop such data quality assessment metrics and intervention strategies in a sound and systematic way.

Originality/value

To the best of the authors' knowledge, this paper is the first data quality theory guided examination of DQA practices in RDRs.

Details

Journal of Documentation, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 0022-0418

Keywords

Article
Publication date: 12 April 2024

Carlos Arturo Vallejo Hoyos and Flavia Braga Chinelato

This research delineates the interdependencies between e-service quality (e-SQ), product quality (PQ) and food biosafety measures (FBM) in shaping consumer satisfaction and…

Abstract

Purpose

This research delineates the interdependencies between e-service quality (e-SQ), product quality (PQ) and food biosafety measures (FBM) in shaping consumer satisfaction and loyalty within the online food delivery services (OFDS) landscape. Anchored by the technology acceptance model (TAM) and the theory of planned behavior (TPB), the study integrates these frameworks to examine how perceived service efficiency, reliability, product appeal and biosafety protocols contribute to overall consumer trust and repurchase intentions.

Design/methodology/approach

Surveys were conducted on several 100 online food delivery app users, ages 20 to 64, in major cities in Colombia, which provided data for structural equation modeling analysis.

Findings

The analysis revealed that reliable, responsive service and appealing food presentation significantly influence consumer perceptions of behind-the-scenes safety protocols during delivery. Strict standards around mitigating contamination risks and verifiable handling at each point further engender trust in the platform and intentions to repurchase among users. The data cement proper food security as pivotal for customer retention.

Practical implications

Quantitatively confirming biosafety’s rising centrality provides an impetus for platforms to integrate and promote integrity, safety and traceability protection as a competitive differentiator.

Originality/value

The study’s originality lies in its comprehensive exploration of the OFDS quality attributes and their direct impact on consumer loyalty. Besides, it offers valuable insights for both academic and practical implications in enhancing service delivery and marketing strategies.

Details

International Journal of Quality & Reliability Management, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 0265-671X

Keywords

Article
Publication date: 3 July 2020

Fernando F. Padró, Karen Trimmer, Heejin Chang and Jonathan H. Green

The purpose of this study is to investigate the extent to which TQM has influenced the legal system in Australia, an area seldom investigated in the quality or legal literature.

Abstract

Purpose

The purpose of this study is to investigate the extent to which TQM has influenced the legal system in Australia, an area seldom investigated in the quality or legal literature.

Design/methodology/approach

Documentary and policy analysis of legislation, rules and rulemaking documentation based on a partial application of historical-policy analysis (HPA). Textual analysis was based on Dean and Bowen's (1994) definition of TQM and Vinni's (2007) review of new public management and Swiss (1992) “reformed TQM” concepts.

Findings

Australia's Tertiary Education Quality and Standards Agency Act of 2011 and supporting legal documents such as Guidance Notes include language reflective of TQM principles, providing evidence that present-day administrative law schemes include TQM practices and tools to undergird procedures of regulatory expectations (sometimes in the form of standards), monitoring and general operations. Oftentimes, it is the supporting legal documentation where TQM practices are found and operationalized.

Research limitations/implications

This is a proof-of-concept research study to determine the feasibility to identify TQM concepts within the existing language of legal statutes and supporting regulatory documentation. As such this study worked out the preliminary research challenges in performing this type of analysis.

Practical implications

Understanding TQM's impact on legal systems expands the system's perspective of organizations that do not always factor in the influence government policy has on organizational behaviours and outlooks. More specifically, understanding TQM's influence sheds insight on regulatory requirements imposed on a sector and the normative aspects of regulatory compliance that impact the operations and strategic planning of organizations.

Social implications

The article provides an example of how legal administrative rulemaking influences organizational operational and strategic activities to remain viable in the organization's business or industrial sector.

Originality/value

There are few research papers or literature reviews pertaining to the subject of TQM concepts embedded in laws and regulations, most of which date from the 1980s through early 2000s.

Details

The TQM Journal, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 1754-2731

Keywords

Article
Publication date: 1 February 2024

Antonio Samagaio, Paulo Morais Francisco and Teresa Felício

This study aims to identify the effect of soft skills as a driver of audit quality and their moderating role in the relationship between stress and the propensity for auditors to…

Abstract

Purpose

This study aims to identify the effect of soft skills as a driver of audit quality and their moderating role in the relationship between stress and the propensity for auditors to engage in reduced audit quality practices (RAQP).

Design/methodology/approach

This study uses a sample of 130 auditors, whose data were collected through an electronic questionnaire. The results were derived from the partial least squares-structural equation modelling method.

Findings

The findings show that the propensity to incur RAQP increases when auditors are under job stressors but decreases when individuals have resilience and time management skills. Moreover, the results suggest that the moderating effect of these two soft skills can effectively reduce the auditors’ propensity to engage in dysfunctional actions and judgments in auditing. Emotional intelligence and self-efficacy skills are shown not to affect RAQP.

Originality/value

This study adds to previous research on auditors’ drivers for supplying audit quality, by providing evidence of auditor characteristics as a critical input to audit quality. The results emphasize the importance of researchers including in models the moderating effect of soft skills on the relationship between audit quality and determinants associated with audit firms, clients or the regulatory framework.

Details

Review of Accounting and Finance, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 1475-7702

Keywords

1 – 10 of over 7000