Search results

1 – 10 of over 8000
Article
Publication date: 25 January 2024

Besiki Stvilia and Dong Joon Lee

This study addresses the need for a theory-guided, rich, descriptive account of research data repositories' (RDRs) understanding of data quality and the structures of their data…

Abstract

Purpose

This study addresses the need for a theory-guided, rich, descriptive account of research data repositories' (RDRs) understanding of data quality and the structures of their data quality assurance (DQA) activities. Its findings can help develop operational DQA models and best practice guides and identify opportunities for innovation in the DQA activities.

Design/methodology/approach

The study analyzed 122 data repositories' applications for the Core Trustworthy Data Repositories, interview transcripts of 32 curators and repository managers and data curation-related webpages of their repository websites. The combined dataset represented 146 unique RDRs. The study was guided by a theoretical framework comprising activity theory and an information quality evaluation framework.

Findings

The study provided a theory-based examination of the DQA practices of RDRs summarized as a conceptual model. The authors identified three DQA activities: evaluation, intervention and communication and their structures, including activity motivations, roles played and mediating tools and rules and standards. When defining data quality, study participants went beyond the traditional definition of data quality and referenced seven facets of ethical and effective information systems in addition to data quality. Furthermore, the participants and RDRs referenced 13 dimensions in their DQA models. The study revealed that DQA activities were prioritized by data value, level of quality, available expertise, cost and funding incentives.

Practical implications

The study's findings can inform the design and construction of digital research data curation infrastructure components on university campuses that aim to provide access not just to big data but trustworthy data. Communities of practice focused on repositories and archives could consider adding FAIR operationalizations, extensions and metrics focused on data quality. The availability of such metrics and associated measurements can help reusers determine whether they can trust and reuse a particular dataset. The findings of this study can help to develop such data quality assessment metrics and intervention strategies in a sound and systematic way.

Originality/value

To the best of the authors' knowledge, this paper is the first data quality theory guided examination of DQA practices in RDRs.

Details

Journal of Documentation, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 0022-0418

Keywords

Article
Publication date: 14 July 2023

Hamid Hassani, Azadeh Mohebi, M.J. Ershadi and Ammar Jalalimanesh

The purpose of this research is to provide a framework in which new data quality dimensions are defined. The new dimensions provide new metrics for the assessment of lecture video…

91

Abstract

Purpose

The purpose of this research is to provide a framework in which new data quality dimensions are defined. The new dimensions provide new metrics for the assessment of lecture video indexing. As lecture video indexing involves various steps, the proposed framework containing new dimensions, introduces new integrated approach for evaluating an indexing method or algorithm from the beginning to the end.

Design/methodology/approach

The emphasis in this study is on the fifth step of design science research methodology (DSRM), known as evaluation. That is, the methods that are developed in the field of lecture video indexing as an artifact, should be evaluated from different aspects. In this research, nine dimensions of data quality including accuracy, value-added, relevancy, completeness, appropriate amount of data, concise, consistency, interpretability and accessibility have been redefined based on previous studies and nominal group technique (NGT).

Findings

The proposed dimensions are implemented as new metrics to evaluate a newly developed lecture video indexing algorithm, LVTIA and numerical values have been obtained based on the proposed definitions for each dimension. In addition, the new dimensions are compared with each other in terms of various aspects. The comparison shows that each dimension that is used for assessing lecture video indexing, is able to reflect a different weakness or strength of an indexing method or algorithm.

Originality/value

Despite development of different methods for indexing lecture videos, the issue of data quality and its various dimensions have not been studied. Since data with low quality can affect the process of scientific lecture video indexing, the issue of data quality in this process requires special attention.

Details

Library Hi Tech, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 0737-8831

Keywords

Article
Publication date: 31 October 2023

Yangze Liang and Zhao Xu

Monitoring of the quality of precast concrete (PC) components is crucial for the success of prefabricated construction projects. Currently, quality monitoring of PC components…

Abstract

Purpose

Monitoring of the quality of precast concrete (PC) components is crucial for the success of prefabricated construction projects. Currently, quality monitoring of PC components during the construction phase is predominantly done manually, resulting in low efficiency and hindering the progress of intelligent construction. This paper presents an intelligent inspection method for assessing the appearance quality of PC components, utilizing an enhanced you look only once (YOLO) model and multi-source data. The aim of this research is to achieve automated management of the appearance quality of precast components in the prefabricated construction process through digital means.

Design/methodology/approach

The paper begins by establishing an improved YOLO model and an image dataset for evaluating appearance quality. Through object detection in the images, a preliminary and efficient assessment of the precast components' appearance quality is achieved. Moreover, the detection results are mapped onto the point cloud for high-precision quality inspection. In the case of precast components with quality defects, precise quality inspection is conducted by combining the three-dimensional model data obtained from forward design conversion with the captured point cloud data through registration. Additionally, the paper proposes a framework for an automated inspection platform dedicated to assessing appearance quality in prefabricated buildings, encompassing the platform's hardware network.

Findings

The improved YOLO model achieved a best mean average precision of 85.02% on the VOC2007 dataset, surpassing the performance of most similar models. After targeted training, the model exhibits excellent recognition capabilities for the four common appearance quality defects. When mapped onto the point cloud, the accuracy of quality inspection based on point cloud data and forward design is within 0.1 mm. The appearance quality inspection platform enables feedback and optimization of quality issues.

Originality/value

The proposed method in this study enables high-precision, visualized and automated detection of the appearance quality of PC components. It effectively meets the demand for quality inspection of precast components on construction sites of prefabricated buildings, providing technological support for the development of intelligent construction. The design of the appearance quality inspection platform's logic and framework facilitates the integration of the method, laying the foundation for efficient quality management in the future.

Details

Engineering, Construction and Architectural Management, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 0969-9988

Keywords

Open Access
Article
Publication date: 6 September 2022

Rose Clancy, Ken Bruton, Dominic T.J. O’Sullivan and Aidan J. Cloonan

Quality management practitioners have yet to cease the potential of digitalisation. Furthermore, there is a lack of tools such as frameworks guiding practitioners in the digital…

2799

Abstract

Purpose

Quality management practitioners have yet to cease the potential of digitalisation. Furthermore, there is a lack of tools such as frameworks guiding practitioners in the digital transformation of their organisations. The purpose of this study is to provide a framework to guide quality practitioners with the implementation of digitalisation in their existing practices.

Design/methodology/approach

A review of literature assessed how quality management and digitalisation have been integrated. Findings from the literature review highlighted the success of the integration of Lean manufacturing with digitalisation. A comprehensive list of Lean Six Sigma tools were then reviewed in terms of their effectiveness and relevance for the hybrid digitisation approach to process improvement (HyDAPI) framework.

Findings

The implementation of the proposed HyDAPI framework in an industrial case study led to increased efficiency, reduction of waste, standardised work, mistake proofing and the ability to root cause non-conformance products.

Research limitations/implications

The activities and tools in the HyDAPI framework are not inclusive of all techniques from Lean Six Sigma.

Practical implications

The HyDAPI framework is a flexible guide for quality practitioners to digitalise key information from manufacturing processes. The framework allows organisations to select the appropriate tools as needed. This is required because of the varying and complex nature of organisation processes and the challenge of adapting to the continually evolving Industry 4.0.

Originality/value

This research proposes the HyDAPI framework as a flexible and adaptable approach for quality management practitioners to implement digitalisation. This was developed because of the gap in research regarding the lack of procedures guiding organisations in their digital transition to Industry 4.0.

Details

International Journal of Lean Six Sigma, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 2040-4166

Keywords

Article
Publication date: 23 January 2024

Ranjit Roy Ghatak and Jose Arturo Garza-Reyes

The research explores the shift to Quality 4.0, examining the move towards a data-focussed transformation within organizational frameworks. This transition is characterized by…

Abstract

Purpose

The research explores the shift to Quality 4.0, examining the move towards a data-focussed transformation within organizational frameworks. This transition is characterized by incorporating Industry 4.0 technological innovations into existing quality management frameworks, signifying a significant evolution in quality control systems. Despite the evident advantages, the practical deployment in the Indian manufacturing sector encounters various obstacles. This research is dedicated to a thorough examination of these impediments. It is structured around a set of pivotal research questions: First, it seeks to identify the key barriers that impede the adoption of Quality 4.0. Second, it aims to elucidate these barriers' interrelations and mutual dependencies. Thirdly, the research prioritizes these barriers in terms of their significance to the adoption process. Finally, it contemplates the ramifications of these priorities for the strategic advancement of manufacturing practices and the development of informed policies. By answering these questions, the research provides a detailed understanding of the challenges faced. It offers actionable insights for practitioners and policymakers implementing Quality 4.0 in the Indian manufacturing sector.

Design/methodology/approach

Employing Interpretive Structural Modelling and Matrix Impact of Cross Multiplication Applied to Classification, the authors probe the interdependencies amongst fourteen identified barriers inhibiting Quality 4.0 adoption. These barriers were categorized according to their driving power and dependence, providing a richer understanding of the dynamic obstacles within the Technology–Organization–Environment (TOE) framework.

Findings

The study results highlight the lack of Quality 4.0 standards and Big Data Analytics (BDA) tools as fundamental obstacles to integrating Quality 4.0 within the Indian manufacturing sector. Additionally, the study results contravene dominant academic narratives, suggesting that the cumulative impact of organizational barriers is marginal, contrary to theoretical postulations emphasizing their central significance in Quality 4.0 assimilation.

Practical implications

This research provides concrete strategies, such as developing a collaborative platform for sharing best practices in Quality 4.0 standards, which fosters a synergistic relationship between organizations and policymakers, for instance, by creating a joint task force, comprised of industry leaders and regulatory bodies, dedicated to formulating and disseminating comprehensive guidelines for Quality 4.0 adoption. This initiative could lead to establishing industry-wide standards, benefiting from the pooled expertise of diverse stakeholders. Additionally, the study underscores the necessity for robust, standardized Big Data Analytics tools specifically designed to meet the Quality 4.0 criteria, which can be developed through public-private partnerships. These tools would facilitate the seamless integration of Quality 4.0 processes, demonstrating a direct route for overcoming the barriers of inadequate standards.

Originality/value

This research delineates specific obstacles to Quality 4.0 adoption by applying the TOE framework, detailing how these barriers interact with and influence each other, particularly highlighting the previously overlooked environmental factors. The analysis reveals a critical interdependence between “lack of standards for Quality 4.0” and “lack of standardized BDA tools and solutions,” providing nuanced insights into their conjoined effect on stalling progress in this field. Moreover, the study contributes to the theoretical body of knowledge by mapping out these novel impediments, offering a more comprehensive understanding of the challenges faced in adopting Quality 4.0.

Details

International Journal of Quality & Reliability Management, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 0265-671X

Keywords

Article
Publication date: 11 July 2023

Nehal Elshaboury, Eslam Mohammed Abdelkader and Abobakr Al-Sakkaf

Modern human society has continuous advancements that have a negative impact on the quality of the air. Daily transportation, industrial and residential operations churn up…

Abstract

Purpose

Modern human society has continuous advancements that have a negative impact on the quality of the air. Daily transportation, industrial and residential operations churn up dangerous contaminants in our surroundings. Addressing air pollution issues is critical for human health and ecosystems, particularly in developing countries such as Egypt. Excessive levels of pollutants have been linked to a variety of circulatory, respiratory and nervous illnesses. To this end, the purpose of this research paper is to forecast air pollution concentrations in Egypt based on time series analysis.

Design/methodology/approach

Deep learning models are leveraged to analyze air quality time series in the 6th of October City, Egypt. In this regard, convolutional neural network (CNN), long short-term memory network and multilayer perceptron neural network models are used to forecast the overall concentrations of sulfur dioxide (SO2) and particulate matter 10 µm in diameter (PM10). The models are trained and validated by using monthly data available from the Egyptian Environmental Affairs Agency between December 2014 and July 2020. The performance measures such as determination coefficient, root mean square error and mean absolute error are used to evaluate the outcomes of models.

Findings

The CNN model exhibits the best performance in terms of forecasting pollutant concentrations 3, 6, 9 and 12 months ahead. Finally, using data from December 2014 to July 2021, the CNN model is used to anticipate the pollutant concentrations 12 months ahead. In July 2022, the overall concentrations of SO2 and PM10 are expected to reach 10 and 127 µg/m3, respectively. The developed model could aid decision-makers, practitioners and local authorities in planning and implementing various interventions to mitigate their negative influences on the population and environment.

Originality/value

This research introduces the development of an efficient time-series model that can project the future concentrations of particulate and gaseous air pollutants in Egypt. This research study offers the first time application of deep learning models to forecast the air quality in Egypt. This research study examines the performance of machine learning approaches and deep learning techniques to forecast sulfur dioxide and particular matter concentrations using standard performance metrics.

Details

Construction Innovation , vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 1471-4175

Keywords

Article
Publication date: 1 January 2024

Diana Oliveira, Helena Alvelos and Maria J. Rosa

Quality 4.0 is being presented as the new stage of quality development. However, its overlying concept and rationale are still hard to define. To better understand what different…

Abstract

Purpose

Quality 4.0 is being presented as the new stage of quality development. However, its overlying concept and rationale are still hard to define. To better understand what different authors and studies advocate being Quality 4.0, a systematic literature review was undertaken on the topic. This paper presents the results of such review, providing some avenues for further research on quality management.

Design/methodology/approach

The documents for the systematic literature review have been searched on the Scopus database, using the search equation: [TITLE-ABS-KEY (“Quality 4.0”) OR TITLE-ABS-KEY (Quality Management” AND (“Industry 4.0” OR “Fourth Industr*” OR i4.0))]. Documents were filtered by language and by type. Of the 367 documents identified, 146 were submitted to exploratory content analysis.

Findings

The analyzed documents essentially provide theoretical discussions on what Quality 4.0 is or should be. Five categories have emerged from the content analysis undertaken: Industry 4.0 and the Rise of a New Approach to Quality; Motivations, Readiness Factors and Barriers to a Quality 4.0 Approach; Digital Quality Management Systems; Combination of Quality Tools and Lean Methodologies and Quality 4.0 Professionals.

Research limitations/implications

It was hard to find studies reporting how quality is actually being managed in organizations that already operate in the Industry 4.0 paradigm. Answers could not be found to questions regarding actual practices, methodologies and tools being used in Quality 4.0 approaches. However, the research undertaken allowed to identify in the literature different ways of conceptualizing and analyzing Quality 4.0, opening up avenues for further research on quality management in the Industry 4.0 era.

Originality/value

This paper offers a broad look at how quality management is changing in response to the affirmation of the Industry 4.0 paradigm.

Details

The TQM Journal, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 1754-2731

Keywords

Article
Publication date: 13 December 2023

Indrit Troshani and Nick Rowbottom

Information infrastructures can enable or constrain how companies pursue their visions of sustainability reporting and help address the urgent need to understand how corporate…

Abstract

Purpose

Information infrastructures can enable or constrain how companies pursue their visions of sustainability reporting and help address the urgent need to understand how corporate activity affects sustainability outcomes and how socio-ecological challenges affect corporate activity. The paper examines the relationship between sustainability reporting information infrastructures and sustainability reporting practice.

Design/methodology/approach

The paper mobilises a socio-technical perspective and the conception of infrastructure, the socio-technical arrangement of technical artifacts and social routines, to engage with a qualitative dataset comprised of interview and documentary evidence on the development and construction of sustainability reporting information.

Findings

The results detail how sustainability reporting information infrastructures are used by companies and depict the difficulties faced in generating reliable sustainability data. The findings illustrate the challenges and measures undertaken by entities to embed automation and integration, and to enhance sustainability data quality. The findings provide insight into how infrastructures constrain and support sustainability reporting practices.

Originality/value

The paper explains how infrastructures shape sustainability reporting practices, and how infrastructures are shaped by regulatory demands and costs. Companies have developed “uneven” infrastructures supporting legislative requirements, whilst infrastructures supporting non-legislative sustainability reporting remain underdeveloped. Consequently, infrastructures supporting specific legislation have developed along unitary pathways and are often poorly integrated with infrastructures supporting other sustainability reporting areas. Infrastructures developed around legislative requirements are not necessarily constrained by financial reporting norms and do not preclude specific sustainability reporting visions. On the contrary, due to regulation, infrastructure supporting disclosures that offer an “inside out” perspective on sustainability reporting is often comparatively well developed.

Details

Accounting, Auditing & Accountability Journal, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 0951-3574

Keywords

Article
Publication date: 24 October 2023

Hasan Tutar, Mehmet Şahin and Teymur Sarkhanov

The lack of a definite standard for determining the sample size in qualitative research leaves the research process to the initiative of the researcher, and this situation…

Abstract

Purpose

The lack of a definite standard for determining the sample size in qualitative research leaves the research process to the initiative of the researcher, and this situation overshadows the scientificity of the research. The primary purpose of this research is to propose a model by questioning the problem of determining the sample size, which is one of the essential issues in qualitative research. The fuzzy logic model is proposed to determine the sample size in qualitative research.

Design/methodology/approach

Considering the structure of the problem in the present study, the proposed fuzzy logic model will benefit and contribute to the literature and practical applications. In this context, ten variables, namely scope of research, data quality, participant genuineness, duration of the interview, number of interviews, homogeneity, information strength, drilling ability, triangulation and research design, are used as inputs. A total of 20 different scenarios were created to demonstrate the applicability of the model proposed in the research and how the model works.

Findings

The authors reflected the results of each scenario in the table and showed the values for the sample size in qualitative studies in Table 4. The research results show that the proposed model's results are of a quality that will support the literature. The research findings show that it is possible to develop a model using the laws of fuzzy logic to determine the sample size in qualitative research.

Originality/value

The model developed in this research can contribute to the literature, and in any case, it can be argued that determining the sample volume is a much more effective and functional model than leaving it to the initiative of the researcher.

Details

Qualitative Research Journal, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 1443-9883

Keywords

Article
Publication date: 1 June 2022

Adeyl Khan, Md. Shamim Talukder, Quazi Tafsirul Islam and A.K.M. Najmul Islam

As businesses keep investing substantial resources in developing business analytics (BA) capabilities, it is unclear how the performance improvement transpires as BA affects…

Abstract

Purpose

As businesses keep investing substantial resources in developing business analytics (BA) capabilities, it is unclear how the performance improvement transpires as BA affects performance in many different ways. This paper aims to analyze how BA capabilities affect firms’ agility through resources like information quality and innovative capacity considering industry dynamism and the resulting impact on firm performance.

Design/methodology/approach

This paper tested the research hypothesis using primary data collected from 192 companies operating in Bangladesh. The data were analyzed using partial least squares-based structural equation modeling.

Findings

The results indicate that BA capabilities improve business resources like information quality and innovative capacity, which, in turn, significantly impact a firm’s agility. This paper also found out that industry dynamism moderates the firms’ agility and, ultimately, firms’ performance.

Practical implications

The contribution of this work provides insight regarding the role of business analytics capabilities in increasing organizational agility and performance under the moderating effects of industry dynamism.

Originality/value

The present research is to the best of the authors’ knowledge among the first studies considering a firm’s agility to explore the impact of BA on a firm’s performance in a dynamic environment. While previous researchers discussed resources like information quality and innovative capability, current research theoretically argues that these items are a leveraging point in a BA context to increase firm agility.

Details

VINE Journal of Information and Knowledge Management Systems, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 2059-5891

Keywords

1 – 10 of over 8000