Search results

1 – 10 of 815
Article
Publication date: 30 April 2024

Shiqing Wu, Jiahai Wang, Haibin Jiang and Weiye Xue

The purpose of this study is to explore a new assembly process planning and execution mode to realize rapid response, reduce the labor intensity of assembly workers and improve…

Abstract

Purpose

The purpose of this study is to explore a new assembly process planning and execution mode to realize rapid response, reduce the labor intensity of assembly workers and improve the assembly efficiency and quality.

Design/methodology/approach

Based on the related concepts of digital twin, this paper studies the product assembly planning in digital space, the process execution in physical space and the interaction between digital space and physical space. The assembly process planning is simulated and verified in the digital space to generate three-dimensional visual assembly process specification documents, the implementation of the assembly process specification documents in the physical space is monitored and feed back to revise the assembly process and improve the assembly quality.

Findings

Digital twin technology enhances the quality and efficiency of assembly process planning and execution system.

Originality/value

It provides a new perspective for assembly process planning and execution, the architecture, connections and data acquisition approaches of the digital twin-driven framework are proposed in this paper, which is of important theoretical values. What is more, a smart assembly workbench is developed, the specific image classification algorithms are presented in detail too, which is of some industrial application values.

Details

Robotic Intelligence and Automation, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 2754-6969

Keywords

Article
Publication date: 22 February 2024

Ranjeet Kumar Singh

Although the challenges associated with big data are increasing, the question of the most suitable big data analytics (BDA) platform in libraries is always significant. The…

62

Abstract

Purpose

Although the challenges associated with big data are increasing, the question of the most suitable big data analytics (BDA) platform in libraries is always significant. The purpose of this study is to propose a solution to this problem.

Design/methodology/approach

The current study identifies relevant literature and provides a review of big data adoption in libraries. It also presents a step-by-step guide for the development of a BDA platform using the Apache Hadoop Ecosystem. To test the system, an analysis of library big data using Apache Pig, which is a tool from the Apache Hadoop Ecosystem, was performed. It establishes the effectiveness of Apache Hadoop Ecosystem as a powerful BDA solution in libraries.

Findings

It can be inferred from the literature that libraries and librarians have not taken the possibility of big data services in libraries very seriously. Also, the literature suggests that there is no significant effort made to establish any BDA architecture in libraries. This study establishes the Apache Hadoop Ecosystem as a possible solution for delivering BDA services in libraries.

Research limitations/implications

The present work suggests adapting the idea of providing various big data services in a library by developing a BDA platform, for instance, providing assistance to the researchers in understanding the big data, cleaning and curation of big data by skilled and experienced data managers and providing the infrastructural support to store, process, manage, analyze and visualize the big data.

Practical implications

The study concludes that Apache Hadoops’ Hadoop Distributed File System and MapReduce components significantly reduce the complexities of big data storage and processing, respectively, and Apache Pig, using Pig Latin scripting language, is very efficient in processing big data and responding to queries with a quick response time.

Originality/value

According to the study, there are significantly fewer efforts made to analyze big data from libraries. Furthermore, it has been discovered that acceptance of the Apache Hadoop Ecosystem as a solution to big data problems in libraries are not widely discussed in the literature, although Apache Hadoop is regarded as one of the best frameworks for big data handling.

Details

Digital Library Perspectives, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 2059-5816

Keywords

Article
Publication date: 2 May 2024

Neveen Barakat, Liana Hajeir, Sarah Alattal, Zain Hussein and Mahmoud Awad

The objective of this paper is to develop a condition-based maintenance (CBM) scheme for pneumatic cylinders. The CBM scheme will detect two common types of air leaking failure…

Abstract

Purpose

The objective of this paper is to develop a condition-based maintenance (CBM) scheme for pneumatic cylinders. The CBM scheme will detect two common types of air leaking failure modes and identify the leaky/faulty cylinder. The successful implementation of the proposed scheme will reduce energy consumption, scrap and rework, and time to repair.

Design/methodology/approach

Effective implementation of maintenance is important to reduce operation cost, improve productivity and enhance quality performance at the same time. Condition-based monitoring is an effective maintenance scheme where maintenance is triggered based on the condition of the equipment monitored either real time or at certain intervals. Pneumatic air systems are commonly used in many industries for packaging, sorting and powering air tools among others. A common failure mode of pneumatic cylinders is air leaks which is difficult to detect for complex systems with many connections. The proposed method consists of monitoring the stroke speed profile of the piston inside the pneumatic cylinder using hall effect sensors. Statistical features are extracted from the speed profiles and used to develop a fault detection machine learning model. The proposed method is demonstrated using a real-life case of tea packaging machines.

Findings

Based on the limited data collected, the ensemble machine learning algorithm resulted in 88.4% accuracy. The algorithm can detect failures as soon as they occur based on majority vote rule of three machine learning models.

Practical implications

Early air leak detection will improve quality of packaged tea bags and provide annual savings due to time to repair and energy waste reduction. The average annual estimated savings due to the implementation of the new CBM method is $229,200 with a payback period of less than two years.

Originality/value

To the best of the authors’ knowledge, this paper is the first in terms of proposing a CBM for pneumatic systems air leaks using piston speed. Majority, if not all, current detection methods rely on expensive equipment such as infrared or ultrasonic sensors. This paper also contributes to the research gap of economic justification of using CBM.

Details

Journal of Quality in Maintenance Engineering, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 1355-2511

Keywords

Article
Publication date: 27 March 2024

Temesgen Agazhie and Shalemu Sharew Hailemariam

This study aims to quantify and prioritize the main causes of lean wastes and to apply reduction methods by employing better waste cause identification methodologies.

Abstract

Purpose

This study aims to quantify and prioritize the main causes of lean wastes and to apply reduction methods by employing better waste cause identification methodologies.

Design/methodology/approach

We employed fuzzy techniques for order preference by similarity to the ideal solution (FTOPSIS), fuzzy analytical hierarchy process (FAHP), and failure mode effect analysis (FMEA) to determine the causes of defects. To determine the current defect cause identification procedures, time studies, checklists, and process flow charts were employed. The study focuses on the sewing department of a clothing industry in Addis Ababa, Ethiopia.

Findings

These techniques outperform conventional techniques and offer a better solution for challenging decision-making situations. Each lean waste’s FMEA criteria, such as severity, occurrence, and detectability, were examined. A pairwise comparison revealed that defect has a larger effect than other lean wastes. Defects were mostly caused by inadequate operator training. To minimize lean waste, prioritizing their causes is crucial.

Research limitations/implications

The research focuses on a case company and the result could not be generalized for the whole industry.

Practical implications

The study used quantitative approaches to quantify and prioritize the causes of lean waste in the garment industry and provides insight for industrialists to focus on the waste causes to improve their quality performance.

Originality/value

The methodology of integrating FMEA with FAHP and FTOPSIS was the new contribution to have a better solution to decision variables by considering the severity, occurrence, and detectability of the causes of wastes. The data collection approach was based on experts’ focus group discussion to rate the main causes of defects which could provide optimal values of defect cause prioritization.

Details

International Journal of Quality & Reliability Management, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 0265-671X

Keywords

Article
Publication date: 24 April 2024

Mohsen Jami, Hamidreza Izadbakhsh and Alireza Arshadi Khamseh

This study aims to minimize the cost and time of blood delivery in the whole blood supply chain network (BSCN) in disaster conditions. In other words, integrating all strategic…

Abstract

Purpose

This study aims to minimize the cost and time of blood delivery in the whole blood supply chain network (BSCN) in disaster conditions. In other words, integrating all strategic, tactical and operational decisions of three levels of blood collection, processing and distribution leads to satisfying the demand at the right time.

Design/methodology/approach

This paper proposes an integrated BSCN in disaster conditions to consider four categories of facilities, including temporary blood collection centers, field hospitals, main blood processing centers and medical centers, to optimize demand response time appropriately. The proposed model applies the location of all permanent and emergency facilities in three levels: blood collection, processing and distribution. Other essential decisions, including multipurpose facilities, emergency transportation, inventory and allocation, were also used in the model. The LP metric method is applied to solve the proposed bi-objective mathematical model for the BSCN.

Findings

The findings show that this model clarifies its efficiency in the total cost and blood delivery time reduction, which results in a low carbon transmission of the blood supply chain.

Originality/value

The researchers proposed an integrated BSCN in disaster conditions to minimize the cost and time of blood delivery. They considered multipurpose capabilities for facilities (e.g. field hospitals are responsible for the three purposes of blood collection, processing and distribution), and so locating permanent and emergency facilities at three levels of blood collection, processing and distribution, support facilities, emergency transportation and traffic on the route with pollution were used to present a new model.

Details

Journal of Modelling in Management, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 1746-5664

Keywords

Article
Publication date: 4 March 2024

Zeyu Xing, Tachia Chin, Jing Huang, Mirko Perano and Valerio Temperini

The ongoing paradigm shift in the energy sector holds paramount implications for the realization of the sustainable development goals, encompassing critical domains such as…

Abstract

Purpose

The ongoing paradigm shift in the energy sector holds paramount implications for the realization of the sustainable development goals, encompassing critical domains such as resource optimization, environmental stewardship and workforce opportunities. Concurrently, this transformative trajectory within the power sector possesses a dual-edged nature; it may ameliorate certain challenges while accentuating others. In light of the burgeoning research stream on open innovation, this study aims to examine the intricate dynamics of knowledge-based industry-university-research networking, with an overarching objective to elucidate and calibrate the equilibrium of ambidextrous innovation within power systems.

Design/methodology/approach

The authors scrutinize the role of different innovation organizations in three innovation models: ambidextrous, exploitative and exploratory, and use a multiobjective decision analysis method-entropy weight TOPSIS. The research was conducted within the sphere of the power industry, and the authors mined data from the widely used PatSnap database.

Findings

Results show that the breadth of knowledge search and the strength of an organization’s direct relationships are crucial for ambidextrous innovation, with research institutions having the highest impact. In contrast, for exploitative innovation, depth of knowledge search, the number of R&D patents and the number of innovative products are paramount, with universities playing the most significant role. For exploratory innovation, the depth of knowledge search and the quality of two-mode network relations are vital, with research institutions yielding the best effect. Regional analysis reveals Beijing as the primary hub for ambidextrous and exploratory innovation organizations, while Jiangsu leads for exploitative innovation.

Practical implications

The study offers valuable implications to cope with the dynamic state of ambidextrous innovation performance of the entire power system. In light of the findings, the dynamic state of ambidextrous innovation performance within the power system can be adeptly managed. By emphasizing a balance between exploratory and exploitative strategies, stakeholders are better positioned to respond to evolving challenges and opportunities. Thus, the study offers pivotal guidance to ensure sustained adaptability and growth in the power sector’s innovation landscape.

Originality/value

The primary originality is to extend and refine the theoretical understanding of ambidextrous innovation within power systems. By integrating several theoretical frameworks, including social network theory, knowledge-based theory and resource-based theory, the authors enrich the theoretical landscape of power system ambidextrous innovation. Also, this inclusive examination of two-mode network structures, including the interplay between knowledge and cooperation networks, unveils the intricate interdependencies between these networks and the ambidextrous innovation of power systems. This approach significantly widens the theoretical parameters of innovation network research.

Details

Journal of Knowledge Management, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 1367-3270

Keywords

Open Access
Article
Publication date: 29 February 2024

Guanchen Liu, Dongdong Xu, Zifu Shen, Hongjie Xu and Liang Ding

As an advanced manufacturing method, additive manufacturing (AM) technology provides new possibilities for efficient production and design of parts. However, with the continuous…

Abstract

Purpose

As an advanced manufacturing method, additive manufacturing (AM) technology provides new possibilities for efficient production and design of parts. However, with the continuous expansion of the application of AM materials, subtractive processing has become one of the necessary steps to improve the accuracy and performance of parts. In this paper, the processing process of AM materials is discussed in depth, and the surface integrity problem caused by it is discussed.

Design/methodology/approach

Firstly, we listed and analyzed the characterization parameters of metal surface integrity and its influence on the performance of parts and then introduced the application of integrated processing of metal adding and subtracting materials and the influence of different processing forms on the surface integrity of parts. The surface of the trial-cut material is detected and analyzed, and the surface of the integrated processing of adding and subtracting materials is compared with that of the pure processing of reducing materials, so that the corresponding conclusions are obtained.

Findings

In this process, we also found some surface integrity problems, such as knife marks, residual stress and thermal effects. These problems may have a potential negative impact on the performance of the final parts. In processing, we can try to use other integrated processing technologies of adding and subtracting materials, try to combine various integrated processing technologies of adding and subtracting materials, or consider exploring more efficient AM technology to improve processing efficiency. We can also consider adopting production process optimization measures to reduce the processing cost of adding and subtracting materials.

Originality/value

With the gradual improvement of the requirements for the surface quality of parts in the production process and the in-depth implementation of sustainable manufacturing, the demand for integrated processing of metal addition and subtraction materials is likely to continue to grow in the future. By deeply understanding and studying the problems of material reduction and surface integrity of AM materials, we can better meet the challenges in the manufacturing process and improve the quality and performance of parts. This research is very important for promoting the development of manufacturing technology and achieving success in practical application.

Details

Journal of Intelligent Manufacturing and Special Equipment, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 2633-6596

Keywords

Article
Publication date: 2 April 2024

Paulo Alberto Sampaio Santos, Breno Cortez and Michele Tereza Marques Carvalho

Present study aimed to integrate Geographic Information Systems (GIS) and Building Information Modeling (BIM) in conjunction with multicriteria decision-making (MCDM) to enhance…

Abstract

Purpose

Present study aimed to integrate Geographic Information Systems (GIS) and Building Information Modeling (BIM) in conjunction with multicriteria decision-making (MCDM) to enhance infrastructure investment planning.

Design/methodology/approach

This analysis combines GIS databases with BIM simulations for a novel highway project. Around 150 potential alternatives were simulated, narrowed to 25 more effective routes and 3 options underwent in-depth analysis using PROMETHEE method for decision-making, based on environmental, cost and safety criteria, allowing for comprehensive cross-perspective comparisons.

Findings

A comprehensive framework proposed was validated through a case study. Demonstrating its adaptability with customizable parameters. It aids decision-making, cost estimation, environmental impact analysis and outcome prediction. Considering these critical factors, this study holds the potential to advance new techniques for assessment and planning railways, power lines, gas and water.

Research limitations/implications

The study acknowledges limitations in GIS data quality, particularly in underdeveloped areas or regions with limited technology access. It also overlooks other pertinent variables, like social, economic, political and cultural issues. Thus, conclusions from these simulations may not entirely represent reality or diverse potential scenarios.

Practical implications

The proposed method automates decision-making, reducing subjectivity, aids in selecting effective alternatives and considers environmental criteria to mitigate negative impacts. Additionally, it minimizes costs and risks while demonstrating adaptability for assessing diverse infrastructures.

Originality/value

By integrating GIS and BIM data to support a MCDM workflow, this study proposes to fill the existing research gap in decision-making prioritization and mitigate subjective biases.

Details

Engineering, Construction and Architectural Management, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 0969-9988

Keywords

Article
Publication date: 16 April 2024

Shilong Zhang, Changyong Liu, Kailun Feng, Chunlai Xia, Yuyin Wang and Qinghe Wang

The swivel construction method is a specially designed process used to build bridges that cross rivers, valleys, railroads and other obstacles. To carry out this construction…

Abstract

Purpose

The swivel construction method is a specially designed process used to build bridges that cross rivers, valleys, railroads and other obstacles. To carry out this construction method safely, real-time monitoring of the bridge rotation process is required to ensure a smooth swivel operation without collisions. However, the traditional means of monitoring using Electronic Total Station tools cannot realize real-time monitoring, and monitoring using motion sensors or GPS is cumbersome to use.

Design/methodology/approach

This study proposes a monitoring method based on a series of computer vision (CV) technologies, which can monitor the rotation angle, velocity and inclination angle of the swivel construction in real-time. First, three proposed CV algorithms was developed in a laboratory environment. The experimental tests were carried out on a bridge scale model to select the outperformed algorithms for rotation, velocity and inclination monitor, respectively, as the final monitoring method in proposed method. Then, the selected method was implemented to monitor an actual bridge during its swivel construction to verify the applicability.

Findings

In the laboratory study, the monitoring data measured with the selected monitoring algorithms was compared with those measured by an Electronic Total Station and the errors in terms of rotation angle, velocity and inclination angle, were 0.040%, 0.040%, and −0.454%, respectively, thus validating the accuracy of the proposed method. In the pilot actual application, the method was shown to be feasible in a real construction application.

Originality/value

In a well-controlled laboratory the optimal algorithms for bridge swivel construction are identified and in an actual project the proposed method is verified. The proposed CV method is complementary to the use of Electronic Total Station tools, motion sensors, and GPS for safety monitoring of swivel construction of bridges. It also contributes to being a possible approach without data-driven model training. Its principal advantages are that it both provides real-time monitoring and is easy to deploy in real construction applications.

Details

Engineering, Construction and Architectural Management, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 0969-9988

Keywords

Article
Publication date: 28 March 2024

Elisa Gonzalez Santacruz, David Romero, Julieta Noguez and Thorsten Wuest

This research paper aims to analyze the scientific and grey literature on Quality 4.0 and zero-defect manufacturing (ZDM) frameworks to develop an integrated quality 4.0 framework…

Abstract

Purpose

This research paper aims to analyze the scientific and grey literature on Quality 4.0 and zero-defect manufacturing (ZDM) frameworks to develop an integrated quality 4.0 framework (IQ4.0F) for quality improvement (QI) based on Six Sigma and machine learning (ML) techniques towards ZDM. The IQ4.0F aims to contribute to the advancement of defect prediction approaches in diverse manufacturing processes. Furthermore, the work enables a comprehensive analysis of process variables influencing product quality with emphasis on the use of supervised and unsupervised ML techniques in Six Sigma’s DMAIC (Define, Measure, Analyze, Improve and Control) cycle stage of “Analyze.”

Design/methodology/approach

The research methodology employed a systematic literature review (SLR) based on PRISMA guidelines to develop the integrated framework, followed by a real industrial case study set in the automotive industry to fulfill the objectives of verifying and validating the proposed IQ4.0F with primary data.

Findings

This research work demonstrates the value of a “stepwise framework” to facilitate a shift from conventional quality management systems (QMSs) to QMSs 4.0. It uses the IDEF0 modeling methodology and Six Sigma’s DMAIC cycle to structure the steps to be followed to adopt the Quality 4.0 paradigm for QI. It also proves the worth of integrating Six Sigma and ML techniques into the “Analyze” stage of the DMAIC cycle for improving defect prediction in manufacturing processes and supporting problem-solving activities for quality managers.

Originality/value

This research paper introduces a first-of-its-kind Quality 4.0 framework – the IQ4.0F. Each step of the IQ4.0F was verified and validated in an original industrial case study set in the automotive industry. It is the first Quality 4.0 framework, according to the SLR conducted, to utilize the principal component analysis technique as a substitute for “Screening Design” in the Design of Experiments phase and K-means clustering technique for multivariable analysis, identifying process parameters that significantly impact product quality. The proposed IQ4.0F not only empowers decision-makers with the knowledge to launch a Quality 4.0 initiative but also provides quality managers with a systematic problem-solving methodology for quality improvement.

Details

The TQM Journal, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 1754-2731

Keywords

Access

Year

Last 3 months (815)

Content type

Earlycite article (815)
1 – 10 of 815