Search results

1 – 10 of 97
Article
Publication date: 27 June 2023

Sandeep Kumar, Vikas Swarnakar, Rakesh Kumar Phanden, Dinesh Khanduja and Ayon Chakraborty

The purpose of this study is to present the systematic literature review (SLR) on Lean Six Sigma (LSS) by exploring the state of the art on growth of literature on LSS within the…

Abstract

Purpose

The purpose of this study is to present the systematic literature review (SLR) on Lean Six Sigma (LSS) by exploring the state of the art on growth of literature on LSS within the manufacturing sector, critical factors to implement LSS, the role of LSS in the manufacturing sector from an implementation and sustainability viewpoint and Industry 4.0 viewpoints while highlighting the research gaps.

Design/methodology/approach

An SLR of 2,876 published articles extracted from Scopus, WoS, Emerald Insight, IEEE Xplore, Taylor & Francis, Springer and Inderscience databases was carried out following the protocol of systematic review. In total, 154 articles published in different journals over the past 10 years were selected for quantitative and qualitative analysis which revealed a number of research gaps.

Findings

The findings of the SLR revealed the growth of literature on LSS within the manufacturing sector. The review also highlighted the most cited critical success factors, critical failure factors, performance indicators and associated tools and techniques applied during LSS implementation. The review also focused on studies related to LSS and sustainability viewpoint and LSS and Industry 4.0 viewpoints.

Practical implications

The findings of this SLR can help senior managers, practitioners and researchers to understand the current developments and future requirements to adopt LSS in manufacturing sectors from sustainability and Industry 4.0 viewpoints.

Originality/value

Academic publications in the context of the role of LSS in various research streams are sparse, and to the best of the authors’ knowledge, this paper is one of the first SLRs which explore current developments and future requirements to implement LSS from sustainability and Industry 4.0 perspective.

Details

The TQM Journal, vol. 36 no. 7
Type: Research Article
ISSN: 1754-2731

Keywords

Article
Publication date: 17 September 2024

Søren Skjold Andersen, Mahesh C. Gupta and Diego Augusto de Jesus Pacheco

Charles Sanders Peirce (1839–1914), recognized as the father of philosophical pragmatism, has been described as a philosopher’s philosopher. Eliyahu Moshe Goldratt (1947–2011)…

Abstract

Purpose

Charles Sanders Peirce (1839–1914), recognized as the father of philosophical pragmatism, has been described as a philosopher’s philosopher. Eliyahu Moshe Goldratt (1947–2011), considered the father of the management philosophy theory of constraints (TOC), has been described as being, first and foremost, a philosopher. The TOC body of knowledge is mainly preserved as concrete methodologies used in the management discipline. By examining the foundational elements of synechism and the TOC, the purpose of this study is to investigate the intellectual connections between the arguments and legacies of Goldratt and Peirce. Although this connection is worthy of much further investigation, the research emphasizes the possible implications from a management philosophy perspective.

Design/methodology/approach

Based on a “review with an attitude,” the authors first examined the foundations of Goldratt’s TOC through the lens of Peirce’s synechism. Next, the authors then examined how the study of Peirce combined with a selection of contemporary research in the management and organizational studies domain could point out a direction toward completing Goldratt’s unfinished intellectual work to establish a unified science management while addressing some of the current gaps in the TOC body of knowledge.

Findings

Major findings show that synechism’s growth may extend TOC knowledge, improving managerial practice in organizations. Findings on the convergent ideas of both also reveal that Goldratt valued all synechism categories, emphasizing the importance of not overlooking Firstness. Furthermore, the study analyzes the abductive inference demonstrated in the two use cases, introducing an additional metaphor to the management of organizational systems inspired by Peirce’s philosophical concepts. The research concludes that incorporating TOC and synechism principles can enhance management and organizational practices and enrich management philosophy and theories.

Research limitations/implications

This pioneering research opens promising opportunities to draw parallels between Peirce and Goldratt. Interdisciplinary collaboration will enhance the rigor and validity of integrating synechism and TOC. Experts in organizational behavior, systems theory and complexity science can provide valuable insights into this debate, while practitioners and consultants could help identify barriers and opportunities for integrating synechistic principles.

Practical implications

The study proposes a novel abductive approach using Peirce’s cable metaphor as an initial framework to build a unified science of management based on evolutionary stages: TOC, common sense and connectedness.

Originality/value

This research reinforces the argument that contemporary management practices need philosophical thinking. The authors argue that re-evaluating the foundations of management thought enriches the decision-making process in organizations and the understanding of contemporary theories in management and organizational studies.

Details

Journal of Management History, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 1751-1348

Keywords

Article
Publication date: 25 June 2024

Amruta Chandrakant Amune and Himangi Pande

Security is the major issue that motivates multiple scholars to discover security solutions apart from the advantages of wireless sensor networks (WSN) such as strong…

Abstract

Purpose

Security is the major issue that motivates multiple scholars to discover security solutions apart from the advantages of wireless sensor networks (WSN) such as strong compatibility, flexible communication and low cost. However, there exist a few challenges, such as the complexity of choosing the expected cluster, communication overhead, routing selection and the energy level that affects the entire communication. The ultimate aim of the research is to secure data communication in WSN using prairie indica optimization.

Design/methodology/approach

Initially, the network simulator sets up clusters of sensor nodes. The simulator then selects the Cluster Head and optimizes routing using an advanced Prairie Indica Optimization algorithm to find the most efficient communication paths. Sensor nodes collect data, which is securely transmitted to the base station. By applying prairie indica optimization to WSNs, optimize key aspects of data communication, including secure routing and encryption, to protect sensitive information from potential threats.

Findings

The Prairie Indica Optimization, as proposed, achieves impressive results for networks comprising 50 nodes, with delay, energy and throughput values of 77.39 ms, 21.68 J and 22.59 bps. In the case of 100-node networks, the achieved values are 80.95 ms, 27.74 J and 22.03 bps, significantly surpassing the performance of current techniques. These outcomes underscore the substantial improvements brought about by the Prairie Indica Optimization in enhancing WSN data communication.

Originality/value

In this research, the Prairie Indica Optimization is designed to enhance the security of data communication within WSN.

Details

International Journal of Intelligent Unmanned Systems, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 2049-6427

Keywords

Open Access
Article
Publication date: 14 May 2024

Yuyu Sun, Yuchen Zhang and Zhiguo Zhao

Considering the impact of the Free Trade Zone (FTZ) policy on forecasting the port cargo throughput, this paper constructs a fractional grey multivariate forecasting model to…

Abstract

Purpose

Considering the impact of the Free Trade Zone (FTZ) policy on forecasting the port cargo throughput, this paper constructs a fractional grey multivariate forecasting model to improve the prediction accuracy of port cargo throughput and realize the coordinated development of FTZ policymaking and port construction.

Design/methodology/approach

Considering the effects of data randomization, this paper proposes a novel self-adaptive grey multivariate prediction model, namely FDCGM(1,N). First, fractional-order accumulative generation operation (AGO) is introduced, which integrates the policy impact effect. Second, the heuristic grey wolf optimization (GWO) algorithm is used to determine the optimal nonlinear parameters. Finally, the novel model is then applied to port scale simulation and forecasting in Tianjin and Fujian where FTZs are situated and compared with three other grey models and two machine learning models.

Findings

In the Tianjin and Fujian cases, the new model outperforms the other comparison models, with the least mean absolute percentage error (MAPE) values of 6.07% and 4.16% in the simulation phase, and 6.70% and 1.63% in the forecasting phase, respectively. The results of the comparative analysis find that after the constitution of the FTZs, Tianjin’s port cargo throughput has shown a slow growth trend, and Fujian’s port cargo throughput has exhibited rapid growth. Further, the port cargo throughput of Tianjin and Fujian will maintain a growing trend in the next four years.

Practical implications

The new multivariable grey model can effectively reduce the impact of data randomness on forecasting. Meanwhile, FTZ policy has regional heterogeneity in port development, and the government can take different measures to improve the development of ports.

Originality/value

Under the background of FTZ policy, the new multivariable model can be used to achieve accurate prediction, which is conducive to determining the direction of port development and planning the port layout.

Details

Marine Economics and Management, vol. 7 no. 1
Type: Research Article
ISSN: 2516-158X

Keywords

Article
Publication date: 11 October 2023

Radha Subramanyam, Y. Adline Jancy and P. Nagabushanam

Cross-layer approach in media access control (MAC) layer will address interference and jamming problems. Hybrid distributed MAC can be used for simultaneous voice, data…

Abstract

Purpose

Cross-layer approach in media access control (MAC) layer will address interference and jamming problems. Hybrid distributed MAC can be used for simultaneous voice, data transmissions in wireless sensor network (WSN) and Internet of Things (IoT) applications. Choosing the correct objective function in Nash equilibrium for game theory will address fairness index and resource allocation to the nodes. Game theory optimization for distributed may increase the network performance. The purpose of this study is to survey the various operations that can be carried out using distributive and adaptive MAC protocol. Hill climbing distributed MAC does not need a central coordination system and location-based transmission with neighbor awareness reduces transmission power.

Design/methodology/approach

Distributed MAC in wireless networks is used to address the challenges like network lifetime, reduced energy consumption and for improving delay performance. In this paper, a survey is made on various cooperative communications in MAC protocols, optimization techniques used to improve MAC performance in various applications and mathematical approaches involved in game theory optimization for MAC protocol.

Findings

Spatial reuse of channel improved by 3%–29%, and multichannel improves throughput by 8% using distributed MAC protocol. Nash equilibrium is found to perform well, which focuses on energy utility in the network by individual players. Fuzzy logic improves channel selection by 17% and secondary users’ involvement by 8%. Cross-layer approach in MAC layer will address interference and jamming problems. Hybrid distributed MAC can be used for simultaneous voice, data transmissions in WSN and IoT applications. Cross-layer and cooperative communication give energy savings of 27% and reduces hop distance by 4.7%. Choosing the correct objective function in Nash equilibrium for game theory will address fairness index and resource allocation to the nodes.

Research limitations/implications

Other optimization techniques can be applied for WSN to analyze the performance.

Practical implications

Game theory optimization for distributed may increase the network performance. Optimal cuckoo search improves throughput by 90% and reduces delay by 91%. Stochastic approaches detect 80% attacks even in 90% malicious nodes.

Social implications

Channel allocations in centralized or static manner must be based on traffic demands whether dynamic traffic or fluctuated traffic. Usage of multimedia devices also increased which in turn increased the demand for high throughput. Cochannel interference keep on changing or mitigations occur which can be handled by proper resource allocations. Network survival is by efficient usage of valid patis in the network by avoiding transmission failures and time slots’ effective usage.

Originality/value

Literature survey is carried out to find the methods which give better performance.

Details

International Journal of Pervasive Computing and Communications, vol. 20 no. 2
Type: Research Article
ISSN: 1742-7371

Keywords

Article
Publication date: 24 May 2024

Mingze Yuan, Lin Ma, Ting Qu, Matthias Thürer and George Q. Huang

Workload contribution calculation approaches in the existing literature overestimate or underestimate indirect workload, which increases both workload fluctuation and shop floor…

Abstract

Purpose

Workload contribution calculation approaches in the existing literature overestimate or underestimate indirect workload, which increases both workload fluctuation and shop floor throughput performance. This study optimizes a Corrected Aggregate Workload (CAW) approach to control the workload contribution of workstations and Work In Progress (WIP) levels, thereby improving the shop floor throughput performance.

Design/methodology/approach

This study adopts simulation experiment by SimPy, and experimental factors are: (1) two workload contribution methods (CAW method and considering Position Corrected Aggregate Workload [PCAW] method); (2) two release methods (LUMS COR release and immediate release); (3) eleven workload norms for LUMS COR release (from 7- to 15-time units), and infinite workload norm for immediate release; and (4) two dispatching rules (First Come First Served, FCFS and Operation Due Date, ODD). Each scenario is replicated 100 times, and for each replication data are collected for 10,000 time units, being the warm-up period set to 3,000-time units.

Findings

The results of this study confirm that the PCAW calculation method outperforms the CAW method, especially during higher workload norm levels. The PCAW method is considered the better solution in practice due to its excellent performance in terms of percentage tardiness and mean tardiness time. The efficient workload contribution approach, as discussed in this study, has the potential to offset delivery performance loss that results from throughput performance loss.

Originality/value

This study proposes a novel approach that considers the workstations’ position in the routing of the job and the position of jobs CAW method. The results demonstrated that it allows shop floor throughput time to be short and feasible. It controls WIP by workload contribution of workstations, resulting in a lean shop floor. Therefore, workload contribution calculation is of particular significance for high-variety Make-To-Order (MTO) companies.

Details

Industrial Management & Data Systems, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 0263-5577

Keywords

Article
Publication date: 21 June 2023

Brad C. Meyer, Daniel Bumblauskas, Richard Keegan and Dali Zhang

This research fills a gap in process science by defining and explaining entropy and the increase of entropy in processes.

Abstract

Purpose

This research fills a gap in process science by defining and explaining entropy and the increase of entropy in processes.

Design/methodology/approach

This is a theoretical treatment that begins with a conceptual understanding of entropy in thermodynamics and information theory and extends it to the study of degradation and improvement in a transformation process.

Findings

A transformation process with three inputs: demand volume, throughput and product design, utilizes a system composed of processors, stores, configuration, human actors, stored data and controllers to provide a product. Elements of the system are aligned with the inputs and each other with a purpose to raise standard of living. Lack of alignment is entropy. Primary causes of increased entropy are changes in inputs and disordering of the system components. Secondary causes result from changes made to cope with the primary causes. Improvement and innovation reduce entropy by providing better alignments and new ways of aligning resources.

Originality/value

This is the first detailed theoretical treatment of entropy in a process science context.

Details

International Journal of Productivity and Performance Management, vol. 73 no. 5
Type: Research Article
ISSN: 1741-0401

Keywords

Article
Publication date: 1 April 2024

Mohammad Hani Al-Rifai

The purpose of this paper is twofold: first, a case study on applying lean principles in manufacturing operations to redesign and optimize an electronic device assembly process…

Abstract

Purpose

The purpose of this paper is twofold: first, a case study on applying lean principles in manufacturing operations to redesign and optimize an electronic device assembly process and its impact on performance and second, introducing cardboard prototyping as a Kaizen tool offering a novel approach to testing and simulating improvement scenarios.

Design/methodology/approach

The study employs value stream mapping, root cause analysis, and brainstorming tools to identify root causes of poor performance, followed by deploying a Kaizen event to redesign and optimize an electronic device assembly process. Using physical models, bottlenecks and opportunities for improvement were identified by the Kaizen approach at the workstations and assembly lines, enabling the testing of various scenarios and ideas. Changes in lead times, throughput, work in process inventory and assembly performance were analyzed and documented.

Findings

Pre- and post-improvement measures are provided to demonstrate the impact of the Kaizen event on the performance of the assembly cell. The study reveals that implementing lean tools and techniques reduced costs and increased throughput by reducing assembly cycle times, manufacturing lead time, space utilization, labor overtime and work-in-process inventory requirements.

Originality/value

This paper adds a new dimension to applying the Kaizen methodology in manufacturing processes by introducing cardboard prototyping, which offers a novel way of testing and simulating different scenarios for improvement. The paper describes the process implementation in detail, including the techniques and data utilized to improve the process.

Details

International Journal of Productivity and Performance Management, vol. 73 no. 4
Type: Research Article
ISSN: 1741-0401

Keywords

Article
Publication date: 22 August 2024

Mohammad Al-Rifai

Optimizing manufacturing processes addresses operational challenges and yields significant benefits across the business spectrum. This study aims to comprehensively analyze a…

Abstract

Purpose

Optimizing manufacturing processes addresses operational challenges and yields significant benefits across the business spectrum. This study aims to comprehensively analyze a manufacturing process through value stream mapping (VSM), aiming to streamline operations, reduce production lead times and minimize work-in-process (WIP) inventory levels. These improvements directly enhance competitiveness, customer satisfaction and overall business success, enabling a swift response to market demands, timely delivery of high-quality products and cost-effectiveness.

Design/methodology/approach

The approach integrates current and future-state VSM concepts with Lean tools across four stages: problem definition, current-state VSM analysis, future-state VSM design and improvement implementation. A team assembled to improve the manufacturing process for electronic devices has successfully implemented this approach.

Findings

Implemented improvements significantly reduced WIP inventory (88.8% equivalent to $572,171 annually) and production lead time (from 28.26 to 3.21 days), enhancing operational flexibility and competitiveness. Streamlined processes led to a 13% decrease in cycle time and a notable reduction in daily rework (63.6%), amounting to $118,127 annually. Labor reduction (45.5%) yielded annual savings of approximately $594,000, with affected individuals successfully transitioning to other roles, highlighting the effectiveness of lean methodologies without job cuts.

Originality/value

This initiative exemplifies the effective use of VSM and Lean tools in optimizing an electronic device manufacturing operation that produces 43 products across various processes. By leveraging these methodologies, this research offers valuable insights into enhancing production efficiency, resulting in shorter production lead times, reduced cycle times and significant decreases in WIP inventory and rework.

Details

Measuring Business Excellence, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 1368-3047

Keywords

Open Access
Article
Publication date: 29 May 2024

Mohanad Rezeq, Tarik Aouam and Frederik Gailly

Authorities have set up numerous security checkpoints during times of armed conflict to control the flow of commercial and humanitarian trucks into and out of areas of conflict…

Abstract

Purpose

Authorities have set up numerous security checkpoints during times of armed conflict to control the flow of commercial and humanitarian trucks into and out of areas of conflict. These security checkpoints have become highly utilized because of the complex security procedures and increased truck traffic, which significantly slow the delivery of relief aid. This paper aims to improve the process at security checkpoints by redesigning the current process to reduce processing time and relieve congestion at checkpoint entrance gates.

Design/methodology/approach

A decision-support tool (clearing function distribution model [CFDM]) is used to minimize the effects of security checkpoint congestion on the entire humanitarian supply network using a hybrid simulation-optimization approach. By using a business process simulation, the current and reengineered processes are both simulated, and the simulation output was used to estimate the clearing function (capacity as a function of the workload). For both the AS-IS and TO-BE models, key performance indicators such as distribution costs, backordering and process cycle time were used to compare the results of the CFDM tool. For this, the Kerem Abu Salem security checkpoint south of Gaza was used as a case study.

Findings

The comparison results demonstrate that the CFDM tool performs better when the output of the TO-BE clearing function is used.

Originality/value

The efforts will contribute to improving the planning of any humanitarian network experiencing congestion at security checkpoints by minimizing the impact of congestion on the delivery lead time of relief aid to the final destination.

Details

Journal of Humanitarian Logistics and Supply Chain Management, vol. 14 no. 4
Type: Research Article
ISSN: 2042-6747

Keywords

1 – 10 of 97