Search results

1 – 10 of over 311000
Article
Publication date: 12 June 2019

Jennifer A. Espinosa, Donna Davis, James Stock and Lisa Monahan

The purpose of this paper is to explore the processing of product returns at five case companies using a complex adaptive systems (CAS) logic to identify agent interactions…

Abstract

Purpose

The purpose of this paper is to explore the processing of product returns at five case companies using a complex adaptive systems (CAS) logic to identify agent interactions, organization, schema, learning and the emergence of adaptations in the reverse supply chain.

Design/methodology/approach

Using a multiple-case study design, this research applies abductive reasoning to examine data from in-depth, semi-structured interviews and direct researcher observations collected during site visits at case companies.

Findings

Costly or high-risk returns may require agents to specialize the depth of their mental schema. Processing agents need freedom to interact, self-organize and learn from other agents to generate emergent ideas and adapt.

Practical implications

Limiting the depth of individual agent schema allows managers to better allocate labor to processing product returns during peak volume. To boost adaptability, managers need to craft a dynamic environment that encourages agents with diverse schema to interact, anticipate, and self-organize to brainstorm new ideas. Managers need to resist the urge to “control” the dynamic environment that ensues.

Originality/value

This paper builds on existing research that studies the key decision points in the analysis of product returns by exploring how processing-agent behaviors can create adaptability in the reverse supply chain. Additionally, this research follows in the tradition of Choi et al. (2001) and Surana et al. (2005) and proposes the application of CAS to a specific part of the supply chain – the processing of product returns.

Details

The International Journal of Logistics Management, vol. 30 no. 3
Type: Research Article
ISSN: 0957-4093

Keywords

Article
Publication date: 1 October 1999

Surendra M. Gupta, Yousef A.Y. Al‐Turki and Ronald F. Perry

Just‐in‐time (JIT) systems were originally designed for deterministic production environments such as constant processing times and smooth and stable demand. However, once…

7097

Abstract

Just‐in‐time (JIT) systems were originally designed for deterministic production environments such as constant processing times and smooth and stable demand. However, once implemented, JIT is fraught with numerous types of uncertainties, including variations in processing time and demand, planned interruptions such as preventive maintenance and unplanned interruptions such as equipment failure. These uncertainties lead to lowered production throughput, decreased machine utilization, increased order completion time and greater backlogs and overtime requirements. In this paper, we introduce a newly developed system, which we refer to as the flexible kanban system (FKS), to cope with uncertainties and planned/unplanned interruptions. We demonstrate the superiority of the new system by considering four case examples covering various uncertainties, conducting numerous studies and comparing the overall performances of the FKS with that of the traditional JIT system. In all the cases considered, the performance of the FKS was, indeed, superior to that of the traditional JIT system.

Details

International Journal of Operations & Production Management, vol. 19 no. 10
Type: Research Article
ISSN: 0144-3577

Keywords

Article
Publication date: 8 July 2014

Christian Stary

This paper aims to achieve fully intertwined knowledge and business processing in change processes. It proposes streamlining situated articulation work, value network analyses…

4926

Abstract

Purpose

This paper aims to achieve fully intertwined knowledge and business processing in change processes. It proposes streamlining situated articulation work, value network analyses (VNA) and subject-oriented business process modelling (S-BPM) and execution to provide non-disruptive single and double learning processes driven by concerned stakeholders. When implementing knowledge life cycles, such as Firestone and McElroy’s knowledge life cycle, the agility of organizations is significantly constrained, in particular, when surviving knowledge claims should be implemented in the business processing environment in a seamless way.

Design/methodology/approach

The contribution is based on a conceptual analysis of knowledge life cycle implementations, learning loop developments and an exploratory case study in health care to demonstrate the effectiveness of the proposed approach. The solution towards non-disruptive knowledge and business processing allows stakeholders to actively participate in single- and double-loop learning processes.

Findings

The introduced approach supports problem and knowledge claim formulation, knowledge claim evaluation and non-disruptive knowledge integration into a business process environment. Based on stakeholder articulation, the steps to follow are: holomapping, exchange analysis, impact analysis, value creation analysis, subject-oriented modelling, business process validation and execution. Seamless support of stakeholders is enabled through the direct mapping of stakeholder and activity descriptions from value network representations to behaviour specifications (process models) on the individual and organizational layer.

Research limitations/implications

Current knowledge life cycle developments and implementations can now be analyzed in a structured way. Elements of the proposed approach could be integrated in disruptive implementations to overcome current limitations of knowledge life cycles. However, further case studies need to be performed to identify hindrances or barriers of combining VNA and S-BPM, both on the technological and methodological layer. What works for expert service industries might need to be adapted for production industries, and tools or tool chains might need to be configured accordingly. Finally, the socio-economic impact of the approach needs to be explored.

Practical implications

The presented case study from health care reveals the potential of such a methodological combination, as cycle times can be reduced, in particular, due to the execution of role-specific process models in the respective business processing environment. It can be considered as a fundamental shift for existing change management procedures, as they require rework of the entire functional process models when addressing business processing. Now, stakeholder- or role-specific behaviour can be handled isolated and in parallel, without affecting the entire organization in case of modifications.

Originality/value

The proposed methodological integration has not been done before. It enables stakeholders to perform single- and double-loop change processes in a seamless way.

Article
Publication date: 19 June 2009

Imam Machdi, Toshiyuki Amagasa and Hiroyuki Kitagawa

The purpose of this paper is to propose Extensible Markup Language (XML) data partitioning schemes that can cope with static and dynamic allocation for parallel holistic twig…

Abstract

Purpose

The purpose of this paper is to propose Extensible Markup Language (XML) data partitioning schemes that can cope with static and dynamic allocation for parallel holistic twig joins: grid metadata model for XML (GMX) and streams‐based partitioning method for XML (SPX).

Design/methodology/approach

GMX exploits the relationships between XML documents and query patterns to perform workload‐aware partitioning of XML data. Specifically, the paper constructs a two‐dimensional model with a document dimension and a query dimension in which each object in a dimension is composed from XML metadata related to the dimension. GMX provides a set of XML data partitioning methods that include document clustering, query clustering, document‐based refinement, query‐based refinement, and query‐path refinement, thereby enabling XML data partitioning based on the static information of XML metadata. In contrast, SPX explores the structural relationships of query elements and a range‐containment property of XML streams to generate partitions and allocate them to cluster nodes on‐the‐fly.

Findings

GMX provides several salient features: a set of partition granularities that balance workloads of query processing costs among cluster nodes statically; inter‐query parallelism as well as intra‐query parallelism at multiple extents; and better parallel query performance when all estimated queries are executed simultaneously to meet their probability of query occurrences in the system. SPX also offers the following features: minimal computation time to generate partitions; balancing skewed workloads dynamically on the system; producing higher intra‐query parallelism; and gaining better parallel query performance.

Research limitations/implications

The current status of the proposed XML data partitioning schemes does not take into account XML data updates, e.g. new XML documents and query pattern changes submitted by users on the system.

Practical implications

Note that effectiveness of the XML data partitioning schemes mainly relies on the accuracy of the cost model to estimate query processing costs. The cost model must be adjusted to reflect characteristics of a system platform used in the implementation.

Originality/value

This paper proposes novel schemes of conducting XML data partitioning to achieve both static and dynamic workload balance.

Details

International Journal of Web Information Systems, vol. 5 no. 2
Type: Research Article
ISSN: 1744-0084

Keywords

Article
Publication date: 1 September 2001

Erne Houghton and Victor Portougal

Presents an analytic framework for processing planning in industries where fixed batch sizes are common. The overall optimum processing plan is shown to be located on an envelope…

2314

Abstract

Presents an analytic framework for processing planning in industries where fixed batch sizes are common. The overall optimum processing plan is shown to be located on an envelope between the optimum JIT plan and the optimum level plan. These concepts provide the framework for understanding the overall optimum plan, and the framework leads to an efficient heuristic. The approach is practical, illustrated by a case study from the food industry, which shows the place of overall optimum planning within the company’s planning system and its implications for company performance.

Details

International Journal of Operations & Production Management, vol. 21 no. 9
Type: Research Article
ISSN: 0144-3577

Keywords

Book part
Publication date: 24 October 2019

Myrtede Alfred, Ken Catchpole, Emily Huffer, Kevin Taafe and Larry Fredendall

Achieving reliable instrument reprocessing requires finding the right balance among cost, productivity, and safety. However, there have been few attempts to comprehensively…

Abstract

Achieving reliable instrument reprocessing requires finding the right balance among cost, productivity, and safety. However, there have been few attempts to comprehensively examine sterile processing department (SPD) work systems. We considered an SPD as an example of a socio-technical system – where people, tools, technologies, the work environment, and the organization mutually interact – and applied work systems analysis (WSA) to provide a framework for future intervention and improvement.

The study was conducted at two SPD facilities at a 700-bed academic medical center servicing 56 onsite clinics, 31 operating rooms (ORs), and nine ambulatory centers. Process maps, task analyses, abstraction hierarchies, and variance matrices were developed through direct observations of reprocessing work and staff interviews and iteratively refined based on feedback from an expert group composed of eight staff from SPD, infection control, performance improvement, quality and safety, and perioperative services. Performance sampling conducted focused on specific challenges observed, interruptions during case cart preparation, and analysis of tray defect data from administrative databases.

Across five main sterilization tasks (prepare load, perform double-checks, run sterilizers, place trays in cooling, and test the biological indicator), variance analysis identified 16 failures created by 21 performance shaping factors (PSFs), leading to nine different outcome variations. Case cart preparation involved three main tasks: storing trays, picking cases, and prioritizing trays. Variance analysis for case cart preparation identified 11 different failures, 16 different PSFs, and seven different outcomes. Approximately 1% of cases had a tray with a sterilization or case cart preparation defect and 13.5 interruptions per hour were noted during case cart preparation.

While highly dependent upon the individual skills of the sterile processing technicians, making the sterilization process less complex and more visible, managing interruptions during case cart preparation, improving communication with the OR, and improving workspace and technology design could enhance performance in instrument reprocessing.

Article
Publication date: 5 October 2012

Samuel Forsman, Niclas Björngrim, Anders Bystedt, Lars Laitila, Peter Bomark and Micael Öhman

The construction industry has been criticized for not keeping up with other production industries in terms of cost efficiency, innovation, and production methods. The purpose of…

1341

Abstract

Purpose

The construction industry has been criticized for not keeping up with other production industries in terms of cost efficiency, innovation, and production methods. The purpose of this paper is to contribute to the knowledge about what hampers efficiency in supplying engineer‐to‐order (ETO) joinery‐products to the construction process. The objective is to identify the main contributors to inefficiency and to define areas for innovation in improving this industry.

Design/methodology/approach

Case studies of the supply chain of a Swedish ETO joinery‐products supplier are carried out, and observations, semi‐structured interviews, and documents from these cases are analysed from an efficiency improvement perspective.

Findings

From a lean thinking and information modelling perspective, longer‐term procurement relations and efficient communication of information are the main areas of innovation for enhancing the efficiency of supplying ETO joinery‐products. It seems to be possible to make improvements in planning and coordination, assembly information, and spatial measuring through information modelling and spatial scanning technology. This is likely to result in an increased level of prefabrication, decreased assembly time, and increased predictability of on‐site work.

Originality/value

The role of supplying ETO joinery‐products is a novel research area in construction. There is a need to develop each segment of the manufacturing industry supplying construction and this paper contributes to the collective knowledge in this area. The focus is on the possibilities for innovation in the ETO joinery‐products industry and on its improved integration in the construction industry value chain in general.

Article
Publication date: 24 September 2021

Nina Rizun, Aleksandra Revina and Vera G. Meister

This study aims to draw the attention of business process management (BPM) research and practice to the textual data generated in the processes and the potential of meaningful…

Abstract

Purpose

This study aims to draw the attention of business process management (BPM) research and practice to the textual data generated in the processes and the potential of meaningful insights extraction. The authors apply standard natural language processing (NLP) approaches to gain valuable knowledge in the form of business process (BP) complexity concept suggested in the study. It is built on the objective, subjective and meta-knowledge extracted from the BP textual data and encompassing semantics, syntax and stylistics. As a result, the authors aim to create awareness about cognitive, attention and reading efforts forming the textual data-based BP complexity. The concept serves as a basis for the development of various decision-support solutions for BP workers.

Design/methodology/approach

The starting point is an investigation of the complexity concept in the BPM literature to develop an understanding of the related complexity research and to put the textual data-based BP complexity in its context. Afterward, utilizing the linguistic foundations and the theory of situation awareness (SA), the concept is empirically developed and evaluated in a real-world application case using qualitative interview-based and quantitative data-based methods.

Findings

In the practical, real-world application, the authors confirmed that BP textual data could be used to predict BP complexity from the semantic, syntactic and stylistic viewpoints. The authors were able to prove the value of this knowledge about the BP complexity formed based on the (1) professional contextual experience of the BP worker enriched by the awareness of cognitive efforts required for BP execution (objective knowledge), (2) business emotions enriched by attention efforts (subjective knowledge) and (3) quality of the text, i.e. professionalism, expertise and stress level of the text author, enriched by reading efforts (meta-knowledge). In particular, the BP complexity concept has been applied to an industrial example of Information Technology Infrastructure Library (ITIL) change management (CHM) Information Technology (IT) ticket processing. The authors used IT ticket texts from two samples of 28,157 and 4,625 tickets as the basis for the analysis. The authors evaluated the concept with the help of manually labeled tickets and a rule-based approach using historical ticket execution data. Having a recommendation character, the results showed to be useful in creating awareness regarding cognitive, attention and reading efforts for ITIL CHM BP workers coordinating the IT ticket processing.

Originality/value

While aiming to draw attention to those valuable insights inherent in BP textual data, the authors propose an unconventional approach to BP complexity definition through the lens of textual data. Hereby, the authors address the challenges specified by BPM researchers, i.e. focus on semantics in the development of vocabularies and organization- and sector-specific adaptation of standard NLP techniques.

Details

Business Process Management Journal, vol. 27 no. 7
Type: Research Article
ISSN: 1463-7154

Keywords

Article
Publication date: 19 July 2022

Yaping Zhao, Xiangtianrui Kong, Xiaoyun Xu and Endong Xu

Cycle time reduction is important for order fulling process but often subject to resource constraints. This study considers an unrelated parallel machine environment where orders…

Abstract

Purpose

Cycle time reduction is important for order fulling process but often subject to resource constraints. This study considers an unrelated parallel machine environment where orders with random demands arrive dynamically. Processing speeds are controlled by resource allocation and subject to diminishing marginal returns. The objective is to minimize long-run expected order cycle time via order schedule and resource allocation decisions.

Design/methodology/approach

A stochastic optimization algorithm named CAP is proposed based on particle swarm optimization framework. It takes advantage of derived bound information to improve local search efficiency. Parameter impacts including demand variance, product type number, machine speed and resource coefficient are also analyzed through theoretic studies. The algorithm is evaluated and benchmarked with four well-known algorithms via extensive numerical experiments.

Findings

First, cycle time can be significantly improved when demand randomness is reduced via better forecasting. Second, achieving processing balance should be of top priority when considering resource allocation. Third, given marginal returns on resource consumption, it is advisable to allocate more resources to resource-sensitive machines.

Originality/value

A novel PSO-based optimization algorithm is proposed to jointly optimize order schedule and resource allocation decisions in a dynamic environment with random demands and stochastic arrivals. A general quadratic resource consumption function is adopted to better capture diminishing marginal returns.

Details

Industrial Management & Data Systems, vol. 122 no. 8
Type: Research Article
ISSN: 0263-5577

Keywords

Book part
Publication date: 24 August 2011

Morten H. Abrahamsen

The study here examines how business actors adapt to changes in networks by analyzing their perceptions or their network pictures. The study is exploratory or iterative in the…

Abstract

The study here examines how business actors adapt to changes in networks by analyzing their perceptions or their network pictures. The study is exploratory or iterative in the sense that revisions occur to the research question, method, theory, and context as an integral part of the research process.

Changes within networks receive less research attention, although considerable research exists on explaining business network structures in different research traditions. This study analyzes changes in networks in terms of the industrial network approach. This approach sees networks as connected relationships between actors, where interdependent companies interact based on their sensemaking of their relevant network environment. The study develops a concept of network change as well as an operationalization for comparing perceptions of change, where the study introduces a template model of dottograms to systematically analyze differences in perceptions. The study then applies the model to analyze findings from a case study of Norwegian/Japanese seafood distribution, and the chapter provides a rich description of a complex system facing considerable pressure to change. In-depth personal interviews and cognitive mapping techniques are the main research tools applied, in addition to tracer studies and personal observation.

The dottogram method represents a valuable contribution to case study research as it enables systematic within-case and across-case analyses. A further theoretical contribution of the study is the suggestion that network change is about actors seeking to change their network position to gain access to resources. Thereby, the study also implies a close relationship between the concepts network position and the network change that has not been discussed within the network approach in great detail.

Another major contribution of the study is the analysis of the role that network pictures play in actors' efforts to change their network position. The study develops seven propositions in an attempt to describe the role of network pictures in network change. So far, the relevant literature discusses network pictures mainly as a theoretical concept. Finally, the chapter concludes with important implications for management practice.

Details

Interfirm Networks: Theory, Strategy, and Behavior
Type: Book
ISBN: 978-1-78052-024-7

Keywords

1 – 10 of over 311000