Search results

1 – 3 of 3
Article
Publication date: 5 December 2023

Steven Alexander Melnyk, Matthias Thürer, Constantin Blome, Tobias Schoenherr and Stefan Gold

This study focuses on (re-)introducing computer simulation as a part of the research paradigm. Simulation is a widely applied research method in supply chain and operations…

Abstract

Purpose

This study focuses on (re-)introducing computer simulation as a part of the research paradigm. Simulation is a widely applied research method in supply chain and operations management. However, leading journals, such as the International Journal of Operations and Production Management, have often been reluctant to accept simulation studies. This study provides guidelines on how to conduct simulation research that advances theory, is relevant, and matters.

Design/methodology/approach

This study pooled the viewpoints of the editorial team of the International Journal of Operations and Production Management and authors of simulation studies. The authors debated their views and outlined why simulation is important and what a compelling simulation should look like.

Findings

There is an increasing importance of considering uncertainty, an increasing interest in dynamic phenomena, such as the transient response(s) to disruptions, and an increasing need to consider complementary outcomes, such as sustainability, which many researchers believe can be tackled by big data and modern analytical tools. But building, elaborating, and testing theory by purposeful experimentation is the strength of computer simulation. The authors therefore argue that simulation should play an important role in supply chain and operations management research, but for this, it also has to evolve away from simply generating and analyzing data. Four types of simulation research with much promise are outlined: empirical grounded simulation, simulation that establishes causality, simulation that supplements machine learning, artificial intelligence and analytics and simulation for sensitive environments.

Originality/value

This study identifies reasons why simulation is important for understanding and responding to today's business and societal challenges, it provides some guidance on how to design good simulation studies in this context and it links simulation to empirical research and theory going beyond multimethod studies.

Details

International Journal of Operations & Production Management, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 0144-3577

Keywords

Article
Publication date: 18 April 2023

Steven Alexander Melnyk, William J. Ritchie, Eric Stark and Angela Heavey

Dominant quality standards are present in all industries. Implicit in their use is the assumption that once adopted, there is little or no reason to replace them. However, there…

115

Abstract

Purpose

Dominant quality standards are present in all industries. Implicit in their use is the assumption that once adopted, there is little or no reason to replace them. However, there is evidence that, under certain circumstances, such standards do get replaced. The reasons for this action are not well-understood, either as they pertain to the displacement decision or to the selection and adoption of the alternative standard. The purpose of this study is to identify and explore these two issues (displacement and replacement) by drawing on data from the American healthcare system. This study is viewed through the theoretical lens of legitimacy theory. In addition, the process is viewed from a temporal perspective. The resulting findings are used to better understand how this displacement process takes place and to identify directions for interesting and meaningful future research.

Design/methodology/approach

This is an explanatory study that draws on data gathered from quality managers in 89 hospitals that had adopted a new healthcare quality standard (of these, some fifty percent had displaced the dominant quality standard – the Joint Commission – with a different standard – DNV Healthcare.

Findings

The combined literature review and case study data provide insights into the displacement process. This is a process that evolves over time. Initially, the process is driven by the need to meet customer demands. However, over time, as the organizations try to integrate the guidelines contained within the standards into the organization, gaps in the quality standard emerge. It is these gaps that motivate the need to displace standards. The legitimacy perspective is highly effective at explaining this displacement process. In addition, the study uncovers some critical issues, namely the important role played by the individual auditors in the certification process and the importance of fit between the standard and the context in which it is deployed.

Research limitations/implications

The data for the propositions in this case study were derived from interviews and survey data from 89 healthcare organizations. It would be interesting to examine similar relationships with other quality standards and industries.

Practical implications

Our findings provide new insights related to motivations to decouple from a dominant quality standard. Results provide a cautionary tale for standards that hold a dominant market share such that perceived legitimacy of such standards is not as stable as originally thought.

Originality/value

This study illuminates the fragile nature of the stability of dominant standards and emphasizes the linkages between legitimacy concerns and divestiture of such standards.

Details

International Journal of Operations & Production Management, vol. 43 no. 12
Type: Research Article
ISSN: 0144-3577

Keywords

Article
Publication date: 28 February 2023

Lakshmi Goel, Dawn Russell, Steven Williamson and Justin Zuopeng Zhang

While the idea of the resilience of information systems security exists, there is a lack of research that conceptualizes, defines and specifies a way to measure it as a dynamic…

Abstract

Purpose

While the idea of the resilience of information systems security exists, there is a lack of research that conceptualizes, defines and specifies a way to measure it as a dynamic capability. Drawing on relevant cybersecurity and dynamic capabilities literature, this study aims to define Information Systems Security Resilience (ISSR) as a “dynamic capability of a firm to respond to, and recover from, a security attack” and test it as a new construct.

Design/methodology/approach

The authors employ a methodology including multiple phases to develop and test this construct of ISSR. The authors first interview senior managers from various organizations to establish the face validity of the construct; then develop and analyze a pilot survey for internal validity and reliability; and finally, design and deploy a field survey to test and externally validate the construct.

Findings

The authors conceptualize and define the construct of ISSR as a dynamic capability, develop a scale for its measurement and test it in a pilot and field survey. The construct is valid, and the measurement tool works. It demonstrates that resilience is something that is done, rather than had. As a capability, organizations need to track and measure ISSR, which is what this tool provides the ability to do.

Originality/value

This research contributes to the information systems and cybersecurity literature and offers valuable insights for organizations to manage their security effectively.

Details

Journal of Enterprise Information Management, vol. 36 no. 4
Type: Research Article
ISSN: 1741-0398

Keywords

1 – 3 of 3