Search results
1 – 10 of over 24000Steven Alexander Melnyk, Matthias Thürer, Constantin Blome, Tobias Schoenherr and Stefan Gold
This study focuses on (re-)introducing computer simulation as a part of the research paradigm. Simulation is a widely applied research method in supply chain and operations…
Abstract
Purpose
This study focuses on (re-)introducing computer simulation as a part of the research paradigm. Simulation is a widely applied research method in supply chain and operations management. However, leading journals, such as the International Journal of Operations and Production Management, have often been reluctant to accept simulation studies. This study provides guidelines on how to conduct simulation research that advances theory, is relevant, and matters.
Design/methodology/approach
This study pooled the viewpoints of the editorial team of the International Journal of Operations and Production Management and authors of simulation studies. The authors debated their views and outlined why simulation is important and what a compelling simulation should look like.
Findings
There is an increasing importance of considering uncertainty, an increasing interest in dynamic phenomena, such as the transient response(s) to disruptions, and an increasing need to consider complementary outcomes, such as sustainability, which many researchers believe can be tackled by big data and modern analytical tools. But building, elaborating, and testing theory by purposeful experimentation is the strength of computer simulation. The authors therefore argue that simulation should play an important role in supply chain and operations management research, but for this, it also has to evolve away from simply generating and analyzing data. Four types of simulation research with much promise are outlined: empirical grounded simulation, simulation that establishes causality, simulation that supplements machine learning, artificial intelligence and analytics and simulation for sensitive environments.
Originality/value
This study identifies reasons why simulation is important for understanding and responding to today's business and societal challenges, it provides some guidance on how to design good simulation studies in this context and it links simulation to empirical research and theory going beyond multimethod studies.
Details
Keywords
In their authoritative literature review, Breen and Jonsson (2005) claim that ‘one of the most significant trends in the study of inequalities in educational attainment in the…
Abstract
In their authoritative literature review, Breen and Jonsson (2005) claim that ‘one of the most significant trends in the study of inequalities in educational attainment in the past decade has been the resurgence of rational-choice models focusing on educational decision making’. The starting point of the present contribution is that these models have largely ignored the explanatory relevance of social interactions. To remedy this shortcoming, this paper introduces a micro-founded formal model of the macro-level structure of educational inequality, which frames educational choices as the result of both subjective ability/benefit evaluations and peer-group pressures. As acknowledged by Durlauf (2002, 2006) and Akerlof (1997), however, while the social psychology and ethnographic literature provides abundant empirical evidence of the explanatory relevance of social interactions, statistical evidence on their causal effect is still flawed by identification and selection bias problems. To assess the relative explanatory contribution of the micro-level and network-based mechanisms hypothesised, the paper opts for agent-based computational simulations. In particular, the technique is used to deduce the macro-level consequences of each mechanism (sequentially introduced) and to test these consequences against French aggregate individual-level survey data. The paper's main result is that ability and subjective perceptions of education benefits, no matter how intensely differentiated across agent groups, are not sufficient on their own to generate the actual stratification of educational choices across educational backgrounds existing in France at the beginning of the twenty-first century. By computational counterfactual manipulations, the paper proves that network-based interdependencies among educational choices are instead necessary, and that they contribute, over and above the differentiation of ability and of benefit perceptions, to the genesis of educational stratification by amplifying the segregation of the educational choices that agents make on the basis of purely private ability/benefit calculations.
Details
Keywords
Marc Wouters and Susana Morales
To provide an overview of research published in the management accounting literature on methods for cost management in new product development, such as a target costing, life…
Abstract
Purpose
To provide an overview of research published in the management accounting literature on methods for cost management in new product development, such as a target costing, life cycle costing, component commonality, and modular design.
Methodology/approach
The structured literature search covered papers about 15 different cost management methods published in 40 journals in the period 1990–2013.
Findings
The search yielded a sample of 113 different papers. Many contained information about more than one method, and this yielded 149 references to specific methods. The number of references varied strongly per cost management method and per journal. Target costing has received by far the most attention in the publications in our sample; modular design, component commonality, and life cycle costing were ranked second and joint third. Most references were published in Management Science; Management Accounting Research; and Accounting, Organizations and Society. The results were strongly influenced by Management Science and Decision Science, because cost management methods with an engineering background were published above average in these two journals (design for manufacturing, component commonality, modular design, and product platforms) while other topics were published below average in these two journals.
Research Limitations/Implications
The scope of this review is accounting research. Future work could review the research on cost management methods in new product development published outside accounting.
Originality/value
The paper centers on methods for cost management, which complements reviews that focused on theoretical constructs of management accounting information and its use.
Details
Keywords
Chien‐Yi Huang and Hui‐Hua Huang
The purpose of this paper is to investigate how to reduce the time and cost required to conduct reliability testing. With increasing competition in the electronics industry and…
Abstract
Purpose
The purpose of this paper is to investigate how to reduce the time and cost required to conduct reliability testing. With increasing competition in the electronics industry and reduction in product life cycles, it is essential to diminish the time required for new product development and thus time to market.
Design/methodology/approach
This study conducts empirical sample test for wireless card and analyzes the fatigue life through finite element modeling (FEM). Simulation results are compared to the data collected from a temperature cycling test under conditions of −40°C to 150°C and −40°C to 100°C.
Findings
Assuming that the results of product lifetime from empirical sample test and software simulation exhibit a linear relationship, a “scale factor” should exist for any given product structure, process condition and materials composition scenario. The scale factors were found to be approximately 0.1 in both temperature cycling scenarios. Also, the effectiveness of various adhesive dispensing patterns on solder joint reliability is evaluated through software simulation. The L shape adhesive dispensing was proven to effectively enhance the fatigue life of chip scale package solder joints roughly 100‐fold.
Originality/value
The scale factor is used to convert the results from software simulation to empirical sample test for a given set of processing environments and materials. This helps to reduce the time and cost required to conduct reliability testing.
Details
Keywords
Rodolphe Durand and Zahia Guessoum
The aim of this paper is to give empirical evidence of the fundamental mechanisms underlying the resource systemics: time compression diseconomies, asset mass efficiency, and…
Abstract
The aim of this paper is to give empirical evidence of the fundamental mechanisms underlying the resource systemics: time compression diseconomies, asset mass efficiency, and interconnectedness of assets. It assumes that time, resource properties and interactions are the critical elements leading to accumulation of idiosyncratic resources, firm performance and survival. Results from a Cox regression on a simulated dataset confirm the protective effects of time compression diseconomies, asset mass efficiency, and interconnectedness of assets against firm's death.
Jeffrey W. Alstete and Nicholas J. Beutell
This study aims to consider assurance of learning among undergraduate business students enrolled in capstone business strategy courses using the GLO-BUS competitive simulation…
Abstract
Purpose
This study aims to consider assurance of learning among undergraduate business students enrolled in capstone business strategy courses using the GLO-BUS competitive simulation. Gender, academic major and business core course performance were examined.
Design/methodology/approach
Participants were 595 undergraduate capstone business students from 21 course sections taught over a four-year period. Variables included learning assurance measures, simulation performance, gender, major, business core course grades, capstone course grade and cumulative grade point average. Correlations, linear regression, multiple regression and multivariate analysis of variance (MANOVA) were used to analyze the data.
Findings
Learning assurance report scores were strongly related to simulation performance. Simulation performance was related to capstone course grade, which, in turn, was significantly related to the grade point average (GPA). Core business courses were related to learning assurance and performance indicators. Significant differences for gender and degree major were found for academic performance measures. Women and men did not differ in simulation performance.
Research limitations/implications
Limitations include the use of one simulation (GLO-BUS) and studying students at one university taught by one professor. Assurance of learning measures needs further study as factors in business program evaluation. Future research should analyze post-graduate performance and career achievements in relation to assurance of learning outcomes.
Originality/value
This study conducts empirical analyses of simulation learning that focuses entirely on direct measures, including student characteristics (gender, major), learning assurance measures, business core course grades, capstone course grades and student GPAs.
Details
Keywords
Jaehu Shim, Martin Bliemel and Myeonggil Choi
The purpose of this paper is to suggest a bibliometric method for designing agent-based models (ABMs) in entrepreneurship research. The application of this method is illustrated…
Abstract
Purpose
The purpose of this paper is to suggest a bibliometric method for designing agent-based models (ABMs) in entrepreneurship research. The application of this method is illustrated with an exemplary agent-based modeling and simulation (ABMS) regarding the early venture growth process. This bibliometric approach invigorates the utilization of ABMS as a viable research methodology in process-oriented entrepreneurship research.
Design/methodology/approach
In the bibliometric method, a domain corpus composed of scholarly articles is established and systematically analyzed through co-word analysis to discern essential concepts (i.e. agents, objects, and contexts) and their interrelations. The usefulness of the bibliometric method is elucidated by constructing an illustrative ABMS.
Findings
The bibliometric method for designing ABMs identifies essential concepts in the entrepreneurship literature and provides contexts in which the concepts are interrelated. The illustrative ABMS based on these concepts and interrelations accurately and consistently reproduces the emergence of power-law distributions in venture outcomes consistent with empirical evidence, implying further merit to bibliometric procedures.
Practical implications
The proposed method can be used not only to build simple models with essential concepts, but also to build more complex models that take a large number of concepts and their interrelations into consideration.
Originality/value
This study proposes a bibliometric method for designing ABMs. The proposed method extends similar procedures that are limited to thematic or cluster analysis by examining the semantic contexts in which the concepts co-occur. This research suggests that ABMS from bibliographic sources can be built and validated with empirical evidence. Several considerations are provided for the combined utilization of the bibliometric method and ABMS in entrepreneurship.
Details
Keywords
Shoufeng Cao, Kim Bryceson and Damian Hine
The aim of this paper is to explore the value of collaborative risk management in a decentralised multi-tier global fresh produce supply chain.
Abstract
Purpose
The aim of this paper is to explore the value of collaborative risk management in a decentralised multi-tier global fresh produce supply chain.
Design/methodology/approach
This study utilised a mixed methods approach. A qualitative field study was conducted to examine the need for collaborative risk management. The simulation experiments with industry datasets were conducted to assess whether risk-sharing contracts work in mitigating joint risks in parts of and across the supply chain.
Findings
The qualitative field study revealed risk propagation and the inefficiency of company-specific risk management strategies in value delivery. The simulation results indicated that risk-sharing contracts can incentivise various actors to absorb interrelated risks for value creation.
Research limitations/implications
The research is limited to risks relevant to supply chain processes in the Australia–China table grrape supply chain and does not consider product-related risks and the risk-taking behaviours of supply chain actors.
Practical implications
Collaborative risk management can be deployed to mitigate systematic risks that disrupt global fresh produce supply chains. The results offer evidence-based knowledge to supply chain professionals in understanding the value of collaborative risk assessment and management and provide insights on how to conduct collaborative risk management for effective risk management.
Originality/value
The results contribute to the supply chain risk management literature by new collaborative forms for effective risk management and strategic competition of “supply chain to supply chain” in multi-tier food supply chains.
Details
Keywords
David Card, David S. Lee, Zhuan Pei and Andrea Weber
A regression kink design (RKD or RK design) can be used to identify casual effects in settings where the regressor of interest is a kinked function of an assignment variable. In…
Abstract
A regression kink design (RKD or RK design) can be used to identify casual effects in settings where the regressor of interest is a kinked function of an assignment variable. In this chapter, we apply an RKD approach to study the effect of unemployment benefits on the duration of joblessness in Austria, and discuss implementation issues that may arise in similar settings, including the use of bandwidth selection algorithms and bias-correction procedures. Although recent developments in nonparametric estimation (Calonico, Cattaneo, & Farrell, 2014; Imbens & Kalyanaraman, 2012) are sometimes interpreted by practitioners as pointing to a default estimation procedure, we show that in any given application different procedures may perform better or worse. In particular, Monte Carlo simulations based on data-generating processes that closely resemble the data from our application show that some asymptotically dominant procedures may actually perform worse than “sub-optimal” alternatives in a given empirical application.
Emre Sozer and Wei Shyy
The purpose of this paper is to develop an empiricism free, first principle‐based model to simulate fluid flow and heat transfer through porous media.
Abstract
Purpose
The purpose of this paper is to develop an empiricism free, first principle‐based model to simulate fluid flow and heat transfer through porous media.
Design/methodology/approach
Conventional approaches to the problem are reviewed. A multi‐scale approach that makes use of the sample simulations at the individual pore levels is employed. The effect of porous structures on the global fluid flow is accounted for via local volume averaged governing equations, while the closure terms are accounted for via averaging flow characteristics around the pores.
Findings
The performance of the model has been tested for an isothermal flow case. Good agreement with experimental data were achieved. Both the permeability and Ergun coefficient are shown to be flow properties as opposed to the empirical approach which typically results in constant values of these parameters independent of the flow conditions. Hence, the present multi‐scale approach is more versatile and can account for the possible changes in flow characteristics.
Research limitations/implications
Further validation including non‐isothermal cases is necessary. Current scope of the model is limited to incompressible flows. The methodology can accommodate extension to compressible flows.
Originality/value
This paper proposes a method that eliminates the dependence of the numerical porous media simulations on empirical data. Although the model increases the fidelity of the simulations, it is still computationally affordable due to the use of a multi‐scale methodology.
Details