Search results

1 – 10 of over 3000
Article
Publication date: 11 December 2023

Stephanie Fabri, Lisa A. Pace, Vincent Cassar and Frank Bezzina

The European Innovation Scoreboard is an important indicator of innovation performance across European Member States. Despite its wide application, the indicator fails to…

Abstract

Purpose

The European Innovation Scoreboard is an important indicator of innovation performance across European Member States. Despite its wide application, the indicator fails to highlight the interlinkages that exist among innovation measures and focuses primarily on the linear relationship between the individual measures and the predicted outcome. This study aims to address this gap by applying a novel technique, the fuzzy-set qualitative comparative analysis (fsQCA), to shed light on these interlinkages and highlight the complexity of the determinants underlying innovation performance.

Design/methodology/approach

The authors adopted a configurational approach based on fsQCA that is implemented on innovation performance data from European Member States for the period 2011–2018. The approach is based on non-linearity and allows for the analysis of interlinkages based on equifinality, that is, the model recognises that there are different potential paths of high and low innovation performance. In addition, the approach allows for asymmetric relations, where a low innovation outcome is not the exact inverse of that which leads to high innovation outcome.

Findings

The results clearly indicate that innovation outcomes are not based on simple linear relations. Thus, to reap the desired effects from investments in innovation inputs, the complex set of indicators on which innovation performance is based should be taken into consideration. The results clearly indicate the elements of equifinality and asymmetric relations. Different paths lead to high innovation performance and low innovation performance.

Originality/value

The method applied to investigate the determinants of innovation performance is the prime original factor of this study. Thus, the study contributes to literature by highlighting the complexity involved in understanding innovation. By recognising and attempting to detangle this complexity, this study will assist not just academics but also policymakers in designing the necessary measures required to reach this important outcome for a country’s competitive edge.

Details

International Journal of Innovation Science, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 1757-2223

Keywords

Book part
Publication date: 27 December 2016

Arch G. Woodside, Alexandre Schpektor and Richard Xia

This chapter describes the complementary benefits of model-building and data analysis using algorithm and statistical modeling methods in the context of unobtrusive marketing…

Abstract

ABSTRACT

This chapter describes the complementary benefits of model-building and data analysis using algorithm and statistical modeling methods in the context of unobtrusive marketing field experiments and in transforming findings into isomorphic-management models. Relevant for marketing performance measurement, case-based configural analysis is a relatively new paradigm in crafting and testing theory. Statistical testing of hypotheses to learn net effects of individual terms in MRA equations is the current dominant logic. Isomorphic modeling might best communicate what executives should decide using the findings from algorithm and statistical models. Data testing these propositions here uses data from an unobtrusive field experiment in a retailing context and includes two levels of expertise, four price points, and presence versus absence of a friend (“pal” condition) during the customer-salesperson interactions (n = 240 store customers). The analyses support the conclusion that all three approaches to modeling provide useful complementary information substantially above the use of one or the other alone and that transforming findings from such models into isomorphic-management models is possible.

Article
Publication date: 1 February 2005

Mike Tao Zhang and Ken Goldberg

Semiconductor manufacturing industry requires highly accurate robot operation with short install/setup downtime.

Abstract

Purpose

Semiconductor manufacturing industry requires highly accurate robot operation with short install/setup downtime.

Design/methodology/approach

We develop a fast, low cost and easy‐to‐operate calibration system for wafer‐handling robots. The system is defined by a fixture and a simple compensation algorithm. Given robot repeatability, end effector uncertainties, and the tolerance requirements of wafer placement points, we derive fixture design and placement specifications based on a statistical tolerance model.

Findings

By employing the fixture‐based calibration, we successfully relax the tolerance requirement of the end effector by 20 times.

Originality/value

Semiconductor manufacturing requires fast and easy‐to‐operate calibration systems for wafer‐handling robots. In this paper, we describe a new methodology to solve this problem using fixtures. We develop fixture design criteria and a simple compensate algorithm to satisfy calibration requirements. We also verify our approach by a physical example.

Details

Industrial Robot: An International Journal, vol. 32 no. 1
Type: Research Article
ISSN: 0143-991X

Keywords

Article
Publication date: 21 January 2022

Maximilien de Zordo-Banliat, Xavier Merle, Gregory Dergham and Paola Cinnella

The Reynolds-averaged Navier–Stokes (RANS) equations represent the computational workhorse for engineering design, despite their numerous flaws. Improving and quantifying the…

94

Abstract

Purpose

The Reynolds-averaged Navier–Stokes (RANS) equations represent the computational workhorse for engineering design, despite their numerous flaws. Improving and quantifying the uncertainties associated with RANS models is particularly critical in view of the analysis and optimization of complex turbomachinery flows.

Design/methodology/approach

First, an efficient strategy is introduced for calibrating turbulence model coefficients from high-fidelity data. The results are highly sensitive to the flow configuration (called a calibration scenario) used to inform the coefficients. Second, the bias introduced by the choice of a specific turbulence model is reduced by constructing a mixture model by means of Bayesian model-scenario averaging (BMSA). The BMSA model makes predictions of flows not included in the calibration scenarios as a probability-weighted average of a set of competing turbulence models, each supplemented with multiple sets of closure coefficients inferred from alternative calibration scenarios.

Findings

Different choices for the scenario probabilities are assessed for the prediction of the NACA65 V103 cascade at off-design conditions. In all cases, BMSA improves the solution accuracy with respect to the baseline turbulence models, and the estimated uncertainty intervals encompass reasonably well the reference data. The BMSA results were found to be little sensitive to the user-defined scenario-weighting criterion, both in terms of average prediction and of estimated confidence intervals.

Originality/value

A delicate step in the BMSA is the selection of suitable scenario-weighting criteria, i.e. suitable prior probability mass functions (PMFs) for the calibration scenarios. The role of such PMFs is to assign higher probability to calibration scenarios more likely to provide an accurate estimate of model coefficients for the new flow. In this paper, three mixture models are constructed, based on alternative choices of the scenario probabilities. The authors then compare the capabilities of three different criteria.

Details

International Journal of Numerical Methods for Heat & Fluid Flow, vol. 32 no. 4
Type: Research Article
ISSN: 0961-5539

Keywords

Article
Publication date: 30 July 2021

Bing Zhang, Raiyan Seede, Austin Whitt, David Shoukr, Xueqin Huang, Ibrahim Karaman, Raymundo Arroyave and Alaa Elwany

There is recent emphasis on designing new materials and alloys specifically for metal additive manufacturing (AM) processes, in contrast to AM of existing alloys that were…

Abstract

Purpose

There is recent emphasis on designing new materials and alloys specifically for metal additive manufacturing (AM) processes, in contrast to AM of existing alloys that were developed for other traditional manufacturing methods involving considerably different physics. Process optimization to determine processing recipes for newly developed materials is expensive and time-consuming. The purpose of the current work is to use a systematic printability assessment framework developed by the co-authors to determine windows of processing parameters to print defect-free parts from a binary nickel-niobium alloy (NiNb5) using laser powder bed fusion (LPBF) metal AM.

Design/methodology/approach

The printability assessment framework integrates analytical thermal modeling, uncertainty quantification and experimental characterization to determine processing windows for NiNb5 in an accelerated fashion. Test coupons and mechanical test samples were fabricated on a ProX 200 commercial LPBF system. A series of density, microstructure and mechanical property characterization was conducted to validate the proposed framework.

Findings

Near fully-dense parts with more than 99% density were successfully printed using the proposed framework. Furthermore, the mechanical properties of as-printed parts showed low variability, good tensile strength of up to 662 MPa and tensile ductility 51% higher than what has been reported in the literature.

Originality/value

Although many literature studies investigate process optimization for metal AM, there is a lack of a systematic printability assessment framework to determine manufacturing process parameters for newly designed AM materials in an accelerated fashion. Moreover, the majority of existing process optimization approaches involve either time- and cost-intensive experimental campaigns or require the use of proprietary computational materials codes. Through the use of a readily accessible analytical thermal model coupled with statistical calibration and uncertainty quantification techniques, the proposed framework achieves both efficiency and accessibility to the user. Furthermore, this study demonstrates that following this framework results in printed parts with low degrees of variability in their mechanical properties.

Abstract

Details

Quantitative and Empirical Analysis of Nonlinear Dynamic Macromodels
Type: Book
ISBN: 978-0-44452-122-4

Book part
Publication date: 1 October 2014

Charilaos Mertzanis

Standard financial risk management practices proved unable to provide an adequate understanding and a timely warning of the financial crisis. In particular, the theoretical…

Abstract

Standard financial risk management practices proved unable to provide an adequate understanding and a timely warning of the financial crisis. In particular, the theoretical foundations of risk management and the statistical calibration of risk models are called into question. Policy makers and practitioners respond by looking for new analytical approaches and tools to identify and address new sources of financial risk. Financial markets satisfy reasonable criteria of being considered complex adaptive systems, characterized by complex financial instruments and complex interactions among market actors. Policy makers and practitioners need to take both a micro and macro view of financial risk, identify proper transparency requirements on complex instruments, develop dynamic models of information generation that best approximate observed financial outcomes, and identify and address the causes and consequences of systemic risk. Complexity analysis can make a useful contribution. However, the methodological suitability of complexity theory for financial systems and by extension for risk management is still debatable. Alternative models drawn from the natural sciences and evolutionary theory are proposed.

Details

Risk Management Post Financial Crisis: A Period of Monetary Easing
Type: Book
ISBN: 978-1-78441-027-8

Keywords

Article
Publication date: 26 July 2013

Fasil Ejigu Eregno, Chong‐Yu Xu and Nils‐Otto Kitterød

Recent advances in hydrological impact studies point that the response of specific catchments to climate change scenario using a single model approach is questionable. This study…

1748

Abstract

Purpose

Recent advances in hydrological impact studies point that the response of specific catchments to climate change scenario using a single model approach is questionable. This study was aimed at investigating the impact of climate change on three river basins in China, Ethiopia and Norway using WASMOD and HBV hydrological models.

Design/methodology/approach

First, hydrological models' parameters were determined using current hydro‐climatic data inputs. Second, the historical time series of climatic data was adjusted according to the climate change scenarios. Third, the hydrological characteristics of the catchments under the adjusted climatic conditions were simulated using the calibrated hydrological models. Finally, comparisons of the model simulations of the current and possible future hydrological characteristics were performed. Responses were evaluated in terms of runoff, actual evapotranspiration and soil moisture change for incremental precipitation and temperature change scenarios.

Findings

From the results obtained, it can be inferred that two equally well calibrated models gave different hydrological response to hypothetical climatic scenarios. The authors' findings support the concern that climate change analysis using lumped hydrological models may lead to unreliable conclusions.

Practical implications

Extrapolation of driving forces (temperature and precipitation) beyond the range of parameter calibration yields unreliable response. It is beyond the scope of this study to reduce this model ambiguity, but reduction of uncertainty is a challenge for further research.

Originality/value

The research was conducted based on the primary time series data using the existing two hydrological models to test the magnitude differences one can expect when using different hydrological models to simulate hydrological response of climate changes in different climate zones.

Details

International Journal of Climate Change Strategies and Management, vol. 5 no. 3
Type: Research Article
ISSN: 1756-8692

Keywords

Abstract

Details

Recent Developments in Transport Modelling
Type: Book
ISBN: 978-0-08-045119-0

Abstract

Details

Quantitative and Empirical Analysis of Nonlinear Dynamic Macromodels
Type: Book
ISBN: 978-0-44452-122-4

1 – 10 of over 3000