Search results

1 – 10 of over 129000
To view the access options for this content please click here
Book part
Publication date: 19 November 2014

Enrique Martínez-García and Mark A. Wynne

We investigate the Bayesian approach to model comparison within a two-country framework with nominal rigidities using the workhorse New Keynesian open-economy model of…

Abstract

We investigate the Bayesian approach to model comparison within a two-country framework with nominal rigidities using the workhorse New Keynesian open-economy model of Martínez-García and Wynne (2010). We discuss the trade-offs that monetary policy – characterized by a Taylor-type rule – faces in an interconnected world, with perfectly flexible exchange rates. We then use posterior model probabilities to evaluate the weight of evidence in support of such a model when estimated against more parsimonious specifications that either abstract from monetary frictions or assume autarky by means of controlled experiments that employ simulated data. We argue that Bayesian model comparison with posterior odds is sensitive to sample size and the choice of observable variables for estimation. We show that posterior model probabilities strongly penalize overfitting, which can lead us to favor a less parameterized model against the true data-generating process when the two become arbitrarily close to each other. We also illustrate that the spillovers from monetary policy across countries have an added confounding effect.

To view the access options for this content please click here
Article
Publication date: 6 February 2017

David Jansen van Vuuren

The purpose of this paper is twofold: first, to suggest a modified sales comparison model that is scalable and adaptable to value under conditions of certainty and…

Abstract

Purpose

The purpose of this paper is twofold: first, to suggest a modified sales comparison model that is scalable and adaptable to value under conditions of certainty and uncertainty. The model can potentially be applied to residential property, non-residential property and large item plant and machinery in determining the value, rental or capitalisation rate. The second purpose is to address practitioner and end user bias, which if unaddressed can lead to potentially inconsistent valuation results.

Design/methodology/approach

Literature was reviewed on decision theory, specifically cognitive limitations, heuristics and biases. A qualitative approach is followed in the paper although the output of the proposed model itself is quantitative.

Findings

The paper argues that practitioners and end users alike tend to avoid advanced statistical techniques when valuing under conditions of certainty, while advanced statistical techniques would not be possible under conditions of uncertainty. In addition, practitioners can, due to the representative heuristic, be over-confident in their ability, skill or knowledge when performing valuations under conditions of certainty. When valuing under conditions of uncertainty, practitioners tend to avoid simple rule models as they consider the process too unique to be standardised. The combined effect is inconsistent valuation results unless it can potentially be addressed through an integrated and modified sales comparison model that takes into account varying degrees of certainty and uncertainty.

Practical implications

The proposed modified sales comparison model is an integrated model that can be adopted by practitioners in valuing residential, non-residential and large plant and machinery. It can potentially be used to value under conditions of certainty and uncertainty and improve valuation consistency. End users such as mortgage lenders and investors can benefit from the adoption of this model.

Originality/value

The aim of this paper is to propose an integrated and modified sales comparison model for valuing under conditions of certainty, normal uncertainty and abnormal uncertainty. The integrated model can value based on direct comparison under conditions of certainty and uncertainty while addressing the in practice avoidance of advanced statistical techniques and the implications of the representative heuristic and halo effect as cognitive biases on valuation consistency.

Details

Journal of Property Investment & Finance, vol. 35 no. 1
Type: Research Article
ISSN: 1463-578X

Keywords

To view the access options for this content please click here
Article
Publication date: 27 February 2009

Mourad Elhadef

The purpose of this paper is to describe a novel diagnosis approach, using neural networks (NNs), which can be used to identify faulty nodes in distributed and…

Abstract

Purpose

The purpose of this paper is to describe a novel diagnosis approach, using neural networks (NNs), which can be used to identify faulty nodes in distributed and multiprocessor systems.

Design/methodology/approach

Based on a literature‐based study focusing on research methodology and theoretical frameworks, the conduct of an ethnographic case study is described in detail. A discussion of the reporting and analysis of the data is also included.

Findings

This work shows that NNs can be used to implement a more efficient and adaptable approach for diagnosing faulty nodes in distributed systems. Simulations results indicate that the perceptron‐based diagnosis is a viable addition to present diagnosis problems.

Research limitations/implications

This paper presents a solution for the asymmetric comparison model. For a more generalized approach that can be used for other comparison or invalidation models this approach requires a multilayer neural network.

Practical implications

The extensive simulations conducted clearly showed that the perceptron‐based diagnosis algorithm correctly identified all the millions of faulty situations tested. In addition, the perceptron‐based diagnosis requires an off‐line learning phase which does not have an impact on the diagnosis latency. This means that a fault set can be easily and rapidly identified. Simulations results showed that only few milliseconds are required to diagnose a system, hence, one can start talking about “real‐time” diagnosis.

Originality/value

The paper is first work that uses NNs to solve the system‐level diagnosis problem.

Details

Education, Business and Society: Contemporary Middle Eastern Issues, vol. 2 no. 1
Type: Research Article
ISSN: 1753-7983

Keywords

To view the access options for this content please click here
Book part
Publication date: 30 May 2013

Andreas Schwab and William H. Starbuck

This chapter reports on a rapidly growing trend in data analysis – analytic comparisons between baseline models and explanatory models. Baseline models estimate values for…

Abstract

This chapter reports on a rapidly growing trend in data analysis – analytic comparisons between baseline models and explanatory models. Baseline models estimate values for the dependent variable in the absence of hypothesized causal effects. Thus, the baseline models discussed in this chapter differ from the baseline models commonly used in sequential regression analyses.Baseline modelling entails iteration: (1) Researchers develop baseline models to capture key patterns in the empirical data that are independent of the hypothesized effects. (2) They compare these patterns with the patterns implied by their explanatory models. (3) They use the derived insights to improve their explanatory models. (4) They iterate by comparing their improved explanatory models with modified baseline models.The chapter draws on methodological literature in economics, applied psychology, and the philosophy of science to point out fundamental features of baseline modelling. Examples come from research in international business and management, emerging market economies and developing countries.Baseline modelling offers substantial advantages for theory development. Although analytic comparisons with baseline models originated in some research fields as early as the 1960s, they have not been widely discussed or applied in international management. Baseline modelling takes a more inductive and iterative approach to modelling and theory development. Because baseline modelling holds substantial potential, international-management scholars should explore its opportunities for advancing scientific progress.

Details

Philosophy of Science and Meta-Knowledge in International Business and Management
Type: Book
ISBN: 978-1-78190-713-9

To view the access options for this content please click here
Article
Publication date: 28 August 2009

Kerstin Altmanninger, Martina Seidl and Manuel Wimmer

The purpose of this paper is to provide a feature‐based characterization of version control systems (VCSs), providing an overview about the state‐of‐the‐art of versioning…

Abstract

Purpose

The purpose of this paper is to provide a feature‐based characterization of version control systems (VCSs), providing an overview about the state‐of‐the‐art of versioning systems dedicated to modeling artifacts.

Design/methodology/approach

Based on a literature study of existing approaches, a description of the features of versioning systems is established. Special focus is set on three‐way merging which is an integral component of optimistic versioning. This characterization is employed on current model versioning systems, which allows the derivation of challenges in this research area.

Findings

The results of the evaluation show that several challenges need to be addressed in future developments of VCSs and merging tools in order to allow the parallel development of model artifacts.

Practical implications

Making model‐driven engineering (MDE) a success requires supporting the parallel development of model artifacts as is done nowadays for text‐based artifacts. Therefore, model versioning capabilities are a must for leveraging MDE in practice.

Originality/value

The paper gives a comprehensive overview of collaboration features of VCSs for software engineering artifacts in general, discusses the state‐of‐the‐art of systems for model artifacts, and finally, lists urgent challenges, which have to be considered in future model versioning system for realizing MDE in practice.

Details

International Journal of Web Information Systems, vol. 5 no. 3
Type: Research Article
ISSN: 1744-0084

Keywords

To view the access options for this content please click here
Book part
Publication date: 3 June 2008

Nathaniel T. Wilcox

Choice under risk has a large stochastic (unpredictable) component. This chapter examines five stochastic models for binary discrete choice under risk and how they combine…

Abstract

Choice under risk has a large stochastic (unpredictable) component. This chapter examines five stochastic models for binary discrete choice under risk and how they combine with “structural” theories of choice under risk. Stochastic models are substantive theoretical hypotheses that are frequently testable in and of themselves, and also identifying restrictions for hypothesis tests, estimation and prediction. Econometric comparisons suggest that for the purpose of prediction (as opposed to explanation), choices of stochastic models may be far more consequential than choices of structures such as expected utility or rank-dependent utility.

Details

Risk Aversion in Experiments
Type: Book
ISBN: 978-1-84950-547-5

To view the access options for this content please click here
Article
Publication date: 10 December 2019

Eric Goncalves Da Silva and Philippe Parnaudeau

The purpose of this paper is to quantify the relative importance of the multiphase model for the simulation of a gas bubble impacted by a normal shock wave in water. Both…

Abstract

Purpose

The purpose of this paper is to quantify the relative importance of the multiphase model for the simulation of a gas bubble impacted by a normal shock wave in water. Both the free-field case and the collapse near a wall are investigated. Simulations are performed on both two- and three-dimensional configurations. The main phenomena involved in the bubble collapse are illustrated. A focus on the maximum pressure reached during the collapse is proposed.

Design/methodology/approach

Simulations are performed using an inviscid compressible homogeneous solver based on different systems of equations. It consists in solving different mixture or phasic conservation laws and a transport-equation for the gas volume fraction. Three-dimensional configurations are considered for which an efficient massively parallel strategy was developed. The code is based on a finite volume discretization for which numerical fluxes are computed with a Harten, Lax, Van Leer, Contact (HLLC) scheme.

Findings

The comparison of three multiphase models is proposed. It is shown that a simple four-equation model is well-suited to simulate such strong shock-bubble interaction. The three-dimensional collapse near a wall is investigated. It is shown that the intensity of pressure peaks on the wall is drastically increased (more than 200 per cent) in comparison with the cylindrical case.

Research limitations/implications

The study of bubble collapse is a key point to understand the physical mechanism involved in cavitation erosion. The bubble collapse close to the wall has been addressed as the fundamental mechanism producing damage. Its general behavior is characterized by the formation of a water jet that penetrates through the bubble and the generation of a blast wave during the induced collapse. Both the jet and the blast wave are possible damaging mechanisms. However, the high-speed dynamics, the small spatio-temporal scales and the complicated physics involved in these processes make any theoretical and experimental approach a challenge.

Practical implications

Cavitation erosion is a major problem for hydraulic and marine applications. It is a limiting point for the conception and design of such components.

Originality/value

Such a comparison of multiphase models in the case of a strong shock-induced bubble collapse is clearly original. Usually models are tested separately leading to a large dispersion of results. Moreover, simulations of a three-dimensional bubble collapse are scarce in the literature using such fine grids.

Details

International Journal of Numerical Methods for Heat & Fluid Flow, vol. 22 no. 8
Type: Research Article
ISSN: 0961-5539

Keywords

To view the access options for this content please click here
Article
Publication date: 12 May 2020

Shao-Ming Xie and Chun-Yao Huang

Predicting the inactivity and the repeat transaction frequency of a firm's customer base is critical for customer relationship management. The literature offers two main…

Abstract

Purpose

Predicting the inactivity and the repeat transaction frequency of a firm's customer base is critical for customer relationship management. The literature offers two main approaches to such predictions: stochastic modeling efforts represented by Pareto/NBD and machine learning represented by neural network analysis. As these two approaches have been developed and applied in parallel, this study systematically compares the two approaches in their prediction accuracy and defines the relatively appropriate implementation scenarios of each model.

Design/methodology/approach

By designing a rolling exploration scheme with moving calibration/holdout combinations of customer data, this research explores the two approaches' relative performance by first utilizing three real world datasets and then a wide range of simulated datasets.

Findings

The empirical result indicates that neither approach is dominant and identifies patterns of relative applicability between the two. Such patterns are consistent across the empirical and the simulated datasets.

Originality/value

This study contributes to the literature by bridging two previously parallel analytical approaches applicable to customer base predictions. No prior research has rendered a comprehensive comparison on the two approaches' relative performance in customer base predictions as this study has done. The patterns identified in the two approaches' relative prediction performance provide practitioners with a clear-cut menu upon selecting approaches for customer base predictions. The findings further urge marketing scientists to reevaluate prior modeling efforts during the past half century by assessing what can be replaced by black boxes such as NNA and what cannot.

Details

Asia Pacific Journal of Marketing and Logistics, vol. 33 no. 2
Type: Research Article
ISSN: 1355-5855

Keywords

To view the access options for this content please click here
Article
Publication date: 4 September 2017

Luca Barbazza, Maurizio Faccio, Fabio Oscari and Giulio Rosati

This paper aims at analyzing different possible assembly systems, including innovative potential configurations such as the fully flexible assembly systems (FAS), by…

Abstract

Purpose

This paper aims at analyzing different possible assembly systems, including innovative potential configurations such as the fully flexible assembly systems (FAS), by defining a novel analytical model that focuses on the concept of agility and its impact on the whole system performance, also evaluating the economic convenience in terms of the unit direct production cost.

Design/methodology/approach

The authors propose a comparison model derived by Newton’s second law, introducing a quantitative definition of agility (acceleration), resistance of an assembly system to any change of its operative state (inertia) and unit direct production cost (force). Different types of assembly systems (manual, flexible and fully FAS) are analyzed and compared using the proposed model, investigating agility, system inertia and their impact on the unit direct production cost.

Findings

The proposed agility definition and the proposed comparison model have been applied considering different sets of parameters as independent variables, such as the number of components to assemble (product model complexity) and the target throughput of the system. The main findings are a series of convenience areas which either, for a given target unit direct production cost (force), defines the most agile system to adopt or, for a given target agility (acceleration), defines the most economical system to adopt, as function of the independent variables.

Originality/value

The novelty of this work is, first, the analytical definition of agility applied to assembly systems and contextualized by means of the definition of the new comparison model. The comparison between different assembly systems on the basis of agility, and by using different sets of independent variables, is a further element of interest. Finally, the resulting convenience areas represent a desirable tool that could be used to optimally choose the most suitable assembly system according to one or more system parameters.

Details

Assembly Automation, vol. 37 no. 4
Type: Research Article
ISSN: 0144-5154

Keywords

To view the access options for this content please click here
Book part
Publication date: 1 August 2012

Andreas Schwab and William H. Starbuck

Purpose – This chapter reports on a rapidly growing trend in the analysis of data about emerging market (EM) economies – the use of baseline models as comparisons for…

Abstract

Purpose – This chapter reports on a rapidly growing trend in the analysis of data about emerging market (EM) economies – the use of baseline models as comparisons for explanatory models. Baseline models estimate expected values for the dependent variable in the absence of a hypothesized causal effect but set higher standards than do traditional null hypotheses tests that expect no effect.

Design/methodology/approach – Although the use of baseline models research originated in the 1960s, it has not been widely discussed, or even acknowledged, in the EM literature. We surveyed published EM studies to determine trends in the use of baseline models.

Findings – We categorize and describe the different types of baseline models that scholars have used in EM studies, and draw inferences about the differences between more effective and less effective uses of baseline models.

Value – We believe that comparisons with baseline models offer distinct methodological advantages for the iterative development of better explanatory models and a deeper understanding of empirical phenomena.

Details

West Meets East: Toward Methodological Exchange
Type: Book
ISBN: 978-1-78190-026-0

Keywords

1 – 10 of over 129000