Search results

1 – 10 of over 160000
Book part
Publication date: 23 June 2016

Daniel J. Henderson and Christopher F. Parmeter

It is known that model averaging estimators are useful when there is uncertainty governing which covariates should enter the model. We argue that in applied research there is also…

Abstract

It is known that model averaging estimators are useful when there is uncertainty governing which covariates should enter the model. We argue that in applied research there is also uncertainty as to which method one should deploy, prompting model averaging over user-defined choices. Specifically, we propose, and detail, a nonparametric regression estimator averaged over choice of kernel, bandwidth selection mechanism and local-polynomial order. Simulations and an empirical application are provided to highlight the potential benefits of the method.

Details

Essays in Honor of Aman Ullah
Type: Book
ISBN: 978-1-78560-786-8

Keywords

Book part
Publication date: 29 February 2008

Todd E. Clark and Michael W. McCracken

Small-scale VARs are widely used in macroeconomics for forecasting US output, prices, and interest rates. However, recent work suggests these models may exhibit instabilities. As…

Abstract

Small-scale VARs are widely used in macroeconomics for forecasting US output, prices, and interest rates. However, recent work suggests these models may exhibit instabilities. As such, a variety of estimation or forecasting methods might be used to improve their forecast accuracy. These include using different observation windows for estimation, intercept correction, time-varying parameters, break dating, Bayesian shrinkage, model averaging, etc. This paper compares the effectiveness of such methods in real-time forecasting. We use forecasts from univariate time series models, the Survey of Professional Forecasters, and the Federal Reserve Board's Greenbook as benchmarks.

Details

Forecasting in the Presence of Structural Breaks and Model Uncertainty
Type: Book
ISBN: 978-1-84950-540-6

Article
Publication date: 9 November 2012

Redha Benachour, Saïda Latreche, Mohamed El Hadi Latreche and Christian Gontrand

The present work aims to explain how the nonlinear average model can be used in power electronic integration design as a behavioral model.

Abstract

Purpose

The present work aims to explain how the nonlinear average model can be used in power electronic integration design as a behavioral model.

Design/methodology/approach

The nonlinear average model is used in power electronic integration design as a behavioral model, where it is applied to a voltage source inverter based on IGBTs. This model was chosen because it takes into account the nonlinearity of the power semiconductor components and the wiring circuit effects, which can be formalized by the virtual delay concept. In addition, the nonlinear average model cannot distinguish between slow and quick variables and this is an important feature of the model convergence.

Findings

The paper studies extensively the construction of the nonlinear average model algorithm theoretically. Detailed explanations of the application of this model to voltage source inverter design are provided. The study demonstrates how this model illustrates the effect of the nonlinearity of the power semiconductor components' characteristics on dynamic electrical quantities. It also predicts the effects due to wiring in the inverter circuit.

Research limitations/implications

More simulations and experimental analysis are still necessary to improve the model's accuracy, by using other static characteristic approaches, and to validate the applicability of the model to different converter topologies.

Practical implications

The paper formulates a simple nonlinear average model algorithm, discussing each step. This model was described by VHDL‐AMS. On the one hand, it will assist theoretical and practical research on different topologies of power electronic converters, particularly in power integration systems design such as the integrated power electronics modules (IPEM). On the other hand, it will give designers a more precise behavioral model with a simpler design process.

Originality/value

The nonlinear average model used in power electronic integration design as behavioral model is a novel approach. This model reduces computational costs significantly, takes physical effects into account and is easy to implement.

Details

COMPEL - The international journal for computation and mathematics in electrical and electronic engineering, vol. 31 no. 6
Type: Research Article
ISSN: 0332-1649

Keywords

Article
Publication date: 11 May 2010

Kaiçar Ammous, Elyes Haouas and Slim Abid

The purpose of this paper is to develop a simulation tool which permits reducing the cost of long time‐range simulation of complex converters and running at high frequency.

Abstract

Purpose

The purpose of this paper is to develop a simulation tool which permits reducing the cost of long time‐range simulation of complex converters and running at high frequency.

Design/methodology/approach

A different method is used to represent a simplified converter but the adopted technique uses the average representation of the cell converter.

Findings

The paper shows that the use of averaged representation of the pulse width modulation switch in multilevel converters is staying applied. The main advantage of the proposed averaged model is its simplified representation when only electrical behaviour is considered.

Research limitations/implications

The analytical algorithm of the averaged model can be introduced in different simulator as it has a description language, enabling study of the Compatibilité Electromagnétique and electrothermal phenomena.

Originality/value

This paper presents an averaged model of the multilevel converter which can be implemented in any simulator as it has a description language.

Details

COMPEL - The international journal for computation and mathematics in electrical and electronic engineering, vol. 29 no. 3
Type: Research Article
ISSN: 0332-1649

Keywords

Book part
Publication date: 5 April 2024

Bruce E. Hansen and Jeffrey S. Racine

Classical unit root tests are known to suffer from potentially crippling size distortions, and a range of procedures have been proposed to attenuate this problem, including the…

Abstract

Classical unit root tests are known to suffer from potentially crippling size distortions, and a range of procedures have been proposed to attenuate this problem, including the use of bootstrap procedures. It is also known that the estimating equation’s functional form can affect the outcome of the test, and various model selection procedures have been proposed to overcome this limitation. In this chapter, the authors adopt a model averaging procedure to deal with model uncertainty at the testing stage. In addition, the authors leverage an automatic model-free dependent bootstrap procedure where the null is imposed by simple differencing (the block length is automatically determined using recent developments for bootstrapping dependent processes). Monte Carlo simulations indicate that this approach exhibits the lowest size distortions among its peers in settings that confound existing approaches, while it has superior power relative to those peers whose size distortions do not preclude their general use. The proposed approach is fully automatic, and there are no nuisance parameters that have to be set by the user, which ought to appeal to practitioners.

Details

Essays in Honor of Subal Kumbhakar
Type: Book
ISBN: 978-1-83797-874-8

Keywords

Article
Publication date: 1 June 2004

Jessica Gullbrand

Large‐eddy simulation (LES) of a turbulent channel flow is performed using different subfilter‐scale (SFS) models and test filter functions. The SFS models used are the dynamic…

Abstract

Large‐eddy simulation (LES) of a turbulent channel flow is performed using different subfilter‐scale (SFS) models and test filter functions. The SFS models used are the dynamic Smagorinsky model (DSM) and the dynamic mixed model (DMM). The DMM is a linear combination between the scale‐similarity model and the DSM. The test filter functions investigated are the sharp cut‐off (in spectral space) and smooth filter that is commutative up to fourth‐order. The filters are applied either in the homogeneous directions or in all three spatial directions. The governing equations are discretized using a fourth‐order energy‐conserving finite‐difference scheme. The influence from the test filter function and the SFS model on the LES results are investigated and the effect of two‐dimensional versus three‐dimensional test filtering are investigated. The study shows that the combination of SFS model and filter function highly influences the computational results; even the effect on the zeroth‐order moment is large.

Details

International Journal of Numerical Methods for Heat & Fluid Flow, vol. 14 no. 4
Type: Research Article
ISSN: 0961-5539

Keywords

Book part
Publication date: 5 April 2024

Ziwen Gao, Steven F. Lehrer, Tian Xie and Xinyu Zhang

Motivated by empirical features that characterize cryptocurrency volatility data, the authors develop a forecasting strategy that can account for both model uncertainty and…

Abstract

Motivated by empirical features that characterize cryptocurrency volatility data, the authors develop a forecasting strategy that can account for both model uncertainty and heteroskedasticity of unknown form. The theoretical investigation establishes the asymptotic optimality of the proposed heteroskedastic model averaging heterogeneous autoregressive (H-MAHAR) estimator under mild conditions. The authors additionally examine the convergence rate of the estimated weights of the proposed H-MAHAR estimator. This analysis sheds new light on the asymptotic properties of the least squares model averaging estimator under alternative complicated data generating processes (DGPs). To examine the performance of the H-MAHAR estimator, the authors conduct an out-of-sample forecasting application involving 22 different cryptocurrency assets. The results emphasize the importance of accounting for both model uncertainty and heteroskedasticity in practice.

Content available
Book part
Publication date: 5 October 2018

Abstract

Details

Fuzzy Hybrid Computing in Construction Engineering and Management
Type: Book
ISBN: 978-1-78743-868-2

Article
Publication date: 21 January 2022

Maximilien de Zordo-Banliat, Xavier Merle, Gregory Dergham and Paola Cinnella

The Reynolds-averaged Navier–Stokes (RANS) equations represent the computational workhorse for engineering design, despite their numerous flaws. Improving and quantifying the…

92

Abstract

Purpose

The Reynolds-averaged Navier–Stokes (RANS) equations represent the computational workhorse for engineering design, despite their numerous flaws. Improving and quantifying the uncertainties associated with RANS models is particularly critical in view of the analysis and optimization of complex turbomachinery flows.

Design/methodology/approach

First, an efficient strategy is introduced for calibrating turbulence model coefficients from high-fidelity data. The results are highly sensitive to the flow configuration (called a calibration scenario) used to inform the coefficients. Second, the bias introduced by the choice of a specific turbulence model is reduced by constructing a mixture model by means of Bayesian model-scenario averaging (BMSA). The BMSA model makes predictions of flows not included in the calibration scenarios as a probability-weighted average of a set of competing turbulence models, each supplemented with multiple sets of closure coefficients inferred from alternative calibration scenarios.

Findings

Different choices for the scenario probabilities are assessed for the prediction of the NACA65 V103 cascade at off-design conditions. In all cases, BMSA improves the solution accuracy with respect to the baseline turbulence models, and the estimated uncertainty intervals encompass reasonably well the reference data. The BMSA results were found to be little sensitive to the user-defined scenario-weighting criterion, both in terms of average prediction and of estimated confidence intervals.

Originality/value

A delicate step in the BMSA is the selection of suitable scenario-weighting criteria, i.e. suitable prior probability mass functions (PMFs) for the calibration scenarios. The role of such PMFs is to assign higher probability to calibration scenarios more likely to provide an accurate estimate of model coefficients for the new flow. In this paper, three mixture models are constructed, based on alternative choices of the scenario probabilities. The authors then compare the capabilities of three different criteria.

Details

International Journal of Numerical Methods for Heat & Fluid Flow, vol. 32 no. 4
Type: Research Article
ISSN: 0961-5539

Keywords

Article
Publication date: 11 July 2008

Aapo Länsiluoto and Tomas Eklund

The purpose of this study is to present and compare the results of an evaluation of two self‐organizing map (SOM) models' suitability for financial environment analysis. The models

2352

Abstract

Purpose

The purpose of this study is to present and compare the results of an evaluation of two self‐organizing map (SOM) models' suitability for financial environment analysis. The models are constructed for analyzing the macro‐ and firm‐level environments in the international pulp and paper industry.

Design/methodology/approach

The data for the evaluation were collected through a field survey in 13 publicly‐noted Finnish companies, with a total of 36 respondents. All the respondents were involved in business intelligence or corporate development related tasks.

Findings

The results indicate that, the respondents appreciated the capabilities of both SOM models. All of the factors relating to accuracy, content, format, timeliness, and usability for strategic decision making received ratings higher than neutral. Respondents also concluded that the SOM models have additional benefits compared to currently used methods. Finally, the respondents were willing to utilize the SOM as a complement to other tools for analyzing the competitive environment. Some subfactors of the firm level SOM model received statistically higher averages than the macro‐level SOM model.

Originality/value

Despite the SOM having been utilized in thousands of different applications, user satisfaction with the SOM and its information products has not previously been widely evaluated, especially not by potential end‐users.

Details

Benchmarking: An International Journal, vol. 15 no. 4
Type: Research Article
ISSN: 1463-5771

Keywords

1 – 10 of over 160000