Search results

1 – 10 of over 1000
To view the access options for this content please click here
Article

Yuan Fangyang and Chen Zhongli

The purpose of this paper is to develop new types of direct expansion method of moments (DEMM) by using the n/3th moments for simulating nanoparticle Brownian coagulation…

Abstract

Purpose

The purpose of this paper is to develop new types of direct expansion method of moments (DEMM) by using the n/3th moments for simulating nanoparticle Brownian coagulation in the free molecule regime. The feasibilities of new proposed DEMMs with n/3th moments are investigated to describe the evolution of aerosol size distribution, and some of the models will be applied to further simulation of physical processes.

Design/methodology/approach

The accuracy and efficiency of some kinds of methods of moments are mainly compared including the quadrature method of moments (QMOM), Taylor-expansion method of moments (TEMOM), the log-normal preserving method of moments proposed by Lee (LMM) and the derived DEMM in this paper. QMOM with 12 quadrature approximation points is taken as a reference to evaluate other methods.

Findings

The newly derived models, namely DEMM(4/3,4) and DEMM(2,6), as well as the previous DEMM(2,4), are considered to be qualified models due to their high accuracy and efficiency. They are confirmed to be valid and alternative models to describe the evolution of aerosol size distribution for particle dynamical process involving the n/3th moments.

Originality/value

The n/3th moments, which have clear physical interpretations when n stands for first several integers, are first introduced in the DEMM method for simulating nanoparticle Brownian coagulation in the free molecule regime.

Details

International Journal of Numerical Methods for Heat & Fluid Flow, vol. 25 no. 1
Type: Research Article
ISSN: 0961-5539

Keywords

To view the access options for this content please click here
Book part

Annette Bergemann, Erik Grönqvist and Soffia Guðbjörnsdóttir

We investigate how career disruptions in terms of job loss may impact morbidity for individuals diagnosed with type 2 diabetes (T2D). Combining unique, high-quality…

Abstract

We investigate how career disruptions in terms of job loss may impact morbidity for individuals diagnosed with type 2 diabetes (T2D). Combining unique, high-quality longitudinal data from the Swedish National Diabetes Register (NDR) with matched employer–employee data, we focus on individuals diagnosed with T2D, who are established on the labor market and who lose their job in a mass layoff. Using a conditional difference-in-differences evaluation approach, our results give limited support for job loss having an impact on health behavior, diabetes progression, and cardiovascular risk factors.

Details

Health and Labor Markets
Type: Book
ISBN: 978-1-78973-861-2

Keywords

To view the access options for this content please click here
Article

R.W. Faff and S. Lau

Standard multivariate tests of mean variance efficiency (MVE) have been criticised on the grounds that they require regression residuals to have a multivariate normal…

Abstract

Standard multivariate tests of mean variance efficiency (MVE) have been criticised on the grounds that they require regression residuals to have a multivariate normal distribution. Generally, the existing evidence suggests that the normality assumption is questionable, even for monthly returns. MacKinlay and Richardson (1991) developed a generalised method of moments (GMM) framework which provides tests which are valid under much weaker distributional assumptions. They examined monthly US data formed into size based portfolios, for mean‐variance efficiency relative to the Sharpe‐Lintner CAPM. They found that inferences regarding mean‐variance efficiency can be sensitive to the test considered. In this paper we further investigate their GMM tests using monthly Australian data over the period 1974 to 1994. We extend upon their analysis to consider an alternative version of their GMM test and also to examine a zero‐beta version of the CAPM. Similar to the US case, our results also indicate sensitivity of inferences to the tests used. Finally, while we find that the GMM tests generally provide rejection of mean‐variance efficiency, tests involving the zero‐beta CAPM, particularly when a value‐weighted market index is used, prove less prone to rejection.

Details

Pacific Accounting Review, vol. 9 no. 1
Type: Research Article
ISSN: 0114-0582

To view the access options for this content please click here
Book part

Sara Riscado

In this chapter we approach the estimation of dynamic stochastic general equilibrium models through a moments-based estimator, the empirical likelihood. We attempt to show…

Abstract

In this chapter we approach the estimation of dynamic stochastic general equilibrium models through a moments-based estimator, the empirical likelihood. We attempt to show that this inference process can be a valid alternative to maximum likelihood, which has been one of the preferred choices of the related literature to estimate these models. The empirical likelihood estimator is characterized by a simple setup and only requires knowledge about the moments of the data generating process of the model. In this context, we exploit the fact that these economies can be formulated as a set of moment conditions to infer on their parameters through this technique. For illustrational purposes, we consider a standard real business cycle model with a constant relative risk averse utility function and indivisible labor, driven by a normal technology shock.

Details

DSGE Models in Macroeconomics: Estimation, Evaluation, and New Developments
Type: Book
ISBN: 978-1-78190-305-6

Keywords

To view the access options for this content please click here

Abstract

Details

Applying Maximum Entropy to Econometric Problems
Type: Book
ISBN: 978-0-76230-187-4

To view the access options for this content please click here
Book part

Tae-Seok Jang

This chapter analyzes the empirical relationship between the pricesetting/consumption behavior and the sources of persistence in inflation and output. First, a small-scale…

Abstract

This chapter analyzes the empirical relationship between the pricesetting/consumption behavior and the sources of persistence in inflation and output. First, a small-scale New-Keynesian model (NKM) is examined using the method of moment and maximum likelihood estimators with US data from 1960 to 2007. Then a formal test is used to compare the fit of two competing specifications in the New-Keynesian Phillips Curve (NKPC) and the IS equation, that is, backward- and forward-looking behavior. Accordingly, the inclusion of a lagged term in the NKPC and the IS equation improves the fit of the model while offsetting the influence of inherited and extrinsic persistence; it is shown that intrinsic persistence plays a major role in approximating inflation and output dynamics for the Great Inflation period. However, the null hypothesis cannot be rejected at the 5% level for the Great Moderation period, that is, the NKM with purely forward-looking behavior and its hybrid variant are equivalent. Monte Carlo experiments investigate the validity of chosen moment conditions and the finite sample properties of the chosen estimation methods. Finally, the empirical performance of the formal test is discussed along the lines of the Akaike's and the Bayesian information criterion.

Details

DSGE Models in Macroeconomics: Estimation, Evaluation, and New Developments
Type: Book
ISBN: 978-1-78190-305-6

Keywords

To view the access options for this content please click here
Article

Aldo Artur Belardi, José Roberto Cardoso and Carlos Antonio França Sartori

Presents the mathematical basis and some results, concerning the application of Haar's wavelets, as an expansion function, in the method of moments to solve electrostatic…

Abstract

Presents the mathematical basis and some results, concerning the application of Haar's wavelets, as an expansion function, in the method of moments to solve electrostatic problems. Two applications regarding the evaluation of linear and surface charge densities were carried out: the first one on a finite straight wire, and the second one on a thin square plate. Some optimization techniques were used, whose main computational performance aspects are emphasized. Presents comparative results related to the use of Haar's wavelets and the conventional expansion functions.

Details

COMPEL - The international journal for computation and mathematics in electrical and electronic engineering, vol. 23 no. 3
Type: Research Article
ISSN: 0332-1649

Keywords

To view the access options for this content please click here
Book part

Jean-Marie Dufour and Pascale Valéry

In this paper, we consider the estimation of volatility parameters in the context of a linear regression where the disturbances follow a stochastic volatility (SV) model…

Abstract

In this paper, we consider the estimation of volatility parameters in the context of a linear regression where the disturbances follow a stochastic volatility (SV) model of order one with Gaussian log-volatility. The linear regression represents the conditional mean of the process and may have a fairly general form, including for example finite-order autoregressions. We provide a computationally simple two-step estimator available in closed form. Under general regularity conditions, we show that this two-step estimator is asymptotically normal. We study its statistical properties by simulation, compare it with alternative generalized method-of-moments (GMM) estimators, and present an application to the S&P composite index.

Details

Econometric Analysis of Financial and Economic Time Series
Type: Book
ISBN: 978-0-76231-274-0

To view the access options for this content please click here
Article

Kun Zhou and Zhu He

The purpose of this paper is to investigate aerosol evolution in a planar mixing layer from a Lagrangian point of view. After using Monte Carlo (MC) method to simulate the…

Abstract

Purpose

The purpose of this paper is to investigate aerosol evolution in a planar mixing layer from a Lagrangian point of view. After using Monte Carlo (MC) method to simulate the evolution of aerosol dynamics along particles trajectories, the particles size distributions are obtained, which are unavailable in mostly used methods of moments.

Design/methodology/approach

Nucleation and growth of dibutyl phthalate (DBP) particles are simulated using the quadrature method of moments in a planar mixing layer, where a fast hot stream with DBP vapor is mixing with a slow cool stream without vapor. Trajectories of aerosol particles are recorded. MC method is used to simulate the aerosol evolution along trajectories.

Findings

Investigation on aerosol evolution along the trajectories prompts to classify these trajectories into three groups: first, trajectories away from the active nucleation zone; second, trajectories starting from the active nucleation zone; and third, trajectories crossing over the active nucleation zone. Particle size distributions (psds) along selected representative trajectories are investigated. The psd evolution exhibits interesting behavior due to the synthetic effects of nucleation and condensation. Condensation growth tends to narrow down the psd, and form a sharp front on the side of big particle size. Nucleation is able to broaden the psd through generating the smallest particles. The duration and strength of nucleation have significant effect on the shape of psd.

Originality/value

As far as the authors knowledge, it is the first simulation of aerosol evolution that takes a Lagrangian point of view, and uses MC simulation along particles trajectories to provide the particles size distribution.

Details

International Journal of Numerical Methods for Heat & Fluid Flow, vol. 24 no. 8
Type: Research Article
ISSN: 0961-5539

Keywords

Content available
Article

Sakiru Oladele Akinbode, Adewale Oladapo Dipeolu, Tobi Michael Bolarinwa and Oladayo Babaseun Olukowi

Some progress have been made over time in improving health conditions in Sub-Saharan Africa (SSA). There are, however, contradicting reports on the relationship between…

Abstract

Purpose

Some progress have been made over time in improving health conditions in Sub-Saharan Africa (SSA). There are, however, contradicting reports on the relationship between health outcomes and economic growth in the region. The paper aimed at assessing the effect of health outcome on economic growth in SSA.

Design/methodology/approach

Data for 41 countries from 2000 to 2018 were obtained from WDI and WGI and analyzed using system generalized method of moment (sGMM) which is appropriate for the present scenario. AR(1) and AR(2) tests were used to assess the validity of the model while Sargan and Hansen tests were adopted to examine the validity of the instrumental variables. The robustness of the estimation was confirmed using the pooled OLS and fixed effect regression.

Findings

Health outcome (proxied by life expectancy), lagged GDP per capita, capital formation, labor force (LF), health expenditure (HE), foreign direct investment (FDI) and trade openness (TOP) significantly affected economic growth emphasizing the importance of health in the process of economic growth in the region. AR(1) and AR(2) tests for serial correlation and Sargan/Hansen tests confirmed the validity of the estimated model and the instrumental variables respectively. Robustness of the GMM results was established from the pooled OLS and the fixed effect model results.

Social implications

Improvement in the national health system possibly through the widespread adoption of National Health Insurance, increase government spending on healthcare alongside increased beneficial trade and ease of doing business to facilitate investment were recommended to enhance.

Originality/value

The study used up-to-date data with appropriate methodology.

Details

Journal of Economics and Development, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 1859-0020

Keywords

1 – 10 of over 1000