Search results

1 – 10 of 417
Article
Publication date: 29 September 2022

Fei Wang and Tat Leung Chan

The purpose of this study is to present a newly proposed and developed sorting algorithm-based merging weighted fraction Monte Carlo (SAMWFMC) method for solving the population…

Abstract

Purpose

The purpose of this study is to present a newly proposed and developed sorting algorithm-based merging weighted fraction Monte Carlo (SAMWFMC) method for solving the population balance equation for the weighted fraction coagulation process in aerosol dynamics with high computational accuracy and efficiency.

Design/methodology/approach

In the new SAMWFMC method, the jump Markov process is constructed as the weighted fraction Monte Carlo (WFMC) method (Jiang and Chan, 2021) with a fraction function. Both adjustable and constant fraction functions are used to validate the computational accuracy and efficiency. A new merging scheme is also proposed to ensure a constant-number and constant-volume scheme.

Findings

The new SAMWFMC method is fully validated by comparing with existing analytical solutions for six benchmark test cases. The numerical results obtained from the SAMWFMC method with both adjustable and constant fraction functions show excellent agreement with the analytical solutions and low stochastic errors. Compared with the WFMC method (Jiang and Chan, 2021), the SAMWFMC method can significantly reduce the stochastic error in the total particle number concentration without increasing the stochastic errors in high-order moments of the particle size distribution at only slightly higher computational cost.

Originality/value

The WFMC method (Jiang and Chan, 2021) has a stringent restriction on the fraction functions, making few fraction functions applicable to the WFMC method except for several specifically selected adjustable fraction functions, while the stochastic error in the total particle number concentration is considerably large. The newly developed SAMWFMC method shows significant improvement and advantage in dealing with weighted fraction coagulation process in aerosol dynamics and provides an excellent potential to deal with various fraction functions with higher computational accuracy and efficiency.

Details

International Journal of Numerical Methods for Heat & Fluid Flow, vol. 33 no. 2
Type: Research Article
ISSN: 0961-5539

Keywords

Article
Publication date: 4 February 2021

Xiao Jiang and Tat Leung Chan

The purpose of this study is to investigate the aerosol dynamics of the particle coagulation process using a newly developed weighted fraction Monte Carlo (WFMC) method.

Abstract

Purpose

The purpose of this study is to investigate the aerosol dynamics of the particle coagulation process using a newly developed weighted fraction Monte Carlo (WFMC) method.

Design/methodology/approach

The weighted numerical particles are adopted in a similar manner to the multi-Monte Carlo (MMC) method, with the addition of a new fraction function (α). Probabilistic removal is also introduced to maintain a constant number scheme.

Findings

Three typical cases with constant kernel, free-molecular coagulation kernel and different initial distributions for particle coagulation are simulated and validated. The results show an excellent agreement between the Monte Carlo (MC) method and the corresponding analytical solutions or sectional method results. Further numerical results show that the critical stochastic error in the newly proposed WFMC method is significantly reduced when compared with the traditional MMC method for higher-order moments with only a slight increase in computational cost. The particle size distribution is also found to extend for the larger size regime with the WFMC method, which is traditionally insufficient in the classical direct simulation MC and MMC methods. The effects of different fraction functions on the weight function are also investigated.

Originality Value

Stochastic error is inevitable in MC simulations of aerosol dynamics. To minimize this critical stochastic error, many algorithms, such as MMC method, have been proposed. However, the weight of the numerical particles is not adjustable. This newly developed algorithm with an adjustable weight of the numerical particles can provide improved stochastic error reduction.

Details

International Journal of Numerical Methods for Heat & Fluid Flow, vol. 31 no. 9
Type: Research Article
ISSN: 0961-5539

Keywords

Article
Publication date: 2 September 2021

Xiao Jiang and Tat Leung Chan

The purpose of this paper is to study the soot formation and evolution by using this newly developed Lagrangian particle tracking with weighted fraction Monte Carlo (LPT-WFMC…

Abstract

Purpose

The purpose of this paper is to study the soot formation and evolution by using this newly developed Lagrangian particle tracking with weighted fraction Monte Carlo (LPT-WFMC) method.

Design/methodology/approach

The weighted soot particles are used in this MC framework and is tracked using Lagrangian approach. A detailed soot model based on the LPT-WFMC method is used to study the soot formation and evolution in ethylene laminar premixed flames.

Findings

The LPT-WFMC method is validated by both experimental and numerical results of the direct simulation Monte Carlo (DSMC) and Multi-Monte Carlo (MMC) methods. Compared with DSMC and MMC methods, the stochastic error analysis shows this new LPT-WFMC method could further extend the particle size distributions (PSDs) and improve the accuracy for predicting soot PSDs at larger particle size regime.

Originality/value

Compared with conventional weighted particle schemes, the weight distributions in LPT-WFMC method are adjustable by adopting different fraction functions. As a result, the number of numerical soot particles in each size interval could be also adjustable. The stochastic error of PSDs in larger particle size regime can also be minimized by increasing the number of numerical soot particles at larger size interval.

Details

International Journal of Numerical Methods for Heat & Fluid Flow, vol. 32 no. 6
Type: Research Article
ISSN: 0961-5539

Keywords

Article
Publication date: 20 September 2019

Hongmei Liu and Tat Leung Chan

The purpose of this paper is to study the evolution and growth of aerosol particles in a turbulent planar jet by using the newly developed large eddy simulation…

184

Abstract

Purpose

The purpose of this paper is to study the evolution and growth of aerosol particles in a turbulent planar jet by using the newly developed large eddy simulation (LES)-differentially weighted operator splitting Monte Carlo (DWOSMC) method.

Design/methodology/approach

The DWOSMC method is coupled with LES for the numerical simulation of aerosol dynamics in turbulent flows.

Findings

Firstly, the newly developed and coupled LES-DWOSMC method is verified by the results obtained from a direct numerical simulation-sectional method (DNS-SM) for coagulation occurring in a turbulent planar jet from available literature. Then, the effects of jet temperature and Reynolds number on the evolution of time-averaged mean particle diameter, normalized particle number concentration and particle size distributions (PSDs) are studied numerically on both coagulation and condensation processes. The jet temperature and Reynolds number are shown to be two important parameters that can be used to control the evolution and pattern of PSD in an aerosol reactor.

Originality/value

The coupling between the Monte Carlo method and turbulent flow still encounters many technical difficulties. In addition, the relationship between turbulence, particle properties and collision kernels of aerosol dynamics is not yet well understood due to the theoretical limitations and experimental difficulties. In the present study, the developed and coupled LES-DWOSMC method is capable of solving the aerosol dynamics in turbulent flows.

Details

International Journal of Numerical Methods for Heat & Fluid Flow, vol. 30 no. 2
Type: Research Article
ISSN: 0961-5539

Keywords

Book part
Publication date: 21 December 2010

Tong Zeng and R. Carter Hill

In this paper we use Monte Carlo sampling experiments to examine the properties of pretest estimators in the random parameters logit (RPL) model. The pretests are for the presence…

Abstract

In this paper we use Monte Carlo sampling experiments to examine the properties of pretest estimators in the random parameters logit (RPL) model. The pretests are for the presence of random parameters. We study the Lagrange multiplier (LM), likelihood ratio (LR), and Wald tests, using conditional logit as the restricted model. The LM test is the fastest test to implement among these three test procedures since it only uses restricted, conditional logit, estimates. However, the LM-based pretest estimator has poor risk properties. The ratio of LM-based pretest estimator root mean squared error (RMSE) to the random parameters logit model estimator RMSE diverges from one with increases in the standard deviation of the parameter distribution. The LR and Wald tests exhibit properties of consistent tests, with the power approaching one as the specification error increases, so that the pretest estimator is consistent. We explore the power of these three tests for the random parameters by calculating the empirical percentile values, size, and rejection rates of the test statistics. We find the power of LR and Wald tests decreases with increases in the mean of the coefficient distribution. The LM test has the weakest power for presence of the random coefficient in the RPL model.

Details

Maximum Simulated Likelihood Methods and Applications
Type: Book
ISBN: 978-0-85724-150-4

Book part
Publication date: 1 December 2008

Lijuan Cao, Zhang Jingqing, Lim Kian Guan and Zhonghui Zhao

This paper studies the pricing of collateralized debt obligation (CDO) using Monte Carlo and analytic methods. Both methods are developed within the framework of the reduced form…

Abstract

This paper studies the pricing of collateralized debt obligation (CDO) using Monte Carlo and analytic methods. Both methods are developed within the framework of the reduced form model. One-factor Gaussian Copula is used for treating default correlations amongst the collateral portfolio. Based on the two methods, the portfolio loss, the expected loss in each CDO tranche, tranche spread, and the default delta sensitivity are analyzed with respect to different parameters such as maturity, default correlation, default intensity or hazard rate, and recovery rate. We provide a careful study of the effects of different parametric impact. Our results show that Monte Carlo method is slow and not robust in the calculation of default delta sensitivity. The analytic approach has comparative advantages for pricing CDO. We also employ empirical data to investigate the implied default correlation and base correlation of the CDO. The implication of extending the analytical approach to incorporating Levy processes is also discussed.

Details

Econometrics and Risk Management
Type: Book
ISBN: 978-1-84855-196-1

Article
Publication date: 3 January 2017

Shuyuan Liu and Tat L. Chan

The purpose of this paper is to study the complex aerosol dynamic processes by using this newly developed stochastically weighted operator splitting Monte Carlo (SWOSMC) method.

Abstract

Purpose

The purpose of this paper is to study the complex aerosol dynamic processes by using this newly developed stochastically weighted operator splitting Monte Carlo (SWOSMC) method.

Design/methodology/approach

Stochastically weighted particle method and operator splitting method are coupled to formulate the SWOSMC method for the numerical simulation of particle-fluid systems undergoing the complex simultaneous processes.

Findings

This SWOSMC method is first validated by comparing its numerical simulation results of constant rate coagulation and linear rate condensation with the corresponding analytical solutions. Coagulation and nucleation cases are further studied whose results are compared with the sectional method in excellent agreement. This SWOSMC method has also demonstrated its high numerical simulation capability when used to deal with simultaneous aerosol dynamic processes including coagulation, nucleation and condensation.

Originality/value

There always exists conflict and tradeoffs between computational cost and accuracy for Monte Carlo-based methods for the numerical simulation of aerosol dynamics. The operator splitting method has been widely used in solving complex partial differential equations, while the stochastic-weighted particle method has been commonly used in numerical simulation of aerosol dynamics. However, the integration of these two methods has not been well investigated.

Details

International Journal of Numerical Methods for Heat & Fluid Flow, vol. 27 no. 1
Type: Research Article
ISSN: 0961-5539

Keywords

Book part
Publication date: 19 November 2014

Garland Durham and John Geweke

Massively parallel desktop computing capabilities now well within the reach of individual academics modify the environment for posterior simulation in fundamental and potentially…

Abstract

Massively parallel desktop computing capabilities now well within the reach of individual academics modify the environment for posterior simulation in fundamental and potentially quite advantageous ways. But to fully exploit these benefits algorithms that conform to parallel computing environments are needed. This paper presents a sequential posterior simulator designed to operate efficiently in this context. The simulator makes fewer analytical and programming demands on investigators, and is faster, more reliable, and more complete than conventional posterior simulators. The paper extends existing sequential Monte Carlo methods and theory to provide a thorough and practical foundation for sequential posterior simulation that is well suited to massively parallel computing environments. It provides detailed recommendations on implementation, yielding an algorithm that requires only code for simulation from the prior and evaluation of prior and data densities and works well in a variety of applications representative of serious empirical work in economics and finance. The algorithm facilitates Bayesian model comparison by producing marginal likelihood approximations of unprecedented accuracy as an incidental by-product, is robust to pathological posterior distributions, and provides estimates of numerical standard error and relative numerical efficiency intrinsically. The paper concludes with an application that illustrates the potential of these simulators for applied Bayesian inference.

Article
Publication date: 5 December 2023

Brahim Chebbab, Haroun Ragueb, Walid Ifrah and Dounya Behnous

This study addresses the reliability of a composite fiber (carbon fibers/epoxy matrix) at microscopic level, with a specific focus on its behavior under compressive stresses. The…

Abstract

Purpose

This study addresses the reliability of a composite fiber (carbon fibers/epoxy matrix) at microscopic level, with a specific focus on its behavior under compressive stresses. The primary goal is to investigate the factors that influence the reliability of the composite, specifically considering the effects of initial fiber deformation and fiber volume fraction.

Design/methodology/approach

The analysis involves a multi-step approach. Initially, micromechanics theory is employed to derive limit state equations that define the stress levels at which the fiber remains within an acceptable range of deformation. To assess the composite's structural reliability, a dedicated code is developed using the Monte Carlo method, incorporating random variables.

Findings

Results highlight the significance of initial fiber deformation and volume fraction on the composite's reliability. They indicate that the level of initial deformation of the fibers plays a crucial role in determining the composite reliability. A fiber with 0.5% initial deformation exhibits the ability to endure up to 28% additional stress compared to a fiber with 1% initial deformation. Conversely, a higher fiber volume fraction contributes positively to the composite's reliability. A composite with 60% fiber content and 0.5% initial deformation can support up to 40% additional stress compared to a composite containing 40% fibers with the same deformation.

Originality/value

The study's originality lies in its comprehensive exploration of the factors affecting the reliability of carbon fiber-epoxy matrix composites under compressive stresses. The integration of micromechanics theory and the Monte Carlo method for structural reliability analysis contributes to a thorough understanding of the composite's behavior. The findings shed light on the critical roles played by initial fiber deformation and fiber volume fraction in determining the overall reliability of the composite. Additionally, the study underscores the importance of careful fiber placement during the manufacturing process and emphasizes the role of volume fraction in ensuring the final product's reliability.

Details

International Journal of Structural Integrity, vol. 15 no. 1
Type: Research Article
ISSN: 1757-9864

Keywords

Article
Publication date: 25 January 2011

A.J. Thomas, J. Chard, E. John, A. Davies and M. Francis

The purpose of this paper is to propose a bearing replacement strategy which employs the Monte Carlo simulation method. In this contribution the method is used to estimate the…

1127

Abstract

Purpose

The purpose of this paper is to propose a bearing replacement strategy which employs the Monte Carlo simulation method. In this contribution the method is used to estimate the economic impact on the selection of a particular bearing change strategy. The simulation demonstrates that it is possible to identify the most cost‐effective approach and thus suggests a suitable bearing replacement policy, which in turn allows engineers to develop the appropriate maintenance schedules for their company.

Design/methodology/approach

The paper develops the Monte Carlo method through a case study approach. Three case studies are presented. The first study is detailed and chronicles the design, development and implementation of the Monte Carlo method as a means of defining a bearing replacement strategy within a subject company. The second and third cases outline the application of the Monte Carlo method in two different environments. These applications made it possible to obtain proof of concept and also to further prove the effectiveness of the Monte Carlo simulation approach.

Findings

An effective development of the Monte Carlo approach is proposed and the effectiveness of the method is subsequently evaluated, highlighting the benefits to the host organization and how the approach led to significant improvement in machinery reliability through a bearing replacement strategy.

Practical implications

The design, development and implementation of a bearing replacement strategy provide a simple yet effective approach to achieving significant improvements in system reliability and performance through less downtime and greater cost savings. The paper offers practising maintenance managers and engineers a strategic framework for increasing productive efficiency and output.

Originality/value

The proposed bearing replacement strategy contributes to the existing knowledge base on maintenance systems and subsequently disseminates this information in order to provide impetus, guidance and support towards increasing the development companies in an attempt to move the UK manufacturing sector towards world‐class manufacturing performance.

Details

International Journal of Quality & Reliability Management, vol. 28 no. 2
Type: Research Article
ISSN: 0265-671X

Keywords

1 – 10 of 417