Search results

1 – 10 of over 48000
Article
Publication date: 4 September 2018

Muhannad Aldosary, Jinsheng Wang and Chenfeng Li

This paper aims to provide a comprehensive review of uncertainty quantification methods supported by evidence-based comparison studies. Uncertainties are widely encountered in…

Abstract

Purpose

This paper aims to provide a comprehensive review of uncertainty quantification methods supported by evidence-based comparison studies. Uncertainties are widely encountered in engineering practice, arising from such diverse sources as heterogeneity of materials, variability in measurement, lack of data and ambiguity in knowledge. Academia and industries have long been researching for uncertainty quantification (UQ) methods to quantitatively account for the effects of various input uncertainties on the system response. Despite the rich literature of relevant research, UQ is not an easy subject for novice researchers/practitioners, where many different methods and techniques coexist with inconsistent input/output requirements and analysis schemes.

Design/methodology/approach

This confusing status significantly hampers the research progress and practical application of UQ methods in engineering. In the context of engineering analysis, the research efforts of UQ are most focused in two largely separate research fields: structural reliability analysis (SRA) and stochastic finite element method (SFEM). This paper provides a state-of-the-art review of SRA and SFEM, covering both technology and application aspects. Moreover, unlike standard survey papers that focus primarily on description and explanation, a thorough and rigorous comparative study is performed to test all UQ methods reviewed in the paper on a common set of reprehensive examples.

Findings

Over 20 uncertainty quantification methods in the fields of structural reliability analysis and stochastic finite element methods are reviewed and rigorously tested on carefully designed numerical examples. They include FORM/SORM, importance sampling, subset simulation, response surface method, surrogate methods, polynomial chaos expansion, perturbation method, stochastic collocation method, etc. The review and comparison tests comment and conclude not only on accuracy and efficiency of each method but also their applicability in different types of uncertainty propagation problems.

Originality/value

The research fields of structural reliability analysis and stochastic finite element methods have largely been developed separately, although both tackle uncertainty quantification in engineering problems. For the first time, all major uncertainty quantification methods in both fields are reviewed and rigorously tested on a common set of examples. Critical opinions and concluding remarks are drawn from the rigorous comparative study, providing objective evidence-based information for further research and practical applications.

Details

Engineering Computations, vol. 35 no. 6
Type: Research Article
ISSN: 0264-4401

Keywords

Article
Publication date: 5 October 2012

I. Doltsinis

The purpose of this paper is to expose computational methods as applied to engineering systems and evolutionary processes with randomness in external actions and inherent…

Abstract

Purpose

The purpose of this paper is to expose computational methods as applied to engineering systems and evolutionary processes with randomness in external actions and inherent parameters.

Design/methodology/approach

In total, two approaches are distinguished that rely on solvers from deterministic algorithms. Probabilistic analysis is referred to as the approximation of the response by a Taylor series expansion about the mean input. Alternatively, stochastic simulation implies random sampling of the input and statistical evaluation of the output.

Findings

Beyond the characterization of random response, methods of reliability assessment are discussed. Concepts of design improvement are presented. Optimization for robustness diminishes the sensitivity of the system to fluctuating parameters.

Practical implications

Deterministic algorithms available for the primary problem are utilized for stochastic analysis by statistical Monte Carlo sampling. The computational effort for the repeated solution of the primary problem depends on the variability of the system and is usually high. Alternatively, the analytic Taylor series expansion requires extension of the primary solver to the computation of derivatives of the response with respect to the random input. The method is restricted to the computation of output mean values and variances/covariances, with the effort determined by the amount of the random input. The results of the two methods are comparable within the domain of applicability.

Originality/value

The present account addresses the main issues related to the presence of randomness in engineering systems and processes. They comprise the analysis of stochastic systems, reliability, design improvement, optimization and robustness against randomness of the data. The analytical Taylor approach is contrasted to the statistical Monte Carlo sampling throughout. In both cases, algorithms known from the primary, deterministic problem are the starting point of stochastic treatment. The reader benefits from the comprehensive presentation of the matter in a concise manner.

Article
Publication date: 16 April 2018

Yan Zhao, L.T. Si and H. Ouyang

A novel frequency domain approach, which combines the pseudo excitation method modified by the authors and multi-domain Fourier transform (PEM-FT), is proposed for analyzing…

Abstract

Purpose

A novel frequency domain approach, which combines the pseudo excitation method modified by the authors and multi-domain Fourier transform (PEM-FT), is proposed for analyzing nonstationary random vibration in this paper.

Design/methodology/approach

For a structure subjected to a nonstationary random excitation, the closed-form solution of evolutionary power spectral density of the response is derived in frequency domain.

Findings

The deterministic process and random process in an evolutionary spectrum are separated effectively using this method during the analysis of nonstationary random vibration of a linear damped system, only modulation function of the system needs to be estimated, which brings about a large saving in computational time.

Originality/value

The method is general and highly flexible as it can deal with various damping types and nonstationary random excitations with different modulation functions.

Details

Engineering Computations, vol. 35 no. 2
Type: Research Article
ISSN: 0264-4401

Keywords

Book part
Publication date: 3 June 2008

Nathaniel T. Wilcox

Choice under risk has a large stochastic (unpredictable) component. This chapter examines five stochastic models for binary discrete choice under risk and how they combine with…

Abstract

Choice under risk has a large stochastic (unpredictable) component. This chapter examines five stochastic models for binary discrete choice under risk and how they combine with “structural” theories of choice under risk. Stochastic models are substantive theoretical hypotheses that are frequently testable in and of themselves, and also identifying restrictions for hypothesis tests, estimation and prediction. Econometric comparisons suggest that for the purpose of prediction (as opposed to explanation), choices of stochastic models may be far more consequential than choices of structures such as expected utility or rank-dependent utility.

Details

Risk Aversion in Experiments
Type: Book
ISBN: 978-1-84950-547-5

Article
Publication date: 13 November 2018

Xuchun Ren and Sharif Rahman

This paper aims to present a new method, named as augmented polynomial dimensional decomposition (PDD) method, for robust design optimization (RDO) and reliability-based design…

Abstract

Purpose

This paper aims to present a new method, named as augmented polynomial dimensional decomposition (PDD) method, for robust design optimization (RDO) and reliability-based design optimization (RBDO) subject to mixed design variables comprising both distributional and structural design variables.

Design/methodology/approach

The method involves a new augmented PDD of a high-dimensional stochastic response for statistical moments and reliability analyses; an integration of the augmented PDD, score functions, and finite-difference approximation for calculating the sensitivities of the first two moments and the failure probability with respect to distributional and structural design variables; and standard gradient-based optimization algorithms.

Findings

New closed-form formulae are presented for the design sensitivities of moments that are simultaneously determined along with the moments. A finite-difference approximation integrated with the embedded Monte Carlo simulation of the augmented PDD is put forward for design sensitivities of the failure probability.

Originality/value

In conjunction with the multi-point, single-step design process, the new method provides an efficient means to solve a general stochastic design problem entailing mixed design variables with a large design space. Numerical results, including a three-hole bracket design, indicate that the proposed methods provide accurate and computationally efficient sensitivity estimates and optimal solutions for RDO and RBDO problems.

Article
Publication date: 1 April 1973

W.J. PADGETT and CHRIS P. TSOKOS

A stochastic discrete Fredholm system in the form xn(ω) = hn(ω) + ∞Σj=1 Cn,j(ω)ƒj(xj(ω)), n = 1,2,…, is studied, where ω ∈ Ω, the supporting set of a complete probability measure…

Abstract

A stochastic discrete Fredholm system in the form xn(ω) = hn(ω) + ∞Σj=1 Cn,j(ω)ƒj(xj(ω)), n = 1,2,…, is studied, where ω ∈ Ω, the supporting set of a complete probability measure space (Ω,A,P). A random solution of the discrete system is defined to be a discrete parameter second order stochastic process xn(ω) that satisfies the equation almost surely. The stochastic geometric stability of xn(ω) is defined, and conditions are given under which the random solution has this property. A finite system which approximates the infinite system is given, and it is shown that under certain conditions the unique random solution of the approximating system converges to the unique random solution of the infinite system.

Details

Kybernetes, vol. 2 no. 4
Type: Research Article
ISSN: 0368-492X

Article
Publication date: 17 July 2009

Emmanuel Blanchard, Adrian Sandu and Corina Sandu

The purpose of this paper is to propose a new computational approach for parameter estimation in the Bayesian framework. A posteriori probability density functions are obtained…

Abstract

Purpose

The purpose of this paper is to propose a new computational approach for parameter estimation in the Bayesian framework. A posteriori probability density functions are obtained using the polynomial chaos theory for propagating uncertainties through system dynamics. The new method has the advantage of being able to deal with large parametric uncertainties, non‐Gaussian probability densities and nonlinear dynamics.

Design/methodology/approach

The maximum likelihood estimates are obtained by minimizing a cost function derived from the Bayesian theorem. Direct stochastic collocation is used as a less computationally expensive alternative to the traditional Galerkin approach to propagate the uncertainties through the system in the polynomial chaos framework.

Findings

The new approach is explained and is applied to very simple mechanical systems in order to illustrate how the Bayesian cost function can be affected by the noise level in the measurements, by undersampling, non‐identifiablily of the system, non‐observability and by excitation signals that are not rich enough. When the system is non‐identifiable and an a priori knowledge of the parameter uncertainties is available, regularization techniques can still yield most likely values among the possible combinations of uncertain parameters resulting in the same time responses than the ones observed.

Originality/value

The polynomial chaos method has been shown to be considerably more efficient than Monte Carlo in the simulation of systems with a small number of uncertain parameters. This is believed to be the first time the polynomial chaos theory has been applied to Bayesian estimation.

Details

Engineering Computations, vol. 26 no. 5
Type: Research Article
ISSN: 0264-4401

Keywords

Abstract

Details

Review of Marketing Research
Type: Book
ISBN: 978-0-85724-726-1

Article
Publication date: 25 January 2011

Tirthankar Ghosh and Dilip Roy

The main purpose of this paper is to consider the role of discretization of random variables in analyzing statistical tolerancing, and to propose a new discretizing method along…

Abstract

Purpose

The main purpose of this paper is to consider the role of discretization of random variables in analyzing statistical tolerancing, and to propose a new discretizing method along with a study on its usefulness.

Design/methodology/approach

The approach for discretization of a continuous distribution is based on the concept of moment equalization with the original random variable, conditionally given a set of points of realization. For the purpose of demonstration the normal distribution has been discretized into seven points. Application of the discretization method in approximating the distribution/survival function of a complex system has also been studied. Numerical analysis on two important engineering items has been undertaken and the closeness between the values of the distribution/survival functions obtained by simulation and the proposed method has been examined to indicate the advantage of the proposed approach.

Findings

A comparative study with the earlier reported discretizing methods indicates that the proposed method, which is easy to implement, provides better results for most of the cases studied in this work.

Research limitations/implications

Using the proposed approach one can approximate the probability distribution of a complex system with random component values, which cannot be analytically expressed.

Practical implications

This paper is able to provide a new direction in reliability management research, because it can be used for product design of many important engineering items such as solid‐shaft, hollow cylinder, torsion bar, I‐beam etc.

Originality/value

This research gives a new linear method of discretization. It gives better results than the existing discretization methods of Experimental design, Moment equalization, and Discrete Concentration for reliability (survival probability) determination of solid‐shaft and power resistor.

Details

International Journal of Quality & Reliability Management, vol. 28 no. 2
Type: Research Article
ISSN: 0265-671X

Keywords

Article
Publication date: 1 March 2004

E. Chiodo, F. Gagliardi and M. Pagano

The aim of this paper is to show the connections among uncertainty, information and human knowledge to develop methodologies able to support actions for measure and control of…

Abstract

The aim of this paper is to show the connections among uncertainty, information and human knowledge to develop methodologies able to support actions for measure and control of complex processes, and to propose new model to represent human hazard rate. The interest to human reliability analyses (HRA) arose for nuclear applications, observing that 50‐70 per cent of reported failures on operating systems were human‐induced. Since the middle of 1980, methods and tools of HRA have been transferred former to military weapons systems, latter to aviation designs and operations. At present, HRA, which consider human performance and human reliability knowledge, must be an integral element of complex system design and development. In this paper, system reliability function is carried out as a function of technological, information and human components, evidencing how human element affects the whole system reliability. On the basis of consideration that human errors are often the most unexpected and then the least protected, and subject to many random factors, an analytical model is proposed, based on a conditional Weibull hazard rate with a random scale parameter, for whose characterization the log‐normal, gamma and the inverse Gaussian distributions are considered. The aim of this model is to take into account random variability of human performances by introducing a random hazard rate.

Details

COMPEL - The international journal for computation and mathematics in electrical and electronic engineering, vol. 23 no. 1
Type: Research Article
ISSN: 0332-1649

Keywords

1 – 10 of over 48000