Search results

1 – 10 of over 3000
Article
Publication date: 15 October 2018

Hesheng Tang, Dawei Li, Lixin Deng and Songtao Xue

This paper aims to develop a comprehensive uncertainty quantification method using evidence theory for Park–Ang damage index-based performance design in which epistemic…

Abstract

Purpose

This paper aims to develop a comprehensive uncertainty quantification method using evidence theory for Park–Ang damage index-based performance design in which epistemic uncertainties are considered. Various sources of uncertainty emanating from the database of the cyclic test results of RC members provided by the Pacific Earthquake Engineering Research Center are taken into account.

Design/methodology/approach

In this paper, an uncertainty quantification methodology based on evidence theory is presented for the whole process of performance-based seismic design (PBSD), while considering uncertainty in the Park–Ang damage model. To alleviate the burden of high computational cost in propagating uncertainty, the differential evolution interval optimization strategy is used for efficiently finding the propagated belief structure throughout the whole design process.

Findings

The investigation results of this paper demonstrate that the uncertainty rooted in Park–Ang damage model have a significant influence on PBSD design and evaluation. It might be worth noting that the epistemic uncertainty present in the Park–Ang damage model needs to be considered to avoid underestimating the true uncertainty.

Originality/value

This paper presents an evidence theory-based uncertainty quantification framework for the whole process of PBSD.

Details

Engineering Computations, vol. 35 no. 7
Type: Research Article
ISSN: 0264-4401

Keywords

Article
Publication date: 11 November 2013

Giovanni Petrone, John Axerio-Cilies, Domenico Quagliarella and Gianluca Iaccarino

A probabilistic non-dominated sorting genetic algorithm (P-NSGA) for multi-objective optimization under uncertainty is presented. The purpose of this algorithm is to create a…

Abstract

Purpose

A probabilistic non-dominated sorting genetic algorithm (P-NSGA) for multi-objective optimization under uncertainty is presented. The purpose of this algorithm is to create a tight coupling between the optimization and uncertainty procedures, use all of the possible probabilistic information to drive the optimizer, and leverage high-performance parallel computing.

Design/methodology/approach

This algorithm is a generalization of a classical genetic algorithm for multi-objective optimization (NSGA-II) by Deb et al. The proposed algorithm relies on the use of all possible information in the probabilistic domain summarized by the cumulative distribution functions (CDFs) of the objective functions. Several analytic test functions are used to benchmark this algorithm, but only the results of the Fonseca-Fleming test function are shown. An industrial application is presented to show that P-NSGA can be used for multi-objective shape optimization of a Formula 1 tire brake duct, taking into account the geometrical uncertainties associated with the rotating rubber tire and uncertain inflow conditions.

Findings

This algorithm is shown to have deterministic consistency (i.e. it turns back to the original NSGA-II) when the objective functions are deterministic. When the quality of the CDF is increased (either using more points or higher fidelity resolution), the convergence behavior improves. Since all the information regarding uncertainty quantification is preserved, all the different types of Pareto fronts that exist in the probabilistic framework (e.g. mean value Pareto, mean value penalty Pareto, etc.) are shown to be generated a posteriori. An adaptive sampling approach and parallel computing (in both the uncertainty and optimization algorithms) are shown to have several fold speed-up in selecting optimal solutions under uncertainty.

Originality/value

There are no existing algorithms that use the full probabilistic distribution to guide the optimizer. The method presented herein bases its sorting on real function evaluations, not merely measures (i.e. mean of the probabilistic distribution) that potentially do not exist.

Details

Engineering Computations, vol. 30 no. 8
Type: Research Article
ISSN: 0264-4401

Keywords

Article
Publication date: 22 March 2022

Zhanpeng Shen, Chaoping Zang, Xueqian Chen, Shaoquan Hu and Xin-en Liu

For fast calculation of complex structure in engineering, correlations among input variables are often ignored in uncertainty propagation, even though the effect of ignoring these…

Abstract

Purpose

For fast calculation of complex structure in engineering, correlations among input variables are often ignored in uncertainty propagation, even though the effect of ignoring these correlations on the output uncertainty is unclear. This paper aims to quantify the inputs uncertainty and estimate the correlations among them acorrding to the collected observed data instead of questionable assumptions. Moreover, the small size of the experimental data should also be considered, as it is such a common engineering problem.

Design/methodology/approach

In this paper, a novel method of combining p-box with copula function for both uncertainty quantification and correlation estimation is explored. Copula function is utilized to estimate correlations among uncertain inputs based upon the observed data. The p-box method is employed to quantify the input uncertainty as well as the epistemic uncertainty associated with the limited amount of the observed data. Nested Monte Carlo sampling technique is adopted herein to ensure that the propagation is always feasible. In addition, a Kriging model is built up to reduce the computational cost of uncertainty propagation.

Findings

To illustrate the application of this method, an engineering example of structural reliability assessment is performed. The results indicate that it may significantly affect output uncertainty whether to quantify the correlation among input variables. Furthermore, an additional advantage for risk management is obtained in this approach due to the separation of aleatory and epistemic uncertainties.

Originality/value

The proposed method takes advantage of p-box and copula function to deal with the correlations and limited amount of the observed data, which are two important issues of uncertainty quantification in engineering. Thus, it is practical and has the ability to predict accurate response uncertainty or system state.

Details

Engineering Computations, vol. 39 no. 6
Type: Research Article
ISSN: 0264-4401

Keywords

Article
Publication date: 4 September 2018

Muhannad Aldosary, Jinsheng Wang and Chenfeng Li

This paper aims to provide a comprehensive review of uncertainty quantification methods supported by evidence-based comparison studies. Uncertainties are widely encountered in…

Abstract

Purpose

This paper aims to provide a comprehensive review of uncertainty quantification methods supported by evidence-based comparison studies. Uncertainties are widely encountered in engineering practice, arising from such diverse sources as heterogeneity of materials, variability in measurement, lack of data and ambiguity in knowledge. Academia and industries have long been researching for uncertainty quantification (UQ) methods to quantitatively account for the effects of various input uncertainties on the system response. Despite the rich literature of relevant research, UQ is not an easy subject for novice researchers/practitioners, where many different methods and techniques coexist with inconsistent input/output requirements and analysis schemes.

Design/methodology/approach

This confusing status significantly hampers the research progress and practical application of UQ methods in engineering. In the context of engineering analysis, the research efforts of UQ are most focused in two largely separate research fields: structural reliability analysis (SRA) and stochastic finite element method (SFEM). This paper provides a state-of-the-art review of SRA and SFEM, covering both technology and application aspects. Moreover, unlike standard survey papers that focus primarily on description and explanation, a thorough and rigorous comparative study is performed to test all UQ methods reviewed in the paper on a common set of reprehensive examples.

Findings

Over 20 uncertainty quantification methods in the fields of structural reliability analysis and stochastic finite element methods are reviewed and rigorously tested on carefully designed numerical examples. They include FORM/SORM, importance sampling, subset simulation, response surface method, surrogate methods, polynomial chaos expansion, perturbation method, stochastic collocation method, etc. The review and comparison tests comment and conclude not only on accuracy and efficiency of each method but also their applicability in different types of uncertainty propagation problems.

Originality/value

The research fields of structural reliability analysis and stochastic finite element methods have largely been developed separately, although both tackle uncertainty quantification in engineering problems. For the first time, all major uncertainty quantification methods in both fields are reviewed and rigorously tested on a common set of examples. Critical opinions and concluding remarks are drawn from the rigorous comparative study, providing objective evidence-based information for further research and practical applications.

Details

Engineering Computations, vol. 35 no. 6
Type: Research Article
ISSN: 0264-4401

Keywords

Book part
Publication date: 12 September 2022

Edward E. Rigdon and Marko Sarstedt

The assumption that a set of observed variables is a function of an underlying common factor plus some error has dominated measurement in marketing and the social sciences in…

Abstract

The assumption that a set of observed variables is a function of an underlying common factor plus some error has dominated measurement in marketing and the social sciences in general for decades. This view of measurement comes with assumptions, which, however, are rarely discussed in research. In this article, we question the legitimacy of several of these assumptions, arguing that (1) the common factor model is rarely correct in the population, (2) the common factor does not correspond to the quantity the researcher intends to measure, and (3) the measurement error does not fully capture the uncertainty associated with measurement. Our discussions call for a fundamental rethinking of measurement in the social sciences. Adapting an uncertainty-centric approach to measurement, which has become the norm in in the physical sciences, offers a means to address the limitations of current measurement practice in marketing.

Details

Measurement in Marketing
Type: Book
ISBN: 978-1-80043-631-2

Keywords

Article
Publication date: 2 August 2021

Elham Mahmoudi, Marcel Stepien and Markus König

A principle prerequisite for designing and constructing an underground structure is to estimate the subsurface's properties and obtain a realistic picture of stratigraphy…

Abstract

Purpose

A principle prerequisite for designing and constructing an underground structure is to estimate the subsurface's properties and obtain a realistic picture of stratigraphy. Obtaining direct measure of these values in any location of the built environment is not affordable. Therefore, any evaluation is afflicted with uncertainty, and we need to combine all available measurements, observations and previous knowledge to achieve an informed estimate and quantify the involved uncertainties. This study aims to enhance the geotechnical surveys based on a spatial estimation of subsoil to customised data structures and integrating the ground models into digital design environments.

Design/methodology/approach

The present study's objective is to enhance the geotechnical surveys based on a spatial estimation of subsoil to customised data structures and integrating the ground models into digital design environments. A ground model consisting of voxels is developed via Revit-Dynamo to represent spatial uncertainties employing the kriging interpolation method. The local arrangement of new surveys are evaluated to be optimised.

Findings

The visualisation model's computational performance is modified by using an octree structure. The results show that it adapts the structure to be modelled more efficiently. The proposed concept can identify the geological models' risky locations for further geological investigations and reveal an optimised experimental design. The modifications criteria are defined in global and local considerations.

Originality/value

It provides a transparent and repeatable approach to construct a spatial ground model for subsequent experimental or numerical analysis. In the first attempt, the ground model was discretised by a grid of voxels. In general, the required computing time primarily depends on the size of the voxels. This issue is addressed by implementing octree voxels to reduce the computational efforts. This applies especially to the cases that a higher resolution is required. The investigations using a synthetic soil model showed that the developed methodology fulfilled the kriging method's requirements. The effects of variogram parameters, such as the range and the covariance function, were investigated based on some parameter studies. Moreover, a synthetic model is used to demonstrate the optimal experimental design concept. Through the implementation, alternative locations for new boreholes are generated, and their uncertainties are quantified. The impact of the new borehole on the uncertainty measures are quantified based on local and global approaches. For further research to identify the geological models' risky spots, the development of this approach with additional criteria regarding the search neighbourhood and consideration of barriers and trends in real cases (by employing different interpolation methodologies) should be considered.

Details

Smart and Sustainable Built Environment, vol. 10 no. 3
Type: Research Article
ISSN: 2046-6099

Keywords

Article
Publication date: 23 August 2013

Angeles Saavedra, Elena Arce, Jose Luis Miguez and Enrique Granada

The purpose of this paper is to propose an interpretation of the grey relational grade taking into account its variation range on the basis of the error propagation theory.

127

Abstract

Purpose

The purpose of this paper is to propose an interpretation of the grey relational grade taking into account its variation range on the basis of the error propagation theory.

Design/methodology/approach

The paper uses error propagation theory to calculate the uncertainty of the grey relational grade, exploring how errors are propagated through the sequential operations of the grey relational analysis.

Findings

The non‐consideration of the error associated to the measurement of the experimental data that is transferred to the grey relational grade may have a potential effect on the interpretation of the grey relational rank. Data uncertainty quantification provides information about how well measurement fits to the value of the measured quantity and determines its validity. Therefore, this might lead one to consider that some sequences are less attractive than other lower‐ranked ones.

Practical implications

The combination of the grey and error propagation theories is a tool to choose the most accurate solution in grey relational grade ranks.

Originality/value

This study provides a new approach to interpret grey relational grade classifications.

Details

Grey Systems: Theory and Application, vol. 3 no. 2
Type: Research Article
ISSN: 2043-9377

Keywords

Article
Publication date: 18 October 2018

Lei Wang, Haijun Xia, Yaowen Yang, Yiru Cai and Zhiping Qiu

The purpose of this paper is to propose a novel non-probabilistic reliability-based topology optimization (NRBTO) method for continuum structural design under interval…

Abstract

Purpose

The purpose of this paper is to propose a novel non-probabilistic reliability-based topology optimization (NRBTO) method for continuum structural design under interval uncertainties of load and material parameters based on the technology of 3D printing or additive manufacturing.

Design/methodology/approach

First, the uncertainty quantification analysis is accomplished by interval Taylor extension to determine boundary rules of concerned displacement responses. Based on the interval interference theory, a novel reliability index, named as the optimization feature distance, is then introduced to construct non-probabilistic reliability constraints. To circumvent convergence difficulties in solving large-scale variable optimization problems, the gradient-based method of moving asymptotes is also used, in which the sensitivity expressions of the present reliability measurements with respect to design variables are deduced by combination of the adjoint vector scheme and interval mathematics.

Findings

The main findings of this paper should lie in that new non-probabilistic reliability index, i.e. the optimization feature distance which is defined and further incorporated in continuum topology optimization issues. Besides, a novel concurrent design strategy under consideration of macro-micro integration is presented by using the developed RBTO methodology.

Originality/value

Uncertainty propagation analysis based on the interval Taylor extension method is conducted. Novel reliability index of the optimization feature distance is defined. Expressions of the adjoint vectors between interval bounds of displacement responses and the relative density are deduced. New NRBTO method subjected to continuum structures is developed and further solved by MMA algorithms.

Book part
Publication date: 18 October 2019

Edward George, Purushottam Laud, Brent Logan, Robert McCulloch and Rodney Sparapani

Bayesian additive regression trees (BART) is a fully Bayesian approach to modeling with ensembles of trees. BART can uncover complex regression functions with high-dimensional…

Abstract

Bayesian additive regression trees (BART) is a fully Bayesian approach to modeling with ensembles of trees. BART can uncover complex regression functions with high-dimensional regressors in a fairly automatic way and provide Bayesian quantification of the uncertainty through the posterior. However, BART assumes independent and identical distributed (i.i.d) normal errors. This strong parametric assumption can lead to misleading inference and uncertainty quantification. In this chapter we use the classic Dirichlet process mixture (DPM) mechanism to nonparametrically model the error distribution. A key strength of BART is that default prior settings work reasonably well in a variety of problems. The challenge in extending BART is to choose the parameters of the DPM so that the strengths of the standard BART approach is not lost when the errors are close to normal, but the DPM has the ability to adapt to non-normal errors.

Details

Topics in Identification, Limited Dependent Variables, Partial Observability, Experimentation, and Flexible Modeling: Part B
Type: Book
ISBN: 978-1-83867-419-9

Open Access
Article
Publication date: 21 December 2021

Vahid Badeli, Sascha Ranftl, Gian Marco Melito, Alice Reinbacher-Köstinger, Wolfgang Von Der Linden, Katrin Ellermann and Oszkar Biro

This paper aims to introduce a non-invasive and convenient method to detect a life-threatening disease called aortic dissection. A Bayesian inference based on enhanced…

Abstract

Purpose

This paper aims to introduce a non-invasive and convenient method to detect a life-threatening disease called aortic dissection. A Bayesian inference based on enhanced multi-sensors impedance cardiography (ICG) method has been applied to classify signals from healthy and sick patients.

Design/methodology/approach

A 3D numerical model consisting of simplified organ geometries is used to simulate the electrical impedance changes in the ICG-relevant domain of the human torso. The Bayesian probability theory is used for detecting an aortic dissection, which provides information about the probabilities for both cases, a dissected and a healthy aorta. Thus, the reliability and the uncertainty of the disease identification are found by this method and may indicate further diagnostic clarification.

Findings

The Bayesian classification shows that the enhanced multi-sensors ICG is more reliable in detecting aortic dissection than conventional ICG. Bayesian probability theory allows a rigorous quantification of all uncertainties to draw reliable conclusions for the medical treatment of aortic dissection.

Originality/value

This paper presents a non-invasive and reliable method based on a numerical simulation that could be beneficial for the medical management of aortic dissection patients. With this method, clinicians would be able to monitor the patient’s status and make better decisions in the treatment procedure of each patient.

Details

COMPEL - The international journal for computation and mathematics in electrical and electronic engineering , vol. 41 no. 3
Type: Research Article
ISSN: 0332-1649

Keywords

1 – 10 of over 3000