Search results

1 – 10 of over 2000
To view the access options for this content please click here
Article
Publication date: 15 October 2018

Hesheng Tang, Dawei Li, Lixin Deng and Songtao Xue

This paper aims to develop a comprehensive uncertainty quantification method using evidence theory for Park–Ang damage index-based performance design in which epistemic…

Abstract

Purpose

This paper aims to develop a comprehensive uncertainty quantification method using evidence theory for Park–Ang damage index-based performance design in which epistemic uncertainties are considered. Various sources of uncertainty emanating from the database of the cyclic test results of RC members provided by the Pacific Earthquake Engineering Research Center are taken into account.

Design/methodology/approach

In this paper, an uncertainty quantification methodology based on evidence theory is presented for the whole process of performance-based seismic design (PBSD), while considering uncertainty in the Park–Ang damage model. To alleviate the burden of high computational cost in propagating uncertainty, the differential evolution interval optimization strategy is used for efficiently finding the propagated belief structure throughout the whole design process.

Findings

The investigation results of this paper demonstrate that the uncertainty rooted in Park–Ang damage model have a significant influence on PBSD design and evaluation. It might be worth noting that the epistemic uncertainty present in the Park–Ang damage model needs to be considered to avoid underestimating the true uncertainty.

Originality/value

This paper presents an evidence theory-based uncertainty quantification framework for the whole process of PBSD.

Details

Engineering Computations, vol. 35 no. 7
Type: Research Article
ISSN: 0264-4401

Keywords

To view the access options for this content please click here
Article
Publication date: 11 November 2013

Giovanni Petrone, John Axerio-Cilies, Domenico Quagliarella and Gianluca Iaccarino

A probabilistic non-dominated sorting genetic algorithm (P-NSGA) for multi-objective optimization under uncertainty is presented. The purpose of this algorithm is to…

Abstract

Purpose

A probabilistic non-dominated sorting genetic algorithm (P-NSGA) for multi-objective optimization under uncertainty is presented. The purpose of this algorithm is to create a tight coupling between the optimization and uncertainty procedures, use all of the possible probabilistic information to drive the optimizer, and leverage high-performance parallel computing.

Design/methodology/approach

This algorithm is a generalization of a classical genetic algorithm for multi-objective optimization (NSGA-II) by Deb et al. The proposed algorithm relies on the use of all possible information in the probabilistic domain summarized by the cumulative distribution functions (CDFs) of the objective functions. Several analytic test functions are used to benchmark this algorithm, but only the results of the Fonseca-Fleming test function are shown. An industrial application is presented to show that P-NSGA can be used for multi-objective shape optimization of a Formula 1 tire brake duct, taking into account the geometrical uncertainties associated with the rotating rubber tire and uncertain inflow conditions.

Findings

This algorithm is shown to have deterministic consistency (i.e. it turns back to the original NSGA-II) when the objective functions are deterministic. When the quality of the CDF is increased (either using more points or higher fidelity resolution), the convergence behavior improves. Since all the information regarding uncertainty quantification is preserved, all the different types of Pareto fronts that exist in the probabilistic framework (e.g. mean value Pareto, mean value penalty Pareto, etc.) are shown to be generated a posteriori. An adaptive sampling approach and parallel computing (in both the uncertainty and optimization algorithms) are shown to have several fold speed-up in selecting optimal solutions under uncertainty.

Originality/value

There are no existing algorithms that use the full probabilistic distribution to guide the optimizer. The method presented herein bases its sorting on real function evaluations, not merely measures (i.e. mean of the probabilistic distribution) that potentially do not exist.

Details

Engineering Computations, vol. 30 no. 8
Type: Research Article
ISSN: 0264-4401

Keywords

To view the access options for this content please click here
Article
Publication date: 4 September 2018

Muhannad Aldosary, Jinsheng Wang and Chenfeng Li

This paper aims to provide a comprehensive review of uncertainty quantification methods supported by evidence-based comparison studies. Uncertainties are widely…

Abstract

Purpose

This paper aims to provide a comprehensive review of uncertainty quantification methods supported by evidence-based comparison studies. Uncertainties are widely encountered in engineering practice, arising from such diverse sources as heterogeneity of materials, variability in measurement, lack of data and ambiguity in knowledge. Academia and industries have long been researching for uncertainty quantification (UQ) methods to quantitatively account for the effects of various input uncertainties on the system response. Despite the rich literature of relevant research, UQ is not an easy subject for novice researchers/practitioners, where many different methods and techniques coexist with inconsistent input/output requirements and analysis schemes.

Design/methodology/approach

This confusing status significantly hampers the research progress and practical application of UQ methods in engineering. In the context of engineering analysis, the research efforts of UQ are most focused in two largely separate research fields: structural reliability analysis (SRA) and stochastic finite element method (SFEM). This paper provides a state-of-the-art review of SRA and SFEM, covering both technology and application aspects. Moreover, unlike standard survey papers that focus primarily on description and explanation, a thorough and rigorous comparative study is performed to test all UQ methods reviewed in the paper on a common set of reprehensive examples.

Findings

Over 20 uncertainty quantification methods in the fields of structural reliability analysis and stochastic finite element methods are reviewed and rigorously tested on carefully designed numerical examples. They include FORM/SORM, importance sampling, subset simulation, response surface method, surrogate methods, polynomial chaos expansion, perturbation method, stochastic collocation method, etc. The review and comparison tests comment and conclude not only on accuracy and efficiency of each method but also their applicability in different types of uncertainty propagation problems.

Originality/value

The research fields of structural reliability analysis and stochastic finite element methods have largely been developed separately, although both tackle uncertainty quantification in engineering problems. For the first time, all major uncertainty quantification methods in both fields are reviewed and rigorously tested on a common set of examples. Critical opinions and concluding remarks are drawn from the rigorous comparative study, providing objective evidence-based information for further research and practical applications.

Details

Engineering Computations, vol. 35 no. 6
Type: Research Article
ISSN: 0264-4401

Keywords

To view the access options for this content please click here
Article
Publication date: 23 August 2013

Angeles Saavedra, Elena Arce, Jose Luis Miguez and Enrique Granada

The purpose of this paper is to propose an interpretation of the grey relational grade taking into account its variation range on the basis of the error propagation theory.

Abstract

Purpose

The purpose of this paper is to propose an interpretation of the grey relational grade taking into account its variation range on the basis of the error propagation theory.

Design/methodology/approach

The paper uses error propagation theory to calculate the uncertainty of the grey relational grade, exploring how errors are propagated through the sequential operations of the grey relational analysis.

Findings

The non‐consideration of the error associated to the measurement of the experimental data that is transferred to the grey relational grade may have a potential effect on the interpretation of the grey relational rank. Data uncertainty quantification provides information about how well measurement fits to the value of the measured quantity and determines its validity. Therefore, this might lead one to consider that some sequences are less attractive than other lower‐ranked ones.

Practical implications

The combination of the grey and error propagation theories is a tool to choose the most accurate solution in grey relational grade ranks.

Originality/value

This study provides a new approach to interpret grey relational grade classifications.

Details

Grey Systems: Theory and Application, vol. 3 no. 2
Type: Research Article
ISSN: 2043-9377

Keywords

To view the access options for this content please click here
Article
Publication date: 18 October 2018

Lei Wang, Haijun Xia, Yaowen Yang, Yiru Cai and Zhiping Qiu

The purpose of this paper is to propose a novel non-probabilistic reliability-based topology optimization (NRBTO) method for continuum structural design under interval…

Abstract

Purpose

The purpose of this paper is to propose a novel non-probabilistic reliability-based topology optimization (NRBTO) method for continuum structural design under interval uncertainties of load and material parameters based on the technology of 3D printing or additive manufacturing.

Design/methodology/approach

First, the uncertainty quantification analysis is accomplished by interval Taylor extension to determine boundary rules of concerned displacement responses. Based on the interval interference theory, a novel reliability index, named as the optimization feature distance, is then introduced to construct non-probabilistic reliability constraints. To circumvent convergence difficulties in solving large-scale variable optimization problems, the gradient-based method of moving asymptotes is also used, in which the sensitivity expressions of the present reliability measurements with respect to design variables are deduced by combination of the adjoint vector scheme and interval mathematics.

Findings

The main findings of this paper should lie in that new non-probabilistic reliability index, i.e. the optimization feature distance which is defined and further incorporated in continuum topology optimization issues. Besides, a novel concurrent design strategy under consideration of macro-micro integration is presented by using the developed RBTO methodology.

Originality/value

Uncertainty propagation analysis based on the interval Taylor extension method is conducted. Novel reliability index of the optimization feature distance is defined. Expressions of the adjoint vectors between interval bounds of displacement responses and the relative density are deduced. New NRBTO method subjected to continuum structures is developed and further solved by MMA algorithms.

To view the access options for this content please click here
Book part
Publication date: 18 October 2019

Edward George, Purushottam Laud, Brent Logan, Robert McCulloch and Rodney Sparapani

Bayesian additive regression trees (BART) is a fully Bayesian approach to modeling with ensembles of trees. BART can uncover complex regression functions with…

Abstract

Bayesian additive regression trees (BART) is a fully Bayesian approach to modeling with ensembles of trees. BART can uncover complex regression functions with high-dimensional regressors in a fairly automatic way and provide Bayesian quantification of the uncertainty through the posterior. However, BART assumes independent and identical distributed (i.i.d) normal errors. This strong parametric assumption can lead to misleading inference and uncertainty quantification. In this chapter we use the classic Dirichlet process mixture (DPM) mechanism to nonparametrically model the error distribution. A key strength of BART is that default prior settings work reasonably well in a variety of problems. The challenge in extending BART is to choose the parameters of the DPM so that the strengths of the standard BART approach is not lost when the errors are close to normal, but the DPM has the ability to adapt to non-normal errors.

Details

Topics in Identification, Limited Dependent Variables, Partial Observability, Experimentation, and Flexible Modeling: Part B
Type: Book
ISBN: 978-1-83867-419-9

To view the access options for this content please click here
Article
Publication date: 30 August 2019

Gonçalo das Neves Carneiro and Carlos Conceição António

In the reliability assessment of composite laminate structures with multiple components, the uncertainty space defined around design solutions easily becomes…

Abstract

Purpose

In the reliability assessment of composite laminate structures with multiple components, the uncertainty space defined around design solutions easily becomes over-dimensioned, and not all of the random variables are relevant. The purpose of this study is to implement the importance analysis theory of Sobol’ to reduce the dimension of the uncertainty space, improving the efficiency toward global convergence of evolutionary-based reliability assessment.

Design/methodology/approach

Sobol’ indices are formulated analytically for implicit structural response functions, following the theory of propagation of moments and without violating the fundamental principles presented by Sobol’. An evolutionary algorithm capable of global convergence in reliability assessment is instrumented with the Sobol’ indices. A threshold parameter is introduced to identify the important variables. A set of optimal designs of a multi-laminate composite structure is evaluated.

Findings

Importance analysis shows that uncertainty is concentrated in the laminate where the critical stress state is found. Still, it may also be reasonable in other points of the structure. An accurate and controlled reduction of the uncertainty space significantly improves the convergence rate, while maintaining the quality of the reliability assessment.

Practical implications

The theoretical developments assume independent random variables.

Originality/value

Applying Sobol’ indices as an analytical dimension reduction technique is a novelty. The proposed formulation only requires one adjoint system of equilibrium equations to be solved once. Although a local estimate of a global measure, this analytical formulation still holds because, in structural design, uncertainty is concentrated around the mean-values.

Details

Engineering Computations, vol. 37 no. 1
Type: Research Article
ISSN: 0264-4401

Keywords

To view the access options for this content please click here
Article
Publication date: 19 October 2018

Ignazio Maria Viola, Vincent Chapin, Nicola Speranza and Marco Evangelos Biancolini

There is an increasing interest in airfoils that modify their shape to adapt at the flow conditions. As an example of application, the authors search the optimal 4-digit…

Abstract

Purpose

There is an increasing interest in airfoils that modify their shape to adapt at the flow conditions. As an example of application, the authors search the optimal 4-digit NACA airfoil that maximizes the lift-over-drag ratio for a constant lift coefficient of 0.6, from Re = 104 to 3 × 106.

Design/methodology/approach

The authors consider a γ−Reθt transition model and a κω SST turbulence model with a covariance matrix adaptation evolutionary optimization algorithm. The shape is adapted by radial basis functions mesh morphing using four parameters (angle of attack, thickness, camber and maximum camber position). The objective of the optimization is to find the airfoil that enables a maximum lift-over-drag ratio for a target lift coefficient of 0.6.

Findings

The computation of the optimal airfoils confirmed the expected increase with Re of the lift-over-drag ratio. However, although the observation of efficient biological fliers suggests that the thickness increases monotonically with Re, the authors find that it is constant but for a 1.5 per cent step increase at Re = 3 × 105.

Practical implications

The authors propose and validate an efficient high-fidelity method for the shape optimization of airfoils that can be adopted to define robust and reliable industrial design procedures.

Originality/value

The authors show that the difference in the numerical error between two-dimensional and three-dimensional simulations is negligible, and that the numerical uncertainty of the two-dimensional simulations is sufficiently small to confidently predict the aerodynamic forces across the investigated range of Re.

To view the access options for this content please click here
Article
Publication date: 3 July 2007

K. Durga Rao, H.S. Kushwaha, A.K. Verma and A. Srividya

The purpose of this paper is to demonstrate the potential of simulation approach for performance evaluation in a complex environment with a case of application from Indian…

Abstract

Purpose

The purpose of this paper is to demonstrate the potential of simulation approach for performance evaluation in a complex environment with a case of application from Indian Nuclear Power Plant.

Design/methodology/approach

In this work, stochastic simulation approach is applied to availability evaluation of AC Power supply system of Indian Nuclear Power Plant (INPP). In the presently followed test, maintenance policies on diesel generators and circuit breakers are considered to exactly model the practical scenario. System success logic incorporates the functional dependencies and dynamics in the sequence of operations and maintenance policies. In each iteration (random experiment), from simulated random behaviour of the system, uptime and down time are calculated based on system success logic. After sufficient number of iterations, unavailability and other required reliability measures are estimated from the results.

Findings

The subsystems of AC Power Supply System of NPP are having multi‐states due to surveillance tests and scheduled maintenance activities. In addition, the operation of DG involves starting and running (till its mission time) which is a sequential (or conditional) event. Furthermore, the redundancies and dependencies are adding to the complexity.

Originality/value

This paper emphasizes the importance of realistic reliability modelling in complex operational scenario with Monte‐Carlo simulation approach. Simulation procedure for evaluating the availability/reliability of repairable complex engineering systems having stand‐by tested components is presented. The same simulation model finds application in importance measures calculation, technical specification optimization and uncertainty quantification.

Details

International Journal of Quality & Reliability Management, vol. 24 no. 6
Type: Research Article
ISSN: 0265-671X

Keywords

To view the access options for this content please click here
Article
Publication date: 6 June 2020

Emille Rocha Bernardino de Almeida Prata, José Benício Paes Chaves, Silvane Guimarães Silva Gomes and Frederico José Vieira Passos

Quantitative metrics should be used as a risk management option whenever possible. This work proposes a framework for the risk quantification and the resulting risk-based…

Abstract

Purpose

Quantitative metrics should be used as a risk management option whenever possible. This work proposes a framework for the risk quantification and the resulting risk-based design of control charts to monitor quality control points.

Design/methodology/approach

Two quality control models were considered for the risk quantification analysis. Estimated operating characteristic curves, expressing the defect rate (on a ppm basis) as a function of the sample size, process disturbance magnitude and process capacity, were devised to evaluate the maximum rate of defective product of the processes. The proposed framework applicability on monitoring critical control points in Hazard Analysis and Critical Control Point (HACCP) systems was further evaluated by Monte Carlo simulations.

Findings

Results demonstrate that the proposed monitoring systems can be tuned to achieve an admissible failure risk, conveniently expressed as the number of non-conforming items produced per million products, and these risks can be properly communicated. This risk-based approach can be used to validate critical control point monitoring procedures in HACCP plans. The expected rates of non-conforming items sent out to clients estimated through stochastic simulation procedures agree well with theoretical predictions.

Practical implications

The procedures outlined in this study may be used to establish the statistical validity of monitoring systems that uses control charts. The intrinsic risks of these control systems can be assessed and communicated properly in order to demonstrate the effectiveness of quality control procedures to auditing third parties.

Originality/value

This study provides advancements toward practical directives for the implementation of statistical process control in the food industry. The proposed framework allows the assessment and communication of intrinsic failure risks of quality monitoring systems. It may contribute to the establishment of risk-based thinking in the constitution of quality management systems.

Details

International Journal of Quality & Reliability Management, vol. 38 no. 2
Type: Research Article
ISSN: 0265-671X

Keywords

1 – 10 of over 2000