Search results
1 – 10 of over 14000Jinsheng Wang, Muhannad Aldosary, Song Cen and Chenfeng Li
Normal transformation is often required in structural reliability analysis to convert the non-normal random variables into independent standard normal variables. The existing…
Abstract
Purpose
Normal transformation is often required in structural reliability analysis to convert the non-normal random variables into independent standard normal variables. The existing normal transformation techniques, for example, Rosenblatt transformation and Nataf transformation, usually require the joint probability density function (PDF) and/or marginal PDFs of non-normal random variables. In practical problems, however, the joint PDF and marginal PDFs are often unknown due to the lack of data while the statistical information is much easier to be expressed in terms of statistical moments and correlation coefficients. This study aims to address this issue, by presenting an alternative normal transformation method that does not require PDFs of the input random variables.
Design/methodology/approach
The new approach, namely, the Hermite polynomial normal transformation, expresses the normal transformation function in terms of Hermite polynomials and it works with both uncorrelated and correlated random variables. Its application in structural reliability analysis using different methods is thoroughly investigated via a number of carefully designed comparison studies.
Findings
Comprehensive comparisons are conducted to examine the performance of the proposed Hermite polynomial normal transformation scheme. The results show that the presented approach has comparable accuracy to previous methods and can be obtained in closed-form. Moreover, the new scheme only requires the first four statistical moments and/or the correlation coefficients between random variables, which greatly widen the applicability of normal transformations in practical problems.
Originality/value
This study interprets the classical polynomial normal transformation method in terms of Hermite polynomials, namely, Hermite polynomial normal transformation, to convert uncorrelated/correlated random variables into standard normal random variables. The new scheme only requires the first four statistical moments to operate, making it particularly suitable for problems that are constraint by limited data. Besides, the extension to correlated cases can easily be achieved with the introducing of the Hermite polynomials. Compared to existing methods, the new scheme is cheap to compute and delivers comparable accuracy.
Details
Keywords
Giovanni Falsone and Rossella Laudani
This paper aims to present an approach for the probabilistic characterization of the response of linear structural systems subjected to random time-dependent non-Gaussian actions.
Abstract
Purpose
This paper aims to present an approach for the probabilistic characterization of the response of linear structural systems subjected to random time-dependent non-Gaussian actions.
Design/methodology/approach
Its fundamental property is working directly on the probability density functions of the actions and responses. This avoids passing through the evaluation of the response statistical moments or cumulants, reducing the computational effort in a consistent measure.
Findings
It is an efficient method, for both its computational effort and its accuracy, above all when the input and output processes are strongly non-Gaussian.
Originality/value
This approach can be considered as a dynamic generalization of the probability transformation method recently used for static applications.
Details
Keywords
James P. LeSage and R. Kelley Pace
For this discussion, assume there are n sample observations of the dependent variable y at unique locations. In spatial samples, often each observation is uniquely associated with…
Abstract
For this discussion, assume there are n sample observations of the dependent variable y at unique locations. In spatial samples, often each observation is uniquely associated with a particular location or region, so that observations and regions are equivalent. Spatial dependence arises when an observation at one location, say y i is dependent on “neighboring” observations y j, y j∈ϒi. We use ϒi to denote the set of observations that are “neighboring” to observation i, where some metric is used to define the set of observations that are spatially connected to observation i. For general definitions of the sets ϒi,i=1,…,n, typically at least one observation exhibits simultaneous dependence, so that an observation y j, also depends on y i. That is, the set ϒj contains the observation y i, creating simultaneous dependence among observations. This situation constitutes a difference between time series analysis and spatial analysis. In time series, temporal dependence relations could be such that a “one-period-behind relation” exists, ruling out simultaneous dependence among observations. The time series one-observation-behind relation could arise if spatial observations were located along a line and the dependence of each observation were strictly on the observation located to the left. However, this is not in general true of spatial samples, requiring construction of estimation and inference methods that accommodate the more plausible case of simultaneous dependence among observations.
Yingsai Cao, Sifeng Liu and Zhigeng Fang
The purpose of this paper is to propose new importance measures for degrading components based on Shapley value, which can provide answers about how important players are to the…
Abstract
Purpose
The purpose of this paper is to propose new importance measures for degrading components based on Shapley value, which can provide answers about how important players are to the whole cooperative game and what payoff each player can reasonably expect.
Design/methodology/approach
The proposed importance measure characterizes how a specific degrading component contributes to the degradation of system reliability by using Shapley value. Degradation models are also introduced to assess the reliability of degrading components. The reliability of system consisting independent degrading components is obtained by using structure functions, while reliability of system comprising correlated degrading components is evaluated with a multivariate distribution.
Findings
The ranking of degrading components according to the newly developed importance measure depends on the degradation parameters of components, system structure and parameters characterizing the association of components.
Originality/value
Considering the fact that reliability degradation of engineering systems and equipment are often attributed to the degradation of a particular or set of components that are characterized by degrading features. This paper proposes new importance measures for degrading components based on Shapley value to reflect the responsibility of each degrading component for the deterioration of system reliability. The results are also able to give timely feedback of the expected contribution of each degrading component to system reliability degradation.
Details
Keywords
Ruirui Shao, Zhigeng Fang, Liangyan Tao, Su Gao and Weiqing You
During the service period of communication satellite systems, their performance is often degraded due to the depletion mechanism. In this paper, the grey system theory is applied…
Abstract
Purpose
During the service period of communication satellite systems, their performance is often degraded due to the depletion mechanism. In this paper, the grey system theory is applied to the multi-state system effectiveness evaluation and the grey Lz-transformation ADC (availability, dependability and capability) effectiveness evaluation model is constructed to address the characteristics of the communication satellite system such as different constituent subsystems, numerous states and the inaccuracy and insufficiency of data.
Design/methodology/approach
The model is based on the ADC effectiveness evaluation method, combined with the Lz transformation and uses the definite weighted function of the three-parameter interval grey number as a bridge to incorporate the possibility of system performance being greater than the task demand into the effectiveness solution algorithm. At the same time, using MATLAB (Matrix laboratory) to solve each state probability, the same performance level in the Lz transform is combined. Then, the system effectiveness is obtained by Python.
Findings
The results show that the G-Lz-ADC model constructed in this paper can accurately evaluate the effectiveness of static/dynamic systems and certain/uncertain system and also has better applicability in evaluating the effectiveness of the multi-state complex system.
Practical implications
The G-Lz-ADC effectiveness evaluation model constructed in this paper can effectively reduce the complexity of traditional effectiveness evaluation models by combining the same performance levels in the Lz-transform and solving the effectiveness of the system with the help of computer programming, providing a new method for the effectiveness evaluation of the complex MSS. At the same time, the weaknesses of the system can be identified, providing a theoretical basis for improving the system’s effectiveness.
Originality/value
The possibility solution method based on the definite weighted function comparing the two three-parameter interval grey numbers is constructed, which compensates for the traditional calculation of the probability based on numerical values and subjective preferences of decision-makers. Meanwhile, the effectiveness evaluation model integrates the basic theories of three-parameter interval grey number and its definite weighted function, Grey−Markov, grey universal generating function (GUGF), grey multi-state system (GMSS), etc., which is an innovative method to solve the effectiveness of a multi-state instantaneous communication satellite system.
Details
Keywords
Muhannad Aldosary, Jinsheng Wang and Chenfeng Li
This paper aims to provide a comprehensive review of uncertainty quantification methods supported by evidence-based comparison studies. Uncertainties are widely encountered in…
Abstract
Purpose
This paper aims to provide a comprehensive review of uncertainty quantification methods supported by evidence-based comparison studies. Uncertainties are widely encountered in engineering practice, arising from such diverse sources as heterogeneity of materials, variability in measurement, lack of data and ambiguity in knowledge. Academia and industries have long been researching for uncertainty quantification (UQ) methods to quantitatively account for the effects of various input uncertainties on the system response. Despite the rich literature of relevant research, UQ is not an easy subject for novice researchers/practitioners, where many different methods and techniques coexist with inconsistent input/output requirements and analysis schemes.
Design/methodology/approach
This confusing status significantly hampers the research progress and practical application of UQ methods in engineering. In the context of engineering analysis, the research efforts of UQ are most focused in two largely separate research fields: structural reliability analysis (SRA) and stochastic finite element method (SFEM). This paper provides a state-of-the-art review of SRA and SFEM, covering both technology and application aspects. Moreover, unlike standard survey papers that focus primarily on description and explanation, a thorough and rigorous comparative study is performed to test all UQ methods reviewed in the paper on a common set of reprehensive examples.
Findings
Over 20 uncertainty quantification methods in the fields of structural reliability analysis and stochastic finite element methods are reviewed and rigorously tested on carefully designed numerical examples. They include FORM/SORM, importance sampling, subset simulation, response surface method, surrogate methods, polynomial chaos expansion, perturbation method, stochastic collocation method, etc. The review and comparison tests comment and conclude not only on accuracy and efficiency of each method but also their applicability in different types of uncertainty propagation problems.
Originality/value
The research fields of structural reliability analysis and stochastic finite element methods have largely been developed separately, although both tackle uncertainty quantification in engineering problems. For the first time, all major uncertainty quantification methods in both fields are reviewed and rigorously tested on a common set of examples. Critical opinions and concluding remarks are drawn from the rigorous comparative study, providing objective evidence-based information for further research and practical applications.
Details
Keywords
Yunfei Zu, Wenliang Fan, Jingyao Zhang, Zhengling Li and Makoto Ohsaki
Conversion of the correlated random variables into independent variables, especially into independent standard normal variables, is the common technology for estimating the…
Abstract
Purpose
Conversion of the correlated random variables into independent variables, especially into independent standard normal variables, is the common technology for estimating the statistical moments of response and evaluating reliability of random system, in which calculating the equivalent correlation coefficient is an important component. The purpose of this paper is to investigate an accurate, efficient and easy to implement estimation method for the equivalent correlation coefficient of various incomplete probability systems.
Design/methodology/approach
First, an approach based on the Mehler’s formula for evaluating the equivalent correlation coefficient is introduced, then, by combining with polynomial normal transformations, this approach is improved to be valid for various incomplete probability systems, which is named as the direct method. Next, with the convenient linear reference variables for eight frequently used random variables and the approximation of the Rosenblatt transformation introduced, a further improved implementation without iteration process is developed, which is named as the simplified method. Finally, several examples are investigated to verify the characteristics of the proposed methods.
Findings
The results of the examples in this paper show that both the proposed two methods are of high accuracy, by comparison, the proposed simplified method is more effective and convenient.
Originality/value
Based on the Mehler’s formula, two practical implementations for evaluating the equivalent correlation coefficient are proposed, which are accurate, efficient, easy to implement and valid for various incomplete probability systems.
Details
Keywords
Chuyu Tang, Hao Wang, Genliang Chen and Shaoqiu Xu
This paper aims to propose a robust method for non-rigid point set registration, using the Gaussian mixture model and accommodating non-rigid transformations. The posterior…
Abstract
Purpose
This paper aims to propose a robust method for non-rigid point set registration, using the Gaussian mixture model and accommodating non-rigid transformations. The posterior probabilities of the mixture model are determined through the proposed integrated feature divergence.
Design/methodology/approach
The method involves an alternating two-step framework, comprising correspondence estimation and subsequent transformation updating. For correspondence estimation, integrated feature divergences including both global and local features, are coupled with deterministic annealing to address the non-convexity problem of registration. For transformation updating, the expectation-maximization iteration scheme is introduced to iteratively refine correspondence and transformation estimation until convergence.
Findings
The experiments confirm that the proposed registration approach exhibits remarkable robustness on deformation, noise, outliers and occlusion for both 2D and 3D point clouds. Furthermore, the proposed method outperforms existing analogous algorithms in terms of time complexity. Application of stabilizing and securing intermodal containers loaded on ships is performed. The results demonstrate that the proposed registration framework exhibits excellent adaptability for real-scan point clouds, and achieves comparatively superior alignments in a shorter time.
Originality/value
The integrated feature divergence, involving both global and local information of points, is proven to be an effective indicator for measuring the reliability of point correspondences. This inclusion prevents premature convergence, resulting in more robust registration results for our proposed method. Simultaneously, the total operating time is reduced due to a lower number of iterations.
Details
Keywords
Amos Golan and Robin L. Lumsdaine
Although in principle prior information can significantly improve inference, incorporating incorrect prior information will bias the estimates of any inferential analysis. This…
Abstract
Although in principle prior information can significantly improve inference, incorporating incorrect prior information will bias the estimates of any inferential analysis. This fact deters many scientists from incorporating prior information into their inferential analyses. In the natural sciences, where experiments are more regularly conducted, and can be combined with other relevant information, prior information is often used in inferential analysis, despite it being sometimes nontrivial to specify what that information is and how to quantify that information. In the social sciences, however, prior information is often hard to come by and very hard to justify or validate. We review a number of ways to construct such information. This information emerges naturally, either from fundamental properties and characteristics of the systems studied or from logical reasoning about the problems being analyzed. Borrowing from concepts and philosophical reasoning used in the natural sciences, and within an info-metrics framework, we discuss three different, yet complimentary, approaches for constructing prior information, with an application to the social sciences.
Details
Keywords
Ivan Jeliazkov and Esther Hee Lee
A major stumbling block in multivariate discrete data analysis is the problem of evaluating the outcome probabilities that enter the likelihood function. Calculation of these…
Abstract
A major stumbling block in multivariate discrete data analysis is the problem of evaluating the outcome probabilities that enter the likelihood function. Calculation of these probabilities involves high-dimensional integration, making simulation methods indispensable in both Bayesian and frequentist estimation and model choice. We review several existing probability estimators and then show that a broader perspective on the simulation problem can be afforded by interpreting the outcome probabilities through Bayes’ theorem, leading to the recognition that estimation can alternatively be handled by methods for marginal likelihood computation based on the output of Markov chain Monte Carlo (MCMC) algorithms. These techniques offer stand-alone approaches to simulated likelihood estimation but can also be integrated with traditional estimators. Building on both branches in the literature, we develop new methods for estimating response probabilities and propose an adaptive sampler for producing high-quality draws from multivariate truncated normal distributions. A simulation study illustrates the practical benefits and costs associated with each approach. The methods are employed to estimate the likelihood function of a correlated random effects panel data model of women's labor force participation.