Search results

1 – 10 of over 16000
Article
Publication date: 2 December 2019

Andrei Kervalishvili and Ivar Talvik

This paper aims to reliability analysis of axially loaded steel columns at elevated temperatures considering the probabilistic features of fire.

Abstract

Purpose

This paper aims to reliability analysis of axially loaded steel columns at elevated temperatures considering the probabilistic features of fire.

Design/methodology/approach

The response function used in the reliability analysis is based on the non-linear FEM calculations. The stochastic variability of temperature is integrated with the procedure similar to the parameters of loading and material properties. Direct Monte Carlo simulations (MCSs) are implemented for probabilistic analysis. Computational costs are reduced by polynomial approximation of the response function of the column.

Findings

A design method for practical applications in the common Eurocode format is proposed. The proposed method can be used to estimate the failure probability of a steel column in fire conditions. If standard reliability criteria are applied, the results of the steel column buckling capacity in the fire according to the proposed procedure deviate from the Eurocode results in certain parameter ranges.

Originality/value

The proposed method for design calculations makes use of the advantages of MCS results, while the need for the tedious amount of calculations for the end user are avoided as the predefined factors are implemented in the procedure of Eurocode format. The proposed method allows better differentiation of the fire probability in the capacity assessment compared to the existing design methods.

Details

Journal of Structural Fire Engineering, vol. 11 no. 2
Type: Research Article
ISSN: 2040-2317

Keywords

Article
Publication date: 27 May 2021

Z.G. Liang, J.L. Chen, Z.W. Zhang, S.C. Zhao and S.S. Wang

In order to explore the damaging effect of behind-armor debris of explosively formed projectile (EFP) attack on the top armor of tank and the internal parts of vehicles, a method…

Abstract

Purpose

In order to explore the damaging effect of behind-armor debris of explosively formed projectile (EFP) attack on the top armor of tank and the internal parts of vehicles, a method of damage probability calculation based on experiments is proposed.

Design/methodology/approach

The equivalent target structure of rear-effect damage of the equipment and personnel in the vehicle is determined based on the analysis of the vulnerability of internal equipment and personnel in the tank. The experimental scheme to obtain the density distribution of behind-armor debris is designed, and the calculation model of the damage probability of cavitating antipersonnel debris to the key components of the vehicle is given in the range of scattering angles and different broken pieces.

Findings

The examples show that the damage probability calculation model can be used in the process of evaluating the damage of the equipment and personnel in the tank by behind-armor debris.

Originality/value

An experimental model based on the analysis of the vulnerability of the equipment and personnel is proposed to calculate the damage probability from debris falling on the equipment and personnel in the vehicle. The results are of great value to the calculation of damage evaluation of the equipment and personnel in the tank.

Details

Multidiscipline Modeling in Materials and Structures, vol. 17 no. 4
Type: Research Article
ISSN: 1573-6105

Keywords

Article
Publication date: 14 June 2021

Ruirui Shao, Zhigeng Fang, Liangyan Tao, Su Gao and Weiqing You

During the service period of communication satellite systems, their performance is often degraded due to the depletion mechanism. In this paper, the grey system theory is applied…

Abstract

Purpose

During the service period of communication satellite systems, their performance is often degraded due to the depletion mechanism. In this paper, the grey system theory is applied to the multi-state system effectiveness evaluation and the grey Lz-transformation ADC (availability, dependability and capability) effectiveness evaluation model is constructed to address the characteristics of the communication satellite system such as different constituent subsystems, numerous states and the inaccuracy and insufficiency of data.

Design/methodology/approach

The model is based on the ADC effectiveness evaluation method, combined with the Lz transformation and uses the definite weighted function of the three-parameter interval grey number as a bridge to incorporate the possibility of system performance being greater than the task demand into the effectiveness solution algorithm. At the same time, using MATLAB (Matrix laboratory) to solve each state probability, the same performance level in the Lz transform is combined. Then, the system effectiveness is obtained by Python.

Findings

The results show that the G-Lz-ADC model constructed in this paper can accurately evaluate the effectiveness of static/dynamic systems and certain/uncertain system and also has better applicability in evaluating the effectiveness of the multi-state complex system.

Practical implications

The G-Lz-ADC effectiveness evaluation model constructed in this paper can effectively reduce the complexity of traditional effectiveness evaluation models by combining the same performance levels in the Lz-transform and solving the effectiveness of the system with the help of computer programming, providing a new method for the effectiveness evaluation of the complex MSS. At the same time, the weaknesses of the system can be identified, providing a theoretical basis for improving the system’s effectiveness.

Originality/value

The possibility solution method based on the definite weighted function comparing the two three-parameter interval grey numbers is constructed, which compensates for the traditional calculation of the probability based on numerical values and subjective preferences of decision-makers. Meanwhile, the effectiveness evaluation model integrates the basic theories of three-parameter interval grey number and its definite weighted function, Grey−Markov, grey universal generating function (GUGF), grey multi-state system (GMSS), etc., which is an innovative method to solve the effectiveness of a multi-state instantaneous communication satellite system.

Details

Grey Systems: Theory and Application, vol. 12 no. 2
Type: Research Article
ISSN: 2043-9377

Keywords

Article
Publication date: 1 October 2006

T. Yuge, K. Tagami and S. Yanagi

Calculating the exact top event probability of fault trees is an important analysis in quantitative risk assessments. However, it is a difficult problem for the trees with complex…

1586

Abstract

Purpose

Calculating the exact top event probability of fault trees is an important analysis in quantitative risk assessments. However, it is a difficult problem for the trees with complex structure. Therefore, the paper aims to provide an efficient calculation method to obtain an exact top event probability of a fault tree with many repeated events when the minimal cut sets of the tree model are given.

Design/methodology/approach

The method is based on the inclusion‐exclusion method. Generally, the inclusion‐exclusion method tends to get into computational difficulties for a large‐scale fault tree. The computation time has been reduced by enumerating only non‐canceling terms.

Findings

The method enables the calculation of the probability more quickly than the conventional method. The effect increases as the number of repeated events increases, namely the tree structure becomes complex. This method also can be applied to obtain the lower and upper bounds of the top event probability easily.

Originality/value

The paper expresses the top event probability by using only non‐canceling terms. This is the first application in fault tree analysis.

Details

Journal of Quality in Maintenance Engineering, vol. 12 no. 4
Type: Research Article
ISSN: 1355-2511

Keywords

Article
Publication date: 5 January 2010

Ron Layman, Samy Missoum and Jonathan Vande Geest

The use of stent‐grafts to canalize aortic blood flow for patients with aortic aneurysms is subject to serious failure mechanisms such as a leak between the stent‐graft and the…

Abstract

Purpose

The use of stent‐grafts to canalize aortic blood flow for patients with aortic aneurysms is subject to serious failure mechanisms such as a leak between the stent‐graft and the aorta (Type I endoleak). The purpose of this paper is to describe a novel computational approach to understand the influence of relevant variables on the occurrence of stent‐graft failure and quantify the probability of failure for aneurysm patients.

Design/methodology/approach

A parameterized fluid‐structure interaction finite element model of aortic aneurysm is built based on a multi‐material formulation available in LS‐DYNA. Probabilities of failure are assessed using an explicit construction of limit state functions with support vector machines (SVM) and uniform designs of experiments. The probabilistic approach is applied to two aneurysm geometries to provide a map of probabilities of failure for various design parameter values.

Findings

Parametric studies conducted in the course of this research successfully identified intuitive failure regions in the parameter space, and failure probabilities were calculated using both a simplified and more complex aneurysmal geometry.

Originality/value

This research introduces the use of SVM‐based explicit design space decomposition for probabilistic assessment applied to bioengineering problems. This technique allows one to efficiently calculate probabilities of failure. It is particularly suited for problems where outcomes can only be classified as safe or failed (e.g. leak or no leak). Finally, the proposed fluid‐structure interaction simulation accounts for the initiation of Type I endoleak between the graft and the aneurysm due to simultaneous fluid and solid forces.

Details

Engineering Computations, vol. 27 no. 1
Type: Research Article
ISSN: 0264-4401

Keywords

Article
Publication date: 21 June 2013

Wichai Chattinnawat

This research aims to investigate the differences in designing the zero acceptance number single sampling plans using the apparent fraction of nonconforming and the binomial…

Abstract

Purpose

This research aims to investigate the differences in designing the zero acceptance number single sampling plans using the apparent fraction of nonconforming and the binomial distribution against the exact convolute compound hypergeometric distribution when both types of inspection errors are present.

Design/methodology/approach

This research presents the derivation and uses the numerical study to compare the calculated probability of acceptance and the minimum sample size when using the present design concept of binomial distribution with true fraction of nonconforming replaced with the apparent one. Under the presences of inspection errors and zero acceptance number, the probability of acceptance is alternatively derived and presented in term of a function of the probability generating function. This research uses numerical method to determine the differences in the probability of acceptance. The computation of the minimum sample sizes are presented along with the numerical results and the comparison.

Findings

When the inspection errors are present, the probability of acceptance is extremely decreased even for 1 percent of inspection errors of Type I (rejecting good product) and Type II (accepting bad product). The binomial apparent nonconforming notions yields an over‐estimation of the probability of acceptance, comparing with the exact convolute compound hypergeometric notion under the zero acceptance single sampling plans especially at low fraction of nonconforming levels, the six sigma quality levels. The differences of the calculated probabilities of acceptance and the minimum sample sizes decrease as the inspection error of Type II increases given a fixed value of Type I error and consumer risk.

Originality/value

This research alternatively presents the mathematical derivation along with numerical study to assert the over‐estimation of the probability of acceptance and the minimum sample size if the existing methodology to design the zero acceptance number single sampling plans is used. This finding will help improve the sampling design strategy of the multistage production system.

Details

International Journal of Quality & Reliability Management, vol. 30 no. 6
Type: Research Article
ISSN: 0265-671X

Keywords

Article
Publication date: 4 January 2013

Ozan Çakır

The purpose of this paper is to illustrate an alternative number comparison process to the original procedure utilized under the grey extent analysis.

Abstract

Purpose

The purpose of this paper is to illustrate an alternative number comparison process to the original procedure utilized under the grey extent analysis.

Design/methodology/approach

Number comparison process is visualized on simple diagrams constructed on a Cartesian coordinate system. Discernable areas to assist related probability calculation were sought according to compliance with desired dominance conditions.

Findings

By visualization, the author was able to obtain exactly the same probability information as those attainable by using the original number comparison process.

Research limitations/implications

Reach of this methodology is merely constrained by the limitation of original method where whitenization values of interest uniformly distribute over the domains of related grey numbers.

Originality/value

Number comparison with visualization is easy to comprehend and practice, does not require resorting to probability formulas and an understanding of probability concepts.

Article
Publication date: 21 December 2023

Libiao Bai, Xuyang Zhao, ShuYun Kang, Yiming Ma and BingBing Zhang

Research and development (R&D) projects are often pursued through a project portfolio (PP). R&D PPs involve many stakeholders, and without proactive management, their interactions…

Abstract

Purpose

Research and development (R&D) projects are often pursued through a project portfolio (PP). R&D PPs involve many stakeholders, and without proactive management, their interactions may lead to conflict risks. These conflict risks change dynamically with different stages of the PP life cycle, increasing the challenge of PP risk management. Existing conflict risk research mainly focuses on source identification but lacks risk assessment work. To better manage the stakeholder conflict risks (SCRs) of R&D PPs, this study employs the dynamic Bayesian network (DBN) to construct its dynamic assessment model.

Design/methodology/approach

This study constructs a DBN model to assess the SCRs in R&D PP. First, an indicator system of SCRs is constructed from the life cycle perspective. Then, the risk relationships within each R&D PPs life cycle stage are identified via interpretative structural modeling (ISM). The prior and conditional probabilities of risks are obtained by expert judgment and Monte Carlo simulation (MCS). Finally, crucial SCRs at each stage are identified utilizing propagation analysis, and the corresponding risk responses are proposed.

Findings

The results of the study identify the crucial risks at each stage. Also, for the crucial risks, this study suggests appropriate risk response strategies to help managers better perform risk response activities.

Originality/value

This study dynamically assesses the stakeholder conflict risks in R&D PPs from a life-cycle perspective, extending the stakeholder risk management research. Meanwhile, the crucial risks are identified at each stage accordingly, providing managerial insights for R&D PPs.

Details

Engineering, Construction and Architectural Management, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 0969-9988

Keywords

Article
Publication date: 23 January 2009

Shao Jiye, Xu Minqiang and Wang Rixin

The purpose of this paper is to deal with the fault of the rotor system of aeroengine that has too much uncertainty and design a structural diagnosis framework for the rotor.

543

Abstract

Purpose

The purpose of this paper is to deal with the fault of the rotor system of aeroengine that has too much uncertainty and design a structural diagnosis framework for the rotor.

Design/methodology/approach

Bayesian network (BN) is especially suited for capturing and reasoning with uncertainty. This paper adopts the techniques of BN to implement the probability computation of fault occurrence using system information. The rotor system is analyzed in detail and the familiar faults and their corresponding fault symptoms are extracted, then the rotor's BN model based on above information is established. Meanwhile, a framework of the fault diagnosis system based on the network model is developed. Using this model, the conditional probabilities of the faults happened are computed when the observation of the rotor is presented.

Findings

The diagnosis methods developed are used to diagnose the actual four kinds of faults of the rotor. The BN model can identify the faults occurred by those probabilities computed.

Originality/value

The diagnosis system using BN described in this paper is satisfying and can handle the faults of the rotor.

Details

Aircraft Engineering and Aerospace Technology, vol. 81 no. 1
Type: Research Article
ISSN: 0002-2667

Keywords

Book part
Publication date: 15 January 2010

Jeffrey P. Newman

Mixed logit models can represent heterogeneity across individuals, in both observed and unobserved preferences, but require computationally expensive calculations to compute…

Abstract

Mixed logit models can represent heterogeneity across individuals, in both observed and unobserved preferences, but require computationally expensive calculations to compute probabilities. A few methods for including error covariance heterogeneity in a closed form models have been proposed, and this paper adds to that collection, introducing a new form of a Network GEV model that sub-parameterizes the allocation values for the assignment of alternatives (and sub-nests) to nests. This change allows the incorporation of systematic (nonrandom) error covariance heterogeneity across individuals, while maintaining a closed form for the calculation of choice probabilities. Also explored is a latent class model of nested models, which can similarly express heterogeneity. The heterogeneous models are compared to a similar model with homogeneous covariance in a realistic scenario, and are shown to significantly outperform the homogeneous model, and the level of improvement is especially large in certain market segments. The results also suggest that the two heterogeneous models introduced herein may be functionally equivalent.

Details

Choice Modelling: The State-of-the-art and The State-of-practice
Type: Book
ISBN: 978-1-84950-773-8

1 – 10 of over 16000