Search results

1 – 10 of 521
Article
Publication date: 4 September 2018

Muhannad Aldosary, Jinsheng Wang and Chenfeng Li

This paper aims to provide a comprehensive review of uncertainty quantification methods supported by evidence-based comparison studies. Uncertainties are widely encountered in…

Abstract

Purpose

This paper aims to provide a comprehensive review of uncertainty quantification methods supported by evidence-based comparison studies. Uncertainties are widely encountered in engineering practice, arising from such diverse sources as heterogeneity of materials, variability in measurement, lack of data and ambiguity in knowledge. Academia and industries have long been researching for uncertainty quantification (UQ) methods to quantitatively account for the effects of various input uncertainties on the system response. Despite the rich literature of relevant research, UQ is not an easy subject for novice researchers/practitioners, where many different methods and techniques coexist with inconsistent input/output requirements and analysis schemes.

Design/methodology/approach

This confusing status significantly hampers the research progress and practical application of UQ methods in engineering. In the context of engineering analysis, the research efforts of UQ are most focused in two largely separate research fields: structural reliability analysis (SRA) and stochastic finite element method (SFEM). This paper provides a state-of-the-art review of SRA and SFEM, covering both technology and application aspects. Moreover, unlike standard survey papers that focus primarily on description and explanation, a thorough and rigorous comparative study is performed to test all UQ methods reviewed in the paper on a common set of reprehensive examples.

Findings

Over 20 uncertainty quantification methods in the fields of structural reliability analysis and stochastic finite element methods are reviewed and rigorously tested on carefully designed numerical examples. They include FORM/SORM, importance sampling, subset simulation, response surface method, surrogate methods, polynomial chaos expansion, perturbation method, stochastic collocation method, etc. The review and comparison tests comment and conclude not only on accuracy and efficiency of each method but also their applicability in different types of uncertainty propagation problems.

Originality/value

The research fields of structural reliability analysis and stochastic finite element methods have largely been developed separately, although both tackle uncertainty quantification in engineering problems. For the first time, all major uncertainty quantification methods in both fields are reviewed and rigorously tested on a common set of examples. Critical opinions and concluding remarks are drawn from the rigorous comparative study, providing objective evidence-based information for further research and practical applications.

Details

Engineering Computations, vol. 35 no. 6
Type: Research Article
ISSN: 0264-4401

Keywords

Article
Publication date: 25 July 2019

S. Khodaygan

The purpose of this paper is to present a novel Kriging meta-model assisted method for multi-objective optimal tolerance design of the mechanical assemblies based on the operating…

Abstract

Purpose

The purpose of this paper is to present a novel Kriging meta-model assisted method for multi-objective optimal tolerance design of the mechanical assemblies based on the operating conditions under both systematic and random uncertainties.

Design/methodology/approach

In the proposed method, the performance, the quality loss and the manufacturing cost issues are formulated as the main criteria in terms of systematic and random uncertainties. To investigate the mechanical assembly under the operating conditions, the behavior of the assembly can be simulated based on the finite element analysis (FEA). The objective functions in terms of uncertainties at the operating conditions can be modeled through the Kriging-based metamodeling based on the obtained results from the FEA simulations. Then, the optimal tolerance allocation procedure is formulated as a multi-objective optimization framework. For solving the multi conflicting objectives optimization problem, the multi-objective particle swarm optimization method is used. Then, a Shannon’s entropy-based TOPSIS is used for selection of the best tolerances from the optimal Pareto solutions.

Findings

The proposed method can be used for optimal tolerance design of mechanical assemblies in the operating conditions with including both random and systematic uncertainties. To reach an accurate model of the design function at the operating conditions, the Kriging meta-modeling is used. The efficiency of the proposed method by considering a case study is illustrated and the method is verified by comparison to a conventional tolerance allocation method. The obtained results show that using the proposed method can lead to the product with a more robust efficiency in the performance and a higher quality in comparing to the conventional results.

Research limitations/implications

The proposed method is limited to the dimensional tolerances of components with the normal distribution.

Practical implications

The proposed method is practically easy to be automated for computer-aided tolerance design in industrial applications.

Originality/value

In conventional approaches, regardless of systematic and random uncertainties due to operating conditions, tolerances are allocated based on the assembly conditions. As uncertainties can significantly affect the system’s performance at operating conditions, tolerance allocation without including these effects may be inefficient. This paper aims to fill this gap in the literature by considering both systematic and random uncertainties for multi-objective optimal tolerance design of mechanical assemblies under operating conditions.

Article
Publication date: 15 June 2010

Emad Samadiani and Yogendra Joshi

The purpose of this paper is to review the available reduced order modeling approaches in the literature for predicting the flow and specially temperature fields inside data…

Abstract

Purpose

The purpose of this paper is to review the available reduced order modeling approaches in the literature for predicting the flow and specially temperature fields inside data centers in terms of the involved design parameters.

Design/methodology/approach

This paper begins with a motivation for flow/thermal modeling needs for designing an energy‐efficient thermal management system in data centers. Recent studies on air velocity and temperature field simulations in data centers through computational fluid dynamics/heat transfer (CFD/HT) are reviewed. Meta‐modeling and reduced order modeling are tools to generate accurate and rapid surrogate models for a complex system. These tools, with a focus on low‐dimensional models of turbulent flows are reviewed. Reduced order modeling techniques based on turbulent coherent structures identification, in particular the proper orthogonal decomposition (POD) are explained and reviewed in more details. Then, the available approaches for rapid thermal modeling of data centers are reviewed. Finally, recent studies on generating POD‐based reduced order thermal models of data centers are reviewed and representative results are presented and compared for a case study.

Findings

It is concluded that low‐dimensional models are needed in order to predict the multi‐parameter dependent thermal behavior of data centers accurately and rapidly for design and control purposes. POD‐based techniques have shown great approximation for multi‐parameter thermal modeling of data centers. It is believed that wavelet‐based techniques due to the their ability to separate between coherent and incoherent structures – something that POD cannot do – can be considered as new promising tools for reduced order thermal modeling of complex electronic systems such as data centers

Originality/value

The paper reviews different numerical methods and provides the reader with some insight for reduced order thermal modeling of complex convective systems such as data centers.

Details

International Journal of Numerical Methods for Heat & Fluid Flow, vol. 20 no. 5
Type: Research Article
ISSN: 0961-5539

Keywords

Book part
Publication date: 18 January 2022

Chrystalleni Aristidou, Kevin Lee and Kalvinder Shields

A novel approach to modeling exchange rates is presented based on a set of models distinguished by the drivers of the rate and regime duration. The models are combined into a …

Abstract

A novel approach to modeling exchange rates is presented based on a set of models distinguished by the drivers of the rate and regime duration. The models are combined into a “meta model” using model averaging and non-nested hypothesis-testing techniques. The meta model accommodates periods of stability and slowly evolving or abruptly changing regimes involving multiple drivers. Estimated meta models for five exchange rates provide a compelling characterization of their determination over the last 40 years or so, identifying “phases” during which the influences from policy and financial market responses to news succumb to equilibrating macroeconomic pressures and vice versa.

Details

Essays in Honor of M. Hashem Pesaran: Prediction and Macro Modeling
Type: Book
ISBN: 978-1-80262-062-7

Keywords

Article
Publication date: 1 June 1994

Harry Alder

Reviews the advances in creative thinking since De Bono′s LateralThinking, including work on brain hemispheres, with particular referenceto practical techniques that can be…

2178

Abstract

Reviews the advances in creative thinking since De Bono′s Lateral Thinking, including work on brain hemispheres, with particular reference to practical techniques that can be applied in business. Outlines a general approach, largely concerned with how the creative right brain is best harnessed, and drawing, inter alia, on current research among top British business leaders. Describes a number of specific techniques, including what are termed Chunking, Sleight of Mouth, Reversals, Metaphors, the Meta Model, and different forms of Visualization. Some of these have been developed within Neuro Linguistic Programming (NLP). A new technique, Chunked Reversals, has been developed by the author. These techniques allow, typically, hundreds of ideas to be generated and applied to strategic or operational business objectives, in a more focused way than traditional brainstorming or lateral thinking.

Details

Management Decision, vol. 32 no. 4
Type: Research Article
ISSN: 0025-1747

Keywords

Article
Publication date: 5 October 2015

Xiaoke Li, Haobo Qiu, Zhenzhong Chen, Liang Gao and Xinyu Shao

Kriging model has been widely adopted to reduce the high computational costs of simulations in Reliability-based design optimization (RBDO). To construct the Kriging model…

488

Abstract

Purpose

Kriging model has been widely adopted to reduce the high computational costs of simulations in Reliability-based design optimization (RBDO). To construct the Kriging model accurately and efficiently in the region of significance, a local sampling method with variable radius (LSVR) is proposed. The paper aims to discuss these issues.

Design/methodology/approach

In LSVR, the sequential sampling points are mainly selected within the local region around the current design point. The size of the local region is adaptively defined according to the target reliability and the nonlinearity of the probabilistic constraint. Every probabilistic constraint has its own local region instead of all constraints sharing one local region. In the local sampling region, the points located on the constraint boundary and the points with high uncertainty are considered simultaneously.

Findings

The computational capability of the proposed method is demonstrated using two mathematical problems, a reducer design and a box girder design of a super heavy machine tool. The comparison results show that the proposed method is very efficient and accurate.

Originality/value

The main contribution of this paper lies in: a new local sampling region computational criterion is proposed for Kriging. The originality of this paper is using expected feasible function (EFF) criterion and the shortest distance to the existing sample points instead of the other types of sequential sampling criterion to deal with the low efficiency problem.

Details

Engineering Computations, vol. 32 no. 7
Type: Research Article
ISSN: 0264-4401

Keywords

Article
Publication date: 30 August 2011

Kwang‐Ki Lee, Kwon‐Hee Lee and Seung‐Ho Han

Approximation techniques were used instead of expensive computing analysis in a traditional parametric design optimization of a complex system. A Kriging meta‐model was utilized…

Abstract

Purpose

Approximation techniques were used instead of expensive computing analysis in a traditional parametric design optimization of a complex system. A Kriging meta‐model was utilized, which enabled the fit of approximated design characteristics for a complex system such as turbine blades that incorporate a large number of design variables and non‐linear behaviors. This paper aims to discuss these issues.

Design/methodology/approach

The authors constructed a Kriging meta‐model with a multi‐level orthogonal array for the design of experiments, which were used to optimize the fatigue life of turbine blades under cyclic rotational loads such as centrifugal force. By combining a seven‐level orthogonal array with the Kriging model, the non‐linear design space of fatigue life was explored and optimized.

Findings

A computer‐generated multi‐level orthogonal array provided a good representation of the non‐linear design space information. The results show that not only was the fatigue life of the leading edge of the blade root significantly improved, but also that the computing analysis was effective.

Originality/value

To maximize the fatigue life of the turbine blade, the three‐design variables with seven factor levels were optimized via a Kriging meta‐model. As with the optimization technique, a desirability function approach was adopted, which converted multiple responses into a single response problem by maximizing the total desirability.

Details

International Journal of Structural Integrity, vol. 2 no. 3
Type: Research Article
ISSN: 1757-9864

Keywords

Article
Publication date: 12 April 2013

Mina Ranjbarfard, Mohammad Aghdasi, Amir Albadvi and Mohammad Hassanzadeh

The aim of this paper is to develop, test and improve a method that draws upon business process improvement literature and combines it with knowledge management approaches for…

1650

Abstract

Purpose

The aim of this paper is to develop, test and improve a method that draws upon business process improvement literature and combines it with knowledge management approaches for modeling and analyzing knowledge‐intensive business processes.

Design/methodology/approach

Analyzing and integrating previous meta models served in knowledge oriented business process researches, a preliminary meta model was developed for modeling knowledge‐intensive business processes. Then an initial version of Proper Arrangement of Knowledge Management Processes (PAKMP) framework was developed according to the knowledge management processes approaches. Third round of interviews with process 137 members were conducted in order to test applicability and completeness of both preliminary meta model and initial version of PAKMP framework in order to improve them. In addition, a five‐steps analysis method achieved through case study which is based on the application of both final Meta model and PAKMP framework. In fact this five‐steps method was applied in Tehran's Municipality which redounded to improve preliminary meta model and initial version of PAKMP framework and endorsed the applicability of the proposed method in real world.

Findings

This paper has a contribution in enriching the literature related to integrating KM efforts and BPM efforts by presenting a five‐steps analysis method and testing it in a real case. This method considers both KM and business process management points of view.

Research limitations/implications

The general applicability of the method due to the weak generalization of the single case study is a limitation.

Originality/value

This paper combines the advantages of the business process improvement and knowledge management approaches and suggests a practical method for modeling and analyzing the knowledge management status in knowledge‐intensive business processes. After analysis, managers should put emphasis on improving the arrangement of KM processes for critical knowledge objects which led to improve the performance of knowledge‐intensive business process trough removing KM problems. The paper concludes by suggesting some topics for future research.

Details

Business Process Management Journal, vol. 19 no. 2
Type: Research Article
ISSN: 1463-7154

Keywords

Content available
Article
Publication date: 1 December 2006

51

Abstract

Details

Kybernetes, vol. 35 no. 10
Type: Research Article
ISSN: 0368-492X

Article
Publication date: 15 March 2018

Cem Savas Aydin, Senim Ozgurler, Mehmet Bulent Durmusoglu and Mesut Ozgurler

This paper aims to present a multi-response robust design (RD) optimization approach for U-shaped assembly cells (ACs) with multi-functional walking-workers by using operational…

Abstract

Purpose

This paper aims to present a multi-response robust design (RD) optimization approach for U-shaped assembly cells (ACs) with multi-functional walking-workers by using operational design (OD) factors in a simulation setting. The proposed methodology incorporated the design factors related to the operation of ACs into an RD framework. Utilization of OD factors provided a practical design approach for ACs addressing system robustness without modifying the cell structure.

Design/methodology/approach

Taguchi’s design philosophy and response surface meta-models have been combined for robust simulation optimization (SO). Multiple performance measures have been considered for the study and concurrently optimized by using a multi-response optimization (MRO) approach. Simulation setting provided flexibility in experimental design selection and facilitated experiments by avoiding cost and time constraints in real-world experiments.

Findings

The present approach is illustrated through RD of an AC for performance measures: average throughput time, average WIP inventory and cycle time. Findings are in line with expectations that a significant reduction in performance variability is attainable by trading-off optimality for robustness. Reductions in expected performance (optimality) values are negligible in comparison to reductions in performance variability (robustness).

Practical implications

ACs designed for robustness are more likely to meet design objectives once they are implemented, preventing changes or roll-backs. Successful implementations serve as examples to shop-floor personnel alleviating issues such as operator/supervisor resistance and scepticism, encouraging participation and facilitating teamwork.

Originality/value

ACs include many activities related to cell operation which can be used for performance optimization. The proposed framework is a realistic design approach using OD factors and considering system stochasticity in terms of noise factors for RD optimization through simulation. To the best of the authors’ knowledge, it is the first time a multi-response RD optimization approach for U-shaped manual ACs with multi-functional walking-workers using factors related to AC operation is proposed.

Details

Assembly Automation, vol. 38 no. 4
Type: Research Article
ISSN: 0144-5154

Keywords

1 – 10 of 521