Search results

21 – 30 of 290
Article
Publication date: 31 July 2019

Enying Li, Zheng Zhou, Hu Wang and Kang Cai

This study aims to suggest and develops a global sensitivity analysis-assisted multi-level sequential optimization method for the heat transfer problem.

Abstract

Purpose

This study aims to suggest and develops a global sensitivity analysis-assisted multi-level sequential optimization method for the heat transfer problem.

Design/methodology/approach

Compared with other surrogate-assisted optimization methods, the distinctive characteristic of the suggested method is to decompose the original problem into several layers according to the global sensitivity index. The optimization starts with the several most important design variables by the support vector regression-based efficient global optimization method. Then, when the optimization process progresses, the filtered design variables should be involved in optimization one by one or the setting value. Therefore, in each layer, the design space should be reduced according to the previous optimization result. To improve the accuracy of the global sensitivity index, a novel global sensitivity analysis method based on the variance-based method incorporating a random sampling high-dimensional model representation is introduced.

Findings

The advantage of this method lies in its capability to solve complicated problems with a limited number of sample points. Moreover, to enhance the reliability of optimum, the support vector regression-based global efficient optimization is used to optimize in each layer.

Practical implications

The developed optimization tool is built by MATLAB and can be integrated by commercial software, such as ABAQUS and COMSOL. Lastly, this tool is integrated with COMSOL and applied to the plant-fin heat sink design. Compared with the initial temperature, the temperature after design is over 49°. Moreover, the relationships among all design variables are also disclosed clearly.

Originality/value

The D-MORPH-HDMR is integrated to obtain the coupling relativities among the design variables efficiently. The suggested method can be decomposed into multiplier layers according to the GSI. The SVR-EGO is used to optimize the sub-problem because of its robustness of modeling.

Details

Engineering Computations, vol. 37 no. 2
Type: Research Article
ISSN: 0264-4401

Keywords

Article
Publication date: 12 June 2017

Khaoula Chikhaoui, Noureddine Bouhaddi, Najib Kacem, Mohamed Guedri and Mohamed Soula

The purpose of this paper is to develop robust metamodels, which allow propagating parametric uncertainties, in the presence of localized nonlinearities, with reduced cost and…

Abstract

Purpose

The purpose of this paper is to develop robust metamodels, which allow propagating parametric uncertainties, in the presence of localized nonlinearities, with reduced cost and without significant loss of accuracy.

Design/methodology/approach

The proposed metamodels combine the generalized polynomial chaos expansion (gPCE) for the uncertainty propagation and reduced order models (ROMs). Based on the computation of deterministic responses, the gPCE requires prohibitive computational time for large-size finite element models, large number of uncertain parameters and presence of nonlinearities. To overcome this issue, a first metamodel is created by combining the gPCE and a ROM based on the enrichment of the truncated Ritz basis using static residuals taking into account the stochastic and nonlinear effects. The extension to the Craig–Bampton approach leads to a second metamodel.

Findings

Implementing the metamodels to approximate the time responses of a frame and a coupled micro-beams structure containing localized nonlinearities and stochastic parameters permits to significantly reduce computation cost with acceptable loss of accuracy, with respect to the reference Latin Hypercube Sampling method.

Originality/value

The proposed combination of the gPCE and the ROMs leads to a computationally efficient and accurate tool for robust design in the presence of parametric uncertainties and localized nonlinearities.

Details

Engineering Computations, vol. 34 no. 4
Type: Research Article
ISSN: 0264-4401

Keywords

Article
Publication date: 21 June 2019

Milad Yousefi and Moslem Yousefi

The complexity and interdisciplinarity of healthcare industry problems make this industry one of the attention centers of computer-based simulation studies to provide a proper…

Abstract

Purpose

The complexity and interdisciplinarity of healthcare industry problems make this industry one of the attention centers of computer-based simulation studies to provide a proper tool for interaction between decision-makers and experts. The purpose of this study is to present a metamodel-based simulation optimization in an emergency department (ED) to allocate human resources in the best way to minimize door to doctor time subject to the problem constraints which are capacity and budget.

Design/methodology/approach

To obtain the objective of this research, first the data are collected from a public hospital ED in Brazil, and then an agent-based simulation is designed and constructed. Afterwards, three machine-learning approaches, namely, adaptive neuro-fuzzy inference system (ANFIS), feed forward neural network (FNN) and recurrent neural network (RNN), are used to build an ensemble metamodel through adaptive boosting. Finally, the results from the metamodel are applied in a discrete imperialist competitive algorithm (ICA) for optimization.

Findings

Analyzing the results shows that the yellow zone section is considered as a potential bottleneck of the ED. After 100 executions of the algorithm, the results show a reduction of 24.82 per cent in the door to doctor time with a success rate of 59 per cent.

Originality/value

This study fulfils an identified need to optimize human resources in an ED with less computational time.

Details

Kybernetes, vol. 49 no. 3
Type: Research Article
ISSN: 0368-492X

Keywords

Article
Publication date: 1 July 1999

Ifeanyi E. Madu

Develops a robust maintenance float policy, that considers system design parameter settings that not only satisfy the system performance criteria but are also insensitive to…

Abstract

Develops a robust maintenance float policy, that considers system design parameter settings that not only satisfy the system performance criteria but are also insensitive to various noise conditions. The experimental design strategy employed in the study involves the use of discrete event simulation. In the study, the strategy proposed involves solving a maintenance float policy using both the inner and outer arrays as advocated by Genichi Taguchi. Initial system variables and their parameter settings were chosen based on a prior study. These system variables were then classified into design factors and noise factors. An experimental design was developed using Taguchi’s orthogonal array, after which a simulation experiment was performed and additional data collected. Based on the results, regression was performed with the significant factors and interactions. From the regression analysis, a robust metamodel was developed. A cost model was also proposed.

Details

International Journal of Quality & Reliability Management, vol. 16 no. 5
Type: Research Article
ISSN: 0265-671X

Keywords

Article
Publication date: 28 August 2019

Franck Mastrippolito, Stephane Aubert, Frédéric Ducros and Martin Buisson

This paper aims to improve the radial basis fuction mesh morphing method. During a shape optimization based on computational fluid dynamic (CFD) solvers, the mesh has to be…

Abstract

Purpose

This paper aims to improve the radial basis fuction mesh morphing method. During a shape optimization based on computational fluid dynamic (CFD) solvers, the mesh has to be changed. Two possible strategies are re-meshing or morphing. The morphing one is advantageous because it preserves the mesh connectivity, but it must be constrained.

Design/methodology/approach

RBF mesh deformation is one of the most robust and accurate morphing method. Using a greedy algorithm, the computational cost of the method is reduced. To evaluate the morphing performances, a rib shape optimization is performed using the NSGA-II algorithm coupled to kriging metamodels based on CFD. The morphing method is then compared to a re-meshing strategy.

Findings

The authors propose a method, based on Schur complement, to speed-up the greedy process. By using the information of the previous iteration, smaller linear systems are solved and time is saved. The optimization results highlight the interest of using a morphing-based metamodel regarding the resolution time and the accuracy of the interpolated solutions.

Originality/value

A new method based on Schur complement is addressed to speed-up the greedy algorithm and successfully applied to a shape optimization.

Details

International Journal of Numerical Methods for Heat & Fluid Flow, vol. 30 no. 9
Type: Research Article
ISSN: 0961-5539

Keywords

Article
Publication date: 29 July 2014

M.Q. Chau, X. Han, C. Jiang, Y.C. Bai, T.N. Tran and V.H. Truong

The performance measure approach (PMA) is widely adopted for reliability analysis and reliability-based design optimization because of its robustness and efficiency compared to…

Abstract

Purpose

The performance measure approach (PMA) is widely adopted for reliability analysis and reliability-based design optimization because of its robustness and efficiency compared to reliability index approach. However, it has been reported that PMA involves repeat evaluations of probabilistic constraints therefore it is prohibitively expensive for many large-scale applications. In order to overcome these disadvantages, the purpose of this paper is to propose an efficient PMA-based reliability analysis technique using radial basis function (RBF).

Design/methodology/approach

The RBF is adopted to approximate the implicit limit state functions in combination with latin hypercube sampling (LHS) strategy. The advanced mean value method is applied to obtain the most probable point (MPP) with the prescribed target reliability and corresponding probabilistic performance measure to improve analysis accuracy. A sequential framework is proposed to relocate the sampling center to the obtained MPP and reconstruct RBF until a criteria is satisfied.

Findings

The method is shown to be better in the computation time to the PMA based on the actual model. The analysis results of probabilistic performance measure are accurately close to the reference solution. Five numerical examples are presented to demonstrate the effectiveness of the proposed method.

Originality/value

The main contribution of this paper is to propose a new reliability analysis technique using reconstructed RBF approximate model. The originalities of this paper may lie in: investigating the PMA using metamodel techniques, using RBF instead of the other types of metamodels to deal with the low efficiency problem.

Details

Engineering Computations, vol. 31 no. 6
Type: Research Article
ISSN: 0264-4401

Keywords

Article
Publication date: 7 September 2012

Ali Zamani, Ahmad Mirabadi and Felix Schmid

In writing this paper, the authors investigated the use of electromagnetic sensors in axle counter applications by means of train wheel detection. The purpose of this paper is to…

Abstract

Purpose

In writing this paper, the authors investigated the use of electromagnetic sensors in axle counter applications by means of train wheel detection. The purpose of this paper is to improve the detection capability of train wheel detectors, by installing them in the optimal orientation and position, using finite element modeling (FEM) in combination with metamodeling techniques. The authors compare three common metamodeling techniques for the special case of wheel detector orientation: response surface methodology; multivariate adaptive regression splines; and kriging.

Design/methodology/approach

After analyzing the effective parameters of a train wheel detector, an appropriate method for decreasing the system susceptibility to electromagnetic noises is presented.

Findings

The results were validated using a laboratory‐based system and also the results of field tests carried out on the Iranian railway network. The results of the study suggest that the FEM method and a metamodeling technique can reduce the computational efforts and processing time.

Originality/value

In this paper, combination of FEM and metamodeling approaches are used to optimize the railway axle counter coils orientation, which is more insusceptible to electromagnetic noise than initial arrangement used by some signallers.

Open Access
Article
Publication date: 28 August 2021

Slawomir Koziel and Anna Pietrenko-Dabrowska

A novel framework for expedited antenna optimization with an iterative prediction-correction scheme is proposed. The methodology is comprehensively validated using three…

Abstract

Purpose

A novel framework for expedited antenna optimization with an iterative prediction-correction scheme is proposed. The methodology is comprehensively validated using three real-world antenna structures: narrow-band, dual-band and wideband, optimized under various design scenarios.

Design/methodology/approach

The keystone of the proposed approach is to reuse designs pre-optimized for various sets of performance specifications and to encode them into metamodels that render good initial designs, as well as an initial estimate of the antenna response sensitivities. Subsequent design refinement is realized using an iterative prediction-correction loop accommodating the discrepancies between the actual and target design specifications.

Findings

The presented framework is capable of yielding optimized antenna designs at the cost of just a few full-wave electromagnetic simulations. The practical importance of the iterative correction procedure has been corroborated by benchmarking against gradient-only refinement. It has been found that the incorporation of problem-specific knowledge into the optimization framework greatly facilitates parameter adjustment and improves its reliability.

Research limitations/implications

The proposed approach can be a viable tool for antenna optimization whenever a certain number of previously obtained designs are available or the designer finds the initial effort of their gathering justifiable by intended re-use of the procedure. The future work will incorporate response features technology for improving the accuracy of the initial approximation of antenna response sensitivities.

Originality/value

The proposed optimization framework has been proved to be a viable tool for cost-efficient and reliable antenna optimization. To the knowledge, this approach to antenna optimization goes beyond the capabilities of available methods, especially in terms of efficient utilization of the existing knowledge, thus enabling reliable parameter tuning over broad ranges of both operating conditions and material parameters of the structure of interest.

Details

Engineering Computations, vol. 38 no. 10
Type: Research Article
ISSN: 0264-4401

Keywords

Article
Publication date: 13 August 2019

Xiaosong Du and Leifur Leifsson

Model-assisted probability of detection (MAPOD) is an important approach used as part of assessing the reliability of nondestructive testing systems. The purpose of this paper is…

Abstract

Purpose

Model-assisted probability of detection (MAPOD) is an important approach used as part of assessing the reliability of nondestructive testing systems. The purpose of this paper is to apply the polynomial chaos-based Kriging (PCK) metamodeling method to MAPOD for the first time to enable efficient uncertainty propagation, which is currently a major bottleneck when using accurate physics-based models.

Design/methodology/approach

In this paper, the state-of-the-art Kriging, polynomial chaos expansions (PCE) and PCK are applied to “a^ vs a”-based MAPOD of ultrasonic testing (UT) benchmark problems. In particular, Kriging interpolation matches the observations well, while PCE is capable of capturing the global trend accurately. The proposed UP approach for MAPOD using PCK adopts the PCE bases as the trend function of the universal Kriging model, aiming at combining advantages of both metamodels.

Findings

To reach a pre-set accuracy threshold, the PCK method requires 50 per cent fewer training points than the PCE method, and around one order of magnitude fewer than Kriging for the test cases considered. The relative differences on the key MAPOD metrics compared with those from the physics-based models are controlled within 1 per cent.

Originality/value

The contributions of this work are the first application of PCK metamodel for MAPOD analysis, the first comparison between PCK with the current state-of-the-art metamodels for MAPOD and new MAPOD results for the UT benchmark cases.

Details

Engineering Computations, vol. 37 no. 1
Type: Research Article
ISSN: 0264-4401

Keywords

Article
Publication date: 10 August 2010

Witold Pedrycz

The purpose of this paper is to show that exploiting fundamental ideas of granular computing can lead to further conceptual developments of granular metastructures, which are…

Abstract

Purpose

The purpose of this paper is to show that exploiting fundamental ideas of granular computing can lead to further conceptual developments of granular metastructures, which are inherently associated with computing involving a large number of individual datasets; and to show that such processing leads to the representatives of information granules and granular models in the form of metastructures and metamodels.

Design/methodology/approach

The formulation of the concept of granular metastructures is provided and presented along with some essential algorithmic developments and associated optimization strategies. The overall methodological framework is the one of granular computing, especially fuzzy sets and fuzzy sets of higher type. Given the structural facet of optimization, the paper stresses the relevance of the use of evolutionary optimization.

Findings

This paper focused on the underlying concepts and while it elaborated on some development aspects and optimization tools, it should be stressed that further refinement and a thorough exploitation of optimization techniques in application to the inherently combinatorial facet of the problem are to be pursued in detail.

Practical implications

The introduced approach and algorithms could be of interest when solving problems of granular metastructures, in particular those encountered in knowledge‐based systems.

Originality/value

The main aspects of originality concern a formulation of the concept of granular metastructures and their design, based on granular evidence (experimental data) of lower type. A constructive way of forming type‐2 fuzzy sets via the principle of justifiable granularity exhibits a significant level of originality and offers a general way of designing information granules.

Details

Kybernetes, vol. 39 no. 7
Type: Research Article
ISSN: 0368-492X

Keywords

21 – 30 of 290