Search results

1 – 10 of over 2000
Article
Publication date: 3 November 2014

Huchang Liao, Zeshui Xu and Jiuping Xu

The purpose of this paper is to develop some weight determining methods for hesitant fuzzy multi-criterion decision making (MCDM) in which the preference information on attributes…

Abstract

Purpose

The purpose of this paper is to develop some weight determining methods for hesitant fuzzy multi-criterion decision making (MCDM) in which the preference information on attributes is collected over different periods.

Design/methodology/approach

Based on the proposed weight determining methods and dynamic hesitant fuzzy aggregation operators, an approach is developed to solve the hesitant fuzzy multi-stage multi-attribute decision-making problem where all the preference information of attributes over different periods is represented in hesitant fuzzy values.

Findings

In order to determine the weights associated with dynamic hesitant fuzzy operators, the authors propose the improved maximum entropy method and the minimum average deviation method.

Research limitations/implications

This paper does not consider the multi-stage multi-criteria group decision-making problem.

Practical implications

An example concerning the evaluation of rangelands is given to illustrate the validation and efficiency of the proposed approach. It should be stated that the proposed approach can also be implemented into other multi-stage MCDM problems.

Originality/value

The concept of hesitant fuzzy variable (HFV) is defined. Some operational laws and properties of the HFVs are given. Moreover, to fuse the multi-stage hesitant fuzzy information, the aggregation operators of hesitant fuzzy sets are extended to that of the HFVs.

Article
Publication date: 8 July 2022

Da Teng, Yun-Wen Feng, Jun-Yu Chen and Cheng Lu

The purpose of this paper is to briefly summarize and review the theories and methods of complex structures’ dynamic reliability. Complex structures are usually assembled from…

Abstract

Purpose

The purpose of this paper is to briefly summarize and review the theories and methods of complex structures’ dynamic reliability. Complex structures are usually assembled from multiple components and subjected to time-varying loads of aerodynamic, structural, thermal and other physical fields; its reliability analysis is of great significance to ensure the safe operation of large-scale equipment such as aviation and machinery.

Design/methodology/approach

In this paper for the single-objective dynamic reliability analysis of complex structures, the calculation can be categorized into Monte Carlo (MC), outcrossing rate, envelope functions and extreme value methods. The series-parallel and expansion methods, multi-extremum surrogate models and decomposed-coordinated surrogate models are summarized for the multiobjective dynamic reliability analysis of complex structures.

Findings

The numerical complex compound function and turbine blisk are used as examples to illustrate the performance of single-objective and multiobjective dynamic reliability analysis methods. Then the future development direction of dynamic reliability analysis of complex structures is prospected.

Originality/value

The paper provides a useful reference for further theoretical research and engineering application.

Details

International Journal of Structural Integrity, vol. 13 no. 5
Type: Research Article
ISSN: 1757-9864

Keywords

Article
Publication date: 28 February 2023

Jinsheng Wang, Zhiyang Cao, Guoji Xu, Jian Yang and Ahsan Kareem

Assessing the failure probability of engineering structures is still a challenging task in the presence of various uncertainties due to the involvement of expensive-to-evaluate…

194

Abstract

Purpose

Assessing the failure probability of engineering structures is still a challenging task in the presence of various uncertainties due to the involvement of expensive-to-evaluate computational models. The traditional simulation-based approaches require tremendous computational effort, especially when the failure probability is small. Thus, the use of more efficient surrogate modeling techniques to emulate the true performance function has gained increasingly more attention and application in recent years. In this paper, an active learning method based on a Kriging model is proposed to estimate the failure probability with high efficiency and accuracy.

Design/methodology/approach

To effectively identify informative samples for the enrichment of the design of experiments, a set of new learning functions is proposed. These learning functions are successfully incorporated into a sampling scheme, where the candidate samples for the enrichment are uniformly distributed in the n-dimensional hypersphere with an iteratively updated radius. To further improve the computational efficiency, a parallelization strategy that enables the proposed algorithm to select multiple sample points in each iteration is presented by introducing the K-means clustering algorithm. Hence, the proposed method is referred to as the adaptive Kriging method based on K-means clustering and sampling in n-Ball (AK-KBn).

Findings

The performance of AK-KBn is evaluated through several numerical examples. According to the generated results, all the proposed learning functions are capable of guiding the search toward sample points close to the LSS in the critical region and result in a converged Kriging model that perfectly matches the true one in the regions of interest. The AK-KBn method is demonstrated to be well suited for structural reliability analysis and a very good performance is observed in the investigated examples.

Originality/value

In this study, the statistical information of Kriging prediction, the relative contribution of the sample points to the failure probability and the distances between the candidate samples and the existing ones are all integrated into the proposed learning functions, which enables effective selection of informative samples for updating the Kriging model. Moreover, the number of required iterations is reduced by introducing the parallel computing strategy, which can dramatically alleviate the computation cost when time demanding numerical models are involved in the analysis.

Details

Engineering Computations, vol. 40 no. 2
Type: Research Article
ISSN: 0264-4401

Keywords

Article
Publication date: 31 January 2020

Sandhya Kumari Teku, Koteswara Rao Sanagapallea and Santi Prabha Inty

Integrating complementary information with high-quality visual perception is essential in infrared and visible image fusion. Contrast-enhanced fusion required for target detection…

Abstract

Purpose

Integrating complementary information with high-quality visual perception is essential in infrared and visible image fusion. Contrast-enhanced fusion required for target detection in military, navigation and surveillance applications, where visible images are captured at low-light conditions, is a challenging task. This paper aims to focus on the enhancement of poorly illuminated low-light images through decomposition prior to fusion, to provide high visual quality.

Design/methodology/approach

In this paper, a two-step process is implemented to improve the visual quality. First, the low-light visible image is decomposed to dark and bright image components. The decomposition is accomplished based on the selection of a threshold using Renyi’s entropy maximization. The decomposed dark and bright images are intensified with the stochastic resonance (SR) model. Second, texture information-based weighted average scheme for low-frequency coefficients and select maximum precept for high-frequency coefficients are used in the discrete wavelet transform (DWT) domain.

Findings

Simulations in MATLAB were carried out on various test images. The qualitative and quantitative evaluations of the proposed method show improvement in edge-based and information-based metrics compared to several existing fusion techniques.

Originality/value

In this work, a high-contrast, edge-preserved and brightness-improved image is obtained by the processing steps considered in this work to get good visual quality.

Details

World Journal of Engineering, vol. 17 no. 1
Type: Research Article
ISSN: 1708-5284

Keywords

Article
Publication date: 10 January 2023

Jianhua Zhu, Luxin Wan, Huijuan Zhao, Longzhen Yu and Siyu Xiao

The purpose of this paper is to provide scientific guidance for the integration of industrialization and information (TIOII). In recent years, TIOII has promoted the development…

Abstract

Purpose

The purpose of this paper is to provide scientific guidance for the integration of industrialization and information (TIOII). In recent years, TIOII has promoted the development of intelligent manufacturing in China. However, many enterprises blindly invest in TIOII, which affects their normal production and operation.

Design/methodology/approach

This study establishes an efficiency evaluation model for TIOII. In this paper, entropy analytic hierarchy process (AHP) constraint cone and cross-efficiency are added based on traditional data envelopment analysis (DEA) model, and entropy AHP–cross-efficiency DEA model is proposed. Then, statistical analysis is carried out on the integration efficiency of enterprises in Guangzhou using cross-sectional data, and the traditional DEA model and entropy AHP–cross-efficiency DEA model are used to analyze the integration efficiency of enterprises.

Findings

The data show that the efficiency of enterprise integration is at a medium level in Guangzhou. The efficiency of enterprise integration has no significant relationship with enterprise size and production type but has a low negative correlation with the development level of enterprise integration. In addition, the improved DEA model can better reflect the real integration efficiency of enterprises and obtain complete ranking results.

Originality/value

By adding the entropy AHP constraint cone and cross-efficiency, the traditional DEA model is improved. The improved DEA model can better reflect the real efficiency of TIOII and obtain complete ranking results.

Details

Chinese Management Studies, vol. 18 no. 1
Type: Research Article
ISSN: 1750-614X

Keywords

Article
Publication date: 2 August 2021

Modupeola Dada, Patricia Popoola and Ntombi Mathe

This study aims to review the recent advancements in high entropy alloys (HEAs) called high entropy materials, including high entropy superalloys which are current potential…

1475

Abstract

Purpose

This study aims to review the recent advancements in high entropy alloys (HEAs) called high entropy materials, including high entropy superalloys which are current potential alternatives to nickel superalloys for gas turbine applications. Understandings of the laser surface modification techniques of the HEA are discussed whilst future recommendations and remedies to manufacturing challenges via laser are outlined.

Design/methodology/approach

Materials used for high-pressure gas turbine engine applications must be able to withstand severe environmentally induced degradation, mechanical, thermal loads and general extreme conditions caused by hot corrosive gases, high-temperature oxidation and stress. Over the years, Nickel-based superalloys with elevated temperature rupture and creep resistance, excellent lifetime expectancy and solution strengthening L12 and γ´ precipitate used for turbine engine applications. However, the superalloy’s density, low creep strength, poor thermal conductivity, difficulty in machining and low fatigue resistance demands the innovation of new advanced materials.

Findings

HEAs is one of the most frequently investigated advanced materials, attributed to their configurational complexity and properties reported to exceed conventional materials. Thus, owing to their characteristic feature of the high entropy effect, several other materials have emerged to become potential solutions for several functional and structural applications in the aerospace industry. In a previous study, research contributions show that defects are associated with conventional manufacturing processes of HEAs; therefore, this study investigates new advances in the laser-based manufacturing and surface modification techniques of HEA.

Research limitations/implications

The AlxCoCrCuFeNi HEA system, particularly the Al0.5CoCrCuFeNi HEA has been extensively studied, attributed to its mechanical and physical properties exceeding that of pure metals for aerospace turbine engine applications and the advances in the fabrication and surface modification processes of the alloy was outlined to show the latest developments focusing only on laser-based manufacturing processing due to its many advantages.

Originality/value

It is evident that high entropy materials are a potential innovative alternative to conventional superalloys for turbine engine applications via laser additive manufacturing.

Details

World Journal of Engineering, vol. 20 no. 1
Type: Research Article
ISSN: 1708-5284

Keywords

Article
Publication date: 3 April 2018

Jing Quan, Bo Zeng and LuYun Wang

Equally weighted factors and initial data from behavioural sequences are used for calculating the degree of grey incidence in Deng’s grey incidence analysis. However, certain grey…

Abstract

Purpose

Equally weighted factors and initial data from behavioural sequences are used for calculating the degree of grey incidence in Deng’s grey incidence analysis. However, certain grey information cannot be directly obtained, and the correlation coefficients of each sequence at different times are of different importance to the system. The purpose of this paper is to propose an improved grey incidence model with new grey incidence coefficients and weighted degree of grey incidence. Some grey information can be obtained more easily by using the grey transformation sequences, and the maximum entropy method is used to calculate the weights of new grey incidence coefficients, so the new degree of grey incidence was distinguished more effectively by the proposed model.

Design/methodology/approach

New grey incidence coefficients are defined using transformation sequences of the initial data. To overcome the shortcomings arising from the use of equal weights, the maximum entropy method is proposed for determining the weights of the grey incidence coefficients. The resulting model optimises the classical models and evaluates the influencing factors more effectively. The effectiveness of the model was verified by a numerical example. Furthermore, the model was used for analysing the main influencing factors of the tertiary industry in China.

Findings

The proposed model optimises the classical models, and the application example shows that urbanisation has the greatest effect on employment in the tertiary sector.

Originality/value

An improved grey incidence model is proposed that improves the grey incidence coefficients and their weights, and has better performance than the classical models. The model was successfully used in the analysis of the influence factors of the tertiary industry in China. The results indicate that the model can reflect the significance of incidence coefficients at different time points; therefore, their fluctuation can be effectively controlled.

Details

Grey Systems: Theory and Application, vol. 8 no. 2
Type: Research Article
ISSN: 2043-9377

Keywords

Article
Publication date: 24 September 2021

Xue Deng and Yingxian Lin

The weighted evaluation function method with normalized objective functions is used to transform the proposed multi-objective model into a single objective one, which reflects the…

Abstract

Purpose

The weighted evaluation function method with normalized objective functions is used to transform the proposed multi-objective model into a single objective one, which reflects the investors' preference for returns, risks and social responsibility by adjusting the weights. Finally, an example is given to illustrate the solution steps of the model and the effectiveness of the algorithm.

Design/methodology/approach

Based on the possibility theory, assuming that the future returns of each asset are trapezoidal fuzzy numbers, a mean-variance-Yager entropy-social responsibility model is constructed including piecewise linear transaction costs and risk-free assets. The model proposed in this paper includes six constraints, the investment proportion sum, the non-negativity proportion, the ceiling and floor, the pre-assignment, the cardinality and the round lot constraints. In addition, considering the special round lot constraint, the proposed model is transformed into an integer programming problem.

Findings

The effects of different constraints and transaction costs on the effective frontier of the portfolio are analyzed, which not only assists investors to make decisions close to their expectations by setting appropriate parameters but also provides constructive suggestions through the overall performance of each asset.

Originality/value

There are two improvements in the improved particle swarm optimization algorithm: one is that the complex constraints are specifically satisfied by using a renewable 0–1 random constraint matrix and random scaling factors instead of fixed ones; the other is eliminating the particles with poor fitness and randomly adding some new particles that satisfy all the constraints to achieve the goal of global search as much as possible.

Book part
Publication date: 23 June 2016

Amos Golan and Robin L. Lumsdaine

Although in principle prior information can significantly improve inference, incorporating incorrect prior information will bias the estimates of any inferential analysis. This…

Abstract

Although in principle prior information can significantly improve inference, incorporating incorrect prior information will bias the estimates of any inferential analysis. This fact deters many scientists from incorporating prior information into their inferential analyses. In the natural sciences, where experiments are more regularly conducted, and can be combined with other relevant information, prior information is often used in inferential analysis, despite it being sometimes nontrivial to specify what that information is and how to quantify that information. In the social sciences, however, prior information is often hard to come by and very hard to justify or validate. We review a number of ways to construct such information. This information emerges naturally, either from fundamental properties and characteristics of the systems studied or from logical reasoning about the problems being analyzed. Borrowing from concepts and philosophical reasoning used in the natural sciences, and within an info-metrics framework, we discuss three different, yet complimentary, approaches for constructing prior information, with an application to the social sciences.

Details

Essays in Honor of Aman Ullah
Type: Book
ISBN: 978-1-78560-786-8

Keywords

Article
Publication date: 19 May 2020

Praveen Kumar Gopagoni and Mohan Rao S K

Association rule mining generates the patterns and correlations from the database, which requires large scanning time, and the cost of computation associated with the generation…

Abstract

Purpose

Association rule mining generates the patterns and correlations from the database, which requires large scanning time, and the cost of computation associated with the generation of the rules is quite high. On the other hand, the candidate rules generated using the traditional association rules mining face a huge challenge in terms of time and space, and the process is lengthy. In order to tackle the issues of the existing methods and to render the privacy rules, the paper proposes the grid-based privacy association rule mining.

Design/methodology/approach

The primary intention of the research is to design and develop a distributed elephant herding optimization (EHO) for grid-based privacy association rule mining from the database. The proposed method of rule generation is processed as two steps: in the first step, the rules are generated using apriori algorithm, which is the effective association rule mining algorithm. In general, the extraction of the association rules from the input database is based on confidence and support that is replaced with new terms, such as probability-based confidence and holo-entropy. Thus, in the proposed model, the extraction of the association rules is based on probability-based confidence and holo-entropy. In the second step, the generated rules are given to the grid-based privacy rule mining, which produces privacy-dependent rules based on a novel optimization algorithm and grid-based fitness. The novel optimization algorithm is developed by integrating the distributed concept in EHO algorithm.

Findings

The experimentation of the method using the databases taken from the Frequent Itemset Mining Dataset Repository to prove the effectiveness of the distributed grid-based privacy association rule mining includes the retail, chess, T10I4D100K and T40I10D100K databases. The proposed method outperformed the existing methods through offering a higher degree of privacy and utility, and moreover, it is noted that the distributed nature of the association rule mining facilitates the parallel processing and generates the privacy rules without much computational burden. The rate of hiding capacity, the rate of information preservation and rate of the false rules generated for the proposed method are found to be 0.4468, 0.4488 and 0.0654, respectively, which is better compared with the existing rule mining methods.

Originality/value

Data mining is performed in a distributed manner through the grids that subdivide the input data, and the rules are framed using the apriori-based association mining, which is the modification of the standard apriori with the holo-entropy and probability-based confidence replacing the support and confidence in the standard apriori algorithm. The mined rules do not assure the privacy, and hence, the grid-based privacy rules are employed that utilize the adaptive elephant herding optimization (AEHO) for generating the privacy rules. The AEHO inherits the adaptive nature in the standard EHO, which renders the global optimal solution.

Details

Data Technologies and Applications, vol. 54 no. 3
Type: Research Article
ISSN: 2514-9288

Keywords

1 – 10 of over 2000