Search results
1 – 10 of over 28000Jing Ye and Yaoguo Dang
Nowadays, evaluation objects are becoming more and more complicated. The interval grey numbers can be used to more accurately express the evaluation objects. However, the…
Abstract
Purpose
Nowadays, evaluation objects are becoming more and more complicated. The interval grey numbers can be used to more accurately express the evaluation objects. However, the information distribution of interval grey numbers is not balanced. The purpose of this paper is to introduce the central-point triangular whitenization weight function to solve the clustering process of this kind of numbers.
Design/methodology/approach
A new expression of the central-point triangular whitenization weight function is presented in this paper, in terms of the grey cluster problem based on interval grey numbers. By establishing the integral mean value function on the set of interval grey numbers, the application range of grey clustering model is extended to the interval grey number category, and, in this way, the grey fixed weight cluster model based on interval grey numbers is obtained.
Findings
The model is verified by a case which reveals a high distinguishability, validity and practicability.
Practical implications
This model can be used in many fields, such as agriculture, economy, geology and medical science, and provides a feasible method for evaluation schemes in performance evaluation, scheme selection, risk evaluation and so on.
Originality/value
The central-point triangular whitenization weight function is introduced. The method reflects the thought “make full use of the information” in grey system theory and further enriches the system of grey clustering theory as well as expands the application scope of the grey clustering method.
Details
Keywords
Sifeng Liu, Yingjie Yang, Naiming Xie and Jeffrey Forrest
The purpose of this paper is to summarize the progress in grey system research during 2000-2015, so as to present some important new concepts, models, methods and a new framework…
Abstract
Purpose
The purpose of this paper is to summarize the progress in grey system research during 2000-2015, so as to present some important new concepts, models, methods and a new framework of grey system theory.
Design/methodology/approach
The new thinking, new models and new methods of grey system theory and their applications are presented in this paper. It includes algorithm rules of grey numbers based on the “kernel” and the degree of greyness of grey numbers, the concept of general grey numbers, the synthesis axiom of degree of greyness of grey numbers and their operations; the general form of buffer operators of grey sequence operators; the four basic models of grey model GM(1,1), such as even GM, original difference GM, even difference GM, discrete GM and the suitable sequence type of each basic model, and suitable range of most used grey forecasting models; the similarity degree of grey incidences, the closeness degree of grey incidences and the three-dimensional absolute degree of grey incidence of grey incidence analysis models; the grey cluster model based on center-point and end-point mixed triangular whitenization functions; the multi-attribute intelligent grey target decision model, the two stages decision model with grey synthetic measure of grey decision models; grey game models, grey input-output models of grey combined models; and the problems of robust stability for grey stochastic time-delay systems of neutral type, distributed-delay type and neutral distributed-delay type of grey control, etc. And the new framework of grey system theory is given as well.
Findings
The problems which remain for further studying are discussed at the end of each section. The reader could know the general picture of research and developing trend of grey system theory from this paper.
Practical implications
A lot of successful practical applications of the new models to solve various problems have been found in many different areas of natural science, social science and engineering, including spaceflight, civil aviation, information, metallurgy, machinery, petroleum, chemical industry, electrical power, electronics, light industries, energy resources, transportation, medicine, health, agriculture, forestry, geography, hydrology, seismology, meteorology, environment protection, architecture, behavioral science, management science, law, education, military science, etc. These practical applications have brought forward definite and noticeable social and economic benefits. It demonstrates a wide range of applicability of grey system theory, especially in the situation where the available information is incomplete and the collected data are inaccurate.
Originality/value
The reader is given a general picture of grey systems theory as a new model system and a new framework for studying problems where partial information is known; especially for uncertain systems with few data points and poor information. The problems remaining for further studying are identified at the end of each section.
Details
Keywords
Kejia Chen, Ping Chen, Lixi Yang and Lian Jin
The purpose of this paper is to propose a grey clustering evaluation model based on analytic hierarchy process (AHP) and interval grey number (IGN) to solve the clustering…
Abstract
Purpose
The purpose of this paper is to propose a grey clustering evaluation model based on analytic hierarchy process (AHP) and interval grey number (IGN) to solve the clustering evaluation problem with IGNs.
Design/methodology/approach
First, the centre-point triangular whitenisation weight function with real numbers is built, and then by using interval mean function, the whitenisation weight function is extended to IGNs. The weights of evaluation indexes are determined by AHP. Finally, this model is used to evaluate the flight safety of a Chinese airline. The results indicate that the model is effective and reasonable.
Findings
When IGN meets certain conditions, the centre-point triangular whitenisation weight function based on IGN is not multiple-cross and it is normative. It provides a certain standard and basis for obtaining the effective evaluation indexes and determining the scientific evaluation of the grey class.
Originality/value
The traditional grey clustering model is extended to the field of IGN. It can make full use of all the information of the IGN, so the result of the evaluation is more objective and reasonable, which provides supports for solving practical problems.
Details
Keywords
Gives introductory remarks about chapter 1 of this group of 31 papers, from ISEF 1999 Proceedings, in the methodologies for field analysis, in the electromagnetic community…
Abstract
Gives introductory remarks about chapter 1 of this group of 31 papers, from ISEF 1999 Proceedings, in the methodologies for field analysis, in the electromagnetic community. Observes that computer package implementation theory contributes to clarification. Discusses the areas covered by some of the papers ‐ such as artificial intelligence using fuzzy logic. Includes applications such as permanent magnets and looks at eddy current problems. States the finite element method is currently the most popular method used for field computation. Closes by pointing out the amalgam of topics.
Details
Keywords
Fangqi Hong, Pengfei Wei and Michael Beer
Bayesian cubature (BC) has emerged to be one of most competitive approach for estimating the multi-dimensional integral especially when the integrand is expensive to evaluate, and…
Abstract
Purpose
Bayesian cubature (BC) has emerged to be one of most competitive approach for estimating the multi-dimensional integral especially when the integrand is expensive to evaluate, and alternative acquisition functions, such as the Posterior Variance Contribution (PVC) function, have been developed for adaptive experiment design of the integration points. However, those sequential design strategies also prevent BC from being implemented in a parallel scheme. Therefore, this paper aims at developing a parallelized adaptive BC method to further improve the computational efficiency.
Design/methodology/approach
By theoretically examining the multimodal behavior of the PVC function, it is concluded that the multiple local maxima all have important contribution to the integration accuracy as can be selected as design points, providing a practical way for parallelization of the adaptive BC. Inspired by the above finding, four multimodal optimization algorithms, including one newly developed in this work, are then introduced for finding multiple local maxima of the PVC function in one run, and further for parallel implementation of the adaptive BC.
Findings
The superiority of the parallel schemes and the performance of the four multimodal optimization algorithms are then demonstrated and compared with the k-means clustering method by using two numerical benchmarks and two engineering examples.
Originality/value
Multimodal behavior of acquisition function for BC is comprehensively investigated. All the local maxima of the acquisition function contribute to adaptive BC accuracy. Parallelization of adaptive BC is realized with four multimodal optimization methods.
Details
Keywords
Introduces papers from this area of expertise from the ISEF 1999 Proceedings. States the goal herein is one of identifying devices or systems able to provide prescribed…
Abstract
Introduces papers from this area of expertise from the ISEF 1999 Proceedings. States the goal herein is one of identifying devices or systems able to provide prescribed performance. Notes that 18 papers from the Symposium are grouped in the area of automated optimal design. Describes the main challenges that condition computational electromagnetism’s future development. Concludes by itemizing the range of applications from small activators to optimization of induction heating systems in this third chapter.
Details
Keywords
The purpose of this paper is to develop a multi-criterion group decision-making (MCGDM) method by combining the regret theory and the Choquet integral under 2-tuple linguistic…
Abstract
Purpose
The purpose of this paper is to develop a multi-criterion group decision-making (MCGDM) method by combining the regret theory and the Choquet integral under 2-tuple linguistic environment and apply the proposed method to deal with the supplier selection problem.
Design/methodology/approach
When making a decision, the decision-maker is more willing to choose the alternative(s) which is preferred by the experts so as to avoid the regret. At the same time, the correlative relationships among the criterion set can be sufficiently described by the fuzzy measures, later the evaluations of a group of criteria can be aggregated by means of the Choquet integral. Hence, the authors cope with the MCGDM problems by combining the regret theory and the Choquet integral, where the fuzzy measures of criteria are partly known or completely unknown and the evaluations are expressed by 2-tuples. The vertical and the horizontal regret-rejoice functions are defined at first. Then, a model aiming to determine the missing fuzzy measures is constructed. Based on which, an MCGDM method is proposed. The proposed method is applied to tackle a practical decision-making problem to verify its feasibility and the effectiveness.
Findings
The vertical and the horizontal regret-rejoice functions are defined. The relationships of the fuzzy measures are expressed by the sets. A model is built for determining the fuzzy measures. Based on which, an MCGDM method is proposed. The results show that the proposed method can solve the MCGDM problems within the context of 2-tuple, where the decision-maker avoids the regret and the criteria are correlative.
Originality/value
The paper proposes an MCGDM method by combining the regret theory and the Choquet integral, which is suitable for dealing with a variety of decision-making problems.
Details
Keywords
The purpose of this paper is to expose computational methods as applied to engineering systems and evolutionary processes with randomness in external actions and inherent…
Abstract
Purpose
The purpose of this paper is to expose computational methods as applied to engineering systems and evolutionary processes with randomness in external actions and inherent parameters.
Design/methodology/approach
In total, two approaches are distinguished that rely on solvers from deterministic algorithms. Probabilistic analysis is referred to as the approximation of the response by a Taylor series expansion about the mean input. Alternatively, stochastic simulation implies random sampling of the input and statistical evaluation of the output.
Findings
Beyond the characterization of random response, methods of reliability assessment are discussed. Concepts of design improvement are presented. Optimization for robustness diminishes the sensitivity of the system to fluctuating parameters.
Practical implications
Deterministic algorithms available for the primary problem are utilized for stochastic analysis by statistical Monte Carlo sampling. The computational effort for the repeated solution of the primary problem depends on the variability of the system and is usually high. Alternatively, the analytic Taylor series expansion requires extension of the primary solver to the computation of derivatives of the response with respect to the random input. The method is restricted to the computation of output mean values and variances/covariances, with the effort determined by the amount of the random input. The results of the two methods are comparable within the domain of applicability.
Originality/value
The present account addresses the main issues related to the presence of randomness in engineering systems and processes. They comprise the analysis of stochastic systems, reliability, design improvement, optimization and robustness against randomness of the data. The analytical Taylor approach is contrasted to the statistical Monte Carlo sampling throughout. In both cases, algorithms known from the primary, deterministic problem are the starting point of stochastic treatment. The reader benefits from the comprehensive presentation of the matter in a concise manner.
Details
Keywords
Gordon Wills, Sherril H. Kennedy, John Cheese and Angela Rushton
To achieve a full understanding of the role ofmarketing from plan to profit requires a knowledgeof the basic building blocks. This textbookintroduces the key concepts in the art…
Abstract
To achieve a full understanding of the role of marketing from plan to profit requires a knowledge of the basic building blocks. This textbook introduces the key concepts in the art or science of marketing to practising managers. Understanding your customers and consumers, the 4 Ps (Product, Place, Price and Promotion) provides the basic tools for effective marketing. Deploying your resources and informing your managerial decision making is dealt with in Unit VII introducing marketing intelligence, competition, budgeting and organisational issues. The logical conclusion of this effort is achieving sales and the particular techniques involved are explored in the final section.
Details
Keywords
Burcu Tunga and Metin Demiralp
The plain High Dimensional Model Representation (HDMR) method needs Dirac delta type weights to partition the given multivariate data set for modelling an interpolation problem…
Abstract
Purpose
The plain High Dimensional Model Representation (HDMR) method needs Dirac delta type weights to partition the given multivariate data set for modelling an interpolation problem. Dirac delta type weight imposes a different importance level to each node of this set during the partitioning procedure which directly effects the performance of HDMR. The purpose of this paper is to develop a new method by using fluctuation free integration and HDMR methods to obtain optimized weight factors needed for identifying these importance levels for the multivariate data partitioning and modelling procedure.
Design/methodology/approach
A common problem in multivariate interpolation problems where the sought function values are given at the nodes of a rectangular prismatic grid is to determine an analytical structure for the function under consideration. As the multivariance of an interpolation problem increases, incompletenesses appear in standard numerical methods and memory limitations in computer‐based applications. To overcome the multivariance problems, it is better to deal with less‐variate structures. HDMR methods which are based on divide‐and‐conquer philosophy can be used for this purpose. This corresponds to multivariate data partitioning in which at most univariate components of the Plain HDMR are taken into consideration. To obtain these components there exist a number of integrals to be evaluated and the Fluctuation Free Integration method is used to obtain the results of these integrals. This new form of HDMR integrated with Fluctuation Free Integration also allows the Dirac delta type weight usage in multivariate data partitioning to be discarded and to optimize the weight factors corresponding to the importance level of each node of the given set.
Findings
The method developed in this study is applied to the six numerical examples in which there exist different structures and very encouraging results were obtained. In addition, the new method is compared with the other methods which include Dirac delta type weight function and the obtained results are given in the numerical implementations section.
Originality/value
The authors' new method allows an optimized weight structure in modelling to be determined in the given problem, instead of imposing the use of a certain weight function such as Dirac delta type weight. This allows the HDMR philosophy to have the chance of a flexible weight utilization in multivariate data modelling problems.
Details