Search results

1 – 10 of over 14000
Article
Publication date: 18 March 2021

Junliang Du, Sifeng Liu and Yong Liu

The purpose of this paper is to advance a novel grey variable dual precision rough set model for grey concept.

Abstract

Purpose

The purpose of this paper is to advance a novel grey variable dual precision rough set model for grey concept.

Design/methodology/approach

To obtain the approximation of a grey object, the authors first define the concepts of grey rough membership degree and grey degree of approximation on the basic thinking logic of variable precision rough set. Based on grey rough membership degree and grey degree of approximation, the authors proposed a grey variable dual precision rough set model. It uses a clear knowledge concept to approximate a grey concept, and the output result is also a clear concept.

Findings

The result demonstrates that the proposed model may be closer to the actual decision-making situation, can effectively improve the rationality and scientificity of the approximation and reduce the risk of decision-making. It can effectively achieve the whitenization of grey objects. The model can be degenerated to traditional variable precision rough fuzzy set model, variable precision rough set model and classic Pawlak rough set, when some specific conditions are met.

Practical implications

The method exposed in the paper can be used to solve multi-criteria decision problems with grey decision objects and provide a decision rule. It can also help us better realize knowledge discovery and attribute reduction. It can effectively achieve the whitenization of grey object.

Originality/value

This method proposed in this paper implements a rough approximation of grey decision object and obtains low-risk probabilistic decision rule. It can effectively achieve a certain degree of whitenization of some grey objects.

Details

Grey Systems: Theory and Application, vol. 12 no. 1
Type: Research Article
ISSN: 2043-9377

Keywords

Article
Publication date: 24 June 2013

Sun Bingzhen and Ma Weimin

– The purpose of this paper is to present a measure method of the uncertainty for rough fuzzy set based on general binary relation.

Abstract

Purpose

The purpose of this paper is to present a measure method of the uncertainty for rough fuzzy set based on general binary relation.

Design/methodology/approach

Rough set and fuzzy set are two different but complementary theories for expressing uncertainty information, and based on the combination of these two uncertainty theories of expressing and handling uncertainty information, the rough fuzzy set model and uncertainty measure based on general relation are discussed.

Findings

This paper reveals the intrinsic of the uncertainty for rough fuzzy set based on general relation and presents a new measure method by introducing the Shannon entropy to generalized approximation space.

Originality/value

The paper contributes to the discussion on the research of rough set and fuzzy set. The conclusions are useful in information processing with uncertainty.

Details

Kybernetes, vol. 42 no. 6
Type: Research Article
ISSN: 0368-492X

Keywords

Article
Publication date: 9 January 2009

Wei‐Shing Chen

This paper seeks to present the use of Rough Sets (RS) theory as a processing method to improve the results in customer satisfaction survey applications.

2866

Abstract

Purpose

This paper seeks to present the use of Rough Sets (RS) theory as a processing method to improve the results in customer satisfaction survey applications.

Design/methodology/approach

The research methodology is to apply an innovative tool to discover knowledge on customer behavior patterns instead of using conventional statistical methods. The RS theory was applied to discover the voice of customers in market research. The collected data contained 422 records. Each record included 20 condition attributes as well as two decision attributes. The important attributes that ensured high quality of classification were generated first. Then decision rules for classifying high and low overall satisfaction and loyalty categories were derived.

Findings

Three important facts were found: the important product and service attributes that lead to overall satisfaction and loyalty; the percentage of latently dissatisfied customers; and customer decision rules.

Research limitations/implications

The study is limited by the case company and its experience. These rules were presented to the company's sales and marketing managers who believed that they provided them with valuable information for creating strategies to increase customer satisfaction and retention.

Originality/value

RS theory provides a mathematical tool to discover patterns hidden in survey data. The paper describes a new attempt of applying a RS‐based method to analyze overall customer satisfaction and loyalty behavior through regular satisfaction questionnaire surveys.

Details

Asia Pacific Journal of Marketing and Logistics, vol. 21 no. 1
Type: Research Article
ISSN: 1355-5855

Keywords

Article
Publication date: 1 July 2003

Yongli Li, Zhilin Li, Yong‐qi Chen, Xiaoxia Li and Yi Lin

Practical needs in geographical information systems (GIS) have led to the investigation of formal, sound and computational methods for spatial analysis. Since models based…

Abstract

Practical needs in geographical information systems (GIS) have led to the investigation of formal, sound and computational methods for spatial analysis. Since models based on topology of R2 have a serious problem of incapability of being applied directly for practical computations, we have noticed that models developed on the raster space can overcome this problem. Because some models based on vector spaces have been effectively used in practical applications, we then introduce the idea of using the raster space as our platform to study spatial entities of vector spaces. In this paper, we use raster spaces to study not only morphological changes of spatial entities of vector spaces, but also equal relations and connectedness of spatial entities of vector spaces. Based on the discovery that all these concepts contain relativity, we then introduce several new concepts, such as observable equivalence, strong connectedness, and weak connectedness. Additionally, we present a possible method of employing raster spaces to study spatial relations of spatial entities of vector spaces. Since the traditional raster spaces could not be used directly, we first construct a new model, called pansystems model, for the concept of raster spaces, then develop a procedure to convert a representation of a spatial entity in vector spaces to that of the spatial entity in a raster space. Such conversions are called approximation mappings.

Details

Kybernetes, vol. 32 no. 5/6
Type: Research Article
ISSN: 0368-492X

Keywords

Article
Publication date: 1 December 1997

E. Salajegheh

Achieves efficient structural optimization of plate structures while the design constraints are multiple frequency constraints. Reduces the computational cost of…

Abstract

Achieves efficient structural optimization of plate structures while the design constraints are multiple frequency constraints. Reduces the computational cost of optimization by approximating the frequencies using the Rayleigh quotient. Uses an optimality criteria method to solve each of the approximate problems. The creation of a high quality approximation is the key to the efficiency of the method. Also, with the great number of design variables, the optimality criteria methods are robust approaches. Thus the combination of approximation concepts and optimality criteria methods forms the basis of an efficient tool for optimum design of plate structures with frequency constraints. Presents examples and compares the results with previous work.

Details

Engineering Computations, vol. 14 no. 8
Type: Research Article
ISSN: 0264-4401

Keywords

Article
Publication date: 1 February 2016

Yi-Chung Hu

– The purpose of this paper is to propose that the grey tolerance rough set (GTRS) and construct the GTRS-based classifiers.

Abstract

Purpose

The purpose of this paper is to propose that the grey tolerance rough set (GTRS) and construct the GTRS-based classifiers.

Design/methodology/approach

The authors use grey relational analysis to implement a relationship-based similarity measure for tolerance rough sets.

Findings

The proposed classification method has been tested on several real-world data sets. Its classification performance is comparable to that of other rough-set-based methods.

Originality/value

The authors design a variant of a similarity measure which can be used to estimate the relationship between any two patterns, such that the closer the relationship, the greater the similarity will be.

Details

Kybernetes, vol. 45 no. 2
Type: Research Article
ISSN: 0368-492X

Keywords

Open Access
Article
Publication date: 3 February 2018

M. Sudha and A. Kumaravel

Rough set theory is a simple and potential methodology in extracting and minimizing rules from decision tables. Its concepts are core, reduct and discovering knowledge in…

Abstract

Rough set theory is a simple and potential methodology in extracting and minimizing rules from decision tables. Its concepts are core, reduct and discovering knowledge in the form of rules. The decision rules explain the decision state to predict and support the new situation. Initially it was proposed as a useful tool for analysis of decision states. This approach produces a set of decision rules involves two types namely certain and possible rules based on approximation. The prediction may highly be affected if the data size varies in larger numbers. Application of Rough set theory towards this direction has not been considered yet. Hence the main objective of this paper is to study the influence of data size and the number of rules generated by rough set methods. The performance of these methods is presented through the metric like accuracy and quality of classification. The results obtained show the range of performance and first of its kind in current research trend.

Details

Applied Computing and Informatics, vol. 16 no. 1/2
Type: Research Article
ISSN: 2634-1964

Keywords

Article
Publication date: 1 September 1999

Th. Ebner, Ch. Magele, B.R. Brandstätter, M. Luschin and P.G. Alotto

Global optimization in electrical engineering using stochastic methods requires usually a large amount of CPU time to locate the optimum, if the objective function is…

Abstract

Global optimization in electrical engineering using stochastic methods requires usually a large amount of CPU time to locate the optimum, if the objective function is calculated either with the finite element method (FEM) or the boundary element method (BEM). One approach to reduce the number of FEM or BEM calls using neural networks and another one using multiquadric functions have been introduced recently. This paper compares the efficiency of both methods, which are applied to a couple of test problems and the results are discussed.

Details

COMPEL - The international journal for computation and mathematics in electrical and electronic engineering, vol. 18 no. 3
Type: Research Article
ISSN: 0332-1649

Keywords

Article
Publication date: 1 November 1999

Scott A. Burns and Keith M. Mueller

The analysis of certain structures must be performed with due consideration to non‐linear behavior, such as material and geometric non‐linearities. The existing methods…

Abstract

The analysis of certain structures must be performed with due consideration to non‐linear behavior, such as material and geometric non‐linearities. The existing methods for treating non‐linear structural behavior generally make use of repeated linearization, such as load increment methods. This paper demonstrates that there is an alternative type of linearization that appears to have significant advantages when applied to the analysis of non‐linear structural systems. Briefly stated, this alternative linearization can be thought of as a “monomialization”. This monomial (single‐termed power function) approximation more faithfully models the power function behavior inherent in typical structural systems. Conveniently, it becomes a linear form when transformed into log space. Thus, computational tools based on linear algebra remain useful and effective. Preliminary results indicate that the monomial approximation provides a higher quality approximation to non‐linear phenomena exhibited in structural applications. Consequently, incremental and iterative methods become more effective because larger steps can be taken. The net result is an increase in reliability of the solution process and a significant reduction in computational effort. Two examples are presented to demonstrate the method.

Details

Engineering Computations, vol. 16 no. 7
Type: Research Article
ISSN: 0264-4401

Keywords

Article
Publication date: 1 September 2003

Jean‐Louis Coulomb, Avenir Kobetski, Mauricio Caldora Costa, Yves Mare´chal and Ulf Jo¨nsson

This paper compares three different radial basis function neural networks, as well as the diffuse element method, according to their ability of approximation. This is very…

Abstract

This paper compares three different radial basis function neural networks, as well as the diffuse element method, according to their ability of approximation. This is very useful for the optimization of electromagnetic devices. Tests are done on several analytical functions and on the TEAM workshop problem 25.

Details

COMPEL - The international journal for computation and mathematics in electrical and electronic engineering, vol. 22 no. 3
Type: Research Article
ISSN: 0332-1649

Keywords

1 – 10 of over 14000