Search results
1 – 10 of 38Rjiba Sadika, Moez Soltani and Saloua Benammou
The purpose of this paper is to apply the Takagi-Sugeno (T-S) fuzzy model techniques in order to treat and classify textual data sets with and without noise. A comparative…
Abstract
Purpose
The purpose of this paper is to apply the Takagi-Sugeno (T-S) fuzzy model techniques in order to treat and classify textual data sets with and without noise. A comparative study is done in order to select the most accurate T-S algorithm in the textual data sets.
Design/methodology/approach
From a survey about what has been termed the “Tunisian Revolution,” the authors collect a textual data set from a questionnaire targeted at students. Five clustering algorithms are mainly applied: the Gath-Geva (G-G) algorithm, the modified G-G algorithm, the fuzzy c-means algorithm and the kernel fuzzy c-means algorithm. The authors examine the performances of the four clustering algorithms and select the most reliable one to cluster textual data.
Findings
The proposed methodology was to cluster textual data based on the T-S fuzzy model. On one hand, the results obtained using the T-S models are in the form of numerical relationships between selected keywords and the rest of words constituting a text. Consequently, it allows the authors to interpret these results not only qualitatively but also quantitatively. On the other hand, the proposed method is applied for clustering text taking into account the noise.
Originality/value
The originality comes from the fact that the authors validate some economical results based on textual data, even if they have not been written by experts in the linguistic fields. In addition, the results obtained in this study are easy and simple to interpret by the analysts.
Details
Keywords
Runhai Jiao, Shaolong Liu, Wu Wen and Biying Lin
The large volume of big data makes it impractical for traditional clustering algorithms which are usually designed for entire data set. The purpose of this paper is to…
Abstract
Purpose
The large volume of big data makes it impractical for traditional clustering algorithms which are usually designed for entire data set. The purpose of this paper is to focus on incremental clustering which divides data into series of data chunks and only a small amount of data need to be clustered at each time. Few researches on incremental clustering algorithm address the problem of optimizing cluster center initialization for each data chunk and selecting multiple passing points for each cluster.
Design/methodology/approach
Through optimizing initial cluster centers, quality of clustering results is improved for each data chunk and then quality of final clustering results is enhanced. Moreover, through selecting multiple passing points, more accurate information is passed down to improve the final clustering results. The method has been proposed to solve those two problems and is applied in the proposed algorithm based on streaming kernel fuzzy c-means (stKFCM) algorithm.
Findings
Experimental results show that the proposed algorithm demonstrates more accuracy and better performance than streaming kernel stKFCM algorithm.
Originality/value
This paper addresses the problem of improving the performance of increment clustering through optimizing cluster center initialization and selecting multiple passing points. The paper analyzed the performance of the proposed scheme and proved its effectiveness.
Details
Keywords
M. Punniyamoorthy and P. Sridevi
Credit risk assessment has gained importance in recent years due to global financial crisis and credit crunch. Financial institutions therefore seek the support of credit…
Abstract
Purpose
Credit risk assessment has gained importance in recent years due to global financial crisis and credit crunch. Financial institutions therefore seek the support of credit rating agencies to predict the ability of creditors to meet financial persuasions. The purpose of this paper is to construct neural network (NN) and fuzzy support vector machine (FSVM) classifiers to discriminate good creditors from bad ones and identify a best classifier for credit risk assessment.
Design/methodology/approach
This study uses artificial neural network, the most popular AI technique used in the field of financial applications for classification and prediction and the new machine learning classification algorithm, FSVM to differentiate good creditors from bad. As membership value on data points influence the classification problem, this paper presents the new FSVM model. The instances membership is computed using fuzzy c-means by evolving a new membership. The FSVM model is also tested on different kernels and compared and the classifier with highest classification accuracy for a kernel is identified.
Findings
The paper identifies a standard AI model by comparing the performances of the NN model and FSVM model for a credit risk data set. This work proves that that FSVM model performs better than back propagation-neural network.
Practical implications
The proposed model can be used by financial institutions to accurately assess the credit risk pattern of customers and make better decisions.
Originality/value
This paper has developed a new membership for data points and has proposed a new FCM-based FSVM model for more accurate predictions.
Details
Keywords
Swarnalatha Purushotham and Balakrishna Tripathy
The purpose of this paper is to provide a way to analyze satellite images using various clustering algorithms and refined bitplane methods with other supporting techniques…
Abstract
Purpose
The purpose of this paper is to provide a way to analyze satellite images using various clustering algorithms and refined bitplane methods with other supporting techniques to prove the superiority of RIFCM.
Design/methodology/approach
A comparative study has been carried out using RIFCM with other related algorithms from their suitability in analysis of satellite images with other supporting techniques which segments the images for further process for the benefit of societal problems. Four images were selected dealing with hills, freshwater, freshwatervally and drought satellite images.
Findings
The superiority of the proposed algorithm, RIFCM with refined bitplane towards other clustering techniques with other supporting methods clustering, has been found and as such the comparison, has been made by applying four metrics (Otsu (Max-Min), PSNR and RMSE (40%-60%-Min-Max), histogram analysis (Max-Max), DB index and D index (Max-Min)) and proved that the RIFCM algorithm with refined bitplane yielded robust results with efficient performance, reduction in the metrics and time complexity of depth computation of satellite images for further process of an image.
Practical implications
For better clustering of satellite images like lands, hills, freshwater, freshwatervalley, drought, etc. of satellite images is an achievement.
Originality/value
The existing system extends the novel framework to provide a more explicit way to analyze an image by removing distortions with refined bitplane slicing using the proposed algorithm of rough intuitionistic fuzzy c-means to show the superiority of RIFCM.
Details
Keywords
Zahra Nematzadeh, Roliana Ibrahim, Ali Selamat and Vahdat Nazerian
The purpose of this study is to enhance data quality and overall accuracy and improve certainty by reducing the negative impacts of the FCM algorithm while clustering…
Abstract
Purpose
The purpose of this study is to enhance data quality and overall accuracy and improve certainty by reducing the negative impacts of the FCM algorithm while clustering real-world data and also decreasing the inherent noise in data sets.
Design/methodology/approach
The present study proposed a new effective model based on fuzzy C-means (FCM), ensemble filtering (ENS) and machine learning algorithms, called an FCM-ENS model. This model is mainly composed of three parts: noise detection, noise filtering and noise classification.
Findings
The performance of the proposed model was tested by conducting experiments on six data sets from the UCI repository. As shown by the obtained results, the proposed noise detection model very effectively detected the class noise and enhanced performance in case the identified class noisy instances were removed.
Originality/value
To the best of the authors’ knowledge, no effort has been made to improve the FCM algorithm in relation to class noise detection issues. Thus, the novelty of existing research is combining the FCM algorithm as a noise detection technique with ENS to reduce the negative effect of inherent noise and increase data quality and accuracy.
Details
Keywords
Shabia Shabir Khan and S.M.K. Quadri
As far as the treatment of most complex issues in the design is concerned, approaches based on classical artificial intelligence are inferior compared to the ones based on…
Abstract
Purpose
As far as the treatment of most complex issues in the design is concerned, approaches based on classical artificial intelligence are inferior compared to the ones based on computational intelligence, particularly this involves dealing with vagueness, multi-objectivity and good amount of possible solutions. In practical applications, computational techniques have given best results and the research in this field is continuously growing. The purpose of this paper is to search for a general and effective intelligent tool for prediction of patient survival after surgery. The present study involves the construction of such intelligent computational models using different configurations, including data partitioning techniques that have been experimentally evaluated by applying them over realistic medical data set for the prediction of survival in pancreatic cancer patients.
Design/methodology/approach
On the basis of the experiments and research performed over the data belonging to various fields using different intelligent tools, the authors infer that combining or integrating the qualification aspects of fuzzy inference system and quantification aspects of artificial neural network can prove an efficient and better model for prediction. The authors have constructed three soft computing-based adaptive neuro-fuzzy inference system (ANFIS) models with different configurations and data partitioning techniques with an aim to search capable predictive tools that could deal with nonlinear and complex data. After evaluating the models over three shuffles of data (training set, test set and full set), the performances were compared in order to find the best design for prediction of patient survival after surgery. The construction and implementation of models have been performed using MATLAB simulator.
Findings
On applying the hybrid intelligent neuro-fuzzy models with different configurations, the authors were able to find its advantage in predicting the survival of patients with pancreatic cancer. Experimental results and comparison between the constructed models conclude that ANFIS with Fuzzy C-means (FCM) partitioning model provides better accuracy in predicting the class with lowest mean square error (MSE) value. Apart from MSE value, other evaluation measure values for FCM partitioning prove to be better than the rest of the models. Therefore, the results demonstrate that the model can be applied to other biomedicine and engineering fields dealing with different complex issues related to imprecision and uncertainty.
Originality/value
The originality of paper includes framework showing two-way flow for fuzzy system construction which is further used by the authors in designing the three simulation models with different configurations, including the partitioning methods for prediction of patient survival after surgery. Several experiments were carried out using different shuffles of data to validate the parameters of the model. The performances of the models were compared using various evaluation measures such as MSE.
Details
Keywords
Wanjie Hu, Jianjun Dong, Bon-Gang Hwang, Rui Ren and Zhilong Chen
Underground logistics system (ULS) is recognized as sustainable alleviator to road-dominated urban logistics infrastructure with various social and environmental benefits…
Abstract
Purpose
Underground logistics system (ULS) is recognized as sustainable alleviator to road-dominated urban logistics infrastructure with various social and environmental benefits. The purpose of this study is to propose effective modeling and optimization method for planning a hub-and-spoke ULS network in urban region.
Design/methodology/approach
Underground freight tunnels and the last-mile ground delivery were organized as a hierarchical network. A mixed-integer programming model (MIP) with minimum system cost was developed. Then a two-phase optimization schema combining Genetic-based fuzzy C-means algorithm (GA-FCM), Depth-first-search FCM (DFS-FCM) algorithm and Dijkstra algorithm (DA), etc. was designed to optimize the location-allocation of ULS facilities and customer clusters. Finally, a real-world simulation was conducted for validation.
Findings
The multistage strategy and hybrid algorithms could efficiently yield hub-and-spoke network configurations at the lowest objective cost. GA-FCM performed better than K-means in customer-node clustering. The combination of DFS-FCM and DA achieved superior network configuration than that of combining K-means and minimum spanning tree technique. The results also provided some management insights: (1) greater scale economies effect in underground freight movement could reduce system budget, (2) changes in transportation cost would not have obvious impact on ULS network layout and (3) over 90% of transportation process in ULS network took place underground, giving remarkable alleviation to road freight traffic.
Research limitations/implications
Demand pairs among customers were not considered due to lacking data. Heterogeneity of facilities capacity parameters was omitted.
Originality/value
This study has used an innovative hybrid optimization technique to address the two-phase network planning of urban ULS. The novel design and solution approaches offer insights for urban ULS development and management.
Details
Keywords
Hiren Mewada, Amit V. Patel, Jitendra Chaudhari, Keyur Mahant and Alpesh Vala
In clinical analysis, medical image segmentation is an important step to study the anatomical structure. This helps to diagnose and classify abnormality in the image. The…
Abstract
Purpose
In clinical analysis, medical image segmentation is an important step to study the anatomical structure. This helps to diagnose and classify abnormality in the image. The wide variations in the image modality and limitations in the acquisition process of instruments make this segmentation challenging. This paper aims to propose a semi-automatic model to tackle these challenges and to segment medical images.
Design/methodology/approach
The authors propose Legendre polynomial-based active contour to segment region of interest (ROI) from the noisy, low-resolution and inhomogeneous medical images using the soft computing and multi-resolution framework. In the first phase, initial segmentation (i.e. prior clustering) is obtained from low-resolution medical images using fuzzy C-mean (FCM) clustering and noise is suppressed using wavelet energy-based multi-resolution approach. In the second phase, resultant segmentation is obtained using the Legendre polynomial-based level set approach.
Findings
The proposed model is tested on different medical images such as x-ray images for brain tumor identification, magnetic resonance imaging (MRI), spine images, blood cells and blood vessels. The rigorous analysis of the model is carried out by calculating the improvement against noise, required processing time and accuracy of the segmentation. The comparative analysis concludes that the proposed model withstands the noise and succeeds to segment any type of medical modality achieving an average accuracy of 99.57%.
Originality/value
The proposed design is an improvement to the Legendre level set (L2S) model. The integration of FCM and wavelet transform in L2S makes model insensitive to noise and intensity inhomogeneity and hence it succeeds to segment ROI from a wide variety of medical images even for the images where L2S failed to segment them.
Details
Keywords
Henrique Ewbank, José Arnaldo Frutuoso Roveda, Sandra Regina Monteiro Masalskiene Roveda, Admilson ĺrio Ribeiro, Adriano Bressane, Abdollah Hadi-Vencheh and Peter Wanke
The purpose of this paper is to analyze demand forecast strategies to support a more sustainable management in a pallet supply chain, and thus avoid environmental impacts…
Abstract
Purpose
The purpose of this paper is to analyze demand forecast strategies to support a more sustainable management in a pallet supply chain, and thus avoid environmental impacts, such as reducing the consumption of forest resources.
Design/methodology/approach
Since the producer presents several uncertainties regarding its demand logs, a methodology that embed zero-inflated intelligence is proposed combining fuzzy time series with clustering techniques, in order to deal with an excessive count of zeros.
Findings
A comparison with other models from literature is performed. As a result, the strategy that considered at the same time the excess of zeros and low demands provided the best performance, and thus it can be considered a promising approach, particularly for sustainable supply chains where resources consumption is significant and exist a huge variation in demand over time.
Originality/value
The findings of the study contribute to the knowledge of the managers and policymakers in achieving sustainable supply chain management. The results provide the important concepts regarding the sustainability of supply chain using fuzzy time series and clustering techniques.
Details