DOI QR코드

DOI QR Code

A Novel Image Segmentation Method Based on Improved Intuitionistic Fuzzy C-Means Clustering Algorithm

  • Kong, Jun (Jiangsu Provincial Engineering Laboratory of Pattern Recognition and Computational Intelligence Jiangnan University) ;
  • Hou, Jian (Jiangsu Provincial Engineering Laboratory of Pattern Recognition and Computational Intelligence Jiangnan University) ;
  • Jiang, Min (Jiangsu Provincial Engineering Laboratory of Pattern Recognition and Computational Intelligence Jiangnan University) ;
  • Sun, Jinhua (Jiangsu Provincial Engineering Laboratory of Pattern Recognition and Computational Intelligence Jiangnan University)
  • Received : 2018.07.17
  • Accepted : 2019.01.15
  • Published : 2019.06.30

Abstract

Segmentation plays an important role in the field of image processing and computer vision. Intuitionistic fuzzy C-means (IFCM) clustering algorithm emerged as an effective technique for image segmentation in recent years. However, standard fuzzy C-means (FCM) and IFCM algorithms are sensitive to noise and initial cluster centers, and they ignore the spatial relationship of pixels. In view of these shortcomings, an improved algorithm based on IFCM is proposed in this paper. Firstly, we propose a modified non-membership function to generate intuitionistic fuzzy set and a method of determining initial clustering centers based on grayscale features, they highlight the effect of uncertainty in intuitionistic fuzzy set and improve the robustness to noise. Secondly, an improved nonlinear kernel function is proposed to map data into kernel space to measure the distance between data and the cluster centers more accurately. Thirdly, the local spatial-gray information measure is introduced, which considers membership degree, gray features and spatial position information at the same time. Finally, we propose a new measure of intuitionistic fuzzy entropy, it takes into account fuzziness and intuition of intuitionistic fuzzy set. The experimental results show that compared with other IFCM based algorithms, the proposed algorithm has better segmentation and clustering performance.

Keywords

1. Introduction

 Image information is one of the most important information for human beings to know and communicate with the outside world. There are certain regions with specific similar properties in an image, which are called object areas. The vast majority of the information in an image is often contained in object areas. Image segmentation is a basic and crucial aspect of image analysis and processing in the field of image video and computer vision, which is essentially a process of dividing an image into multiple non-overlapping sub-regions based on some characteristics of pixels. Despite substantial progress in recent years, image segmentation still remains to be a challenging problem due to several factors which both from the background and the object areas [1-3].

 Clustering methods are considered to be an efficient technique to deal with the similarity and uncertainty in an image, which group pixels into different clusters according to some criteria and features. Over the past decades, various clustering based methods such as k-means [4], fuzzy clustering [5], k-mediods [6] and scalable spectral clustering [7] have been proposed. Among these, fuzzy C-means (FCM) is a most extensively studied algorithm based on fuzzy theory, which allows an element to belong to multiple classes with varying memberships. Intuitionistic fuzzy C-means (IFCM) algorithm, as a successful extension and variant of FCM, has attracted extensive attention and has been widely used in many fields such as image processing and pattern recognition [8]. However, standard FCM and IFCM algorithms are sensitive to noise and initial cluster centers, and they ignore the spatial relationship of pixels, leading to imprecise clustering results.

 In this work, we have considered the problems mentioned above, and the main contributions of our work can be summed up as follows. First, we propose a modified non-membership function to generate intuitionistic fuzzy set, which highlights the effect of uncertainty and makes good use of image information. A method of determining initial clustering centers based on pixel characteristics is also proposed, aiming at suppressing the noise and clustering more accurately. Second, considering some data might be inseparable in low dimensional space, we propose an improved nonlinear function to map data into high dimensional kernel space to measure the distance. Third, to get a comprehensive measure of various factors, we introduce the local spatial-gray information, combining membership degree, gray features and spatial position information. Finally, we improve standard intuitionistic fuzzy entropy in the objective function, which takes into account fuzziness, intuition and uncertainty of intuitionistic fuzzy set. The flowchart of our work is shown in Fig. 1. Compared with existing algorithms, the proposed algorithm not only considers membership degree, non-membership degree and spatial information, but it also takes into account the effect of intuition, uncertainty and kernel space distance measure. To better verify the performance of our segmentation algorithm, we have performed comprehensive evaluations on different images and datasets. Our approach improves the baseline IFCM both in the aspect of segmentation and clustering.

 

Fig. 1. Flowchart of our segmentation algorithm

 The remainder of this paper is arranged as follows. Section 2 introduces the correlation theory of fuzzy C-means clustering algorithm and intuitionistic fuzzy set. In Section 3, we minutely introduce our method from five main parts. Comparison experiments and comparative analysis are given in Section 4. Finally, Section 5 summarizes the paper.

2. Related Work

2.1 Current Segmentation Algorithms

 Over last decades scholars around the world have proposed a variety of image segmentation algorithms by utilizing different schemes [9-12] including global thresholding method, edge detection method, region based method, clustering based method and so on [13]. Generally speaking, most of these representative segmentation algorithms are global thresholding based, where gray level thresholding is always efficient and easy to calculate. But they are only suitable for images with significant difference in gray values. Besides, the existence of noise and many other interferences also raises difficulties in segmenting image precisely. Thus image segmentation is one of the most difficult tasks in the field of computer vision. To this day, it is still a persistent research hotspot.

 In recent years, image segmentation algorithm based on FCM is widely used as it can iterate to obtain the final results adaptively in the case of unsupervised, which works well for noiseless images. But FCM is often sensitive to noise and initial clustering centers, so that it might be difficult to yield satisfactory segmentation effect by utilizing FCM directly. On the basis of fuzzy set theory, Atanassov first proposed the concept of intuitionistic fuzzy set (IFS), which takes into account the function of membership degree, non-membership degree and hesitation at the same time [14]. Over last decade, many scholars have devoted themselves to the study of intuitionistic fuzzy set, which has been widely used in various fields and achieved great results. Charia et al. [15] proposed an edge detection method based on intuitionistic fuzzy sets and fuzzy C-means (IFCM) clustering algorithm, which was applied to the detection of medical images incorporating local information with good segmentation performance. On the basis of intuitionistic fuzzy sets, Kaur et al. [16] proposed the RBF kernel intuitionistic fuzzy C-means algorithm (KIFCM), where the kernel metric matrix was used instead of the original Euclidean norm metric. Ansari et al. [17] proposed a new measure of intuitionistic fuzzy divergence and intuitionistic fuzzy entropy, and proved its effectiveness in edge detection. An image segmentation algorithm combines intuitionistic fuzzy theory with spatial information (IFCMS) was proposed by Tripathy et al. [18], which is insensitive to noise to some extent and performs well on segmentation. Another representative algorithm (IIFCM) was proposed by Verma et al, which introduces a factor considers both local gray-level and spatial information in IFCM, with great robustness to noise and well-preserved image information [19]. After this, Zhao et al. [8] proposed an IFCM based multi-objective optimized segmentation algorithm with multiple image spatial information (MOEIFC-MSI), it achieves high segmentation accuracy and is robust to noise.

2.2 Standard FCM Algorithm

 For N initial data \(X=\left\{x_{1}, x_{2}, \ldots, x_{N}\right\}\), if they are expected to be divided into C fuzzy sets \(F=\left(F_{1}, F_{2}, \dots, F_{C}\right)\), then the objective function of the FCM algorithm can be expressed as:

\(J_{F C M}(U, V)=\sum_{i=1}^{N} \sum_{j=1}^{C} \mu_{i j}^{m} d_{i j}^{2}\left(x_{i}, v_{j}\right)\)       (1)

 And the constraint conditions are:

\(\sum_{j=1}^{C} \mu_{i j}=1, \quad \forall i\)       (2)

\(0<\sum_{i=1}^{N} \mu_{i j}<n, \forall j\)       (3)

 Where C denotes the number of the clusters, which is a preset parameter, \(\mu_{i j}\) is the membership degree of the data xi to the fuzzy cluster Fj, vj is the clustering center of the fuzzy cluster Fj, dij denotes the Euclidean distance measure between data xi and clustering center vj, m is a constant, usually takes 2, U is the membership matrix and V is the clustering center matrix.

 The implementation of FCM algorithm is the process of minimizing the objective function to seek the optimal cluster center and the membership degree by iteration. According to the constraint conditions Eq. (2) and Eq. (3), the Lagrange function is utilized to obtain the partial derivative of each variable, and we make the partial derivative be 0, then the iterative formulas of fuzzy membership degree and clustering center can be obtained as follows:

\(\mu_{i j}=\frac{1}{\sum_{k=1}^{C}\left(\frac{d_{i j}\left(x_{i}, v_{j}\right)}{d_{i k}\left(x_{i}, v_{k}\right)}\right)^{\frac{2}{m-1}}} \quad \begin{array}{l} i=1,2, \ldots, N \\ j=1,2, \ldots, C \end{array}\)       (4)

\(v_{j}=\frac{\sum_{i=1}^{N} \mu_{ij}^{m} x_{i}}{\sum_{i=1}^{N} \mu_{i j}^{m}} \quad \begin{array}{l} i=1,2, \ldots, N \\ j=1,2, \ldots, C \end{array}\)       (5)

 The FCM algorithm initializes the cluster center first, and then iterates through Eq. (4) and Eq. (5) until the iteration stops. There are two stop criteria in standard FCM: the maximum number of iterations M and the accuracy of objective function e. Once the number of iterations reaches the maximum, or the error of objective function is less than the accuracy, the iteration stops. According to the obtained membership degree matrix, the cluster corresponding to the maximum membership value of the sample is the result of clustering. In order to take a more comprehensive understanding of FCM, we give the outline of standard FCM algorithm as Algorithm 1.

Algorithm 1

 

2.3 Intuitionistic Fuzzy Set

 Intuitionistic fuzzy set is a successful extension and development of standard fuzzy set. A fuzzy set can be defined as:

\(A=\left\{x, \mu_{A}(x) | x \in X\right\}\)       (6)

Where \(X=\left\{x_{1}, x_{2}, \ldots, x_{N}\right\}\) is a data set, \(\mu_{A}(x) \rightarrow[0,1]\) denotes the membership degree for data x belonging to fuzzy set A, which breaks through the shackles of the traditional binary logic. Then, the non-membership degree can be expressed as 𝑣\(v_{A}(x)=1-\mu_{A}(x)\). However, in the objective world, the judgment of a thing often not only relies on ‘either this or that’, but it also includes a lot of ‘uncertainty’ information. Therefore, on the basis of fuzzy set, an intuitionistic fuzzy set can be represented as:

\(B=\left\{x, \mu_{B}(x), v_{B}(x) | x \in X\right\}\)       (7)

 Where \(\mu_{B}(x) \rightarrow[0,1]\) and \(v_{B}(x) \rightarrow[0,1]\) denote the membership and non-membership degree for data x belonging to intuitionistic fuzzy set B respectively. While unlike fuzzy set, membership and non-membership satisfy the following conditions:

\(0 \leq \mu_{B}(x)+v_{B}(x) \leq 1, \forall x \in X\)       (8)

 Hesitation degree for data x is defined as:

\(\pi_{B}(x)=1-\mu_{B}(x)-v_{B}(x)\)       (9)

 It is a measure of uncertainty of the data. Obviously, \(0 \leq \pi_{B}(x) \leq 1\) for each x ∈ X. Intuitionistic fuzzy set degenerates into ordinary fuzzy set when \(\pi_{B}(x)=0\). For intuitionistic fuzzy set B, its membership degree \(\mu_{B}(x)\), non-membership degree \(v_{B}(x)\) and hesitation degree \(\pi_{B}(x)\) represent the degree of ‘support’, ‘opposition’ and ‘neutrality’ of data belonging to B respectively. Therefore, intuitionistic fuzzy set effectively extend the representation ability of Zadeh’s standard fuzzy set.

 The non-membership function of intuitionistic fuzzy set is usually generated by Yager’s [20, 21] and Sugeno’s [22] negation function, which are:

\(N_{1}(\mu(x))=\left(1-\mu(x)^{\alpha}\right)^{\frac{1}{\alpha}}, \alpha>0\)       (10)

\(N_{2}(\mu(x))=\frac{1-\mu(x)}{1+\lambda \mu(x)}, \lambda>0\)       (11)

 Then the intuitionistic fuzzy set can be further expressed as \(\{x, \mu(x), N(\mu(x)) | x \in X\}\).

3. Proposed Approach

3.1 The Generation of Intuitionistic Fuzzy Set

 For practical applications, the non-membership function of intuitionistic fuzzy set plays an important role in the final results. Generally speaking, different problems should use different non-membership functions based on the specific circumstances to get good results. In this paper, a new method of generating non-membership function is proposed to improve the literature [23], which not only considers the gray characteristics of pixels, but also enhances the uncertainty of intuitionistic fuzzy set.

 For a gray image with pixel set \(x=\left\{x_{1}, x_{2}, \dots, x_{N}\right\}\), if we would divide pixels into C categories, its intuitionistic fuzzy set can be expressed as:

 \(A=\left\{\mu_{i j}\left(x_{i}\right), v_{i j}\left(x_{i}\right), \pi_{i j}\left(x_{i}\right) | x_{i} \in X\right\}, 1 \leq i \leq N, 1 \leq j \leq C\)       (12)

 We propose a new method of generating non-membership degree:

\(N\left(\mu\left(x_{i}\right)\right)=\frac{1-\mu_{i j}\left(x_{i}\right)}{1+\mu_{i j}\left(x_{i}\right)}\left(\exp \left(\frac{-\mu_{i j}\left(x_{i}\right)}{\alpha \cdot \sigma}\right)\right)^{\frac{1}{\alpha}}\)       (13)

 Where \(\mu_{i j}\left(x_{i}\right)\) is the membership function related to gray value, \(\alpha\) is a non-membership constant which verified in literature [18] with best performance when \(\alpha=5, \sigma\) is the standard deviation of membership function \(\mu_{i j}\left(x_{i}\right)\), it is usually between 0.37-0.38.

 Compared with the function in literature [23], the classical Yager’s function [20, 21] and Sugeno’s function [22], the non-membership function proposed in this paper has significantly better performance. On the one hand, multiplier \(\frac{1-\mu_{i j}\left(x_{i}\right)}{1+\mu_{i j}\left(x_{i}\right)}\) guarantees the value of the non-membership function to be between 0-1, which satisfies the constraints. On the other hand, when membership degree is close to 0 or 1, the certainty increases sharply, that is, the degree of hesitation becomes very small; when the degree of membership is close to 0.5, the value of non-membership is small, that is, the uncertainty increases and the hesitation degree is large. Different from existing non-membership functions, it puts more emphasis on uncertainty and hesitation degree of data, which is more suitable for dealing with noise and edge in image. Therefore, the proposed algorithm not only considers the gray features of images, but also enhances the effect of uncertainty in intuitionistic fuzzy set, leading to more excellent effect.

 Then the intuitionistic fuzzy set of image constructed by this method can be expressed as 

\(\left\{\mu_{i j}\left(x_{i}\right), \frac{1-\mu_{i j}\left(x_{i}\right)}{1+\mu_{i j}\left(x_{i}\right)}\left(\exp \left(\frac{-\mu_{i j}\left(x_{i}\right)}{\alpha \cdot \sigma}\right)\right)^{\frac{1}{\alpha}}, 1-\mu_{i j}\left(x_{i}\right)-\frac{1-\mu_{i j}\left(x_{i}\right)}{1+\mu_{i j}\left(x_{i}\right)}\left(\exp \left(\frac{-\mu_{i j}\left(x_{i}\right)}{\alpha \cdot \sigma}\right)\right)^{\frac{1}{\alpha}} | x_{i} \in X\right\}\).

3.2 Determination of Initial Clustering Center

 In FCM algorithm, the initial clustering center plays a significant role in the performance of the algorithm and the final clustering results. The clustering results fluctuate with different initializations. Inappropriate initial clustering centers are likely to make the solution of the objective function fall into the local minimum, resulting in erroneous results. In view of this, many scholars have proposed different methods to optimize the initial clustering center, such as K-means algorithm, ant colony algorithm, and the logical model considering the correlation between samples [24-26]. Since the algorithm proposed in this paper is used for image segmentation, a method of determining initial cluster centers based on pixel features of grayscale images is proposed.

 For an image with pixel set \(X=\left\{x_{1}, x_{2}, \ldots, x_{N}\right\}\) and L gray levels \(G=\left\{\xi_{1}, \xi_{2}, \ldots, \xi_{L}\right\}\), and the number of categories to be clustered is 𝐶𝐶. Firstly, the gray histogram of the image is drawn, and then the number \(N=\left\{n_{1}, n_{2}, \dots, n_{L}\right\}\) of the pixels X at each gray level \(\xi_{i}(i=1,2, \dots, L)\) in the histogram is calculated. The gray level \(\xi_{\max }\) corresponding to the largest number \(n_{\max }=\max \left\{n_{1}, n_{2}, \ldots, n_{L}\right\}\) is selected as the first initial clustering center. Next, we propose the concept of standard distance of the pixels set:

\(D_{S}=\frac{1}{\sum_{p=1}^{L}\left(n_{p} \cdot \xi_{p}\right)} \sum_{p=1}^{L} \sum_{q=1}^{L} n_{p} \cdot n_{q} \cdot\left(\xi_{p}-\xi_{q}\right)^{2} \cdot d\left(\xi_{p}, \xi_{q}\right)\)       (14)

 Where \(d\left(\xi_{p}, \xi_{q}\right)\) is the average Euclidean distance between pixels with gray level \(\xi_{p}\) and pixels with gray level \(\xi_{q}\). Suppose there are \(n_p\) pixels corresponding to gray level \(\xi_{p}\), and \(n_q\) pixels corresponding to gray level \(\xi_{q}\), for each pixel belonging to above \(n_p\) pixels, we calculate its Euclidean distance from each pixel in \(n_q\). There are \(n_p\) ∙ \(n_q\) distances in total, and d(\(\xi_{p}\), \(\xi_{q}\)) is the average distance. We define \(D_s\) as the minimum threshold distance between two cluster centers. Let c denotes the number of initial cluster centers which have already been found, when c = 1, we define the set of non-clustering center as:

\(G_{1}=\left\{\xi_{k} \| \xi_{k}-\xi_{\max } |<D_{s}, k=1,2, \ldots, L\right\}\)       (15)

 It denotes the set of gray levels whose distance from the first initial clustering center is less than the standard distance \(D_s\), that is, these gray levels can not become clustering centers. Thus we update the gray set G by removing the non-clustering center set G1 from itself:

\(G=G-G_{1}\)       (16)

 Then we would look for the next initial clustering center in the updated gray scale G. When we have found a clustering center, if c < C, let c = c + 1, and the gray level of pixels with the largest number in the updated gray set is searched as the next initial clustering center. We then define non-clustering center set as \(G_{c}=\left\{\xi_{k} \| \xi_{k}-\xi_{\max } |<\right.\left.D_{s}, k=1,2, \ldots, L, \xi_{k} \notin G_{r}, r=1,2, \ldots, c-1\right\}\), and so on, the gray scale G is continually updated until all the C initial clustering centers are found.

 The clustering centers should have some representative features, and the distance between different cluster centers should be as large as possible to avoid local minima. The proposed method utilizes the maximum number of gray levels in the histogram to determine the initial clustering center, taking into account the gray features between pixels; the introduction of the standard distance allows the initial clustering centers to be dispersed as much as possible, avoiding the local minimum to make algorithm converge well.

3.3 Kernel Space Distance Metric

 IFCM algorithm can perform good clustering effect for linear separable problems, and the Euclidean distance adopted is easy to calculate and implement [27]. However, many problems are linear inseparable in reality, and some non-convex data are difficult to calculate [28]. In this case, only using Euclidean intuitionistic fuzzy distance is not sufficient to calculate the distance between the data and the cluster center.

 In consideration of this problem, we propose an improved distance measure based on kernel space. The original data are mapped to the feature space of high dimension by kernel function, which makes the data separable. Then distance metric in high dimensional kernel space is calculated.

 Assume that the original data set is \(X=\left\{x_{1}, x_{2}, \ldots, x_{N}\right\} \subseteq R^{O}\), it is mapped to the high-dimensional feature space by a nonlinear mapping \(\phi\), and the new data set \(\phi(X)=\left\{\phi\left(x_{1}\right), \phi\left(x_{2}\right), \ldots, \phi\left(x_{N}\right)\right\} \subseteq R^{T}\) is acquired. Then the distance metric based on kernel space can be expressed as:

\(d_{i j}^{2}\left(x_{i}, v_{j}\right)=\left\|\phi\left(x_{i}\right)-\phi\left(v_{j}\right)\right\|^{2}`\)       (17)

 Gaussian kernel function is one of the most widely used kernel functions. However, the edge of standard Gaussian kernel function tends to be infinitely small. In the inner product operation of the kernel function, the data on the edge are almost useless, which are equivalent to being cut off. In this regard, we need to do two improvements. On the one hand, moderate attenuation should be maintained at the far point; on the other hand, fast attenuation should be achieved at the current test point. Thus we construct the following kernel function based on Gaussian to map the data to high dimensional kernel space:

\(K\left(x_{i}, v_{j}\right)=\exp \left(\frac{\sigma_{B}^{2}}{\left\|v_{j}-x_{i}\right\|^{2}+\lambda}\right)\)       (18)

 Where \(\sigma_{B}\) denotes the bandwidth of kernel function, \(\lambda\) is a displacement parameter, which controls the height and attenuation of the function curve. Experiment shows best effect when \(\sigma_{B}\) is 0.3, \(\lambda\) is between 0.2 and 0.4. Then, by utilizing the properties of the kernel function \(\|\phi(a)-\phi(b)\|^{2}=K(a, a)-2 K(a, b)+K(b, b)\) and substituting it to Eq. (17), we can get the distance measure between the data point and the clustering center: \(d_{i j}^{2}\left(x_{i}, v_{j}\right)=\left\|\phi\left(x_{i}\right)-\phi\left(v_{j}\right)\right\|^{2}=K\left(x_{i}, x_{i}\right)-2 K\left(x_{i}, v_{j}\right)+K\left(v_{j}, v_{j}\right)\)

3.4 Local Spatial-Gray Information Measure

 The clustering effect of standard FCM and IFCM algorithm is usually just determined by membership degree in the objective function, without considering the characteristics of the data and the local spatial information, resulting in low clustering accuracy and susceptibility to noise. In order to improve the robustness of clustering results, in this paper a novel local spatial-gray information measure is introduced in the objective function, which considers the intuitionistic fuzzy set, the spatial location information and the local gray feature at the same time.

 For an image with pixel set \(X=\left\{x_{1}, x_{2}, \dots, x_{N}\right\}\), we define the similarity measure between pixels and cluster centers as:

\(S_{i j}=\exp \left(-\frac{\max \left(\left|a_{i}-a_{j}\right|,\left|b_{i}-b_{j}\right|\right)}{\lambda_{s}}-\frac{\left\|g\left(x_{i}\right)-g\left(v_{j}\right)\right\|^{2}}{\lambda_{g}}\right)\)       (19)

 Where \(\left(a_{i}, b_{i}\right)\) and \(\left(a_{j}, b_{j}\right)\) denote the two-dimensional spatial coordinates of the pixel and the cluster center, g(xi) and g(vj) are the gray values of the pixel and the cluster center, \(\lambda_{s}\) and \(\)\(\lambda_{g}\) are the scale parameters of local spatial information and gray information, which control the proportion of different information.

 Define the gray difference of cluster center as:

\({dif} f_{j}=\frac{\sum_{x_{i} \in N_{r}\left\|g\left(x_{i}\right)-g\left(v_{j}\right)\right\|}}{N u m}\)       (20)

 Where Nr is the neighborhood window centered on the clustering center, g(xi) and g(vj) are the gray values of the pixel and the cluster center respectively, Num is the number of neighborhood pixels. diffj represents the gray feature of the pixels around the cluster center.

 Next, we construct the local spatial-gray scale factor based on the gray difference and the similarity measure:

\(M_{i j}=\frac{S_{i j}}{ { diff }_{j}+d i s_{i j}+1}\left(1-\mu_{i j}\left(x_{i}\right)\right)^{m}\left\|g\left(x_{i}\right)-g\left(v_{j}\right)\right\|^{2}\)       (21)

 Where Sij denotes the similarity measure, diffj denotes the gray difference, disij denotes the Euclidean distance between the pixel and the clustering center, \(\mu_{i j}\left(x_{i}\right)\) is the membership degree of the pixel, m is a fuzzy constant, g(xi) and g(vj) are the gray values of the pixel and the cluster center. The local spatial-gray scale factor takes into account the degree of membership, the gray features of the pixels and the spatial position characteristics at the same time. In addition, since membership degree is closely related to non-membership and hesitation degree, the measure Mij can consider both certainty and uncertainty of intuitionistic fuzzy set, which effectively improves the clustering accuracy of the algorithm.

3.5 Improved Intuitionistic Fuzzy Entropy

 In order to maximize the correct points in the cluster and yield a better clustering effect, we then introduce intuitionistic fuzzy entropy in the objective function. The uncertainty of intuitionistic fuzzy set should be reflected in two aspects of fuzziness and intuition, in which fuzziness is determined by the degree of difference between membership and non-membership, and intuition is determined by hesitation degree. The intuitionistic fuzzy entropy proposed by Burillo et al. [29] does not have compatibility with fuzzy entropy and can only measure part of the uncertainty information of intuitionistic fuzzy set. In recent years, scholars have put forward many new methods of intuitionistic fuzzy entropy measurement, but generally speaking, they still have some defects: (1) The intuitionistic fuzzy entropy is determined just by the difference between the degree of membership and non-membership, while the effect of hesitation degree is ignored; (2) When the membership degree is equal to the degree of the non-membership, that is, when the difference is 0, the entropy value is always equal to 1, which is independent of the degree of hesitation [30].

 In response to the above problems, we propose a new formula of intuitionistic fuzzy entropy, which considers the difference between membership and non-membership degree and the effect of hesitation degree at the same time. For the discourse domain \(X=\left\{x_{1}, x_{2}, \ldots, x_{N}\right\}\) and the intuitionistic fuzzy set \(A=\left\{\mu_{i j}\left(x_{i}\right), \omega_{i j}\left(x_{i}\right), \pi_{i j}\left(x_{i}\right) | x_{i} \in X\right\}\), the formula of intuitionistic fuzzy entropy can be written as:

\(I F E(A)=\frac{1}{N} \sum_{i=1}^{N} \sin \left(\frac{\pi}{2} \cdot \frac{1-\left(\mu_{i j}\left(x_{i}\right)-\omega_{i j}\left(x_{i}\right)\right)^{2}+2 \pi_{i j}^{2}\left(x_{i}\right)}{2-\left(\mu_{i j}\left(x_{i}\right)-\omega_{i j}\left(x_{i}\right)\right)^{2}+\pi_{i j}^{2}\left(x_{i}\right)}\right)\)       (22)

 When \(\mu_{i j}\left(x_{i}\right)=1, \omega_{i j}\left(x_{i}\right)=0\) or \(\mu_{i j}\left(x_{i}\right)=0, \omega_{i j}\left(x_{i}\right)=1\), we can find \(\pi_{i j}\left(x_{i}\right)=0 \), IFE(A)=0, and the intuitionistic fuzzy set 𝐴𝐴 degenerates into a fuzzy set; when \(\mu_{i j}\left(x_{i}\right)=\omega_{i j}\left(x_{i}\right)\), \(I F E(A)=\frac{1}{N} \sum_{i=1}^{N} \sin \left(\frac{\pi}{2} \cdot \frac{1+2 \pi_{i j}^{2}\left(x_{i}\right)}{2+\pi_{i j}^{2}\left(x_{i}\right)}\right)\), it is obvious that the greater the value of \(\pi_{i j}\left(x_{i}\right)\), the greater the value of IFE(A).

 This proposed method of measuring intuitionistic fuzzy entropy is more reasonable and embodies the fuzziness and intuition of intuitionistic fuzzy set, which can better determine and utilize the uncertainty of intuitionistic fuzzy set. Applying it to IFCM algorithm can yield a more accurate clustering result.

3.6 Proposed Objective Function

 In our algorithm, the Euclidean intuitionistic fuzzy distance of standard IFCM is replaced by the kernel space distance metric, and the local spatial-gray information and improved intuitionistic fuzzy entropy proposed in this paper are introduced, then the novel image segmentation method based on improved IFCM algorithm is obtained, whose objective function is:

\(\left\{\begin{aligned} \min J(U, V, A)=\sum_{i=1}^{N} \sum_{j=1}^{C} \mu_{i j}^{m} d_{i j}^{2}\left(x_{i}, v_{j}\right) &+\sum_{i=1}^{N} \sum_{j=1}^{C} M_{i j}+I F E(A) \\ \text { subject to } \sum_{j=1}^{C} \mu_{i j}=1 \end{aligned}\right.\)       (23)

 Where N is the number of image pixels, C is the number of clusters, \(\mu_{i j}\) is the intuitionistic fuzzy membership function, m is a intuitionistic fuzzy constant, \(d_{i j}^{2}\) is the kernel space distance, \(M_{i j}\) is the local spatial-gray information measure, and IFE(A) is the intuitionistic fuzzy entropy.

 To solve the above minimization problem, we introduce the Lagrange function, then the objective function can be expressed as:

\(E=\sum_{i=1}^{N} \sum_{j=1}^{C} \mu_{i j}^{m} d_{i j}^{2}\left(x_{i}, v_{j}\right)+\sum_{i=1}^{N} \sum_{j=1}^{C} M_{i j}+I F E(A)-\sum_{i=1}^{N} l_{i}\left(\sum_{j=1}^{C} \mu_{i j}-1\right)(24)\)       (24)

 Where li is the Lagrange multiplier constant. For Eq. (24), the partial derivatives 𝐸𝐸 with respect to \(\mu_{i j}, v_{j}\) and lare calculated respectively and make them 0, \(\frac{\partial E}{\partial \mu_{i j}}=\frac{\partial E}{\partial v_{j}}=\frac{\partial E}{\partial l_{i}}=0\), then iterative formulas of the membership function and the cluster center can be acquired. Its iteration stop conditions are the same as FCM’s, that is, the maximum number of iterations M and the accuracy of objective function e.

 An overview of our proposed method is summarized in Algorithm 2.

Algorithm 2

 

4. Experiments

 Different types of images have different pixel characteristics, and there is a big difference between them. In order to validate the effectiveness of the proposed algorithm and verify its performance on different images, experiment is carried out on simple square image, publically available MRI brain image [31] and BSDS500 dataset [32]. Besides the proposed method, five representative fuzzy based algorithms as FCM, IFCM, KIFCM, IFCM-S and IIFCM are evaluated as well. To obtain quantitative comparison of segmentation results, we adopt the same evaluation measures as [19], in terms of similarity index \((\rho)\), false negative ratio \(\left(r_{f n}\right)\) and false positive ratio \(\left(r_{f p}\right)\). Different algorithms are implemented with MATALB 2015a on the machine equipped with a core 3.5 GHz with 8GB memory without any parallel framework.

4.1 Simple Square Image

 To demonstrate the performance on different types of images, we first evaluate above six algorithms on a square image with a simple structure of size 256*256. The synthetic square image is composed of four classes with different gray values, which are 7, 78, 214 and 251 respectively (as shown in Fig. 2(a)). The classes in image are separated into small patches and they are of different sizes. For simplicity, we call them C1 (gray value 7), C2 (gray value 78), C3 (gray value 214) and C4 (gray value 251). The ground-truth of square image is shown in Fig. 2(c), which is segmented into four parts corresponding to four classes. To further evaluate the robustness to noise of different algorithms, the polluted square image (as shown in Fig. 2(b)) with salt and pepper 1% noise is processed by above six algorithms. Our experimental setup is as follows: intuitionistic fuzzy constant m is 2, non-membership constant α is 5, iterative stopping condition is e = 0.00001 , maximum number of iterations M = 1000, the bandwidth \(\sigma_{B}\) and displacement parameter 𝜆𝜆 of kernel function are 0.3 and 0.2 respectively, \(\lambda_{s}\) and \(\lambda_{g}\) of local spatial-gray information scale parameters are 2, window size takes 3*3 (Num = 8) and the number of clusters C is 4 corresponding to above classes. The segmented results of the proposed method is shown in Fig. 2(d), while Fig. 2(e)-(i) shows the segmented square image by FCM [5], IFCM [15], KIFCM [16], IFCMS [18] and IIFCM [19] respectively with setting optimal parameter values. Their quantitative comparisons are shown in Table 1, Table 2 and Table 3, and the best two results are shown in red and green.

 It is obvious from segmented results that all of the six algorithms perform well on simple structured image without much difference. However, from the perspective of robustness, there might be some differences. To have a further evaluation on different algorithms, the square images with salt and pepper 5% noise, Poisson noise and Gaussian 1% noise are segmented by above six algorithms. Their results are measured in terms of \(\rho, r_{f n}\) and \(r_{f p}\) which are shown in Table 1, Table 2 and Table 3 respectively. We can find from Table 1 that the proposed algorithm has the best similarity index \(\rho\) amongst all the six algorithms except the Poisson noise. As shown in Table 2 and Table 3, false negative ratio \(r_{f n}\) and false positive ratio \(r_{f p}\) values of the proposed algorithm and IIFCM are significantly better than others with various noise, while proposed method is always better than IIFCM more or less. It also can be noted from Table 1, Table 2 and Table 3 that different algorithms perform their best values in terms of \(\rho, r_{f n}\) and \(r_{f p}\) with salt and pepper 1% and Poisson noise. On the whole, our algorithm achieves great segmentation results and a best robustness to noise.

 

Fig. 2. Square image and segmented results of different algorithms: (a)square image (b)square image with salt and pepper 1% noise, (c)ground-truth image, (d)proposed algorithm, (e)FCM algorithm, (f)IFCM algorithm, (g)KIFCM algorithm, (h)IFCM-S algorithm, (i)IIFCM algorithm.

Table 1. The similarity index (\(\rho\)) on segmented square images with different noise

 

Table 2. The false negative ratio (\(r_{f n}\)) on segmented square images with different noise

 

Table 3. The false positive ratio (\(r_{f p}\)) on segmented square images with different noise

 

4.2 Simulated Brain Database

 The human brain has complex structures. The noise and ambiguity in boundaries between different tissues make it difficult to segment MRI brain image. Three main tissues of brain should be accurately segmented: cerebro spinal fluid (CSF), gray matter (GM) and white matter (WM) [33]. In this section, tests are performed on MRI brain images to further evaluate the segmentation performance of different algorithms. Simulated MRI brain images for testing and ground-truth images can be acquired from Brain Web [31], which is a publically available Simulated Brain Database (SBD). We explore how different methods can be used for the brain image segmentation task under the same experimental setting as the test on square image. Note that the number of cluster c = 4 corresponding to GM, WM, CSF and background respectively.

 Fig. 3(a) shows a simulated MRI brain image of size 217*181, which is a T1-weighted image of slice thickness 1 mm, with 1% noise and 0 intensity non-uniformity (INU). Following the strategy outlined in [34, 35], image without non-brain tissues (as shown in Fig. 3(b)) are obtained for segmentation. The ground-truth segmented images of CSF, GM and WM (background is not considered) are shown in Fig. 3(c). Then we provide MRI segmented results of different algorithms in Fig. 3(d)-(i), corresponding to the proposed algorithm, FCM, IFCM, KIFCM, IFCM-S and IIFCM respectively. Different algorithms are implemented with their best-optimized parameters. As described above, measurement metrics  \(\rho, r_{f n}\) and \(r_{f p}\) are used to evaluate their performance quantificationally for GM and WM, which is shown in Table 4 and Table 5. We can find from evaluation results that the proposed algorithm achieves best values in terms of all the three measures for both GM and WM, indicating that the segmented results of proposed algorithm yield the most similar structure to ground-truth images, with the lowest probability of error.

 

Fig. 3. MRI brain image and segmented results of different algorithms: (a)MRI brain image of slice thickness 1 mm, with 1% noise and INU=0, (b)image without non-brain tissues, (c)ground-truth image, (d)proposed algorithm, (e)FCM algorithm, (f)IFCM algorithm, (g)KIFCM algorithm, (h)IFCM-S algorithm, (i)IIFCM algorithm.

Table 4. Measurement metrics (\(\rho\), \(r_{f n}\), \(r_{f p}\)) for GM segmented images by different algorithm with different noise level and INU

 

Table 5. Measurement metrics  (\(\rho\), \(r_{f n}\), \(r_{f p}\)) for WM segmented images by different algorithm with different noise level and INU

 

 In order to yield a more explicit comparative analysis, MRI brain images with different noise level (0%, 1% and 5%) and intensity non-uniformity (INU=0 and INU=20) are processed by all the six algorithms. Observing comparable results in Table 4 and Table 5, it is as expected that almost all the evaluation results of our algorithm yield best values in terms of above three measures.

4.3 BSDS500 Dataset

 After carrying out experiments on Simulated Brain Database, we evaluate different algorithms on the Berkeley Segmentation Dataset and Benchmark (BSDS500) [32]. BSDS500 is a widely used dataset of natural images for segmentation, which is composed of 200 training, 100 validation and 200 test images, and each image is manually labeled ground-truth image by annotators. To demonstrate the effectiveness of our segmentation algorithm, we show the segmentation results of our approach on several samples of the BSDS500 benchmark with their optimal parameters setting and the best segmentation effect, which are shown in Fig. 4.

 

Fig. 4. Some segmentation results of our algorithm on BSDS500

 To further evaluate the performance of different algorithms, from the perspective of the essence of clustering, images in BSDS500 are segmented and their results are measured by some clustering evaluation indexes. The experimental setting remains the same as above experiments. Below we first introduce some cluster validity measures.

 The partition coefficient Fc and partition entropy Hc are representative functions for evaluating the performance of fuzzy clustering. 0 ≤ Fc ≤ 1, 0 ≤ Hc ≤ 1. The greater the partition coefficient, or the smaller the partition entropy, then the higher the accuracy of clustering, the better the clustering effect. The Purity index evaluates the clustering performance by calculating the ratio of the number of correctly clustered data to the total data. It is easy to calculate with the value between 0 and 1. The value of a completely wrong cluster method is 0, and an entirely correct method is 1. In addition, from the aspects of classification compactness and classification distance, we measure Davies-Bouldin Index (DB), which is calculated by dividing the sum of the average distance between any two classes by the distance between their cluster centers. The smaller DB, the smaller the intra-class distance and the greater the distance between classes, the better the clustering effect. Dunn Validity Index (DVI) is another index measuring inter-class distance and intra-class distance. As intra-class distance becomes smaller, or the distance between classes becomes larger, then clustering effect is better and the value of DVI increases.

 Without loss of generality, we randomly select 200 images in the dataset BSDS500. To synthetically measure the effectiveness of different algorithms, 200 selected images are processed by all the six algorithms. The final results based on five measuring metrics are obtained by averaging 200 images, as shown in Table 6. Time spending is also an important criteria for evaluating the performance of algorithms. Therefore, we give the average running time of different algorithms in Table 6 too. Besides, to evaluate the robustness to image noise of different algorithms on BSDS500, following the strategy in [8],we add noise with different noise level to 200 selected images and measure their average clustering accuracy, which is given in Table 7.

 It can be seen from the experimental data that although our algorithm runs a relatively longer time, the results of different evaluation indexes of the proposed algorithm in Table 6 and Table 7 are obviously better than those of other algorithms. That is to say, applying the algorithm to image segmentation can more accurately put a pixel-point into the corresponding class and get better clustering effect without losing the real-time. Besides, as we can see from the segmentation results in Fig. 4, our algorithm achieves great segmentation performance on BSDS500 images, especially those large and distinguishing objects. The main reasons are: firstly, the method of determining the initial clustering center takes into account the distribution of pixels and gray scale features, which makes the algorithm better than other algorithms in the initial stage; secondly, the hesitation degree generated by our modified non-membership function considers more uncertainty, and it is more applicable to image segmentation; thirdly, the kernel space distance metric is introduced to make the linearly inseparable data separable in high-dimensional space, thus the classification ability has been improved effectively; fourthly, we propose a local spatial-gray information measure in the objective function to improve accuracy and robustness to noise: on the one hand, the gray relationship between pixels is considered; on the other hand, the spatial position relationship is taken into account as well; finally, the improved intuitionistic fuzzy entropy highlights even more the effect of uncertainty, which embodies fuzziness and intuition, making the segmentation results more accurate. Therefore, our algorithm can not only segment image well, but it can also deal with the uncertainty effectively, such as noise and cluster boundary. And the accompanying drawback is that computation and time cost is more. Even so, achieving significant performance improvement at the little cost of time spending, is obviously a rather satisfactory result.

Table 6. The segmentation benchmarks on the BSDS

 

Table 7. The clustering accuracy on the BSDS with different noise level

 

 Next, we would compare clustering and segmentation performance of different algorithms from the aspect of data and statistics. There are two main evaluation indicators: precision and recall. Since the two indicators are always contradictory, we chose F1-measure to consider them simultaneously. The F1-measure is a measure of a test’s accuracy and can be interpreted as a scored mean of the precision and recall, which can be represented as \(F 1=\frac{2 \cdot P \cdot R}{P+R}=\), where P is the precision, R is the recall, the F1 score reaches its best value at 1 and worst score at 0.

 Image segmentation can be understood as a multi-classification problem, that is, classifying a pixel point to a cluster or another. To have a further comprehensive and objective evaluation, we evaluate different algorithms on BSDS300 and BSDS500 benchmark dataset respectively measuring their F1-measures. Table 7 shows the F1-measures of six different image segmentation algorithms when choosing an optimal scale for the entire dataset (ODS), the aggregate F1-measure for the best scale in each image (OIS), and the average precision (AP) on the full recall range (equivalently, the area under the precision-recall curve). The maximum F1-measures of these algorithms are shown in Fig. 5.

 

Fig. 5. Evaluation results of different algorithms on the Berkeley Segmentation Dataset: (a)BSDS300, (b)BSDS500

Table 7. The segmentation benchmarks on the BSDS

 

 From Table 7 and the curves in Fig. 5 we can see that the algorithm proposed in this paper has better F1-measures on different evaluations and different datasets. Moreover, it performs significantly better than other algorithms across almost the entire operating regime, which means it has a best classification precision and segmentation performance.

5. Conclusion

 In this paper, we propose a novel image segmentation algorithm based on improved intuitionistic fuzzy set and C-means clustering algorithm. It overcomes the shortcomings of traditional FCM, IFCM and some existing algorithms, combining kernel space distance, local spatial information, local gray information and improved intuitionistic fuzzy entropy in the objective function. The initial clustering center is determined based on the gray features and spatial location of pixels, so that the algorithm performs excellent in processing noise and uncertainty in image. Besides, the intuitionistic fuzzy set and non-membership degree are generated by an improved method, which highlight the role of uncertainty effectively. We conduct experiments on simple square image, MRI brain image and BSDS500 dataset respectively, and different evaluation measures are adopted to quantitatively evaluate the algorithm. In addition, we also compare proposed algorithm with five existing representative fuzzy clustering based segmentation algorithms. Experimental results demonstrate that the performance of image segmentation, the pixel clustering effect and the robustness to noise of the proposed algorithm are all significantly better than other algorithms.

 

References

  1. H. F. Sima, P. Guo, "Texture superpixels merging by color-texture histograms for color image segmentation," Ksii Transactions on Internet and Information Systems, vol. 8, no. 7, pp. 2400-2419, July, 2014. https://doi.org/10.3837/tiis.2014.07.011
  2. Y.S. Jeong, C. G. Lim, B. S. Jeong, H. J. Choi, "Topic masks for image segmentation," Ksii Transactions on Internet and Information Systems, vol. 7, no. 12, pp. 3274-3292, December, 2013. https://doi.org/10.3837/tiis.2013.12.018
  3. H. F. Sima, P. Guo, Y. F. Zou, Z. H. Wang and M. L. Xu, "Bottom-Up merging segmentation for color images with complex areas," IEEE Transactions on Systems Man & Cybernetics Systems, vol. 48, no. 3, pp. 354-365, March, 2018. https://doi.org/10.1109/TSMC.2016.2608831
  4. J. A. Hartigan and M. A. Wong, "Algorithm AS 136: A K-means clustering algorithm," Journal of the Royal Statistical Society, vol. 28, no. 1, pp. 100-108, 1979. https://doi.org/10.2307/2346830
  5. J. C. Bezdek, R. Ehrlich, and W. Full, "FCM: The fuzzy C-means clustering algorithm," Computers & Geosciences, vol. 10, no. 2, pp. 191-203, January, 1984. https://doi.org/10.1016/0098-3004(84)90020-7
  6. H. S. Park and C.H. Jun, "A simple and fast algorithm for K-medoids clustering," Expert Systems with Applications, vol. 36, no. 2, pp. 3336-3341, March, 2009. https://doi.org/10.1016/j.eswa.2008.01.039
  7. F. Tung, A. Wong, and D. A. Clausi, "Enabling scalable spectral clustering for image segmentation," Pattern Recognition, vol. 43, no. 12, pp. 4069-4076, December, 2010. https://doi.org/10.1016/j.patcog.2010.06.015
  8. F. Zhao, H. Q. Liu, J. L. Fan, C. W. Chen, R. Lan and N. Li, "Intuitionistic fuzzy set approach to multi-objective evolutionary clustering with multiple spatial information for image segmentation," Neurocomputing, vol. 312, pp. 296-309, October, 2018. https://doi.org/10.1016/j.neucom.2018.05.116
  9. S. Kim, D. Y. Chang, S. Nowozin, and P. Kohli, "Image segmentation using higher-order correlation clustering," IEEE Transactions on Pattern Analysis & Machine Intelligence, vol. 36, no. 9, pp. 1761-1774, September, 2014. https://doi.org/10.1109/TPAMI.2014.2303095
  10. S. Dai, S. Zhan, and N. Song, "Adaptive active contour model: A localized mutual information approach for medical image segmentation," Ksii Transactions on Internet & Information Systems, vol. 9, no. 5, pp. 1840-1855, May, 2015. https://doi.org/10.3837/tiis.2015.05.016
  11. K. Zhang, L. Zhang, K. M. Lam, and D. Zhang, "A level set approach to image segmentation With Intensity Inhomogeneity," IEEE Transactions on Cybernetics, vol. 46, no. 2, pp. 546-557, February, 2016. https://doi.org/10.1109/TCYB.2015.2409119
  12. Y. J. Peng and Y. R. Ma, "A level set method to image segmentation based on local direction gradient," Ksii Transactions on Internet and Information Systems, vol. 12, no. 4, pp. 1760-1778, April, 2018. https://doi.org/10.3837/tiis.2018.04.020
  13. S. De, S. Bhattacharyya, S. Chakraborty, and P. Dutta, "Image segmentation: A review," Springer International Publishing, pp. 29-40, 2016.
  14. K. T. Atanassov, "Intuitionistic fuzzy sets," Fuzzy Sets & Systems, vol. 20, no. 1, pp. 87-96, August, 1986. https://doi.org/10.1016/S0165-0114(86)80034-3
  15. T. Chaira, "A novel intuitionistic fuzzy C means clustering algorithm and its application to medical images," Applied Soft Computing, vol. 11, no. 2, pp. 1711-1717, March, 2011. https://doi.org/10.1016/j.asoc.2010.05.005
  16. P. Kaur, A. K. Soni, and A. Gosain, "Retracted: A robust kernelized intuitionistic fuzzy c-means clustering algorithm in segmentation of noisy medical images," Pattern Recognition Letters, vol. 34, no. 2, pp. 163-175, January, 2013. https://doi.org/10.1016/j.patrec.2012.09.015
  17. M. D. Ansari, A. R. Mishra, and F. T. Ansari, "New divergence and entropy measures for intuitionistic fuzzy sets on edge detection," International Journal of Fuzzy Systems, vol. 20, no. 2, pp. 474-487, July, 2018. https://doi.org/10.1007/s40815-017-0348-4
  18. B. K. Tripathy, A. Basu, and S. Govel, "Image segmentation using spatial intuitionistic fuzzy C means clustering," in Proc. of the IEEE International Conference on Computational Intelligence and Computing Research, December, 2014.
  19. H. Verma, R. K. Agrawal, and A. Sharan, "An improved intuitionistic fuzzy C-means clustering algorithm incorporating local information for brain image segmentation," Applied Soft Computing, vol. 46, no. C, pp. 543-557, September, 2016. https://doi.org/10.1016/j.asoc.2015.12.022
  20. R. R. Yager, "On the measure of fuzziness and negation Part I: Membership in the unit interval," Internatinal Journal of General System, vol. 5, no. 4, pp. 221-229, January, 1979. https://doi.org/10.1080/03081077908547452
  21. R. R. Yager, "On the measure of fuzziness and negation. II. Lattices," Information & Control, vol. 44, no. 3, pp. 236-260, March, 1980. https://doi.org/10.1016/S0019-9958(80)90156-4
  22. M. Sugeno, "Fuzzy measures and fuzzy integrals-A survey," Readings in Fuzzy Sets for Intelligent Systems, vol. 6, no. 1, pp. 251-257, 1993. https://doi.org/10.1016/B978-1-4832-1450-4.50027-4
  23. Y. K. Dubey, M. M. Mushrif, and K. Mitra, "Segmentation of brain MR images using rough set based intuitionistic fuzzy clustering," Biocybernetics & Biomedical Engineering, vol. 36, no. 2, pp. 413-426, 2016. https://doi.org/10.1016/j.bbe.2016.01.001
  24. X. Chen, X. S. Wu, W. Wang, and H. Wang, "An improved initial cluster centers selection algorithm for K-means based on features correlative degree," Journal of Sichuan University, vol. 47, no. 1, pp. 13-19, 2015. https://doi.org/10.3969/j.issn.0490-6756.2010.01.003
  25. H. Jiang, G. Zhang, and J. Cai, "An improved ant colony clustering algorithm based on LF algorithm," in Proc. of the IEEE International Conference on E-Business Engineering, October, 2015.
  26. S. S. Khan and A. Ahmad, "Cluster center initialization algorithm for K-means clustering," Expert Systems with Applications, vol. 40, no. 18, pp. 7444-7456, December, 2013. https://doi.org/10.1016/j.eswa.2013.07.002
  27. T. Chaira and A. Panwar, "An Atanassov's intuitionistic fuzzy kernel clustering for medical image segmentation," International Journal of Computational Intelligence Systems, vol. 7, no. 2, pp. 360-370, November, 2014. https://doi.org/10.1080/18756891.2013.865830
  28. Y. M. Kim, K. T. Park, and Y. S. Moon, "Target segmentation in non-homogeneous infrared images using a PCA plane and an adaptive Gaussian kernel," Ksii Transactions on Internet & Information Systems, vol. 9, no. 6, pp. 2302-2316, June, 2015. https://doi.org/10.3837/tiis.2015.06.019
  29. P. Burillo and H. Bustince, "Entropy on intuitionistic fuzzy sets and on interval-valued fuzzy sets," Fuzzy Sets & Systems, vol. 78, no. 3, pp. 305-316, March, 1996. https://doi.org/10.1016/0165-0114(96)84611-2
  30. F. Y. Meng and X. H. Chen, "Entropy and similarity measure for Atannasov's interval-valued intuitionistic fuzzy sets and their application," Fuzzy Optimization & Decision Making, vol. 15, no. 1, pp. 75-101, April, 2015. https://doi.org/10.1007/s10700-015-9215-7
  31. C. A. Cocosco, V. K., R. K. -S. Kwan, A. C. Evans, "BrainWeb: Online interface to a 3D MRI simulated brain database," NeuroImage, vol. 5, no. 4, 1997.
  32. P. Arbelaez, M. Maire, C. Fowlkes, and J. Malik, "Contour detection and hierarchical image segmentation," IEEE Transactions on Pattern Analysis & Machine Intelligence, vol. 33, no. 5, pp. 898-916, May, 2011. https://doi.org/10.1109/TPAMI.2010.161
  33. Z. Ji, Y. Xia, Q. Sun, Q. Chen, D. Xia, and D. D. Feng, "Fuzzy local gaussian mixture model for brain MR image segmentation," IEEE Transactions on Information Technology in Biomedicine, vol. 16, no. 3, pp. 339-347, May, 2012. https://doi.org/10.1109/TITB.2012.2185852
  34. Brain Extraction Tool (BET). https://fsl.fmrib.ox.ac.uk/fsl/fslwiki/BET.
  35. M. W. Woolrich, "Bayesian analysis of neuroimaging data in FSL," Neuroimage, vol. 45, no. 1, pp. S173-S186, March, 2009. https://doi.org/10.1016/j.neuroimage.2008.10.055