Anda di halaman 1dari 16

See discussions, stats, and author profiles for this publication at: https://www.researchgate.

net/publication/311494468

A Clustering Algorithm Based on Intuitionistic Fuzzy Relations for Tree


Structure Evaluation

Article · December 2017


DOI: 10.1007/s40819-016-0286-0

CITATIONS READS

0 185

2 authors:

Ismat Beg Tabasam Rashid


Lahore School of Economics University of Management and Technology (Pakistan)
267 PUBLICATIONS   2,628 CITATIONS    53 PUBLICATIONS   446 CITATIONS   

SEE PROFILE SEE PROFILE

Some of the authors of this publication are also working on these related projects:

Multicriteria Group Decision Making View project

Multicriteria Group Decision Making View project

All content following this page was uploaded by Ismat Beg on 25 October 2017.

The user has requested enhancement of the downloaded file.


Int. J. Appl. Comput. Math (2017) 3:3131–3145
DOI 10.1007/s40819-016-0286-0

ORIGINAL PAPER

A Clustering Algorithm Based on Intuitionistic Fuzzy


Relations for Tree Structure Evaluation

Ismat Beg1 · Tabasam Rashid2

Published online: 18 December 2016


© Springer India Pvt. Ltd. 2016

Abstract Intuitionistic fuzzy relations are used to construct hierarchical structures for the
evaluation of vague complicated humanistic systems. A novel algorithm to develop partition
trees at different levels according to different intuitionistic fuzzy triangular norm composition
is presented. Examples are given to demonstrate the usefulness of the proposed algorithm.

Keywords Fuzzy cluster analysis · Intuitionistic fuzzy relation · Hierarchical evaluation


structure

Introduction

Ordinary fuzzy set theory is used to show the vagueness in an information by Zadeh [36]. A
fuzzy set A in the universe X is a mapping from X to [0, 1]. Bellman et al. [5] and Ruspini
[20] started work in clustering with fuzzy sets. Now-a-days fuzzy clustering has been applied
and studied in different areas (Bezdek [6], Dave [7], Trauwaert et al. [26]). Atanassov [1]
gave the notion of intuitionistic fuzzy sets (IFS) which is an
 extension of Zadeh’s fuzzy set.

An IFS A can be defined as a mapping from X to L ∗ = (x1 , x2 ) ∈ [0, 1]2 |x1 + x2 ≤ 1 ,
where x1 and x2 are called membership and non-membership values for x ∈ X, respectively.
The class of all IFSs in X is denoted IF (X ). IFS has proved to be a very suitable tool to
describe the uncertain or imprecise information. In [34], summation of any two numbers
x = (x1 , x2 ) and y = (y1 , y2 ) from L ∗ is defined as:

x + y = (x1 , x2 ) + (y1 , y2 ) = ((x1 + y1 )/2, (|x2 − y2 |)/2).

B Ismat Beg
ibeg@lahoreschool.edu.pk
Tabasam Rashid
tabasam.rashid@gmail.com
1 Centre for Mathematics and Statistical Sciences, Lahore School of Economics, Lahore, Pakistan
2 School of Science, University of Management and Technology, Lahore 54770, Pakistan

123
3132 Int. J. Appl. Comput. Math (2017) 3:3131–3145

Here, we define the ordering ≤ L ∗ in L ∗ given as:


• If x1 < y1 , then (x1 , x2 ) < L ∗ (y1 , y2 );
• If x1 = y1 and x2 > y2 then (x1 , x2 ) < L ∗ (y1 , y2 ).
(L ∗ , ≤ L ∗ ) assumes the structure of a complete, bounded totally ordered set with great-
est element 1 L ∗ = (1, 0) and smallest element 0 L ∗ = (0, 1). Equality = L ∗ defined as
(x1 , x2 ) = L ∗ (y1 , y2 ), ⇔ x1 = y1 and x2 = y2 . The sup and inf operations on this lattice
are derived from ≤ L ∗ as:
sup((x1 , x2 ), (y1 , y2 )) = (max(x1 , y1 ), min(x2 , y2 ))
and
inf((x1 , x2 ), (y1 , y2 )) = (min(x1 , y1 ), max(x2 , y2 )).
IFS has been used in medical diagnosis [8,16,18]. Recently Beg and Rashid [3] developed
the concept of trapezoidal valued intuitionistic fuzzy sets (TVIFS) and used it for complex
decision making problems. Various methods for evaluation of hierarchical structures are
available in literature such as hierarchy consistency analysis (HCA) [10–14] and analytical
hierarchy process (AHP) [21–23]. Most of the time fuzzy clustering is based on fuzzy relation
matrix. Fuzzy hierarchical clustering is useful in various real life application. A max–min
transitive and similarity relations are proposed by Zadeh [37]. Since this max–min similarity
relation is a resolution form of equivalence relations, we can also obtain a corresponding
multi-level hierarchical clustering structure. The principal objects of our study are the binary
operations [0, 1]. There are concepts of triangular norm (t-norm) and triangular conorm
(t-conorm) as a popular binary operations on [0, 1]. A t-norm on [0, 1] is any increasing,
commutative, associative [0, 1]2 → [0, 1] mapping T that satisfies T (1, x) = x for all
x ∈ [0, 1]; a t-conorm on [0, 1] is any increasing, commutative, associative [0, 1]2 → [0, 1]
mapping S that satisfies S(0, x) = x for all x ∈ [0, 1]. We refer to the book of Klement et
al. [17] for the detailed study of t-norm and t-conorm. To every t-norm T there corresponds
a t-conorm called the dual t-conorm, defined by:
T ∗ (x, y) = 1 − T (1 − x, 1 − y).
One of the popular and important t-norm is minimum operator M such that M(x, y) =
min(x, y) and its corresponding dual t-conorm is maximum operator M ∗ such that
M ∗ (x, y) = max(x, y).
An n-step max–min compositions of fuzzy relations given by Tamura et al. in [25]. Yang
and Shih generalized Tamura’s n-step composition procedure by using t-norm instead of min
in max–min composition in [35] and they obtained a max-T relation after n-step max-T
composition. Fuzzy c-means clustering is also papular in literature, in which a number of
clusters are constructed on the basis of suitable distance [4]. The clustering algorithms for
intuitionistic fuzzy sets was developed in [29,33] and these algorithm considers each of the
given IFSs as a unique cluster in the first stage, and then compares each pair of the IFS by
using the weighted Hamming distance or the weighted Euclidean distance. They repeated
the procedure until the desirable number of clusters formed. Some clustering algorithms
under intuitionistic fuzzy environment were developed in [27,28,31,32,39]. In [39], Zhao
et al. compute the association cofficients of intuitionistic fuzzy information and then by uti-
lizing these values algorithm construct clusters. In [27,28,31,32,39], clusters were formed
by converting the intuitionistic fuzzy information to single values and then apply the algo-
rithm. But here we proposed a clustering algorithm on intuitionistic fuzzy values without
any kind of conversion of these values. All these existing techniques are different from our

123
Int. J. Appl. Comput. Math (2017) 3:3131–3145 3133

proposed procedure. Deschrijver et al. developed the concept of intuitionistic fuzzy t-norm
and intuitionistic fuzzy t-conorm in [9]. An intuitionistic fuzzy t-norm on L ∗ is any increas-
ing, commutative, associative (L ∗ )2 → L ∗ mapping T that satisfies T (1 L ∗ , x) = x, for all
x ∈ L ∗ ; an intuitionistic fuzzy t-conorm on L ∗ is any increasing, commutative, associative
(L ∗ )2 → L ∗ mapping S satisfying S (0 L ∗ , x) = x, for all x ∈ L ∗ .
The aim of this paper is to propose an algorithm to develop partition trees at different levels
according to different intuitionistic fuzzy triangular norm (t-norm) compositions by using
intuitionistic fuzzy sets. The hierarchical clustering based on fuzzy relations is an active
topic of research in the existing literature and hierarchical tree structure is an interesting
approach. The one thing in this paper that is different from the existing algorithm is that
fuzzy relation is built from intuitionistic idea which has been developed for long time. To
the best of our knowledge, no one has discussed about the hierarchical clustering algorithm
for intuitionistic fuzzy sets based on different intuitionistic fuzzy t-norms to produce several
partition trees. Rest of the paper is organized as follows: In Sect. 2, some basic concepts of
fuzzy set and intuitionistic fuzzy set are given. In Sect. 3, first we introduce the concept of sup-
T compositions and construct the transitive closure matrix. Secondly a clustering algorithm
was proposed to develop partition of clusters at different levels. An algorithm is given to
obtain a partition tree structure without using the sup-T composition. To further elaborate
this proposed algorithm two interesting examples are given. Conclusion of the paper is given
in last section.

Intuitionistic Fuzzy Relation

The greatest intuitionistic fuzzy t-norm with respect to the ordering ≤ L ∗ is inf, while
the smallest intuitionistic fuzzy t-conorm with respect to ≤ L ∗ is sup. An intuitionistic
fuzzy t-norm T on L ∗ (resp. t-conorm S ) is called t-representable if there exists a t-
norm T and a t-conorm S on [0, 1] (resp. a t-conorm S  and a t-norm T  on [0, 1]) such
that, for x = (x1 , x2 ), y = (y1 , y2 ) ∈ L ∗ , T (x, y) = (T (x1 , y1 ), S(x2 , y2 )) (resp.
S (x, y) = (S  (x 1 , y1 ), T  (x 2 , y2 ))); T and S (resp. S  and T  ) are called the representants
of T (resp. S ). Clearly, inf and sup are t-representable:

T M (x, y) = (M(x 1 , y1 ), M ∗ (x 2 , y2 ))
S M (x, y) = (M ∗ (x 1 , y1 ), M(x 2 , y2 )).

Definition 2.1 Intuitionistic fuzzy relation R from X to Y is an intuitionistic fuzzy subset


of X × Y and R(x, y) ∈ L ∗ for all (x, y) ∈ X × Y .

An intuitionistic fuzzy relation from X to X is called an intuitionistic fuzzy relation on X .

Definition 2.2 For α ∈ L ∗ , the α-cut Rα is given by:

Rα = {(x, y)|R(x, y) ≥ L ∗ α}

Definition 2.3 An intuitionistic fuzzy relation R on X is called:

1. Reflexive if and only if R(x, x) = 1 L ∗ for all x ∈ X ;


2. Symmetric if and only if R(x, y) = L ∗ R(y, x) for all x, y ∈ X ;
3. Sup-T transitive if and only if R(x, z) ≥ L ∗ sup {T (R(x, y), R(y, z))} for all x, z ∈ X .
y∈X

123
3134 Int. J. Appl. Comput. Math (2017) 3:3131–3145

Definition 2.4 An intuitionistic fuzzy relation R on X is called an intuitionistic fuzzy prox-


imity relation if it satisfies the reflexivity and symmetry.

Definition 2.5 An intuitionistic fuzzy relation R on X is called an Sup-T intuitionistic fuzzy


similarity relation if it satisfies the reflexivity and symmetry and Sup-T transitivity.

A sup-T composition “◦” for any two intuitionistic fuzzy relations R1 and R2 is defined as:
R1 ◦ R2 (x, z) = sup {T (R1 (x, y), R2 (y, z))} ∀x, z ∈ X,
y∈X

where R1 ◦ R2 represents the composition of two intuitionistic fuzzy relations R1 and


R2 on X . A sup-T composition with a different intuitionistic fuzzy t-norm yields different
composition results.
An n-step procedure with a sup-T M composition indicates that with a finite n, R(n) will
become an intuitionistic fuzzy similarity matrix with a sup-T M transitivity. This n-step pro-
cedure can be extended to any sup-T composition.

Theorem 2.6 Suppose that R(0) is a proximity-relation matrix. Then, by sup-T compositions,
one has
I < L ∗ R(0) < L ∗ R(1) < L ∗ · · · < L ∗ R(n) = L ∗ R(n+1) = · · · ,

where R(n) is a sup-T intuitionistic fuzzy similarity relation. If n is not finite, then lim R(n) =
n→∞
R(∞) with R(∞) a sup-T intuitionistic fuzzy similarity relation, i.e.

I < L ∗ R(0) < L ∗ R(1) < L ∗ · · · < L ∗ R(n) < L ∗ R(n+1) < L ∗ · · · < L ∗ R(∞) .

Proof Since R(0) is an intuitionistic


 proximity
  matrix, I < L ∗ R .
relation (0)
(1) (1) (1) (0) (0)
Let R = [γi j ] with γi j = sup T γik , γk j .
k   
(0) (0) (0) (0) (0) (0)
Then, one has γi j = L ∗ T (γi j , 1 L ∗ ) = L ∗ T (γi j , γ j j ) ≤ L ∗ sup T γik , γk j =L ∗
k
(1)
γi j . That is, R(0) ≤ L ∗ R(1) .
Suppose that R(0) does not have sup-T transitivity. Then, there exist i and j such that for
some l one has
(0) (0) (0)
γi j < L ∗ T (γil , γl j )
  
(0) (0) (0) (1)
⇒ γi j < L ∗ sup T γik , γk j = γi j
k
⇒ R(0) < L ∗ R(1) .
If R(1) does not have sup-T transitivity either, then similarly one has R(1) < L ∗ R(2) . If the
sup-T transitivity is not reached after (n − 1) compositions, then
I < L ∗ R(0) < L ∗ R(1) < L ∗ · · · < L ∗ R(n−1) < L ∗ R(n) .

Suppose that R(n) has sup-T transitivity. Then for all i, j one has
  
(n) (n) (n) (n+1)
γi j ≥ L ∗ sup T γik , γk j = γi j
k

and    
(n) (n) (n) (n)
T γii , γi j = L ∗ T 1 L ∗ , γi j = L ∗ γi j .

123
Int. J. Appl. Comput. Math (2017) 3:3131–3145 3135

Then     
(n+1) (n) (n) (n) (n) (n)
γi j = sup T γik , γk j ≥ L ∗ T γi j , γi j = γi j .
k
(n+1) (n)
One also has = L ∗ γi j , i.e., R(n+1) = L ∗ R(n) . Similarly, R(n+2) = L ∗ R(n+1) . That
γi j
is,
I < L ∗ R(0) < L ∗ R(1) < L ∗ · · · < L ∗ R(n−1) < L ∗ R(n) = L ∗ R(n+1) = · · · ,

If there is not a finite n such that R(n) = L ∗ R(n+1) = L ∗ · · · , then


I < L ∗ R(0) < L ∗ R(1) < L ∗ · · · < L ∗ R(n−1) < L ∗ R(n) < L ∗ R(n+1) < L ∗ · · · < L ∗ R∗ ,

where ⎡ ⎤
1L ∗ · · · 1L ∗
∗ ⎢ .. .. ⎥ .
R = ⎣. . ⎦
1L ∗ · · · 1L ∗

It is known that {R(k) |k = 0, 1, 2, . . .} is monotone and bounded. Then R(∞) =


limn→∞ R(n) exists. Next, it is claimed that R(∞) is a sup-T intuitionistic fuzzy similarity
relation.
Recall that   
(n) (n−1) (n−1)
γi j = sup T γik , γk j .
k

Now, a new term is defined as


  
(n) (n−1) (n−1)
γi j = sup T γik , γk j ,
k

Although R(n) and R(n) are different, limn→∞ R(n) = limn→∞ R(n) = R(∞) .
  
γi(2)
j = sup T γik(1)
1
, γ (1)
k 1 j ,
k1
      
(3) (2) (2) (1) (1) (1)
γi j = sup T γik2 , γk2 j = L ∗ sup T T γik1 , γk1 k2 , γk2 j ,
k2 k1 ,k2
..
.        
(n)
γi j = sup T · · · T T γik(1)
1
(1) (1) (1)
, γk1 k2 , γk2 k3 , · · · , γkn−1 j ,
k1 ,...,kn−1
       
γi(m+n)
j = sup T · · · T T γik(1)
1
, γ (1)
k k
1 2
, γ (1)
k k
2 3
, · · · , γ (1)
k m+n−1 j
k1 ,...,km+n−1
       
≥ L∗ sup sup T · · · T T γik(1)
1
, γ
(1)
k k
1 2
, γ
(1)
k k
2 3
, · · · , γ
(1)
k m+n−1 j
k1 ,...,km−1 km+1 ,...,km+n−1
      
=T sup T · · · T γik(1)
1
, γ
(1)
k1 k2 , · · · , γ
(1)
km−1 l ,
k1 ,...,km−1
     
sup T · · · T γlk(1)
m+1
(1) (1)
, γkm+1 km+2 , · · · , γkm+n−1 j
km+1 ,...,km+n−1
 
(m) (n)
= L∗ T γil , γl j .

(m+n) (m) (n)


Then γi j ≥ L ∗ T (γil , γl j ) for all l. As m → ∞ and n → ∞, one has
 
(∞) (∞) (∞)
γi j ≥ L ∗ T γil , γl j for all l.

123
3136 Int. J. Appl. Comput. Math (2017) 3:3131–3145

(∞) (∞) (∞)


One also has γii = L ∗ 1 L ∗ and γi j = L ∗ γ ji . That is, R(∞) is a sup-T intuitionistic
fuzzy similarity relation. 

The sup-T transitive procedure is given in [30]. Any intuitionistic fuzzy relation matrix R
can be decomposed into a resolution form by the use of α-cut, where α ∈ L ∗ . For example,
the intuitionistic fuzzy relation R on X × Y can be resolved into:
R = α Rα = α1 Rα1 + α2 Rα2 + · · · + αm Rαm ,
where (0, 1) ≤ L ∗ α1 ≤ L ∗ α2 ≤ L ∗ · · · ≤ L ∗ αm ≤ L ∗ (1, 0) and α Rα is an intuitionistic fuzzy
relation on X × Y defined as

α if (x, y) ∈ Rα ,
α Rα (x, y) =
0 otherwise.

Clustering Algorithm for Tree Structures

Beg and Rashid [2] proposed an improved version of clustering algorithm to develop a tree
structure from the transitive closure and non-transitive closure fuzzy similarity matrix. Li
and He [19] developed an algorithm to cluster the alternatives only from an intuitionistic
fuzzy equivalence matrix of alternatives. Here, we propose a clustering algorithm in which
we may skip the following s0 and s1 steps if we are interested to develop the cluster from
the given intuitionistic fuzzy proximity relation matrix. If we want to see the other options
according to different intuitionistic fuzzy t-norms then we compute s0 and s1 also. This
proposed clustering algorithm is also illustrated by flow chart in Fig. 1.
 
(0)
s0. Let R(0) = ri j be a given intuitionistic fuzzy proximity-relation matrix where
n×n
ri j ∈ L ∗ . Choose an intuitionistic fuzzy t-norm T .
Let k = 0.
s1. Let k = k + 1.
Let R(k) = L ∗ R(k−1) ◦ R(0) .
IF R(k)  = L ∗ R(k−1) , THEN go to step s1.  
ELSE a sup-T intuitionistic fuzzy similarity relation matrix R = ri j n×n = R(k) with
a transitive closure from the given intuitionistic proximity relation matrix R is obtained.
s2. Based on the α-cut decomposition, we have
R = ∪α α Rα = α1 Rα1 + α2 Rα2 + · · · + αm Rαm , 0 L ∗ ≤ L ∗ α1 ≤ L ∗ α2 ≤ L ∗ · · · ≤ L ∗
αm ≤ L ∗ 1, α1 , α2 , . . . , αm ∈ ∧ R (Level set of R).
Let k = m.
Let Pαk+1 = {{1}, {2}, · · · , {n}}.
s3. LOOP for αk , k = m, m − 1, . . . , 3, 2.
Set ri j = (0, 1) for all i = j and set ri j = (0, 1) for all ri j < L ∗ αk .
Let I = {1, 2, . . . , n}.
Let C = empt yset = { } and Pαk = {{ }}.
s4. LOOP for I
s4.1. Choose s and t from I so that
rst = sup{ri j |i < j, i, j ∈ I }.
Note that a tie is broken randomly.
IF rst  = 0 THEN link s and t into the same cluster C = {s, t} and GOTO s4.2.
ELSE Pαk = Pαk ∪ {all indices in I into separated clusters} and STOP.

123
Int. J. Appl. Comput. Math (2017) 3:3131–3145 3137

Fig. 1 Flow chart of proposed clustering algorithm

s4.2. Choose u from I \C so that


 
 
riu = sup ri j | j ∈ I \C with ri j  = 0 L ∗ for all i ∈ C
i∈C i∈C

IF there is such a u, THEN C = C ∪ Cu where Cu ∈ Pαk+1 such that u ∈ Cu and


go back to s4.2.
ELSE Pαk = Pαk ∪ {C}.
s4.3. Let I = I \C now put C = { } and go back to s4.1.
End of LOOP s4.
PRINT Pαk
Suppose k = k − 1.
End of LOOP s3.

123
3138 Int. J. Appl. Comput. Math (2017) 3:3131–3145

Example 3.1 We consider that there are five kids {John, Smith, Albert, Emma, Mike} in two
close relative families and calculate their face similarity by physical appearance, R(0) is an
initially given intuitionistic fuzzy relation matrix that represents the degrees of similarity of
each pair of two kids and this initial given matrix is only an intuitionistic fuzzy proximity-
relation matrix.
⎡ ⎤
(1, 0) (0.7, 0.2) (0.3, 0.6) (0.3, 0.5) (0.3, 0.6)
⎢ (0.7, 0.2) (1, 0) (0.3, 0.6) (0.4, 0.4) (0.5, 0.3) ⎥
⎢ ⎥
R =⎢
(0)
⎢ (0.3, 0.6) (0.3, 0.6) (1, 0) (0.9, 0) (0.5, 0.2) ⎥

⎣ (0.3, 0.5) (0.4, 0.4) (0.9, 0) (1, 0) (0.3, 0.5) ⎦
(0.5, 0.6) (0.5, 0.3) (0.5, 0.2) (0.3, 0.5) (1, 0)

The sup-T intuitionistic fuzzy similarity relation matrices is an optional step, we may skip this
composition. But here we will solve this example by just considering the sup-T M composition.
By using the T M t-norm for the composition of R(0) with itself, we get R.
⎡ ⎤
(1, 0) (0.7, 0.2) (0.5, 0.5) (0.5, 0.4) (0.5, 0.3)
⎢ (0.7, 0.2) (1, 0) (0.5, 0.3) (0.5, 0.4) (0.5, 0.3) ⎥
⎢ ⎥
R=⎢ ⎢ (0.5, 0.5) (0.5, 0.3) (1, 0) (0.9, 0) (0.5, 0.2) ⎥

⎣ (0.5, 0.4) (0.5, 0.4) (0.9, 0) (1, 0) (0.5, 0.2) ⎦
(0.5, 0.3) (0.5, 0.3) (0.5, 0.2) (0.5, 0.2) (1, 0)

There are α1 = (0.5, 0.5), α2 = (0.5, 0.4), α3 = (0.5, 0.3), α4 = (0.5, 0.2), α5 =
(0.7, 0.2), α6 = (0.9, 0) and α7 = (1, 0).
The partition for level α7 :
⎡ ⎤
(1, 0) 0 0 0 0
⎢ 0 (1, 0) 0 0 0 ⎥
⎢ ⎥
Rα7 = ⎢⎢ 0 0 (1, 0) 0 0 ⎥ ⎥
⎣ 0 0 0 (1, 0) 0 ⎦
0 0 0 0 (1, 0)

{John}, {Smith}, {Albert}, {Emma}, {Mike}.


The partition for level α6 :
⎡ ⎤
(1, 0) 0 0 0 0
⎢ 0 (1, 0) 0 0 0 ⎥
⎢ ⎥

Rα6 = ⎢ 0 0 (1, 0) (0.9, 0) 0 ⎥ ⎥
⎣ 0 0 (0.9, 0) (1, 0) 0 ⎦
0 0 0 0 (1, 0)

{Albert, Emma}, {John}, {Smith}, {Mike}.


The partition for level α5 :
⎡ ⎤
(1, 0) (0.7, 0.2) 0 0 0
⎢ (0.7, 0.2) (1, 0) 0 0 0 ⎥
⎢ ⎥
Rα5 = ⎢ ⎢ 0 0 (1, 0) (0.9, 0) 0 ⎥ ⎥
⎣ 0 0 (0.9, 0) (1, 0) 0 ⎦
0 0 0 0 (1, 0)

{Albert, Emma}, {John, Smith}, {Mike}.

123
Int. J. Appl. Comput. Math (2017) 3:3131–3145 3139

The partition for level α4 :


⎡ ⎤
(1, 0) (0.7, 0.2) 0 0 0
⎢ (0.7, 0.2) (1, 0) 0 0 0 ⎥
⎢ ⎥
Rα4 = ⎢ ⎢ 0 0 (1, 0) (0.9, 0) (0.5, 0.2) ⎥

⎣ 0 0 (0.9, 0) (1, 0) (0.5, 0.2) ⎦
0 0 (0.5, 0.2) (0.5, 0.2) (1, 0)
{Albert, Emma, Mike}, {John, Smith}.
The partition for level α3 :
⎡ ⎤
(1, 0) (0.7, 0.2) 0 0 (0.5, 0.3)
⎢ (0.7, 0.2) (1, 0) (0.5, 0.3) 0 (0.5, 0.3) ⎥
⎢ ⎥
Rα3 = ⎢⎢ 0 (0.5, 0.3) (1, 0) (0.9, 0) (0.5, 0.2) ⎥

⎣ 0 0 (0.9, 0) (1, 0) (0.5, 0.2) ⎦
(0.5, 0.3) (0.5, 0.3) (0.5, 0.2) (0.5, 0.2) (1, 0)
{Albert, Emma, Mike}, {John, Smith}.
The partition for level α2 :
⎡ ⎤
(1, 0) (0.7, 0.2) 0 (0.5, 0.4) (0.5, 0.3)
⎢ (0.7, 0.2) (1, 0) (0.5, 0.3) (0.5, 0.4) (0.5, 0.3) ⎥
⎢ ⎥
Rα2 = ⎢ ⎢ 0 (0.5, 0.3) (1, 0) (0.9, 0) (0.5, 0.2) ⎥

⎣ (0.5, 0.4) (0.5, 0.4) (0.9, 0) (1, 0) (0.5, 0.2) ⎦
(0.5, 0.3) (0.5, 0.3) (0.5, 0.2) (0.5, 0.2) (1, 0)
{Albert, Emma, Mike, John, Smith}.
The partition for level α1 :
⎡ ⎤
(1, 0) (0.7, 0.2) (0.5, 0.5) (0.5, 0.4) (0.5, 0.3)
⎢ (0.7, 0.2) (1, 0) (0.5, 0.3) (0.5, 0.4) (0.5, 0.3) ⎥
⎢ ⎥
Rα1 = ⎢ ⎢ (0.5, 0.5) (0.5, 0.3) (1, 0) (0.9, 0) (0.5, 0.2) ⎥

⎣ (0.5, 0.4) (0.5, 0.4) (0.9, 0) (1, 0) (0.5, 0.2) ⎦
(0.5, 0.3) (0.5, 0.3) (0.5, 0.2) (0.5, 0.2) (1, 0)
{Albert, Emma, Mike, John, Smith}.
Graphical structure of this clustering at all levels is given in Fig. 2:
In this example, it is easy to note that we apply our proposed clustering algorithm directly
on intuitionistic fuzzy values. Even at the last step of our algorithm we used the information in

Fig. 2 Sup-T M composition based clustering at different levels

123
3140 Int. J. Appl. Comput. Math (2017) 3:3131–3145

intuitionistic fuzzy form. In all the existing techniques [27–29,31–33,39] intuitionistic fuzzy
information tranformed to single value by using some kind of measure like, association
measure, distance measure, similarity and dissimilarity measure.

Example 3.2 A performance evaluation matrix, R (0) is an initially given intuitionistic fuzzy
proximity-relation matrix (adapted from [15]). Taiwan Assessment and Evaluation Associa-
tion (TWAEA) developed in 2000 for the performance evaluation of universities in Taiwan.
TWAEA constructed various hierarchical tree structures as performance evaluation models
for academic departments. For performance evaluation of academic departments the fol-
lowing ten criteria were used: Teaching innovation (TI), Teaching Quality (TQ), Teaching
Material (TM), Journal Paper (JP), Research Grant (RG), Academic Award (AW), Patent
Acquisition (PA), Student Consultation (SC), Professional Service (PS) and University Ser-
vice (US).

The committee derived the following intuitionistic fuzzy proximity-relation matrix.

TI TQ TM JP RG AW PA SC PS US
TI
⎡(1,0) (0.8,0.1) (0.8,0.1) (0.3,0.5) (0.2,0.4) (0.2,0.5) (0.2,0.5) (0.2,0.6) (0.2,0.7) (0.2,0.8)

TQ ⎢(0.8,0.1) (1.0,0) (0.8,0.1) (0.4,0.3) (0.3,0.5) (0.4,0.5) (0.3,0.5) (0.5,0.2) (0.1,0.7) (0.3,0.7)⎥
⎢ ⎥
TM ⎢(0.8,0.1) (0.8,0.1) (1.0,0) (0.5,0.3) (0.2,0.7) (0.3,0.4) (0.3,0.7) (0.3,0.6) (0.1,0.5) (0.2,0.7)⎥
⎢ ⎥
JP ⎢(0.3,0.5) (0.4,0.3) (0.5,0.3) (1.0,0) (0.8,0.1) (0.9,0.1) (0.8,0.1) (0.1,0.7) (0.8,0.1) (0.2,0.5)⎥
⎢ ⎥
RG ⎢(0.2,0.4) (0.3,0.5) (0.2,0.7) (0.8,0.1) (1.0,0) (0.6,0.3) (0.8,0.1) (0.1,0.7) (0.5,0.2) (0.2,0.6)⎥
⎢ ⎥
AW ⎢(0.2,0.5) (0.4,0.5) (0.3,0.4) (0.9,0.1) (0.6,0.3) (1.0,0) (0.5,0.2) (0.1,0.7) (0.9,0.1) (0.6,0.2)⎥
⎢ ⎥
PA ⎢(0.2,0.5) (0.3,0.5) (0.3,0.7) (0.8,0.1) (0.8,0.1) (0.5,0.2) (1.0,0) (0.1,0.7) (0.4,0.3) (0.1,0.5)⎥
⎢ ⎥
SC ⎢(0.2,0.6) (0.5,0.2) (0.3,0.6) (0.1,0.7) (0.1,0.7) (0.1,0.7) (0.1,0.7) (1.0,0) (0.2,0.8) (0.5,0.3)⎥
⎣ ⎦
PS (0.2,0.7) (0.1,0.7) (0.1,0.5) (0.8,0.1) (0.5,0.2) (0.9,0.1) (0.4,0.3) (0.2,0.8) (1.0,0) (0.6,0.3)

US (0.2,0.8) (0.3,0.7) (0.2,0.7) (0.2,0.5) (0.2,0.6) (0.6,0.2) (0.1,0.5) (0.5,0.3) (0.6,0.3) (1.0,0)

(a). Initial given intuitionistic fuzzy proximity relation matrix is R (0) . Now we apply our
proposed clustering algorithm.
s0.
⎡ ⎤
(1,0) (0.8,0.1) (0.8,0.1) (0.3,0.5) (0.2,0.4) (0.2,0.5) (0.2,0.5) (0.2,0.6) (0.2,0.7) (0.2,0.8)
⎢ ⎥
⎢ (0.8,0.1) (1.0,0) (0.8,0.1) (0.4,0.3) (0.3,0.5) (0.4,0.5) (0.3,0.5) (0.5,0.2) (0.1,0.7) (0.3,0.7)

⎢ ⎥
⎢ (0.8,0.1) (0.8,0.1) (1.0,0) (0.5,0.3) (0.2,0.7) (0.3,0.4) (0.3,0.7) (0.3,0.6) (0.1,0.5) (0.2,0.7)

⎢ ⎥
⎢ (0.3,0.5) (0.4,0.3) (0.5,0.3) (1.0,0) (0.8,0.1) (0.9,0.1) (0.8,0.1) (0.1,0.7) (0.8,0.1) (0.2,0.5)

⎢ ⎥
(0)
=⎢ ⎥
(0.2,0.4) (0.3,0.5) (0.2,0.7) (0.8,0.1) (1.0,0) (0.6,0.3) (0.8,0.1) (0.1,0.7) (0.5,0.2) (0.2,0.6)
R ⎢ ⎥
⎢ (0.2,0.5) (0.4,0.5) (0.3,0.4) (0.9,0.1) (0.6,0.3) (1.0,0) (0.5,0.2) (0.1,0.7) (0.9,0.1) (0.6,0.2)

⎢ ⎥
⎢ (0.2,0.5) (0.3,0.5) (0.3,0.7) (0.8,0.1) (0.8,0.1) (0.5,0.2) (1.0,0) (0.1,0.7) (0.4,0.3) (0.1,0.5)

⎢ ⎥
⎢ (0.2,0.6) (0.5,0.2) (0.3,0.6) (0.1,0.7) (0.1,0.7) (0.1,0.7) (0.1,0.7) (1.0,0) (0.2,0.8) (0.5,0.3)

⎣ (0.2,0.7) (0.1,0.7) (0.1,0.5) (0.8,0.1) (0.5,0.2) (0.9,0.1) (0.4,0.3) (0.2,0.8) (1.0,0) (0.6,0.3) ⎦
(0.2,0.8) (0.3,0.7) (0.2,0.7) (0.2,0.5) (0.2,0.6) (0.6,0.2) (0.1,0.5) (0.5,0.3) (0.6,0.3) (1.0,0)

s1. sup-T M composition of R (0) , we obtain R, which is sup-T M intuitionistic fuzzy


similarity relation matrix (Figs. 3, 4).

123
Int. J. Appl. Comput. Math (2017) 3:3131–3145 3141

Fig. 3 Sup-T M composition based clustering at different levels

Fig. 4 Clustering tree for initially given intuitionistic fuzzy proximity relation matrix

⎡ ⎤
(1.0,0) (0.8,0.1) (0.8,0.1) (0.5,0.3) (0.5,0.4) (0.5,0.4) (0.5,0.4) (0.5,0.2) (0.5,0.4) (0.5,0.5)
⎢ ⎥
⎢ (0.8,0.1) (1.0,0) (0.8,0.1) (0.5,0.3) (0.5,0.3) (0.5,0.3) (0.5,0.3) (0.5,0.2) (0.5,0.3) (0.5,0.3)

⎢ ⎥
⎢ (0.8,0.1) (0.8,0.1) (1.0,0) (0.5,0.3) (0.5,0.3) (0.5,0.3) (0.5,0.3) (0.5,0.2) (0.5,0.3) (0.5,0.4)

⎢ ⎥
⎢ (0.5,0.3) (0.5,0.3) (0.5,0.3) (1.0,0) (0.8,0.1) (0.9,0.1) (0.8,0.1) (0.5,0.3) (0.9,0.1) (0.6,0.2)

⎢ ⎥
R=⎢ ⎥
(0.5,0.4) (0.5,0.3) (0.5,0.3) (0.8,0.1) (1.0,0) (0.8,0.1) (0.8,0.1) (0.5,0.5) (0.8,0.1) (0.6,0.3)
⎢ ⎥
⎢ (0.5,0.4) (0.5,0.3) (0.5,0.3) (0.9,0.1) (0.8,0.1) (1.0,0) (0.8,0.1) (0.5,0.3) (0.9,0.1) (0.6,0.2)

⎢ ⎥
⎢ (0.5,0.4) (0.5,0.3) (0.5,0.3) (0.8,0.1) (0.8,0.1) (0.8,0.1) (1.0,0) (0.5,0.5) (0.8,0.1) (0.6,0.2)

⎢ ⎥
⎢ (0.5,0.2) (0.5,0.2) (0.5,0.2) (0.5,0.3) (0.5,0.5) (0.5,0.3) (0.5,0.5) (1.0,0) (0.5,0.3) (0.5,0.3)

⎣ (0.5,0.4) (0.5,0.3) (0.5,0.3) (0.9,0.1) (0.8,0.1) (0.9,0.1) (0.8,0.1) (0.5,0.3) (1.0,0) (0.6,0.2) ⎦
(0.5,0.5) (0.5,0.3) (0.5,0.4) (0.6,0.2) (0.6,0.3) (0.6,0.2) (0.6,0.2) (0.5,0.3) (0.6,0.2) (1.0,0)

s2.
α1 = (0.5, 0.5), α2 = (0.5, 0.4), α3 = (0.5, 0.3), α4 = (0.5, 0.2), α5 = (0.6, 0.3),
α6 = (0.6, 0.2), α7 = (0.8, 0.1), α8 = (0.9, 0.1). For α8 = (0.9, 0.1)
{JP,AW,PS},{TI},{TQ},{TM},{RG},{PA},{SC},{US}
For α7 = (0.8, 0.1)
{JP,AW,PS,PA,RG},{TI,TQ,TM},{SC},{US}

123
3142 Int. J. Appl. Comput. Math (2017) 3:3131–3145

For α6 = (0.6, 0.2)


{JP,AW,PS,PA,RG},{TI,TQ,TM},{SC},{US}
For α5 = (0.6, 0.3)
{JP,AW,PS,PA,RG,US},{TI,TQ,TM},{SC}
For α4 = (0.5, 0.2)
{JP,AW,PS,PA,RG,US},{TI,TQ,TM,SC}
For α3 = (0.5, 0.3)
{JP,AW,PS,PA,RG,US,TI,TQ,TM,SC}
For α2 = (0.5, 0.4)
{JP,AW,PS,PA,RG,US,TI,TQ,TM,SC}
We get the following partition tree
(b). Without any sup-T composition on R (0) , we apply our clustering algorithm to construct
the hierarchical structure for the initially given intuitionistic fuzzy proximity relation
matrix.
⎡ ⎤
(1,0) (0.8,0.1) (0.8,0.1) (0.3,0.5) (0.2,0.4) (0.2,0.5) (0.2,0.5) (0.2,0.6) (0.2,0.7) (0.2,0.8)
⎢ ⎥
⎢ (0.8,0.1) (1.0,0) (0.8,0.1) (0.4,0.3) (0.3,0.5) (0.4,0.5) (0.3,0.5) (0.5,0.2) (0.1,0.7) (0.3,0.7)

⎢ ⎥
⎢ (0.8,0.1) (0.8,0.1) (1.0,0) (0.5,0.3) (0.2,0.7) (0.3,0.4) (0.3,0.7) (0.3,0.6) (0.1,0.5) (0.2,0.7)

⎢ ⎥
⎢ (0.3,0.5) (0.4,0.3) (0.5,0.3) (1.0,0) (0.8,0.1) (0.9,0.1) (0.8,0.1) (0.1,0.7) (0.8,0.1) (0.2,0.5)

⎢ ⎥
(0)
=⎢ ⎥
(0.2,0.4) (0.3,0.5) (0.2,0.7) (0.8,0.1) (1.0,0) (0.6,0.3) (0.8,0.1) (0.1,0.7) (0.5,0.2) (0.2,0.6)
R ⎢ ⎥
⎢ (0.2,0.5) (0.4,0.5) (0.3,0.4) (0.9,0.1) (0.6,0.3) (1.0,0) (0.5,0.2) (0.1,0.7) (0.9,0.1) (0.6,0.2)

⎢ ⎥
⎢ (0.2,0.5) (0.3,0.5) (0.3,0.7) (0.8,0.1) (0.8,0.1) (0.5,0.2) (1.0,0) (0.1,0.7) (0.4,0.3) (0.1,0.5)

⎢ ⎥
⎢ (0.2,0.6) (0.5,0.2) (0.3,0.6) (0.1,0.7) (0.1,0.7) (0.1,0.7) (0.1,0.7) (1.0,0) (0.2,0.8) (0.5,0.3)

⎣ (0.2,0.7) (0.1,0.7) (0.1,0.5) (0.8,0.1) (0.5,0.2) (0.9,0.1) (0.4,0.3) (0.2,0.8) (1.0,0) (0.6,0.3) ⎦
(0.2,0.8) (0.3,0.7) (0.2,0.7) (0.2,0.5) (0.2,0.6) (0.6,0.2) (0.1,0.5) (0.5,0.3) (0.6,0.3) (1.0,0)

α1 = (0.1, 0.7), α2 = (0.1, 0.5), α3 = (0.2, 0.8), α4 = (0.2, 0.7), α5 = (0.2, 0.6),
α6 = (0.2, 0.5), α7 = (0.2, 0.4), α8 = (0.3, 0.7), α9 = (0.3, 0.6), α10 = (0.3, 0.5),
α11 = (0.3, 0.4), α12 = (0.4, 0.5), α13 = (0.4, 0.3), α14 = (0.5, 0.3), α15 =
(0.5, 0.2), α16 = (0.6, 0.3), α17 = (0.6, 0.2), α18 = (0.8, 0.1), α19 = (0.9, 0.1).
For α19 = (0.9, 0.1)
{JP,AW},{TI},{TQ},{TM},{RG},{PA},{SC},{PS},{US}
For α18 = (0.8, 0.1),
{JP,AW,PS},{TI,TQ,TM},{RG},{PA},{SC},{US}
For α15 = (0.5, 0.2), α16 = (0.6, 0.3), α17 = (0.6, 0.2)
{JP,AW,PS,RG},{TI,TQ,TM},{PA},{SC},{US}
For α14 = (0.5, 0.3)
{JP,AW,PS,RG},{TI,TQ,TM} {SC,US},{PA}
For α11 = (0.3, 0.4), α12 = (0.4, 0.5), α13 = (0.4, 0.3)
{JP,AW,PS,RG,PA},{TI,TQ,TM},{SC,US}
For α3 = (0.2, 0.8), α4 = (0.2, 0.7), α5 = (0.2, 0.6), α6 = (0.2, 0.5), α7 =
(0.2, 0.4), α8 = (0.3, 0.7), α9 = (0.3, 0.6), α10 = (0.3, 0.5)
{JP,AW,PS,RG,PA,TI,TQ,TM},{SC,US}
For α2 = (0.1, 0.5), α1 = (0.1, 0.7)
{JP,AW,PS,RG,PA,TI,TQ,TM,SC,US}
We get the following partition tree.
Using algorithm of this section, we can see that clustering tree structure of initially given
intuitionistic fuzzy proximity relation matrix is developed and also by applying the sup-T
composition on this given matrix, clustering tree is obtained.

123
Int. J. Appl. Comput. Math (2017) 3:3131–3145 3143

Conclusion

Clustering analysis is a fundamental but important tool in statistical data analysis. Consider
that IFSs are a powerful tool to deal with vagueness and uncertainty, and there has been some
investigation of the clustering techniques of IFSs.

Comparison

Xu et al. [30] developed a clustering algorithm for IFSs, which consists of the following two
steps: Firstly, it employs the derived association coefficients of IFSs to construct an association
matrix, and utilizes a procedure to transform it into an equivalent association matrix. Secondly,
it constructs the α-cut matrix of the equivalent association matrix, and then classifies the IFSs
under the given confidence levels. They also generalized the clustering algorithm for IVIFSs.
They used max–min composition only for the association matrix and they consider α ∈ [0, 1]
as in ordinary fuzzy set theory. In this paper, we use Sup-T composition for the clustering
and α ∈ L ∗ for the proper and justified handling of IFS clustering.
Zhao et al. [38] proposed two intuitionistic fuzzy minimum spanning tree clustering algo-
rithms. They also extended the clustering algorithm for interval-valued intuitionistic fuzzy
sets based on minimum spanning tree concept. Construction of matrix in their proposed
algorithm is based on distance measure. In step 3 of their proposed algorithm they consider
γ =distance, so according to their method at γ = 0.6 IFSs A, B, C, D are in same clus-
ter. But in our proposed algorithm we are considering an intuitionistic fuzzy element for
combining the different elements in a same cluster and our proposed algorithm is based on
intuitionistic fuzzy proximity relation.
Xu et al. [33] defined two intuitionistic fuzzy similarity measures and gave concept of
intuitionistic fuzzy similarity measure matrix. Then based on the intuitionistic fuzzy sim-
ilarity measure matrix, they proposed an intuitionistic fuzzy spectral clustering algorithm.
Their proposed spectral clustering algorithm cannot applied on intuitionistic fuzzy relations.
Intuitionistic fuzzy relation is important and useful concept for the proper handling of vague-
ness. Our proposed method is based on intuitionistic fuzzy relations and moreover we can
developed several numbers of clustering results for the same data by the selection of different
intuitionistic fuzzy t-norm.

Discussion

Hierarchical tree structure is an interesting approach to describe relationship among all the
alternatives in a humanistic complicated system. This paper presents clustering algorithm
using intuitionistic fuzzy relation based analysis to establish the schemes of intuitionistic
fuzzy similarity based tree structure. Clustering results derived without Sup-T compositions
are more appropriate because during the procedure of composition some values of relationship
are deviated from the actual information of relationship. It is based on intuitionistic fuzzy
proximity relation matrix without deviation from actual data. When we are interested to see
different possibilities of clustering tree then we can compose this initially given intuitionistic
fuzzy proximity relation matrix with it self by the selection of intuitionistic fuzzy t-norm.
In the future work, the developed algorithm can be applied to many applications including
information retrieval, equipment evaluation, sources location, multi-criteria decision-making
and data mining, etc.

123
3144 Int. J. Appl. Comput. Math (2017) 3:3131–3145

Acknowledgements The authors would like to thank the editors and the anonymous reviewers, whose insight-
ful comments and constructive suggestions helped us to significantly improve the quality of this paper.

References
1. Atanassov, K.: Intuitionistic fuzzy sets. Fuzzy Sets Syst. 20, 87–96 (1986)
2. Beg, I., Rashid, T.: An improved clustering algorithm using fuzzy relation for the performance evaluation
of humanistic systems. Int. J. Intell. Syst. 29(12), 1181–1199 (2014)
3. Beg, I., Rashid, T.: Multi-criteria trapezoidal valued intuitionistic fuzzy decision making with Choquet
integral based TOPSIS. OPSEARCH 51(1), 98–129 (2014)
4. Beg, I., Rashid, T.: Fuzzy distance measure and fuzzy clustering algorithm. J. Interdiscip. Math. 18(5),
471–492 (2015)
5. Bellman, R., Kalaba, R.L., Zadeh, L.A.: Abstraction and pattern classification. J. Math. Anal. Appl. 2,
581–585 (1966)
6. Bezdek, J.C.: Pattern Recognition with Fuzzy Objective Function Algorithms. Plenum, New York (1981)
7. Dave, R.N.: Generalized fuzzy c-shells clustering and detection of circular and elliptical boundaries.
Pattern Recogn. 25, 713–721 (1992)
8. De, S.K., Biswas, R., Roy, A.R.: An application of intuitionistic fuzzy sets in medical diagnosis. Fuzzy
Sets Syst. 117, 209–213 (2001)
9. Deschrijver, G., Cornelis, C., Kerre, E.E.: On the representation of intuitionistic fuzzy t-norms and t-
conorms. IEEE Trans. Fuzzy Syst. 12(1), 45–61 (2004)
10. Guh, Y.Y., Lee, E.S., Wang, K.M.: Hierarchies consistency analysis for non-transitive problems. Comput.
Math. Appl. 32, 67–77 (1996)
11. Guh, Y.Y.: Determining weight by combining different hierarchy structure-Hierarchies consistency anal-
ysis. Int. J. Inf. Manag. Sci. 7, 63–80 (1996)
12. Guh, Y.Y.: Introduction to a new weighting method-Hierarchies consistency analysis. Eur. J. Oper. Res.
102, 215–226 (1997)
13. Guh, Y.Y.: Hierarchies weighting by combining multiple evaluation viewpoint. Int. J. Inf. Manag. Sci. 8,
47–61 (1997)
14. Guh, Y.Y., Po, R.W.: Establishing a multiple structures analysis model for AHP. Int. J. Inf. Manag. Sci.
15, 35–51 (2004)
15. Guh, Y.Y., Yang, M.S., Po, R.W., Lee, E.S.: Establishing performance evaluation structures by fuzzy
relation-based cluster analysis. Comput. Math. Appl. 56, 572–582 (2008)
16. Khalaf, M.M.: Medical diagnosis via interval valued intuitionistic fuzzy sets. Ann. Fuzzy Math. Inf. 6(2),
245–249 (2013)
17. Klement, E.P., Mesiar, P., Pap, E.: Triangular Norms. Kluwer, Dordrecht (2000)
18. Li, D.-F.: Multiattribute decision making models and methods using intuitionistic fuzzy sets. J. Comput.
Syst. Sci. 70, 73–85 (2005)
19. Li, B., He, W.: The structures of intuitionistic fuzzy equivalence relations. Inf. Sci. 278, 883–899 (2014)
20. Ruspini, E.H.: A new approach to clustering. Inf. Control 15, 22–32 (1969)
21. Saaty, T.L.: Axiomatic foundation of the analytical hierarchy process. Manag. Sci. 32, 841–855 (1986)
22. Saaty, T.L.: Highlights and critical points in the theory and application of the analytic hierarchy process.
Eur. J. Oper. Res. 74, 426–447 (1994)
23. Saaty, T.L.: Rank from comparisons and from ratings in the analytic hierarchy/network processes. Eur. J.
Oper. Res. 168, 557–570 (2006)
24. Szmidt, E., Kacprzyk, J.: Distances between intuitionistic fuzzy sets. Fuzzy Sets Syst. 114, 505–518
(2000)
25. Tamura, E., Figuchi, S., Tanaka, K.: Pattern classification based on fuzzy relations. IEEE Trans. Syst.
Man Cybern. 1, 61–66 (1971)
26. Trauwaert, E., Kaufman, L., Rousseeuw, P.: Fuzzy clustering algorithms based on the maximum likelihood
principle. Fuzzy Sets Syst. 42, 213–227 (1991)
27. Wang, Z., Xu, Z.S., Liu, S.S., Tang, J.: A netting clustering analysis method under intuitionistic fuzzy
environment. Appl. Soft Comput. 11, 5558–5564 (2011)
28. Wang, Z., Xu, Z.S., Liu, S.S., Yao, Z.Q.: Direct clustering analysis based on intuitionistic fuzzy implica-
tion. Appl. Soft Comput. 23, 1–8 (2014)
29. Xu, Z.S.: Intuitionistic fuzzy hierarchical clustering algorithms. J. Syst. Eng. Electron. 20, 90–97 (2009)
30. Xu, Z.S., Chen, J., Wu, J.J.: Clustering algorithm for intuitionistic fuzzy sets. Inf. Sci. 178(19), 3775–3790
(2008)

123
Int. J. Appl. Comput. Math (2017) 3:3131–3145 3145

31. Xu, Z.S., Tang, J., Liu, S.S.: An orthogonal algorithm for clustering intuitionistic fuzzy information. Inf.
Int. Interdiscip. J. 14, 65–78 (2011)
32. Xu, Z.S., Wu, J.J.: Intuitionistic fuzzy c-means clustering algorithms. J. Syst. Eng. Electron. 21, 580–590
(2010)
33. Xu, D., Xu, Z.S., Liu, S.S., Zhao, H.: A spectral clustering algorithm based on intuitionistic fuzzy infor-
mation. Knowl.-Based Syst. 53, 20–26 (2013)
34. Xu, Z.S., Yager, R.R.: Some geometric aggregation operators based on intuitionistic fuzzy sets. Int. J.
Gen. Syst. 35(4), 417–433 (2006)
35. Yang, M.S., Shih, H.M.: Cluster analysis based on fuzzy relations. Fuzzy Sets Syst. 120, 197–212 (2001)
36. Zadeh, L.A.: Fuzzy sets. Inf. Control 8, 338–356 (1965)
37. Zadeh, L.A.: Similarity relations and fuzzy ordering. Inf. Sci. 3, 177–200 (1971)
38. Zhao, H., Xu, Z., Liu, S., Wang, Z.: Intuitionistic fuzzy MST clustering algorithms. Comput. Ind. Eng.
62, 1130–1140 (2012)
39. Zhao, H., Xu, Z.S., Wang, Z.: Intuitionistic fuzzy clustering algorithm based on Boole matrix and asso-
ciation measure. Int. J. Inf. Technol. Decis. Mak. 12, 95–118 (2013)

123
View publication stats

Anda mungkin juga menyukai