Advanced
Associations Among Information Granules and Their Optimization in Granulation-Degranulation Mechanism of Granular Computing
Associations Among Information Granules and Their Optimization in Granulation-Degranulation Mechanism of Granular Computing
International Journal of Fuzzy Logic and Intelligent Systems. 2013. Dec, 13(4): 245-253
Copyright © 2013, Korean Institute of Intelligent Systems
This is an Open Access article distributed under the terms of the Creative Commons Attribution Non-Commercial License (http://creativecommons.org/licenses/by-nc/3.0/) which permits unrestricted noncommercial use, distribution, and reproduction in any medium, provided the original work is properly cited.
  • Received : December 04, 2013
  • Accepted : December 24, 2013
  • Published : December 25, 2013
Download
PDF
e-PUB
PubReader
PPT
Export by style
Share
Article
Author
Metrics
Cited by
TagCloud
About the Authors
Witold Pedrycz

Abstract
Knowledge representation realized by information granules is one of the essential facets ofgranular computing and an area of intensive research. Fuzzy clustering and clustering are general vehicles to realize formation of information granules. Granulation – degranulation paradigm is one of the schemes determining and quantifying functionality and knowledge representation capabilities of information granules. In this study, we augment this paradigm by forming and optimizing a collection of associations among original and transformed information granules. We discuss several transformation schemes and analyze their properties. A series of numeric experiments is provided using which we quantify the improvement of the degranulation mechanisms offered by the optimized transformation of information granules.
Keywords
1. Introduction
Information granules being at the center of granular computing [1 , 2] are fundamental concepts using which we perceive, structure and process knowledge in a human-centric fashion. With the ongoing progress witnessed in granular computing, it has become apparent that a number of central problems still deserve our attention including a way of constructing information granules and a process of communication with the external world of data when conveying information granules. The first group of tasks directly relates to clustering and fuzzy clustering. Clustering has been a central conceptual and algorithmic vehicle supporting a realization of information granules. We have witnessed a great deal of developments in this domain [3] including numerous generalizations of clustering methods yielding a formation of higher order granular constructs, see [4] and granular fuzzy clusters [1] , in general. There has been a visible role of fuzzy clustering in system modeling [5 , 6] . The second problem of communication between the world of numeric data and granular entities revolves around a fundamental concept of the granulation-degranulation mechanism, assessment of its results quantified in terms of so–called degranulation criterion. The quality of information granules is captured by looking in a way how numeric data are granulated and afterwards de-granulated and to which extent a tandem of these two processing phases distorts the data. The distortions are inevitable (with several exceptions which deal with one–dimensional data and engage triangular membership functions, see [1 , 7] ).
The objective of this study is to further improve the granulationdegranulation abilities by introducing and optimizing transformation of information granules so that the resulting degranulation error becomes minimized. We formulate an overall task as an optimization problem, introduce several categories of transformation functions (realizing suitable interactions/associations among information granules) and discuss the underlying optimization procedures.
The study is organized as follows. We start with a concise review of information granulation realized with the aid of fuzzy clustering (Section 2). The granulation-degranulation idea along with the formal formulation of the problem is presented in Section 3. Section 4 is devoted to the design of interactions among information granules while in Section 5 we provide a number of illustrative experimental studies.
Throughout this study, we consider information granules formed in the n-dimensional space of real numbers, x R n .
2. Information Granulation Through Fuzzy Clustering
Let us briefly recall the underlying concept and ensuing algorithmic aspects of the fuzzy C-means (FCM) algorithm [3 , 8] , which is one of the commonly used techniques of fuzzy clustering and a computational vehicle to build information granules.
Given is a collection of n-dimensional data (patterns) {x k │k=1,2,…,N} where x k ∈R n . Our objective is to determine its structure – a collection of “c” clusters (information granules). From the algorithmic perspective, the problem is posed as a minimization of the following objective function (performance index) Q
Lager Image
where v 1 , v 2 ,…, v c are n-dimensional prototypes of the clusters and U=[uik] stands for a partition matrix expressing a way of allocation of the data to the corresponding clusters; u ik is the membership degree of data x k in the i-th cluster. The distance between the data x k and prototype v i is denoted by ‖.‖.The fuzzification coefficient m (assuming values greater than 1) quantifies an impact of the membership grades on the individual clusters and implies a certain geometry of the ensuing information granules. Typically, the value of the fuzzification coefficient is set to 2. The partition matrix satisfies two essential and practically justifiable properties
Lager Image
Lager Image
The minimization of Q is completed with respect to U∈U and the prototypes v i of V={ v 1 , v 2 ,…, v c } of the clusters, namely
Lager Image
Here U stands for a family of partition matrices, viz. the matrices satisfying conditions expressed by Eqs. (2) and (3).
From the optimization perspective, there are two individual optimization tasks to be carried out separately to determine the partition matrix and the prototypes. The results are welldocumented in the literature and thoroughly discussed and the method has been intensively experimented with. The optimization process is iterative and involves successive computing of the partition matrix and the prototypes
Lager Image
i = 1; 2,…,c; k = 1; 2,…,N.
Lager Image
s = 1; 2,…,c, t = 1; 2,…,n (in the above calculations of the prototypes it has been assumed that the distance function is Euclidean). We can look at the partition matrix by considering its individual rows – denoting them by A 1 , A 2 , .., A c we emphasize that fuzzy clustering gives rise to a collection of information granules.
3. Granulation and Degranulation Principle
More formally, we can describe this knowledge representation in terms of information granules as a way of expressing any data point x in terms of the information granules and describe the result as a vector in the c-dimensional hypercube, namely [0,1] c ,
Lager Image
Lager Image
The granulation-degranulation mechanism: a general viewat the level of processing information granules. FCM, fuzzy C-mean.
The degranulation step is about a reconstruction of x on a basisof the family of information granules (clusters). It can be treated as a certain mapping
Lager Image
The capabilities of the information granules to reflect the structure of the original data can be conveniently expressed by comparing how much the result of degranulation, say x differs from the original pattern x , that is
Lager Image
More formally,
Lager Image
where G and G -1 denote the corresponding phases of information granulation and de-granulation [9] .
The crux of the granulation – degranulation principle is visualized in Figure 1 [1] . Note the transformations G and G -1 operate between the spaces of data and the abstract space of information granules.
Let us start with the granulation phase. More specifically, x is expressed in the form of the membership grades u i of x to the individual granules A i , which form a solution to the following optimization problem
Lager Image
subject to the usual constraints imposed on the degrees of membership
Lager Image
The solution to the problem (we use here Lagrange multipliers) comes in the form,
Lager Image
For the degranulation phase, given u i (x) and the prototypes v i , the vector
Lager Image
is considered as a solution to the minimization problem in which we reconstruct (degranulate) original x when using the prototypes and the corresponding membership grades
Lager Image
Considering the use of the Euclidean distance in the above performance index, the subsequent calculations are straightforward yielding the result
Lager Image
It is important to note that the description of x in more abstract fashion realized by means of A i and being followed by the consecutive degranulation brings about a certain granulation error (which is inevitable given a fact that we move back and forth between different levels of abstraction). While the above formulas pertain to the granulation realized by fuzzy sets, the granulation-degranulation error is also present when dealing with sets (intervals). In this case we are faced with a quantization error, which becomes inevitable when working with A/D (granulation) and D/A (degranulation) conversion mechanisms. The problem formulated above is also associated with vector quantization, which has been widely studied in the literature, cf. [10 - 14] .
The quality of the granulation-degranulation scheme is quantified by means of the following performance index [9]
Lager Image
This performance index articulates how much the reconstruction error is related with the representation capabilities of information granules.
4. Building Interactions Among Information Granules
Denoting information granules formed thorough the process of clustering by A 1 , A 2 , ..A c we consider a certain framework of interaction among them where such interaction gives rise to the enhancement of the degranulation features and subsequently a reduction of the degranulation error. To highlight the essence of the approach, we can refer to Figure 2 .
The essence of this enhanced mechanism dwells on a mapping (transformation) F:{A 1 , A 2 , ..A c }→ {B 1 , B 2 . ,. . . , B c } where B i = F(A 1 , A 2 ,…,A c ). The original membership functions
Lager Image
Interaction–augmented degranulation mechanism; note a processing module resulting in a layer of interaction among original fuzzy sets. FCM, fuzzy C-mean.
are affected by their interaction with other information granules. The newly formed granules are developed in a way it furnishes them with better degranulation capabilities (viz. lower degranulation error).
In general, the transformation F: [0,1] c →[0,1] c can be implemented in numerous ways. Several interesting and practically viable alternatives are summarized in Table 1. The same table includes some comments about the nature of the mapping.
The transformation F is typically endowed with some parameters (say a matrix of weights) and those imply the flexibility of the transformation. Both the type of the mapping as well as its parameters (their numeric values) are subject to the optimization guided by the degranulation error V. More formally, we can describe the result of this optimization as follows
Lager Image
where F opt is an optimal transformation (coming out of a finite family of alternatives) whereas w opt is an optimized vector of its parameters. The minimization shown in Eq. (15) is realized with regard to the type of the transformation F and its arameters w.
As to the development of interactions, a certain general tendency can be observed. Consider a linear combination of original information granules shown in Table 1 . Let us rewrite it in the following form
Lager Image
We note that B i is a distorted (transformed) version of A i in which in an attempt to minimize the degranulation error the original information granule A i is impacted by the membership grades of the remaining A j s. Depending upon the sign of the
Selected alternatives of realization of transformation mechanisms
Lager Image
Selected alternatives of realization of transformation mechanisms
weights, this process is of excitatory (w ij >0) or inhibitory (w ij <0) nature. In other words, those A j s associated with the negative weights w ij form an inhibitory link while the excitatory link is realized by the A j coming with the positive entries of W. A lack of interaction is observed when w ij are close to zero for all i≠j or W = I where I is an identity matrix. While more sophisticated transformation may contribute to the lower degranulation error, the interpretability of the transformation itself could be more difficult because of the existence of morenonlinear and convoluted character of established interactions among information granules.
5. Experimental Studies
In all experiments, we consider the linear transformation of information granules ( Table 1 ). Particle swarm optimization is used as an optimization vehicle [15] . We consider a generic version of this method where the dynamics (velocity and position) of each individual of the swarm of “M” particles is governed
Lager Image
Two-dimensional synthetic data.
Lager Image
Degranulation error V reported for c=2, 3, and 4. The upper (grey) line concerns the error obtained when no associations among information granules are studied;W=I.
by the following expressions:
Lager Image
i = 1, 2,…,M and j = 1, 2,…,r. Here v i is the velocity of the i-th particle in a given search space of dimensionality “r” and r 1 and r 2 are the vectors of random numbers drawn from the uniform distribution over the unit interval. p i is the best position of the i-th particle observed so far and pg is the best particle in the entire swarm. The update of the particle is realized by adding the velocity v i (iter+1) to the current position x i (iter). The main parameters of this optimization environment is set as follows (these specific numeric values are those commonly encountered in the literature): c 1 =c 2 =1.49, w = 0.6. The range of the admissible values of the weights is set to [-4, 4].
The size of the population was set to 120 individuals whereas the number of generations was equal to 100. With regard to the use of the FCM algorithm, the fuzzification coefficient is set 2 and the method was run for 60 iterations (at which point no changes to the values of the objective function were reported). The initial point of optimization was set up randomly
Lager Image
Results of degranulation: (a) no associations involved, and (b) optimized associations.
by choosing a partition matrix with random entries. The distance function used to run FCM as well as to compute the degranulation error Eq. (14), is the weighted Euclidean distance in the form
Lager Image
where s j stands for a standard deviation of the j-th variable.
- 5.1 Synthetic Data
We start with a synthetic two-dimensional data with intent of gaining a better insight into the nature of the optimization process and the produced results. The data set exhibits three welldelineated and compact clusters (groups), Figure 3 .
We formed c = 2, 3, and 4 information granules (fuzzy clusters) and determined the resulting degranulation error for all these situations, see Figure 4 .
The most visible improvement is noted for c = 2. Just to observe in which way the reduced value of V translates into the character of the reconstructed data, Figure 5 contrasts the results obtained when no associations among information granules were considered. It is apparent that in this case a number of data were collapsed whereas the inclusion of the associations help retain the original data as it is visible in Figure 5 (b), especially with regard to the cluster of data positioned close to the origin.
The matrix of the parameters W delivers a detailed insight into the relationships among information granules. Here we have
Lager Image
Lager Image
Fitness functions in successive generations: (a) e-coli, c=10, (b) auto, c=7, (c) housing c =9.
Lager Image
The relationships (associations) between information granules are more visible with the increase of the number of clusters with both inhibitory and excitatory linkages being present.
- 5.2 Selected Machine Learning Repository Data
A collection of three data sets coming from the Machine Learning repository http://archive.ics.uci.edu/ml/ is considered, namely e-coli, auto, and housing. The snapshots of the progression of the optimization process where the fitness function (degranulation error) is reported in successive generations are presented in Figure 6 .
The plots of V treated as a function of the number of clusters
Lager Image
V versus “c”, grey lines relate to reconstruction results produced where no associations are involved: (a) e-coli, (b) auto, and (c) housing.
are displayed in a series of figures, Figure 7 .
In all experiments, there is a visible improvement provided by invoking associations among information granules. This enhancement is quite steady irrespectively from the number of clusters used in the granulation process. We also witness a saturation effect meaning that the values of V tend to stabilize with the increase of the number of clusters. The plots displayed in Figure 8 deliver a certain look at the values of the weights; the diversity of the inhibitory and excitatory effects delivered by their combination is also apparent. A more global view can be formed by considering a sum of the weights reported in the individual rows of the weight matrix (not counting the entries positioned on a main diagonal) – these numbers tell us about an overall impact on the corresponding fuzzy set that arose as a result of the transformation. As illustrated in Figure 9 , there is a visible diversity of excitatory and inhibitory influences among
Lager Image
Entries of the associationmatrixW: (a) e-coli, (b) auto, and (c) housing.
information granules.
6. Conclusion
Understanding and quantifying information granules in terms of their representation capabilities is essential to further advancements of granular computing and pursuing their role in reasoning and modeling schemes. The study covered here underlines a facet of degranulation aspects and its improvement by admitting and quantifying linkages existing among information granules. This helps gain a better insight into the nature and interaction among information granules built on a basis of numeric data. In the numeric studies reported in the paper, westressed an important role of associations of fuzzy sets. While
Lager Image
Levels of interaction reported for the individual fuzzy sets: (a) e-coli, (b) auto, and (c) housing.
in this study we focused on the formalism of fuzzy sets (and fuzzy clustering), the developed framework is of general character and as such can be investigated in depth when dealing with other formalisms of information granules (sets, rough sets, shadowed sets and others).
- Conflict of Interest
No potential conflict of interest relevant to this article was reported.
References
Pedrycz W. 2013 Granular Computing: Analysis and Design of Intelligent Systems Taylor & Francis Boca Raton, FL
Pedrycz W. 2009 “From fuzzy sets to shadowed sets: interpretation and computing,” International Journal of Intelligent 24 (1) 48 - 61    DOI : 10.1002/int.20323
Pedrycz W. 2005 Knowledge-Based Clustering: From Data to Information Granules Wiley Hoboken, NJ
Hwang C. , Rhee F. C. H. 2007 “Uncertain fuzzy clustering: interval type-2 fuzzy approach to c-means,” IEEE Transactions on Fuzzy Systems 15 (1) 107 - 120    DOI : 10.1109/TFUZZ.2006.889763
Kim S. J. , Seo I. Y. 2012 “A clustering approach to wind power prediction based on support vector regression,” International Journal of Fuzzy Logic and Intelligent Systems 12 (2) 108 - 112    DOI : 10.5391/IJFIS.2012.12.2.108
Ye X. Y. , Han M. M. 2013 “A systematic approach to improve fuzzy C-mean method based on genetic algorithm,” International Journal of Fuzzy Logic and Intelligent Systems 13 (3) 178 - 185    DOI : 10.5391/IJFIS.2013.13.3.178
Pedrycz W. 1994 “Why triangular membership functions?,” Fuzzy Sets and Systems 64 (1) 21 - 30    DOI : 10.1016/0165-0114(94)90003-5
Bezdek J. C. 1981 Pattern Recognition With Fuzzy Objective Function Algorithms Plenum Press New York, NY
Pedrycz W. , de Oliveira J. V. 2008 “A development of fuzzy encoding and decoding through fuzzy clustering,” IEEE Transactions on Instrumentation and Measurement 57 (4) 829 - 837    DOI : 10.1109/TIM.2007.913809
Gersho A. , Gray R. M. 1992 Vector Quantization and Signal Compression Kluwer Academic Publishers Vector Quantization and Signal Compression
Gray R. M. 1984 “Vector quantization,” IEEE ASSP Magazine 1 (2) 4 - 29    DOI : 10.1109/MASSP.1984.1162229
Lendasse A. , Francois D. , Wertz V. , Verleysen M. 2005 “Vector quantization: a weighted version for time-series forecasting,” Future Generation Computer Systems 21 (7) 1056 - 1067    DOI : 10.1016/j.future.2004.03.006
Linde Y. , Buzo A. , Gray R. M. 1980 “An algorithm for vector quantizer design,” IEEE Transactions on Communications 28 (1) 84 - 95    DOI : 10.1109/TCOM.1980.1094577
Yair E. , Zeger K. , Gersho A. 1992 “Competitive learning and soft competition for vector quantizer design,” IEEE Transactions on Signal Processing 40 (2) 294 - 309    DOI : 10.1109/78.124940
Yuhui S. , Eberhart R. C. “Empirical study of particle swarm optimization,” Proceedings of the 1999 Congress on Evolutionary Computation Washington, DC July 6-9, 1999 1945 - 1950    DOI : 10.1109/CEC.1999.785511