Elsevier

Pattern Recognition

Volume 27, Issue 5, May 1994, Pages 757-764
Pattern Recognition

Fuzzy Kohonen clustering networks

https://doi.org/10.1016/0031-3203(94)90052-3Get rights and content

Abstract

Kohonen networks are well known for cluster analysis (unsupervised learning). This class of algorithms is a set of heuristic procedures that suffers from several major problems (e.g. neither termination or convergence is guaranteed, no model is optimized by the learning strategy, and the output is often dependent on the sequence of data). A fuzzy Kohonen clustering network is proposed which integrates the Fuzzy c-Means (FCM) model into the learning rate and updating strategies of the Kohonen network. This yield an optimization problem related to FCM, and the numerical results show improved convergence as well as reduced labeling errors. It is proved that the proposed scheme is equivalent to the c-Means algorithms. The new method can be viewed as a Kohonen type of FCM, but is “self-organizing” since the “size” of the update neighborhood and learning rate in the competitive layer are automatically adjusted during learning. Anderson's IRIS data is used to illustrate this method; and results are compared with the standard Kohonen approach.

References (13)

  • A.B. McBratney et al.

    Application of fuzzy sets to climatic classification

    Agric. Forest Meteor.

    (1985)
  • T. Kohonen

    Self-Organization and Associative Memory

    (1989)
  • J.C. Bezdek

    Pattern Recognition with Fuzzy Objective Function Algorithms

    (1981)
  • R. Duda et al.

    Pattern Classification and Scene Analysis

    (1973)
  • J. Tou et al.

    Pattern Recognition Principles

    (1974)
  • J. Hartigan

    Clustering Algorithms

    (1975)
There are more references available in the full text version of this article.

Cited by (240)

  • Mining event logs for knowledge discovery based on adaptive efficient fuzzy Kohonen clustering network

    2020, Knowledge-Based Systems
    Citation Excerpt :

    Due to such distinct advantages, FCM has been combined with other concepts to obtain more desirable results for large data in multi-dimensional space and noisy environments [17,18]. One of the most famous models is the fuzzy Kohonen clustering network (FKCN) [19] to take the best of KCN and FCM, which integrates the FCM into the learning rate and updating strategies of KCN. As a result, the neighborhood size and the learning rate can be updated automatically, while the cluster weights can be obtained by minimizing the objective function [20].

  • Imbalanced credit risk evaluation based on multiple sampling, multiple kernel fuzzy self-organizing map and local accuracy ensemble

    2020, Applied Soft Computing Journal
    Citation Excerpt :

    Jardin used the quantification of temporal patterns to improve SOM to create the ensemble model that characterized the financial health of a set of companies, and on the use of an ensemble of incremental size maps to make forecasts [30]. Fuzzy Clustering Means (FCM) is a common algorithm for the improvement because of its calculation process like SOM [31,32]. Kvalsund and Ripon combined SOM and fuzzy set model, and assessed the impact of SOM used as a discriminant analysis function in a hybrid intelligent system for multi-factor analysis financial prediction [33].

  • BIM log mining: Exploring design productivity characteristics

    2020, Automation in Construction
    Citation Excerpt :

    In order to make it more satisfactory, it becomes a research focus on interfacing between neural networks and fuzzy clustering by incorporating fuzzy membership values into the learning rate in neural networks [14]. By merging KCN and FCM, this kind of hybrid clustering method called fuzzy Kohonen clustering network (FKCN) is able to inherit advantages from both KCN and FCM and make up for shortcomings of each method [45]. To sum up, the superiority of FKCN is distinguished in three major ways: (1) It is capable of handling data with ambiguity and uncertainty; (2) It is not very susceptible to initial parameters; and (3) It can speed up the convergence rate with fewer training cycles.

View all citing articles on Scopus
View full text