Skip to main content
Erschienen in: Neural Processing Letters 3/2018

08.02.2018

Centroid Neural Network with Pairwise Constraints for Semi-supervised Learning

verfasst von: Minh Tran Ngoc, Dong-Chul Park

Erschienen in: Neural Processing Letters | Ausgabe 3/2018

Einloggen

Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.

search-config
loading …

Abstract

A clustering algorithm for datasets with pairwise constraints using the Centroid Neural Network (Cent.NN) is proposed in this paper. The proposed algorithm, referred to as the Centroid Neural Network with Pairwise Constraints (Cent. NN-PC) algorithm, utilizes Cent.NN as its backbone algorithm for data clustering and adopts a semi-supervised learning process for pairwise constraints. A newly formulated energy function is adopted from the original Cent.NN algorithm for the proposed Cent.NN-PC algorithm, introducing penalty terms for violating constraints. The weight update procedure of the proposed Cent.NN-PC algorithm finds optimal prototypes for the given dataset that minimize the quantization error while minimizing the number of violated constraints. In order to evaluate the performance of the proposed Cent.NN-PC algorithm, experiments on six different datasets from the UCI database and two bioinformatics datasets from the KEEL repository are carried out. The performance of the proposed algorithm is compared to that of the the Linear Constrained Vector Quantization Error (LCVQE) algorithm, one of the most commonly used algorithms for data clustering with pairwise constraints. In the experiments, five different numbers of pairwise constraints are utilized to evaluate the clustering performance with constraints of different sizes. The results show that the proposed Cent.NN-PC algorithm outperforms the LCVQE algorithm on most performance criteria, including the total quantization error, the number of violated constraints, and on the three performance metrics of the classification accuracy rate, F-score, and NMI measure outcome. The experiments also show that Cent.NN-PC provides much more stable clustering results at an improved operational speed compared to LCVQE.

Sie haben noch keine Lizenz? Dann Informieren Sie sich jetzt über unsere Produkte:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Literatur
1.
Zurück zum Zitat Hinton G, Sejnowski T (eds) (1999) Unsupervised learning: foundations of neural computation. MIT Press, Cambridge Hinton G, Sejnowski T (eds) (1999) Unsupervised learning: foundations of neural computation. MIT Press, Cambridge
2.
Zurück zum Zitat Al-Behadili H, Grumpe A, Whler C (2016) Neural network based novelty detection for incremental semi-supervised learning in multi-class gesture recognition. In: Proceedings of 11th VISAPP, vol. 3. pp 287–294 Al-Behadili H, Grumpe A, Whler C (2016) Neural network based novelty detection for incremental semi-supervised learning in multi-class gesture recognition. In: Proceedings of 11th VISAPP, vol. 3. pp 287–294
3.
Zurück zum Zitat Basu S, Davsidson I, Wagstaff K (2008) Constrained clustering: advance in algorithms, theory and applications. Taylor and Francis, New York Basu S, Davsidson I, Wagstaff K (2008) Constrained clustering: advance in algorithms, theory and applications. Taylor and Francis, New York
4.
Zurück zum Zitat Basu S, Davidson I (2006) Clustering with constraints: theory and practice. KDD tutorial Basu S, Davidson I (2006) Clustering with constraints: theory and practice. KDD tutorial
5.
Zurück zum Zitat Bilenko M, Basu S, Mooney RJ (2004) Integrating constraints and metric learning in semi-supervised clustering. In: Proceedings of the 21st international conference on machine learning, pp 81–88 Bilenko M, Basu S, Mooney RJ (2004) Integrating constraints and metric learning in semi-supervised clustering. In: Proceedings of the 21st international conference on machine learning, pp 81–88
6.
Zurück zum Zitat Burr Settles, Active Learning Literature Survey. Computer Sciences Technical Report, University of Wisconsin-Madison Burr Settles, Active Learning Literature Survey. Computer Sciences Technical Report, University of Wisconsin-Madison
7.
Zurück zum Zitat Chaturvedi I, Ong Y, Arumugam R (2015) Deep transfer learning for classification of time-delayed gaussian networks. Signal Process 110:250–262CrossRef Chaturvedi I, Ong Y, Arumugam R (2015) Deep transfer learning for classification of time-delayed gaussian networks. Signal Process 110:250–262CrossRef
8.
Zurück zum Zitat Covoes T, Hruschka ER, Ghosh J (2013) Competitive learning with pairwise constraints. IEEE Trans Neural Netw Learn Syst 24(1):164–169CrossRef Covoes T, Hruschka ER, Ghosh J (2013) Competitive learning with pairwise constraints. IEEE Trans Neural Netw Learn Syst 24(1):164–169CrossRef
9.
Zurück zum Zitat Covoes T, Hruschka ER, Ghosh J (2013) A study of K-Means-based algorithms for constrained clustering. Intell Data Anal 21(2):485–505CrossRef Covoes T, Hruschka ER, Ghosh J (2013) A study of K-Means-based algorithms for constrained clustering. Intell Data Anal 21(2):485–505CrossRef
10.
Zurück zum Zitat Davison I, Ravi SS (2005) Clustering with constraints: feasibility issues and the K-means algorithm. In: Proceedings of the fifth SIAM international conference on data mining, pp 138–149 Davison I, Ravi SS (2005) Clustering with constraints: feasibility issues and the K-means algorithm. In: Proceedings of the fifth SIAM international conference on data mining, pp 138–149
12.
Zurück zum Zitat Fergus R, Weiss Y, Torralba A (2009) Semi-supervised learning in gigantic image collections. In: Proceedings of advances in neural information processing systems, vol 22, pp 522–530 Fergus R, Weiss Y, Torralba A (2009) Semi-supervised learning in gigantic image collections. In: Proceedings of advances in neural information processing systems, vol 22, pp 522–530
13.
Zurück zum Zitat Goodfellow I et al (2014) Generative adversarial networks. In: Proceedings of advances in neural information processing systems Goodfellow I et al (2014) Generative adversarial networks. In: Proceedings of advances in neural information processing systems
14.
Zurück zum Zitat Haeusser P, Mordvintsev A, Cremers D (2017) Learning by association: a versatile semi-supervised training method for neural networks. In: Proceedings of The IEEE conference on computer vision and pattern recognition (CVPR) Haeusser P, Mordvintsev A, Cremers D (2017) Learning by association: a versatile semi-supervised training method for neural networks. In: Proceedings of The IEEE conference on computer vision and pattern recognition (CVPR)
15.
Zurück zum Zitat Hinton G et al (2012) Deep neural networks for acoustic modeling in speech recognition. IEEE Signal Process Mag 29:82–97CrossRef Hinton G et al (2012) Deep neural networks for acoustic modeling in speech recognition. IEEE Signal Process Mag 29:82–97CrossRef
16.
Zurück zum Zitat Jian M, Jung C, Shen Y et al (2015) Adaptive constraint propagation for semi-supervised kernel matrix learning. Neural Process Lett. 41:107–123CrossRef Jian M, Jung C, Shen Y et al (2015) Adaptive constraint propagation for semi-supervised kernel matrix learning. Neural Process Lett. 41:107–123CrossRef
18.
19.
20.
Zurück zum Zitat Kumar N, Kummamuru K (2008) Semi-supervised clustering with metric learning using relative comparisons. IEEE Trans Knowl Data Eng 20(4):495–503CrossRef Kumar N, Kummamuru K (2008) Semi-supervised clustering with metric learning using relative comparisons. IEEE Trans Knowl Data Eng 20(4):495–503CrossRef
21.
Zurück zum Zitat Kumar A, Sattigeri P, Fletcher T (2017) Semi-supervised Learning with GANs: manifold invariance with improved inference. In: Proceedings of advances in neural information processing systems Kumar A, Sattigeri P, Fletcher T (2017) Semi-supervised Learning with GANs: manifold invariance with improved inference. In: Proceedings of advances in neural information processing systems
22.
Zurück zum Zitat LeCun Y, Benjio Y, Hinton J (2015) Deep learning. Nature 521:436–444CrossRef LeCun Y, Benjio Y, Hinton J (2015) Deep learning. Nature 521:436–444CrossRef
23.
Zurück zum Zitat LeCun Y. et al. (1990) Handwritten digit recognition with a back-propagation network. In: Proceedings of advances in neural information processing systems, vol 3. pp 396–404 LeCun Y. et al. (1990) Handwritten digit recognition with a back-propagation network. In: Proceedings of advances in neural information processing systems, vol 3. pp 396–404
24.
Zurück zum Zitat Liu Y, Kirchhoff K (2016) Graph-based semisupervised learning for acoustic modeling in automatic speech recognition. IEEE/ACM Trans Audio Speech Lang Process 24(11):1946–1956CrossRef Liu Y, Kirchhoff K (2016) Graph-based semisupervised learning for acoustic modeling in automatic speech recognition. IEEE/ACM Trans Audio Speech Lang Process 24(11):1946–1956CrossRef
25.
Zurück zum Zitat Lozano JA, Pena JM, Larranaga P (1999) An empirical comparison of four initialization methods for the K-means algorithm. Pattern Recognit Lett 20(10):1027–1040CrossRef Lozano JA, Pena JM, Larranaga P (1999) An empirical comparison of four initialization methods for the K-means algorithm. Pattern Recognit Lett 20(10):1027–1040CrossRef
26.
Zurück zum Zitat Lyu M, Hoi S, Jin R, Zhu J (2009) Semi-supervised SVM batch mode active learning for image retrieval. ACM Trans Inf Syst 27(3):1–7 Lyu M, Hoi S, Jin R, Zhu J (2009) Semi-supervised SVM batch mode active learning for image retrieval. ACM Trans Inf Syst 27(3):1–7
28.
Zurück zum Zitat Miyato T, Maeda S, Koyama M, Nakae K, Ishii S (2015) Distributional Smoothing with virtual adversarial training. arXiv:1507.00677 Miyato T, Maeda S, Koyama M, Nakae K, Ishii S (2015) Distributional Smoothing with virtual adversarial training. arXiv:​1507.​00677
29.
Zurück zum Zitat Mohri M, Rostamizadeh A, Talwalkar A (2012) Foundations of machine learning. The MIT Press, CambridgeMATH Mohri M, Rostamizadeh A, Talwalkar A (2012) Foundations of machine learning. The MIT Press, CambridgeMATH
30.
Zurück zum Zitat Olszewski D (2016) Asymmetric k-Means clustering of the asymmetric self-organizing map. Neural Process Lett 43:231–253CrossRef Olszewski D (2016) Asymmetric k-Means clustering of the asymmetric self-organizing map. Neural Process Lett 43:231–253CrossRef
31.
Zurück zum Zitat Park DC (2000) Centroid neural network for unsupervised competitive learning. IEEE Trans Neural Netw 11(2):520–528CrossRef Park DC (2000) Centroid neural network for unsupervised competitive learning. IEEE Trans Neural Netw 11(2):520–528CrossRef
32.
Zurück zum Zitat Park DC, Woo YJ (2001) Weighted centroid neural network for edge preserving image compression. IEEE Trans Neural Netw 12(5):1134–1146CrossRef Park DC, Woo YJ (2001) Weighted centroid neural network for edge preserving image compression. IEEE Trans Neural Netw 12(5):1134–1146CrossRef
33.
Zurück zum Zitat Park DC, Kwon OH, Chung J (2008) Centroid neural network with a divergence measure for gpdf data clustering. IEEE Trans Neural Netw 19(8):948–957CrossRef Park DC, Kwon OH, Chung J (2008) Centroid neural network with a divergence measure for gpdf data clustering. IEEE Trans Neural Netw 19(8):948–957CrossRef
34.
Zurück zum Zitat Pelleg D, Baras D (2007) K-means with large and noisy constraint sets. Proc Mach Learn ECML 2007:674–682 Pelleg D, Baras D (2007) K-means with large and noisy constraint sets. Proc Mach Learn ECML 2007:674–682
35.
Zurück zum Zitat Peteiro-Barral D et al (2013) Toward the scalability of neural networks through feature selection. Expert Syst Appl 40(8):2807–2816CrossRef Peteiro-Barral D et al (2013) Toward the scalability of neural networks through feature selection. Expert Syst Appl 40(8):2807–2816CrossRef
36.
Zurück zum Zitat Pitelis N, Russell C, Agapito L (2014) Semi-supervised learning using an unsupervised atlas. Lect Notes Comput Sci 8275:565–580CrossRef Pitelis N, Russell C, Agapito L (2014) Semi-supervised learning using an unsupervised atlas. Lect Notes Comput Sci 8275:565–580CrossRef
37.
Zurück zum Zitat Radford A, Metz L, Chintala S (2015) Unsupervised representation learning with deep convolutional generative adversarial networks. arXiv:1511.06434 Radford A, Metz L, Chintala S (2015) Unsupervised representation learning with deep convolutional generative adversarial networks. arXiv:​1511.​06434
38.
Zurück zum Zitat Rosenberg C, Hebert M, Schneiderman H (2005) Semi-supervised self-training of object detection models. In: Proceedings of 7th IEEE workshop on application of computer vision, vol 1. pp 29–36 Rosenberg C, Hebert M, Schneiderman H (2005) Semi-supervised self-training of object detection models. In: Proceedings of 7th IEEE workshop on application of computer vision, vol 1. pp 29–36
39.
Zurück zum Zitat Rubens N, Elahi M, Sugiyama M (2016) Active learning in recommender systems. In: Recommender systems handbook, vol 2. pp 809–846CrossRef Rubens N, Elahi M, Sugiyama M (2016) Active learning in recommender systems. In: Recommender systems handbook, vol 2. pp 809–846CrossRef
40.
Zurück zum Zitat Sajjadi M, Javanmardi M, Tasdizen T (2016) Regularization with stochastic transformations and perturbations for deep semi-supervised learning. In: Proceedings of advances in neural information processing systems, pp 1163–1171 Sajjadi M, Javanmardi M, Tasdizen T (2016) Regularization with stochastic transformations and perturbations for deep semi-supervised learning. In: Proceedings of advances in neural information processing systems, pp 1163–1171
41.
Zurück zum Zitat Santos CN, Wadhawan K, Zhou B (2017) Learning loss functions for semi-supervised learning via discriminative adversarial networks. arXiv:1707.02198 Santos CN, Wadhawan K, Zhou B (2017) Learning loss functions for semi-supervised learning via discriminative adversarial networks. arXiv:​1707.​02198
42.
44.
Zurück zum Zitat Wagstaff K, Cardie C(2000) Clustering with instance-level constraints. In: Proceedings of the seventeenth international conference on machine learning, pp 1103–1110 Wagstaff K, Cardie C(2000) Clustering with instance-level constraints. In: Proceedings of the seventeenth international conference on machine learning, pp 1103–1110
45.
Zurück zum Zitat Wagstaff K, Cardie C, Rogers S, Schroedl S (2001) Constrained K-means clustering with background knowledge. In: Proceedings of the eighteenth international conference on machine learning, pp 577–584 Wagstaff K, Cardie C, Rogers S, Schroedl S (2001) Constrained K-means clustering with background knowledge. In: Proceedings of the eighteenth international conference on machine learning, pp 577–584
46.
Zurück zum Zitat Zhu X (2006) Semi-supervised learning literature survey, computer Sciences. University of Wisconsin-Madison, Madison Zhu X (2006) Semi-supervised learning literature survey, computer Sciences. University of Wisconsin-Madison, Madison
Metadaten
Titel
Centroid Neural Network with Pairwise Constraints for Semi-supervised Learning
verfasst von
Minh Tran Ngoc
Dong-Chul Park
Publikationsdatum
08.02.2018
Verlag
Springer US
Erschienen in
Neural Processing Letters / Ausgabe 3/2018
Print ISSN: 1370-4621
Elektronische ISSN: 1573-773X
DOI
https://doi.org/10.1007/s11063-018-9794-8

Weitere Artikel der Ausgabe 3/2018

Neural Processing Letters 3/2018 Zur Ausgabe

Neuer Inhalt