Skip to main content
Top
Published in: Natural Computing 2/2009

01-06-2009

A framework for machine learning based on dynamic physical fields

Authors: Dymitr Ruta, Bogdan Gabrys

Published in: Natural Computing | Issue 2/2009

Log in

Activate our intelligent search to find suitable subject content or patents.

search-config
loading …

Abstract

Despite recent successes and advancements in artificial intelligence and machine learning, this domain remains under continuous challenge and guidance from phenomena and processes observed in natural world. Humans remain unsurpassed in their efficiency of dealing and learning from uncertain information coming in a variety of forms, whereas more and more robust learning and optimisation algorithms have their analytical engine built on the basis of some nature-inspired phenomena. Excellence of neural networks and kernel-based learning methods, an emergence of particle-, swarms-, and social behaviour-based optimisation methods are just few of many facts indicating a trend towards greater exploitation of nature inspired models and systems. This work intends to demonstrate how a simple concept of a physical field can be adopted to build a complete framework for supervised and unsupervised learning methodology. An inspiration for artificial learning has been found in the mechanics of physical fields found on both micro and macro scales. Exploiting the analogies between data and charged particles subjected to gravity, electrostatic and gas particle fields, a family of new algorithms has been developed and applied to classification, clustering and data condensation while properties of the field were further used in a unique visualisation of classification and classifier fusion models. The paper covers extensive pictorial examples and visual interpretations of the presented techniques along with some comparative testing over well-known real and artificial datasets.

Dont have a licence yet? Then find out more about our products and how to get one now:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Footnotes
1
University of California Repository of Machine Learning Databases and Domain Theories, available free at: ftp.ics.uci.edu/pub/machine-learning-databases
 
Literature
go back to reference Wheeler JA (1989) Information, physics, quantum: the search for links. Proc of the Workshop on Complexity, Entropy, and the Physics of Information. Santa Fe 3–28 Wheeler JA (1989) Information, physics, quantum: the search for links. Proc of the Workshop on Complexity, Entropy, and the Physics of Information. Santa Fe 3–28
go back to reference Klir GJ, Folger TA (1988) Fuzzy sets, uncertainty, and information. Prentice-Hall International Edition Klir GJ, Folger TA (1988) Fuzzy sets, uncertainty, and information. Prentice-Hall International Edition
go back to reference Zurek WH (1989) Complexity, entropy and the physics of information. Proc of the Workshop on Complexity, Entropy, and the Physics of Information. Santa Fe Zurek WH (1989) Complexity, entropy and the physics of information. Proc of the Workshop on Complexity, Entropy, and the Physics of Information. Santa Fe
go back to reference Hochreiter S, Mozer MC (2000) An electric approach to independent component analysis. Proc of the 2nd International Workshop on Independent Component Analysis and Signal Separation, Helsinki 45–50. Hochreiter S, Mozer MC (2000) An electric approach to independent component analysis. Proc of the 2nd International Workshop on Independent Component Analysis and Signal Separation, Helsinki 45–50.
go back to reference Principe J, Fisher I, Xu D (2000) Information theoretic learning. In: Haykin S (ed) Unsupervised adaptive filtering. New York Principe J, Fisher I, Xu D (2000) Information theoretic learning. In: Haykin S (ed) Unsupervised adaptive filtering. New York
go back to reference Torkkola K, Campbell W (2000) Mutual information in learning feature transformations. Proc of International Conference on Machine Learning, Stanford Torkkola K, Campbell W (2000) Mutual information in learning feature transformations. Proc of International Conference on Machine Learning, Stanford
go back to reference Torkkola K (2001) Nonlinear feature transforms using maximum mutual information. Proc of IJCNN’2001, Washington DC, USA Torkkola K (2001) Nonlinear feature transforms using maximum mutual information. Proc of IJCNN’2001, Washington DC, USA
go back to reference Shawe-Taylor J, Cristianini N (2004) Kernel methods for pattern analysis. Cambridge University Press Shawe-Taylor J, Cristianini N (2004) Kernel methods for pattern analysis. Cambridge University Press
go back to reference Duda RO, Hart PE, Stork DG (2001) Pattern classification. John Wiley & Sons, New YorkMATH Duda RO, Hart PE, Stork DG (2001) Pattern classification. John Wiley & Sons, New YorkMATH
go back to reference Feynman RP, Leighton RB, Sands M (1963) The Feynman lectures on physics. Addison Wesley Feynman RP, Leighton RB, Sands M (1963) The Feynman lectures on physics. Addison Wesley
go back to reference Cunningham SJ, Humphrey MC, Witten IH (1996) Understanding what machine learning produces part 2: Knowledge visualisation techniques Cunningham SJ, Humphrey MC, Witten IH (1996) Understanding what machine learning produces part 2: Knowledge visualisation techniques
go back to reference Ho TK (2002) Mirage—A tool for interactive pattern recognition from multimedia data. Proc of the Astronomical Data Analysis Software & Systems XII, Baltimore, MD Ho TK (2002) Mirage—A tool for interactive pattern recognition from multimedia data. Proc of the Astronomical Data Analysis Software & Systems XII, Baltimore, MD
go back to reference Kittler J (1998) Combining classifiers: a theoretical framework. Pattern Anal Appl 1:18–27CrossRef Kittler J (1998) Combining classifiers: a theoretical framework. Pattern Anal Appl 1:18–27CrossRef
go back to reference Girolami M, He C (2003) Probability density estimation from optimally condensed data samples. IEEE Trans Pattern Anal Machine Intell 25(10):1253–1264CrossRef Girolami M, He C (2003) Probability density estimation from optimally condensed data samples. IEEE Trans Pattern Anal Machine Intell 25(10):1253–1264CrossRef
Metadata
Title
A framework for machine learning based on dynamic physical fields
Authors
Dymitr Ruta
Bogdan Gabrys
Publication date
01-06-2009
Publisher
Springer Netherlands
Published in
Natural Computing / Issue 2/2009
Print ISSN: 1567-7818
Electronic ISSN: 1572-9796
DOI
https://doi.org/10.1007/s11047-007-9064-6

Other articles of this Issue 2/2009

Natural Computing 2/2009 Go to the issue

Premium Partner