1998 | OriginalPaper | Buchkapitel
A Review of Neural Networks with Direct Learning Based on Linear or Non-Linear Threshold Logics
verfasst von : Daniel M. Dubois
Erschienen in: Computational Intelligence: Soft Computing and Fuzzy-Neuro Integration with Applications
Verlag: Springer Berlin Heidelberg
Enthalten in: Professional Book Archive
Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.
Wählen Sie Textabschnitte aus um mit Künstlicher Intelligenz passenden Patente zu finden. powered by
Markieren Sie Textabschnitte, um KI-gestützt weitere passende Inhalte zu finden. powered by
This paper deals with a review of the non-linear threshold logic developed in collaboration by D. Dubois, G. Resconi and A. Raymondi. This is a significant extension of the neural threshold logic pioneered by McCulloch and Pitts. The output of their formal neuron is given by the Heaviside function with an argument depending on a linear weighted sum of the inputs and a threshold parameter. All Boolean tables cannot be represented by such a formal neuron. For example, the exclusive OR and the parity problem need hidden neurons to be resolved. A few years ago, Dubois proposed a non-linear fractal neuron to resolve the exclusive OR problem with only one single neuron. Then Dubois and Resconi introduce the non-linear threshold logic, that is to say a Heaviside function with a non-linear sum of the inputs which can represent any Boolean tables with only one neuron where the Dubois’ non-linear neuron model is a Heaviside fixed function. In this framework the supervised learning is direct, that is to say without recursive algorithms for computing the weights and threshold, related to the new foundation of the threshold logic by Resconi and Raymondi. This paper will review the main aspects of the linear and non-linear threshold logic with direct learning and applications in pattern recognition with the software TurboBrain. This constitutes a new tool in the framework of Soft Computing.