Learning sets of filters using back-propagation☆
References (12)
- et al.
A learning algorithm for Boltzmann Machines
Cognitive Science
(1985) - et al.
Boltzmann machines for speech recognition
Computer Speech and Language
(1986) A theory of adaptive pattern classifiers
IEEE Transactions on Electronic Computers
(1967)- et al.
Learning distributed representations of concepts
- et al.
Phonotopic maps: insightful representation of phonological features for speech recognition
There are more references available in the full text version of this article.
Cited by (84)
Weighting and pruning based ensemble deep random vector functional link network for tabular data classification
2022, Pattern RecognitionCitation Excerpt :Ranging from vision and video tasks to natural language processing, these deep neural networks have reached state-of-the-art results in multiple domains [1,2]. In conventional neural networks, back-propagation methods are used to train a large number of parameters in these models [3]. Although such a training method makes it possible to optimize the parameters, the time-consuming training process has become a severe problem in recently designed complex neural networks.
Evolving interpretable neural modularity in free-form multilayer perceptrons through connection costs
2024, Neural Computing and ApplicationsPredicting the mechanism of pyramidal neurons in synaptic integration by high-frequency electrical stimulation and patch clamp
2023, Proceedings of SPIE - The International Society for Optical EngineeringAppearance-Based Gaze Estimation for ASD Diagnosis
2022, IEEE Transactions on Cybernetics
- ☆
This research was supported by contract N00014-86-K-00167 from the Office of Naval Research and an R. K. Mellon Fellowship to David Plaut.
Copyright © 1987 Published by Elsevier Ltd.