Skip to main content
Top

2020 | OriginalPaper | Chapter

Recurrent Neural Network Learning of Performance and Intrinsic Population Dynamics from Sparse Neural Data

Authors : Alessandro Salatiello, Martin A. Giese

Published in: Artificial Neural Networks and Machine Learning – ICANN 2020

Publisher: Springer International Publishing

Activate our intelligent search to find suitable subject content or patents.

search-config
loading …

Abstract

Recurrent Neural Networks (RNNs) are popular models of brain function. The typical training strategy is to adjust their input-output behavior so that it matches that of the biological circuit of interest. Even though this strategy ensures that the biological and artificial networks perform the same computational task, it does not guarantee that their internal activity dynamics match. This suggests that the trained RNNs might end up performing the task employing a different internal computational mechanism. In this work, we introduce a novel training strategy that allows learning not only the input-output behavior of an RNN but also its internal network dynamics. We test the proposed method by training an RNN to simultaneously reproduce internal dynamics and output signals of a physiologically-inspired neural model of motor cortical and muscle activity dynamics. Remarkably, we show that the reproduction of the internal dynamics is successful even when the training algorithm relies on the activities of a small subset of neurons sampled from the biological network. Furthermore, we show that training the RNNs with this method significantly improves their generalization performance. Overall, our results suggest that the proposed method is suitable for building powerful functional RNN models, which automatically capture important computational properties of the biological circuit of interest from sparse neural recordings.

Dont have a licence yet? Then find out more about our products and how to get one now:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Footnotes
1
These input terms are strong enough to suppress chaotic activity in the network [5].
 
2
Following [5] we included an L2 regularization term for \(\mathbf{J}\).
 
3
To ensure a sufficiently strong effect on the embedder network, the hint signals were scaled by a factor of 5. In addition, the final 110 samples of such signals were replaced by a smooth decay to zero, modeled by spline interpolation. This ensured that the activities go back to zero at the end of the movement phase.
 
4
We restricted our analysis to the first five singular vector canonical variables, which on average, captured \({>}92\%\) of the original data variance.
 
Literature
1.
go back to reference Chaudhuri, R., Gercek, B., Pandey, B., Peyrache, A., Fiete, I.: The intrinsic attractor manifold and population dynamics of a canonical cognitive circuit across waking and sleep. Nat. Neurosci. 22(9), 1512–1520 (2019)CrossRef Chaudhuri, R., Gercek, B., Pandey, B., Peyrache, A., Fiete, I.: The intrinsic attractor manifold and population dynamics of a canonical cognitive circuit across waking and sleep. Nat. Neurosci. 22(9), 1512–1520 (2019)CrossRef
3.
go back to reference Churchland, M.M., et al.: Stimulus onset quenches neural variability: a widespread cortical phenomenon. Nat. Neurosci. 13(3), 369 (2010)CrossRef Churchland, M.M., et al.: Stimulus onset quenches neural variability: a widespread cortical phenomenon. Nat. Neurosci. 13(3), 369 (2010)CrossRef
4.
go back to reference Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. The MIT Press, Cambridge (2001)MATH Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. The MIT Press, Cambridge (2001)MATH
5.
go back to reference DePasquale, B., Cueva, C.J., Rajan, K., Escola, G.S., Abbott, L.: Full-force: atarget-based method for training recurrent networks. PLoS ONE 13(2) (2018) DePasquale, B., Cueva, C.J., Rajan, K., Escola, G.S., Abbott, L.: Full-force: atarget-based method for training recurrent networks. PLoS ONE 13(2) (2018)
6.
go back to reference Doya, K.: Universality of fully-connected recurrent neural networks. In: Proceedings of 1992 IEEE International Symposium on Circuits and Systems, pp. 2777–2780 (1992) Doya, K.: Universality of fully-connected recurrent neural networks. In: Proceedings of 1992 IEEE International Symposium on Circuits and Systems, pp. 2777–2780 (1992)
7.
go back to reference Flash, T., Hochner, B.: Motor primitives in vertebrates and invertebrates. Curr. Opinion Neurobiol. 15(6), 660–666 (2005)CrossRef Flash, T., Hochner, B.: Motor primitives in vertebrates and invertebrates. Curr. Opinion Neurobiol. 15(6), 660–666 (2005)CrossRef
8.
go back to reference Georgopoulos, A.P., Kalaska, J.F., Massey, J.T.: Spatial trajectories and reaction times of aimed movements: effects of practice, uncertainty, and change in target location. J. Neurophysiol. 46(4), 725–743 (1981)CrossRef Georgopoulos, A.P., Kalaska, J.F., Massey, J.T.: Spatial trajectories and reaction times of aimed movements: effects of practice, uncertainty, and change in target location. J. Neurophysiol. 46(4), 725–743 (1981)CrossRef
9.
go back to reference Hennequin, G., Vogels, T.P., Gerstner, W.: Optimal control of transient dynamics in balanced networks supports generation of complex movements. Neuron 82(6), 1394–1406 (2014)CrossRef Hennequin, G., Vogels, T.P., Gerstner, W.: Optimal control of transient dynamics in balanced networks supports generation of complex movements. Neuron 82(6), 1394–1406 (2014)CrossRef
10.
go back to reference Kim, C.M., Chow, C.C.: Learning recurrent dynamics in spiking networks. eLife 7, e37124 (2018) Kim, C.M., Chow, C.C.: Learning recurrent dynamics in spiking networks. eLife 7, e37124 (2018)
11.
go back to reference Machens, C.K., Romo, R., Brody, C.D.: Flexible control of mutual inhibition: a neural model of two-interval discrimination. Science 307(5712), 1121–1124 (2005)CrossRef Machens, C.K., Romo, R., Brody, C.D.: Flexible control of mutual inhibition: a neural model of two-interval discrimination. Science 307(5712), 1121–1124 (2005)CrossRef
12.
go back to reference Mante, V., Sussillo, D., Shenoy, K.V., Newsome, W.T.: Context-dependent computation by recurrent dynamics in prefrontal cortex. Nature 503(7474), 78–84 (2013)CrossRef Mante, V., Sussillo, D., Shenoy, K.V., Newsome, W.T.: Context-dependent computation by recurrent dynamics in prefrontal cortex. Nature 503(7474), 78–84 (2013)CrossRef
13.
go back to reference Matsuoka, K.: Mechanisms of frequency and pattern control in the neural rhythm generators. Biol. Cybern. 56(5–6), 345–353 (1987)CrossRef Matsuoka, K.: Mechanisms of frequency and pattern control in the neural rhythm generators. Biol. Cybern. 56(5–6), 345–353 (1987)CrossRef
14.
go back to reference Raghu, M., Gilmer, J., Yosinski, J., Sohl-Dickstein, J.: SVCCA: singular vector canonical correlation analysis for deep learning dynamics and interpretability. In: Advances in Neural Information Processing Systems, pp. 6076–6085 (2017) Raghu, M., Gilmer, J., Yosinski, J., Sohl-Dickstein, J.: SVCCA: singular vector canonical correlation analysis for deep learning dynamics and interpretability. In: Advances in Neural Information Processing Systems, pp. 6076–6085 (2017)
15.
go back to reference Saxena, S., Cunningham, J.P.: Towards the neural population doctrine. Curr. Opinion Neurobiol. 55, 103–111 (2019)CrossRef Saxena, S., Cunningham, J.P.: Towards the neural population doctrine. Curr. Opinion Neurobiol. 55, 103–111 (2019)CrossRef
16.
go back to reference Schoner, G., Kelso, J.: Dynamic pattern generation in behavioral and neural systems. Science 239(4847), 1513–1520 (1988)CrossRef Schoner, G., Kelso, J.: Dynamic pattern generation in behavioral and neural systems. Science 239(4847), 1513–1520 (1988)CrossRef
17.
go back to reference Shenoy, K.V., Sahani, M., Churchland, M.M.: Cortical control of arm movements: a dynamical systems perspective. Ann. Rev. Neurosci. 36, 337–359 (2013)CrossRef Shenoy, K.V., Sahani, M., Churchland, M.M.: Cortical control of arm movements: a dynamical systems perspective. Ann. Rev. Neurosci. 36, 337–359 (2013)CrossRef
18.
go back to reference Stroud, J.P., Porter, M.A., Hennequin, G., Vogels, T.P.: Motor primitives in space and time via targeted gain modulation in cortical networks. Nat. Neurosci. 21(12), 1774–1783 (2018)CrossRef Stroud, J.P., Porter, M.A., Hennequin, G., Vogels, T.P.: Motor primitives in space and time via targeted gain modulation in cortical networks. Nat. Neurosci. 21(12), 1774–1783 (2018)CrossRef
19.
go back to reference Sussillo, D.: Neural circuits as computational dynamical systems. Curr. Opinion neurobiol. 25, 156–163 (2014)CrossRef Sussillo, D.: Neural circuits as computational dynamical systems. Curr. Opinion neurobiol. 25, 156–163 (2014)CrossRef
20.
go back to reference Sussillo, D., Barak, O.: Opening the black box: low-dimensional dynamics in high-dimensional recurrent neural networks. Neural Comput. 25(3), 626–649 (2013)MathSciNetCrossRef Sussillo, D., Barak, O.: Opening the black box: low-dimensional dynamics in high-dimensional recurrent neural networks. Neural Comput. 25(3), 626–649 (2013)MathSciNetCrossRef
21.
go back to reference Wang, J., Narain, D., Hosseini, E.A., Jazayeri, M.: Flexible timing by temporal scaling of cortical responses. Nat. Neurosci. 21(1), 102–110 (2018)CrossRef Wang, J., Narain, D., Hosseini, E.A., Jazayeri, M.: Flexible timing by temporal scaling of cortical responses. Nat. Neurosci. 21(1), 102–110 (2018)CrossRef
22.
go back to reference Williamson, R.C., et al.: Scaling properties of dimensionality reduction for neural populations and network models. PLoS Comput. Biol. 12, e1005141 (2016)CrossRef Williamson, R.C., et al.: Scaling properties of dimensionality reduction for neural populations and network models. PLoS Comput. Biol. 12, e1005141 (2016)CrossRef
23.
go back to reference Williamson, R.C., Doiron, B., Smith, M.A., Byron, M.Y.: Bridging large-scale neuronal recordings and large-scale network models using dimensionality reduction. Curr. Opinion Neurobiol. 55, 40–47 (2019)CrossRef Williamson, R.C., Doiron, B., Smith, M.A., Byron, M.Y.: Bridging large-scale neuronal recordings and large-scale network models using dimensionality reduction. Curr. Opinion Neurobiol. 55, 40–47 (2019)CrossRef
Metadata
Title
Recurrent Neural Network Learning of Performance and Intrinsic Population Dynamics from Sparse Neural Data
Authors
Alessandro Salatiello
Martin A. Giese
Copyright Year
2020
DOI
https://doi.org/10.1007/978-3-030-61609-0_69

Premium Partner