Skip to main content
Top

2017 | OriginalPaper | Chapter

2. Regression Problems

Author : Daniel Durstewitz

Published in: Advanced Data Analysis in Neuroscience

Publisher: Springer International Publishing

Activate our intelligent search to find suitable subject content or patents.

search-config
loading …

Abstract

Assume we would like to predict variables y from variables x through a function f(x) such that the squared deviations between actual and predicted values are minimized (a so-called squared error loss function, see Eq. 1.​11). Then the regression function which optimally achieves this is given by f(x) = E(y|x) (Winer 1971; Bishop 2006; Hastie et al. 2009), that is the goal in regression is to model the conditional expectancy of y (the “outputs” or “responses”) given x (the “predictors” or “regressors”). For instance, we may have recorded in vivo the average firing rate of p neurons on N independent trials i, arranged in a set of row vectors X = {x 1,…, x i ,…, x N }, and would like to see whether with these we can predict the movement direction (angle) y i of the animal on each trial (a “decoding” problem). This is a typical multiple regression problem (where “multiple” indicates that we have more than one predictor). Had we also measured more than one output variable, e.g., several movement parameters like angle, velocity, and acceleration, which we would like to set in relation to the firing rates of the p recorded neurons, we would get into the domain of multivariate regression.

Dont have a licence yet? Then find out more about our products and how to get one now:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Literature
go back to reference Aarts, E., Verhage, M., Veenvliet, J.V., Dolan, C.V., van der Sluis, S.: A solution to dependency: using multilevel analysis to accommodate nested data. Nat. Neurosci. 17, 491–496 (2014)CrossRef Aarts, E., Verhage, M., Veenvliet, J.V., Dolan, C.V., van der Sluis, S.: A solution to dependency: using multilevel analysis to accommodate nested data. Nat. Neurosci. 17, 491–496 (2014)CrossRef
go back to reference Balaguer-Ballester, E., Lapish, C.C., Seamans, J.K., Daniel Durstewitz, D.: Attractor dynamics of cortical populations during memory-guided decision-making. PLoS Comput. Biol. 7, e1002057 (2011)CrossRef Balaguer-Ballester, E., Lapish, C.C., Seamans, J.K., Daniel Durstewitz, D.: Attractor dynamics of cortical populations during memory-guided decision-making. PLoS Comput. Biol. 7, e1002057 (2011)CrossRef
go back to reference Bishop, C.M.: Pattern Recognition and Machine Learning. Springer, New York (2006)MATH Bishop, C.M.: Pattern Recognition and Machine Learning. Springer, New York (2006)MATH
go back to reference Brette, R., Gerstner, W.: Adaptive exponential integrate-and-fire model as an effective description of neuronal activity. J. Neurophysiol. 94, 3637–3642 (2005)CrossRef Brette, R., Gerstner, W.: Adaptive exponential integrate-and-fire model as an effective description of neuronal activity. J. Neurophysiol. 94, 3637–3642 (2005)CrossRef
go back to reference Buzsaki, G., Draguhn, A.: Neuronal oscillations in cortical networks. Science. 304, 1926–1929 (2004)CrossRef Buzsaki, G., Draguhn, A.: Neuronal oscillations in cortical networks. Science. 304, 1926–1929 (2004)CrossRef
go back to reference Demanuele, C., Kirsch, P., Esslinger, C., Zink, M., Meyer-Lindenberg, A., Durstewitz, D.: Area-specific information processing in prefrontal cortex during a probabilistic inference task: a multivariate fMRI BOLD time series analysis. PLoS One. 10, e0135424 (2015b)CrossRef Demanuele, C., Kirsch, P., Esslinger, C., Zink, M., Meyer-Lindenberg, A., Durstewitz, D.: Area-specific information processing in prefrontal cortex during a probabilistic inference task: a multivariate fMRI BOLD time series analysis. PLoS One. 10, e0135424 (2015b)CrossRef
go back to reference Duchi, J., Hazan, E., Singer, Y.: Adaptive subgradient methods for online learning and stochastic optimization. J. Mach. Learn. Res. 12, 2121–2159 (2011)MathSciNetMATH Duchi, J., Hazan, E., Singer, Y.: Adaptive subgradient methods for online learning and stochastic optimization. J. Mach. Learn. Res. 12, 2121–2159 (2011)MathSciNetMATH
go back to reference Duda, R.O., Hart, P.E.: Pattern Classification and Scene Analysis. Wiley, New York (1973)MATH Duda, R.O., Hart, P.E.: Pattern Classification and Scene Analysis. Wiley, New York (1973)MATH
go back to reference Fahrmeir, L., Tutz, G.: Multivariate Statistical Modelling Based on Generalized Linear Models. Springer, New York (2010)MATH Fahrmeir, L., Tutz, G.: Multivariate Statistical Modelling Based on Generalized Linear Models. Springer, New York (2010)MATH
go back to reference Fan, J., Yao, Q.: Nonlinear Time Series: Nonparametric and Parametric Methods. Springer, New York (2003)CrossRefMATH Fan, J., Yao, Q.: Nonlinear Time Series: Nonparametric and Parametric Methods. Springer, New York (2003)CrossRefMATH
go back to reference Friston, K.J., Harrison, L., Penny, W.: Dynamic causal modelling. Neuroimage. 19, 1273–1302 (2003)CrossRef Friston, K.J., Harrison, L., Penny, W.: Dynamic causal modelling. Neuroimage. 19, 1273–1302 (2003)CrossRef
go back to reference Graves, A., Wayne, G., Reynolds, M., Harley, T., Danihelka, I., et al.: Hybrid computing using a neural network with dynamic external memory. Nature. 538, 471–476 (2016)CrossRef Graves, A., Wayne, G., Reynolds, M., Harley, T., Danihelka, I., et al.: Hybrid computing using a neural network with dynamic external memory. Nature. 538, 471–476 (2016)CrossRef
go back to reference Haase, R.F.: Multivariate General Linear Models. SAGE, Thousand Oaks, CA (2011)CrossRef Haase, R.F.: Multivariate General Linear Models. SAGE, Thousand Oaks, CA (2011)CrossRef
go back to reference Hastie, T., Tibshirani, R., Friedman, J.: The Elements of Statistical Learning (Vol. 2, No. 1) Springer, New York (2009) Hastie, T., Tibshirani, R., Friedman, J.: The Elements of Statistical Learning (Vol. 2, No. 1) Springer, New York (2009)
go back to reference Hertz, J., Krogh, A.S., Palmer, R.G.: Introduction to the theory of neural computation. Addison-Wesley, Reading, MA (1991) Hertz, J., Krogh, A.S., Palmer, R.G.: Introduction to the theory of neural computation. Addison-Wesley, Reading, MA (1991)
go back to reference Hoerl, A.E., Kennard, R.W.: Ridge regression: biased estimation for nonorthogonal problems. Technometrics. 12, 55–67 (1970)CrossRefMATH Hoerl, A.E., Kennard, R.W.: Ridge regression: biased estimation for nonorthogonal problems. Technometrics. 12, 55–67 (1970)CrossRefMATH
go back to reference Kim, J., Calhoun, V.D., Shim, E., Lee, J.H.: Deep neural network with weight sparsity control and pre-training extracts hierarchical features and enhances classification performance: evidence from whole-brain resting-state functional connectivity patterns of schizophrenia. NeuroImage. 124, 127–146 (2016)CrossRef Kim, J., Calhoun, V.D., Shim, E., Lee, J.H.: Deep neural network with weight sparsity control and pre-training extracts hierarchical features and enhances classification performance: evidence from whole-brain resting-state functional connectivity patterns of schizophrenia. NeuroImage. 124, 127–146 (2016)CrossRef
go back to reference Kohonen, T.: Self-Organising and Associative Memory. Springer, Berlin (1989)CrossRef Kohonen, T.: Self-Organising and Associative Memory. Springer, Berlin (1989)CrossRef
go back to reference Kriegeskorte, N.: Deep neural networks: a new framework for modeling biological vision and brain information processing. Annu. Rev. Vis. Sci. 1, 417–446 (2015)CrossRef Kriegeskorte, N.: Deep neural networks: a new framework for modeling biological vision and brain information processing. Annu. Rev. Vis. Sci. 1, 417–446 (2015)CrossRef
go back to reference Krzanowski, W.J.: Principles of Multivariate Analysis. A User’s Perspective, Rev. edn. Oxford Statistical Science Series. OUP, Oxford (2000) Krzanowski, W.J.: Principles of Multivariate Analysis. A User’s Perspective, Rev. edn. Oxford Statistical Science Series. OUP, Oxford (2000)
go back to reference Lapish, C.C., Balaguer-Ballester, E., Seamans, J.K., Phillips, A.G., Durstewitz, D.: Amphetamine exerts dose-dependent changes in prefrontal cortex attractor dynamics during working memory. J. Neurosci. 35, 10172–10187 (2015)CrossRef Lapish, C.C., Balaguer-Ballester, E., Seamans, J.K., Phillips, A.G., Durstewitz, D.: Amphetamine exerts dose-dependent changes in prefrontal cortex attractor dynamics during working memory. J. Neurosci. 35, 10172–10187 (2015)CrossRef
go back to reference LeCun, Y., Bengio, Y., Hinton, G.: Deep learning. Nature. 521, 436–444 (2015)CrossRef LeCun, Y., Bengio, Y., Hinton, G.: Deep learning. Nature. 521, 436–444 (2015)CrossRef
go back to reference Liu, W., Wang, Z., Liu, X., Zeng, N., Liu, Y., Alsaadi, F.E.: A survey of deep neural network architectures and their applications. Neurocomputing. 234, 11–26 (2017)CrossRef Liu, W., Wang, Z., Liu, X., Zeng, N., Liu, Y., Alsaadi, F.E.: A survey of deep neural network architectures and their applications. Neurocomputing. 234, 11–26 (2017)CrossRef
go back to reference Mnih, V., Kavukcuoglu, K., Silver, D., Rusu, A.A., Veness, J., Bellemare, M.G., Graves, A., Riedmiller, M., Fidjeland, A.K., Ostrovski, G., Petersen, S., Beattie, C., Sadik, A., Antonoglou, I., King, H., Kumaran, D., Wierstra, D., Legg, S., Hassabis, D.: Human-level control through deep reinforcement learning. Nature. 518, 529–533 (2015)CrossRef Mnih, V., Kavukcuoglu, K., Silver, D., Rusu, A.A., Veness, J., Bellemare, M.G., Graves, A., Riedmiller, M., Fidjeland, A.K., Ostrovski, G., Petersen, S., Beattie, C., Sadik, A., Antonoglou, I., King, H., Kumaran, D., Wierstra, D., Legg, S., Hassabis, D.: Human-level control through deep reinforcement learning. Nature. 518, 529–533 (2015)CrossRef
go back to reference Murayama, Y., Biessmann, F., Meinecke, F.C., Müller, K.R., Augath, M., Oeltermann, A., Logothetis, N.K.: Relationship between neural and hemodynamic signals during spontaneous activity studied with temporal kernel CCA. Magn. Reson. Imaging. 28, 1095–1103 (2010)CrossRef Murayama, Y., Biessmann, F., Meinecke, F.C., Müller, K.R., Augath, M., Oeltermann, A., Logothetis, N.K.: Relationship between neural and hemodynamic signals during spontaneous activity studied with temporal kernel CCA. Magn. Reson. Imaging. 28, 1095–1103 (2010)CrossRef
go back to reference Naundorf, B., Wolf, F., Volgushev, M.: Unique features of action potential initiation in cortical neurons. Nature. 20, 1060–1063 (2006)CrossRef Naundorf, B., Wolf, F., Volgushev, M.: Unique features of action potential initiation in cortical neurons. Nature. 20, 1060–1063 (2006)CrossRef
go back to reference Ohiorhenuan, I.E., Mechler, F., Purpura, K.P., Schmid, A.M., Hu, Q., Victor, J.D.: Sparse coding and high-order correlations in fine-scale cortical networks. Nature. 466, 617–621 (2010)CrossRef Ohiorhenuan, I.E., Mechler, F., Purpura, K.P., Schmid, A.M., Hu, Q., Victor, J.D.: Sparse coding and high-order correlations in fine-scale cortical networks. Nature. 466, 617–621 (2010)CrossRef
go back to reference Roweis, S.T., Saul, L.K.: Nonlinear dimensionality reduction by locally linear embedding. Science. 290, 2323–2326 (2000)CrossRef Roweis, S.T., Saul, L.K.: Nonlinear dimensionality reduction by locally linear embedding. Science. 290, 2323–2326 (2000)CrossRef
go back to reference Rumelhart, D.E., Hinton, G.E., Williams, R.J.: Learning representations by back-propagating errors. Nature. 323, 533–536 (1986)CrossRefMATH Rumelhart, D.E., Hinton, G.E., Williams, R.J.: Learning representations by back-propagating errors. Nature. 323, 533–536 (1986)CrossRefMATH
go back to reference Rumelhart, D.E., McClelland, J.E.: Parallel Distributed Processing. MIT Press, Cambridge, MA (1986) Rumelhart, D.E., McClelland, J.E.: Parallel Distributed Processing. MIT Press, Cambridge, MA (1986)
go back to reference Schmidhuber, J.: Deep learning in neural networks: an overview. Neural Netw. 61, 85–117 (2015)CrossRef Schmidhuber, J.: Deep learning in neural networks: an overview. Neural Netw. 61, 85–117 (2015)CrossRef
go back to reference Schneidman, E., Berry, M.J., Segev, R., Bialek, W.: Weak pairwise correlations imply strongly correlated network states in a neural population. Nature. 440, 1007–1012 (2006)CrossRef Schneidman, E., Berry, M.J., Segev, R., Bialek, W.: Weak pairwise correlations imply strongly correlated network states in a neural population. Nature. 440, 1007–1012 (2006)CrossRef
go back to reference Tibshirani, R.: Regression shrinkage and selection via the lasso. J. R. Stat. Soc. Ser. B. 58, 267–288 (1996)MathSciNetMATH Tibshirani, R.: Regression shrinkage and selection via the lasso. J. R. Stat. Soc. Ser. B. 58, 267–288 (1996)MathSciNetMATH
go back to reference Ruder, S.: An overview of gradient descent optimization algorithms. arXiv:1609.04747 (2016) Ruder, S.: An overview of gradient descent optimization algorithms. arXiv:1609.04747 (2016)
go back to reference West, B.T., Welch, K.B., Galecki, A.T.: Linear Mixed Models: A Practical Guide Using Statistical Software. Chapman & Hall, London (2006)MATH West, B.T., Welch, K.B., Galecki, A.T.: Linear Mixed Models: A Practical Guide Using Statistical Software. Chapman & Hall, London (2006)MATH
go back to reference Winer, B.J.: Statistical Principles in Experimental Design. McGraw-Hill, New York (1971) Winer, B.J.: Statistical Principles in Experimental Design. McGraw-Hill, New York (1971)
go back to reference Yamins, D.L.K., DiCarlo, J.J.: Using goal-driven deep learning models to understand sensory cortex. Nat. Neurosci. 19, 356–365 (2016)CrossRef Yamins, D.L.K., DiCarlo, J.J.: Using goal-driven deep learning models to understand sensory cortex. Nat. Neurosci. 19, 356–365 (2016)CrossRef
Metadata
Title
Regression Problems
Author
Daniel Durstewitz
Copyright Year
2017
DOI
https://doi.org/10.1007/978-3-319-59976-2_2

Premium Partner