Skip to main content

2017 | OriginalPaper | Buchkapitel

2. Regression Problems

verfasst von : Daniel Durstewitz

Erschienen in: Advanced Data Analysis in Neuroscience

Verlag: Springer International Publishing

Aktivieren Sie unsere intelligente Suche um passende Fachinhalte oder Patente zu finden.

search-config
loading …

Abstract

Assume we would like to predict variables y from variables x through a function f(x) such that the squared deviations between actual and predicted values are minimized (a so-called squared error loss function, see Eq. 1.​11). Then the regression function which optimally achieves this is given by f(x) = E(y|x) (Winer 1971; Bishop 2006; Hastie et al. 2009), that is the goal in regression is to model the conditional expectancy of y (the “outputs” or “responses”) given x (the “predictors” or “regressors”). For instance, we may have recorded in vivo the average firing rate of p neurons on N independent trials i, arranged in a set of row vectors X = {x 1,…, x i ,…, x N }, and would like to see whether with these we can predict the movement direction (angle) y i of the animal on each trial (a “decoding” problem). This is a typical multiple regression problem (where “multiple” indicates that we have more than one predictor). Had we also measured more than one output variable, e.g., several movement parameters like angle, velocity, and acceleration, which we would like to set in relation to the firing rates of the p recorded neurons, we would get into the domain of multivariate regression.

Sie haben noch keine Lizenz? Dann Informieren Sie sich jetzt über unsere Produkte:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Literatur
Zurück zum Zitat Aarts, E., Verhage, M., Veenvliet, J.V., Dolan, C.V., van der Sluis, S.: A solution to dependency: using multilevel analysis to accommodate nested data. Nat. Neurosci. 17, 491–496 (2014)CrossRef Aarts, E., Verhage, M., Veenvliet, J.V., Dolan, C.V., van der Sluis, S.: A solution to dependency: using multilevel analysis to accommodate nested data. Nat. Neurosci. 17, 491–496 (2014)CrossRef
Zurück zum Zitat Balaguer-Ballester, E., Lapish, C.C., Seamans, J.K., Daniel Durstewitz, D.: Attractor dynamics of cortical populations during memory-guided decision-making. PLoS Comput. Biol. 7, e1002057 (2011)CrossRef Balaguer-Ballester, E., Lapish, C.C., Seamans, J.K., Daniel Durstewitz, D.: Attractor dynamics of cortical populations during memory-guided decision-making. PLoS Comput. Biol. 7, e1002057 (2011)CrossRef
Zurück zum Zitat Bishop, C.M.: Pattern Recognition and Machine Learning. Springer, New York (2006)MATH Bishop, C.M.: Pattern Recognition and Machine Learning. Springer, New York (2006)MATH
Zurück zum Zitat Brette, R., Gerstner, W.: Adaptive exponential integrate-and-fire model as an effective description of neuronal activity. J. Neurophysiol. 94, 3637–3642 (2005)CrossRef Brette, R., Gerstner, W.: Adaptive exponential integrate-and-fire model as an effective description of neuronal activity. J. Neurophysiol. 94, 3637–3642 (2005)CrossRef
Zurück zum Zitat Buzsaki, G., Draguhn, A.: Neuronal oscillations in cortical networks. Science. 304, 1926–1929 (2004)CrossRef Buzsaki, G., Draguhn, A.: Neuronal oscillations in cortical networks. Science. 304, 1926–1929 (2004)CrossRef
Zurück zum Zitat Demanuele, C., Kirsch, P., Esslinger, C., Zink, M., Meyer-Lindenberg, A., Durstewitz, D.: Area-specific information processing in prefrontal cortex during a probabilistic inference task: a multivariate fMRI BOLD time series analysis. PLoS One. 10, e0135424 (2015b)CrossRef Demanuele, C., Kirsch, P., Esslinger, C., Zink, M., Meyer-Lindenberg, A., Durstewitz, D.: Area-specific information processing in prefrontal cortex during a probabilistic inference task: a multivariate fMRI BOLD time series analysis. PLoS One. 10, e0135424 (2015b)CrossRef
Zurück zum Zitat Duchi, J., Hazan, E., Singer, Y.: Adaptive subgradient methods for online learning and stochastic optimization. J. Mach. Learn. Res. 12, 2121–2159 (2011)MathSciNetMATH Duchi, J., Hazan, E., Singer, Y.: Adaptive subgradient methods for online learning and stochastic optimization. J. Mach. Learn. Res. 12, 2121–2159 (2011)MathSciNetMATH
Zurück zum Zitat Duda, R.O., Hart, P.E.: Pattern Classification and Scene Analysis. Wiley, New York (1973)MATH Duda, R.O., Hart, P.E.: Pattern Classification and Scene Analysis. Wiley, New York (1973)MATH
Zurück zum Zitat Fahrmeir, L., Tutz, G.: Multivariate Statistical Modelling Based on Generalized Linear Models. Springer, New York (2010)MATH Fahrmeir, L., Tutz, G.: Multivariate Statistical Modelling Based on Generalized Linear Models. Springer, New York (2010)MATH
Zurück zum Zitat Fan, J., Yao, Q.: Nonlinear Time Series: Nonparametric and Parametric Methods. Springer, New York (2003)CrossRefMATH Fan, J., Yao, Q.: Nonlinear Time Series: Nonparametric and Parametric Methods. Springer, New York (2003)CrossRefMATH
Zurück zum Zitat Friston, K.J., Harrison, L., Penny, W.: Dynamic causal modelling. Neuroimage. 19, 1273–1302 (2003)CrossRef Friston, K.J., Harrison, L., Penny, W.: Dynamic causal modelling. Neuroimage. 19, 1273–1302 (2003)CrossRef
Zurück zum Zitat Graves, A., Wayne, G., Reynolds, M., Harley, T., Danihelka, I., et al.: Hybrid computing using a neural network with dynamic external memory. Nature. 538, 471–476 (2016)CrossRef Graves, A., Wayne, G., Reynolds, M., Harley, T., Danihelka, I., et al.: Hybrid computing using a neural network with dynamic external memory. Nature. 538, 471–476 (2016)CrossRef
Zurück zum Zitat Haase, R.F.: Multivariate General Linear Models. SAGE, Thousand Oaks, CA (2011)CrossRef Haase, R.F.: Multivariate General Linear Models. SAGE, Thousand Oaks, CA (2011)CrossRef
Zurück zum Zitat Hastie, T., Tibshirani, R., Friedman, J.: The Elements of Statistical Learning (Vol. 2, No. 1) Springer, New York (2009) Hastie, T., Tibshirani, R., Friedman, J.: The Elements of Statistical Learning (Vol. 2, No. 1) Springer, New York (2009)
Zurück zum Zitat Hertz, J., Krogh, A.S., Palmer, R.G.: Introduction to the theory of neural computation. Addison-Wesley, Reading, MA (1991) Hertz, J., Krogh, A.S., Palmer, R.G.: Introduction to the theory of neural computation. Addison-Wesley, Reading, MA (1991)
Zurück zum Zitat Hoerl, A.E., Kennard, R.W.: Ridge regression: biased estimation for nonorthogonal problems. Technometrics. 12, 55–67 (1970)CrossRefMATH Hoerl, A.E., Kennard, R.W.: Ridge regression: biased estimation for nonorthogonal problems. Technometrics. 12, 55–67 (1970)CrossRefMATH
Zurück zum Zitat Hotelling, H.: Relations between two sets of variants. Biometrika. 28, 321–377 (1936)CrossRefMATH Hotelling, H.: Relations between two sets of variants. Biometrika. 28, 321–377 (1936)CrossRefMATH
Zurück zum Zitat Kim, J., Calhoun, V.D., Shim, E., Lee, J.H.: Deep neural network with weight sparsity control and pre-training extracts hierarchical features and enhances classification performance: evidence from whole-brain resting-state functional connectivity patterns of schizophrenia. NeuroImage. 124, 127–146 (2016)CrossRef Kim, J., Calhoun, V.D., Shim, E., Lee, J.H.: Deep neural network with weight sparsity control and pre-training extracts hierarchical features and enhances classification performance: evidence from whole-brain resting-state functional connectivity patterns of schizophrenia. NeuroImage. 124, 127–146 (2016)CrossRef
Zurück zum Zitat Kohonen, T.: Self-Organising and Associative Memory. Springer, Berlin (1989)CrossRef Kohonen, T.: Self-Organising and Associative Memory. Springer, Berlin (1989)CrossRef
Zurück zum Zitat Kriegeskorte, N.: Deep neural networks: a new framework for modeling biological vision and brain information processing. Annu. Rev. Vis. Sci. 1, 417–446 (2015)CrossRef Kriegeskorte, N.: Deep neural networks: a new framework for modeling biological vision and brain information processing. Annu. Rev. Vis. Sci. 1, 417–446 (2015)CrossRef
Zurück zum Zitat Krzanowski, W.J.: Principles of Multivariate Analysis. A User’s Perspective, Rev. edn. Oxford Statistical Science Series. OUP, Oxford (2000) Krzanowski, W.J.: Principles of Multivariate Analysis. A User’s Perspective, Rev. edn. Oxford Statistical Science Series. OUP, Oxford (2000)
Zurück zum Zitat Lapish, C.C., Balaguer-Ballester, E., Seamans, J.K., Phillips, A.G., Durstewitz, D.: Amphetamine exerts dose-dependent changes in prefrontal cortex attractor dynamics during working memory. J. Neurosci. 35, 10172–10187 (2015)CrossRef Lapish, C.C., Balaguer-Ballester, E., Seamans, J.K., Phillips, A.G., Durstewitz, D.: Amphetamine exerts dose-dependent changes in prefrontal cortex attractor dynamics during working memory. J. Neurosci. 35, 10172–10187 (2015)CrossRef
Zurück zum Zitat LeCun, Y., Bengio, Y., Hinton, G.: Deep learning. Nature. 521, 436–444 (2015)CrossRef LeCun, Y., Bengio, Y., Hinton, G.: Deep learning. Nature. 521, 436–444 (2015)CrossRef
Zurück zum Zitat Liu, W., Wang, Z., Liu, X., Zeng, N., Liu, Y., Alsaadi, F.E.: A survey of deep neural network architectures and their applications. Neurocomputing. 234, 11–26 (2017)CrossRef Liu, W., Wang, Z., Liu, X., Zeng, N., Liu, Y., Alsaadi, F.E.: A survey of deep neural network architectures and their applications. Neurocomputing. 234, 11–26 (2017)CrossRef
Zurück zum Zitat McDonald, G.C.: Ridge regression. WIREs Comp. Stat. 1, 93–100 (2009)CrossRef McDonald, G.C.: Ridge regression. WIREs Comp. Stat. 1, 93–100 (2009)CrossRef
Zurück zum Zitat Mnih, V., Kavukcuoglu, K., Silver, D., Rusu, A.A., Veness, J., Bellemare, M.G., Graves, A., Riedmiller, M., Fidjeland, A.K., Ostrovski, G., Petersen, S., Beattie, C., Sadik, A., Antonoglou, I., King, H., Kumaran, D., Wierstra, D., Legg, S., Hassabis, D.: Human-level control through deep reinforcement learning. Nature. 518, 529–533 (2015)CrossRef Mnih, V., Kavukcuoglu, K., Silver, D., Rusu, A.A., Veness, J., Bellemare, M.G., Graves, A., Riedmiller, M., Fidjeland, A.K., Ostrovski, G., Petersen, S., Beattie, C., Sadik, A., Antonoglou, I., King, H., Kumaran, D., Wierstra, D., Legg, S., Hassabis, D.: Human-level control through deep reinforcement learning. Nature. 518, 529–533 (2015)CrossRef
Zurück zum Zitat Murayama, Y., Biessmann, F., Meinecke, F.C., Müller, K.R., Augath, M., Oeltermann, A., Logothetis, N.K.: Relationship between neural and hemodynamic signals during spontaneous activity studied with temporal kernel CCA. Magn. Reson. Imaging. 28, 1095–1103 (2010)CrossRef Murayama, Y., Biessmann, F., Meinecke, F.C., Müller, K.R., Augath, M., Oeltermann, A., Logothetis, N.K.: Relationship between neural and hemodynamic signals during spontaneous activity studied with temporal kernel CCA. Magn. Reson. Imaging. 28, 1095–1103 (2010)CrossRef
Zurück zum Zitat Naundorf, B., Wolf, F., Volgushev, M.: Unique features of action potential initiation in cortical neurons. Nature. 20, 1060–1063 (2006)CrossRef Naundorf, B., Wolf, F., Volgushev, M.: Unique features of action potential initiation in cortical neurons. Nature. 20, 1060–1063 (2006)CrossRef
Zurück zum Zitat Ohiorhenuan, I.E., Mechler, F., Purpura, K.P., Schmid, A.M., Hu, Q., Victor, J.D.: Sparse coding and high-order correlations in fine-scale cortical networks. Nature. 466, 617–621 (2010)CrossRef Ohiorhenuan, I.E., Mechler, F., Purpura, K.P., Schmid, A.M., Hu, Q., Victor, J.D.: Sparse coding and high-order correlations in fine-scale cortical networks. Nature. 466, 617–621 (2010)CrossRef
Zurück zum Zitat Roweis, S.T., Saul, L.K.: Nonlinear dimensionality reduction by locally linear embedding. Science. 290, 2323–2326 (2000)CrossRef Roweis, S.T., Saul, L.K.: Nonlinear dimensionality reduction by locally linear embedding. Science. 290, 2323–2326 (2000)CrossRef
Zurück zum Zitat Rumelhart, D.E., Hinton, G.E., Williams, R.J.: Learning representations by back-propagating errors. Nature. 323, 533–536 (1986)CrossRefMATH Rumelhart, D.E., Hinton, G.E., Williams, R.J.: Learning representations by back-propagating errors. Nature. 323, 533–536 (1986)CrossRefMATH
Zurück zum Zitat Rumelhart, D.E., McClelland, J.E.: Parallel Distributed Processing. MIT Press, Cambridge, MA (1986) Rumelhart, D.E., McClelland, J.E.: Parallel Distributed Processing. MIT Press, Cambridge, MA (1986)
Zurück zum Zitat Schmidhuber, J.: Deep learning in neural networks: an overview. Neural Netw. 61, 85–117 (2015)CrossRef Schmidhuber, J.: Deep learning in neural networks: an overview. Neural Netw. 61, 85–117 (2015)CrossRef
Zurück zum Zitat Schneidman, E., Berry, M.J., Segev, R., Bialek, W.: Weak pairwise correlations imply strongly correlated network states in a neural population. Nature. 440, 1007–1012 (2006)CrossRef Schneidman, E., Berry, M.J., Segev, R., Bialek, W.: Weak pairwise correlations imply strongly correlated network states in a neural population. Nature. 440, 1007–1012 (2006)CrossRef
Zurück zum Zitat Tibshirani, R.: Regression shrinkage and selection via the lasso. J. R. Stat. Soc. Ser. B. 58, 267–288 (1996)MathSciNetMATH Tibshirani, R.: Regression shrinkage and selection via the lasso. J. R. Stat. Soc. Ser. B. 58, 267–288 (1996)MathSciNetMATH
Zurück zum Zitat Ruder, S.: An overview of gradient descent optimization algorithms. arXiv:1609.04747 (2016) Ruder, S.: An overview of gradient descent optimization algorithms. arXiv:1609.04747 (2016)
Zurück zum Zitat West, B.T., Welch, K.B., Galecki, A.T.: Linear Mixed Models: A Practical Guide Using Statistical Software. Chapman & Hall, London (2006)MATH West, B.T., Welch, K.B., Galecki, A.T.: Linear Mixed Models: A Practical Guide Using Statistical Software. Chapman & Hall, London (2006)MATH
Zurück zum Zitat Winer, B.J.: Statistical Principles in Experimental Design. McGraw-Hill, New York (1971) Winer, B.J.: Statistical Principles in Experimental Design. McGraw-Hill, New York (1971)
Zurück zum Zitat Yamins, D.L.K., DiCarlo, J.J.: Using goal-driven deep learning models to understand sensory cortex. Nat. Neurosci. 19, 356–365 (2016)CrossRef Yamins, D.L.K., DiCarlo, J.J.: Using goal-driven deep learning models to understand sensory cortex. Nat. Neurosci. 19, 356–365 (2016)CrossRef
Metadaten
Titel
Regression Problems
verfasst von
Daniel Durstewitz
Copyright-Jahr
2017
DOI
https://doi.org/10.1007/978-3-319-59976-2_2