Skip to main content
Top

2018 | OriginalPaper | Chapter

Input-Dependably Feature-Map Pruning

Authors : Atalya Waissman, Aharon Bar-Hillel

Published in: Artificial Neural Networks and Machine Learning – ICANN 2018

Publisher: Springer International Publishing

Activate our intelligent search to find suitable subject content or patents.

search-config
loading …

Abstract

Deep neural networks are an accurate tool for solving, among other things, vision tasks. The computational cost of these networks is often high, preventing their adoption in many real time applications. Thus, there is a constant need for computational saving in this research domain. In this paper we suggest trading accuracy with computation using a gated version of Convolutional Neural Networks (CNN). The gated network selectively activates only a portion of its feature-maps, depending on the given example to be classified. The network’s ‘gates’ imply which feature-maps are necessary for the task, and which are not. Specifically, full feature maps are considered for omission, to enable computational savings in a manner compliant with GPU hardware constraints. The network is trained using a combination of back-propagation for standard weights, minimizing an error-related loss, and reinforcement learning for the gates, minimizing a loss related to the number of feature maps used. We trained and evaluated a gated version of dense-net on the CIFAR-10 dataset [1]. Our results show that with slight impact on the network accuracy, a potential acceleration of up to \( \times 3 \) might be obtained.

Dont have a licence yet? Then find out more about our products and how to get one now:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Literature
1.
go back to reference Krizhevsky, A.: Learning multiple layers of features from tiny images. Technical report, Computer Science Department, University of Toronto, pp. 1–60 (2009) Krizhevsky, A.: Learning multiple layers of features from tiny images. Technical report, Computer Science Department, University of Toronto, pp. 1–60 (2009)
2.
go back to reference Erhan, D., Bengio, Y., Courville, A., Vincent, P.: Visualizing higher-layer features of a deep network. Bernoulli 1341, 1–13 (2009) Erhan, D., Bengio, Y., Courville, A., Vincent, P.: Visualizing higher-layer features of a deep network. Bernoulli 1341, 1–13 (2009)
3.
go back to reference Yosinski, J., Clune, J., Nguyen, A., Fuchs, T., Lipson, H.: Understanding neural networks through deep visualization (2015) Yosinski, J., Clune, J., Nguyen, A., Fuchs, T., Lipson, H.: Understanding neural networks through deep visualization (2015)
4.
go back to reference Rigamonti, R., Sironi, A., Lepetit, V., Fua, P.: Learning separable filters. In: 2013 IEEE Conference on Computer Vision and Pattern Recognition, pp. 2754–2761 (2013) Rigamonti, R., Sironi, A., Lepetit, V., Fua, P.: Learning separable filters. In: 2013 IEEE Conference on Computer Vision and Pattern Recognition, pp. 2754–2761 (2013)
6.
go back to reference Jaderberg, M., Vedaldi, A., Zisserman, A.: Speeding up convolutional neural networks with low rank expansions. arXiv Preprint. arXiv 1405.3866, p. 7 (2014) Jaderberg, M., Vedaldi, A., Zisserman, A.: Speeding up convolutional neural networks with low rank expansions. arXiv Preprint. arXiv 1405.​3866, p. 7 (2014)
7.
go back to reference Jin, J., Dundar, A., Culurciello, E.: Flattened convolutional neural network for feedforward acceleration. ICLR Work. 2014, 1–11 (2015) Jin, J., Dundar, A., Culurciello, E.: Flattened convolutional neural network for feedforward acceleration. ICLR Work. 2014, 1–11 (2015)
9.
go back to reference Vanhoucke, V., Senior, A., Mao, M.: Improving the speed of neural networks on CPUs. In: Proceedings of Deep Learning and Unsupervised Feature Learning NIPS, pp. 1–8 (2011) Vanhoucke, V., Senior, A., Mao, M.: Improving the speed of neural networks on CPUs. In: Proceedings of Deep Learning and Unsupervised Feature Learning NIPS, pp. 1–8 (2011)
10.
go back to reference Mathieu, M., Henaff, M., LeCun, Y.: Fast training of convolutional networks through FFTs. In: International Conference on Learning Representations, pp. 1–9 (2014) Mathieu, M., Henaff, M., LeCun, Y.: Fast training of convolutional networks through FFTs. In: International Conference on Learning Representations, pp. 1–9 (2014)
11.
go back to reference Amthor, M., Rodner, E., Denzler, J.: Impatient DNNs - deep neural networks with dynamic time budgets, no. 2 (2016) Amthor, M., Rodner, E., Denzler, J.: Impatient DNNs - deep neural networks with dynamic time budgets, no. 2 (2016)
12.
go back to reference Shazeer, N., Mirhoseini, A., Maziarz, K., Davis, A., Le, Q., Dean, J.: Outrageously large neural networks : the sparsely-gated mixture-of-experts layer, pp. 1–15 (2017) Shazeer, N., Mirhoseini, A., Maziarz, K., Davis, A., Le, Q., Dean, J.: Outrageously large neural networks : the sparsely-gated mixture-of-experts layer, pp. 1–15 (2017)
13.
go back to reference Mnih, V., Heess, N., Graves, A., Kavukcuoglu, K.: Recurrent models of visual attention. Adv. Neural. Inf. Process. Syst. 27, 1–9 (2014) Mnih, V., Heess, N., Graves, A., Kavukcuoglu, K.: Recurrent models of visual attention. Adv. Neural. Inf. Process. Syst. 27, 1–9 (2014)
14.
go back to reference Bengio, E., Bacon, P.-L., Pineau, J., Precup, D.: Conditional computation in neural networks for faster models, pp. 1–9 (2015) Bengio, E., Bacon, P.-L., Pineau, J., Precup, D.: Conditional computation in neural networks for faster models, pp. 1–9 (2015)
15.
go back to reference He, K., Zhang, X., Ren, S., Sun, J.: Deep residual learning for image recognition. Arxiv.Org, vol. 7, no. 3, pp. 171–180 (2015) He, K., Zhang, X., Ren, S., Sun, J.: Deep residual learning for image recognition. Arxiv.Org, vol. 7, no. 3, pp. 171–180 (2015)
16.
go back to reference He, K., Zhang, X., Ren, S., Sun, J.: Identity mappings in deep residual networks importance of identity skip connections usage of activation function analysis of pre-activation structure, no. 1, pp. 1–15 (2016) He, K., Zhang, X., Ren, S., Sun, J.: Identity mappings in deep residual networks importance of identity skip connections usage of activation function analysis of pre-activation structure, no. 1, pp. 1–15 (2016)
17.
go back to reference Huang, G., Liu, Z., Weinberger, K.Q., van der Maaten, L.: Densely connected convolutional networks (2016) Huang, G., Liu, Z., Weinberger, K.Q., van der Maaten, L.: Densely connected convolutional networks (2016)
18.
go back to reference Willia, R.J.: Simple statistical gradient-following algorithms for connectionist reinforcement learning. Mach. Learn. 8(3), 229–256 (1992) Willia, R.J.: Simple statistical gradient-following algorithms for connectionist reinforcement learning. Mach. Learn. 8(3), 229–256 (1992)
Metadata
Title
Input-Dependably Feature-Map Pruning
Authors
Atalya Waissman
Aharon Bar-Hillel
Copyright Year
2018
DOI
https://doi.org/10.1007/978-3-030-01418-6_69

Premium Partner