Skip to main content
Erschienen in: Artificial Intelligence Review 3/2020

13.06.2019

A survey of swarm and evolutionary computing approaches for deep learning

verfasst von: Ashraf Darwish, Aboul Ella Hassanien, Swagatam Das

Erschienen in: Artificial Intelligence Review | Ausgabe 3/2020

Einloggen

Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.

search-config
loading …

Abstract

Deep learning (DL) has become an important machine learning approach that has been widely successful in many applications. Currently, DL is one of the best methods of extracting knowledge from large sets of raw data in a (nearly) self-organized manner. The technical design of DL depends on the feed-forward information flow principle of artificial neural networks with multiple layers of hidden neurons, which form deep neural networks (DNNs). DNNs have various architectures and parameters and are often developed for specific applications. However, the training process of DNNs can be prolonged based on the application and training set size (Gong et al. 2015). Moreover, finding the most accurate and efficient architecture of a deep learning system in a reasonable time is a potential difficulty associated with this approach. Swarm intelligence (SI) and evolutionary computing (EC) techniques represent simulation-driven non-convex optimization frameworks with few assumptions based on objective functions. These methods are flexible and have been proven effective in many applications; therefore, they can be used to improve DL by optimizing the applied learning models. This paper presents a comprehensive survey of the most recent approaches involving the hybridization of SI and EC algorithms for DL, the architecture of DNNs, and DNN training to improve the classification accuracy. The paper reviews the significant roles of SI and EC in optimizing the hyper-parameters and architectures of a DL system in context to large scale data analytics. Finally, we identify some open problems for further research, as well as potential issues related to DL that require improvements, and an extensive bibliography of the pertinent research is presented.

Sie haben noch keine Lizenz? Dann Informieren Sie sich jetzt über unsere Produkte:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Literatur
Zurück zum Zitat Ackley DH, Hinton GE, Sejnowski TJ (1985) A learning algorithm for Boltzmann machines. Cognit Sci 9(1):147–169CrossRef Ackley DH, Hinton GE, Sejnowski TJ (1985) A learning algorithm for Boltzmann machines. Cognit Sci 9(1):147–169CrossRef
Zurück zum Zitat Agapitos A, O’Neill M, Nicolau M, Fagan D, Kattan A, Brabazon A, Curran K (2015) Deep evolution of image representations for handwritten digit recognition. In 2015 IEEE congress on evolutionary computation (CEC). IEEE, pp 2452–2459 Agapitos A, O’Neill M, Nicolau M, Fagan D, Kattan A, Brabazon A, Curran K (2015) Deep evolution of image representations for handwritten digit recognition. In 2015 IEEE congress on evolutionary computation (CEC). IEEE, pp 2452–2459
Zurück zum Zitat Alejandro M, Lara-Cabrera R, Fuentes-Hurtado F, Naranjo V (2018) EvoDeep: A new evolutionary approach for automatic deep neural networks parametrisation. J Parallel Distrib Comput 117:180–191CrossRef Alejandro M, Lara-Cabrera R, Fuentes-Hurtado F, Naranjo V (2018) EvoDeep: A new evolutionary approach for automatic deep neural networks parametrisation. J Parallel Distrib Comput 117:180–191CrossRef
Zurück zum Zitat Bäck T, Foussette C, Krause P (2013) Contemporary evolution strategies. Springer, BerlinMATHCrossRef Bäck T, Foussette C, Krause P (2013) Contemporary evolution strategies. Springer, BerlinMATHCrossRef
Zurück zum Zitat Badem H, Basturk A, Caliskan A, Yuksel ME (2017) A new efficient training strategy for deep neural networks by hybridization of artificial bee colony and limited-memory BFGS optimization algorithms. Neurocomputing 266:506–526CrossRef Badem H, Basturk A, Caliskan A, Yuksel ME (2017) A new efficient training strategy for deep neural networks by hybridization of artificial bee colony and limited-memory BFGS optimization algorithms. Neurocomputing 266:506–526CrossRef
Zurück zum Zitat Bae C, Kang K, Liu G, Chung YY (2016) A novel real time video tracking framework using adaptive discrete swarm optimization. Expert Syst Appl 64:385–399CrossRef Bae C, Kang K, Liu G, Chung YY (2016) A novel real time video tracking framework using adaptive discrete swarm optimization. Expert Syst Appl 64:385–399CrossRef
Zurück zum Zitat Bayer J, Wierstra D, Togelius J, Schmidhuber J (2009) Evolving memory cell structures for sequence learning. In: International conference on artificial neural networks (ICANN 2009), Springer LNCS, pp 755–764 Bayer J, Wierstra D, Togelius J, Schmidhuber J (2009) Evolving memory cell structures for sequence learning. In: International conference on artificial neural networks (ICANN 2009), Springer LNCS, pp 755–764
Zurück zum Zitat Bengio Y, Lamblin P, Popovici D, Larochelle H (2007) Greedy layer-wise training of deep networks. In: Advances in neural information processing systems, pp 153–160 Bengio Y, Lamblin P, Popovici D, Larochelle H (2007) Greedy layer-wise training of deep networks. In: Advances in neural information processing systems, pp 153–160
Zurück zum Zitat Bengio Y, Courville A, Vincent P (2013) Representation learning: a review and new perspectives. IEEE Trans Pattern Anal Mach Intell 35(8):1798–1828CrossRef Bengio Y, Courville A, Vincent P (2013) Representation learning: a review and new perspectives. IEEE Trans Pattern Anal Mach Intell 35(8):1798–1828CrossRef
Zurück zum Zitat Biswas A, Chandrakasan AP (2018) Conv-RAM: an energy-efficient SRAM with embedded convolution computation for low-power CNN-based machine learning applications. In: 2018 IEEE international solid-state circuits conference—(ISSCC), San Francisco, CA, pp 488–490 Biswas A, Chandrakasan AP (2018) Conv-RAM: an energy-efficient SRAM with embedded convolution computation for low-power CNN-based machine learning applications. In: 2018 IEEE international solid-state circuits conference—(ISSCC), San Francisco, CA, pp 488–490
Zurück zum Zitat Bonyadi MR, Michalewicz Z (2017) Particle swarm optimization for single objective continuous space problems: a review. Evolut Comput 25:1–54CrossRef Bonyadi MR, Michalewicz Z (2017) Particle swarm optimization for single objective continuous space problems: a review. Evolut Comput 25:1–54CrossRef
Zurück zum Zitat Carreira-Perpinan MA, Hinton GE (2005) On contrastive divergence learning. In: 10th international workshop on artificial intelligence and statistics (AISTATS 2005), pp 59–66 Carreira-Perpinan MA, Hinton GE (2005) On contrastive divergence learning. In: 10th international workshop on artificial intelligence and statistics (AISTATS 2005), pp 59–66
Zurück zum Zitat Chandra R (2015) Competition and collaboration in cooperative coevolution of Elman recurrent neural networks for time-series prediction. IEEE Trans Neural Netw Learn Syst 26(12):3123–3136MathSciNetCrossRef Chandra R (2015) Competition and collaboration in cooperative coevolution of Elman recurrent neural networks for time-series prediction. IEEE Trans Neural Netw Learn Syst 26(12):3123–3136MathSciNetCrossRef
Zurück zum Zitat Chen XW, Lin X (2014) Big data deep learning: challenges and perspectives. IEEE Access 2:514–525CrossRef Chen XW, Lin X (2014) Big data deep learning: challenges and perspectives. IEEE Access 2:514–525CrossRef
Zurück zum Zitat Chen S, Liu G, Wu C, Jiang Z, Chen J (2016) Image classification with stacked restricted boltzmann machines and evolutionary function array classification voter. In: 2016 IEEE congress on evolutionary computation (CEC). IEEE, pp 4599–4606 Chen S, Liu G, Wu C, Jiang Z, Chen J (2016) Image classification with stacked restricted boltzmann machines and evolutionary function array classification voter. In: 2016 IEEE congress on evolutionary computation (CEC). IEEE, pp 4599–4606
Zurück zum Zitat Chen J, Zeng GQ, Zhou W, Du W, Lu KD (2018) Wind speed forecasting using nonlinear-learning ensemble of deep learning time series prediction and extremal optimization. Energy Convers Manag 165:681–695CrossRef Chen J, Zeng GQ, Zhou W, Du W, Lu KD (2018) Wind speed forecasting using nonlinear-learning ensemble of deep learning time series prediction and extremal optimization. Energy Convers Manag 165:681–695CrossRef
Zurück zum Zitat Cheung B, Sable C (2011) Hybrid evolution of convolutional networks. In: 2011 10th international conference on machine learning and applications workshops. IEEE, pp 293–297 Cheung B, Sable C (2011) Hybrid evolution of convolutional networks. In: 2011 10th international conference on machine learning and applications workshops. IEEE, pp 293–297
Zurück zum Zitat Corne DW, Reynolds A, Bonabeau E (2012) Swarm intelligence. In: Rozenberg G, Bäck T, Kok JN (eds) Handbook of natural computing. Springer, Berlin, pp 1599–1622CrossRef Corne DW, Reynolds A, Bonabeau E (2012) Swarm intelligence. In: Rozenberg G, Bäck T, Kok JN (eds) Handbook of natural computing. Springer, Berlin, pp 1599–1622CrossRef
Zurück zum Zitat Das S (2013) Evaluating the evolutionary algorithms—classical perspectives and recent trends, in computational intelligence. In: Ishibuchi H (ed) Encyclopedia of life support systems (EOLSS), Developed under the Auspices of the UNESCO, Eolss Publishers, Oxford, UK. http://www.eolss.net Das S (2013) Evaluating the evolutionary algorithms—classical perspectives and recent trends, in computational intelligence. In: Ishibuchi H (ed) Encyclopedia of life support systems (EOLSS), Developed under the Auspices of the UNESCO, Eolss Publishers, Oxford, UK. http://​www.​eolss.​net
Zurück zum Zitat Das S, Mullick SS, Suganthan PN (2016) Recent advances in differential evolution—an updated survey. Swarm Evolut Comput 27:1–30CrossRef Das S, Mullick SS, Suganthan PN (2016) Recent advances in differential evolution—an updated survey. Swarm Evolut Comput 27:1–30CrossRef
Zurück zum Zitat Das S, Datta S, Chaudhuri BB (2018) Handling data irregularities in classification: foundations, trends, and future challenges. Pattern Recognit 81:674–693CrossRef Das S, Datta S, Chaudhuri BB (2018) Handling data irregularities in classification: foundations, trends, and future challenges. Pattern Recognit 81:674–693CrossRef
Zurück zum Zitat David RW (2012) Software review: the ECJ toolkit. Genet Progr Evolvable Mach 13(1):65–67CrossRef David RW (2012) Software review: the ECJ toolkit. Genet Progr Evolvable Mach 13(1):65–67CrossRef
Zurück zum Zitat David OE, Greental I (2014) Genetic algorithms for evolving deep neural networks. In: Proceedings of the companion publication of the 2014 annual conference on genetic and evolutionary computation. ACM, pp 1451–1452 David OE, Greental I (2014) Genetic algorithms for evolving deep neural networks. In: Proceedings of the companion publication of the 2014 annual conference on genetic and evolutionary computation. ACM, pp 1451–1452
Zurück zum Zitat David RC, Precup RE, Petriu EM, Purcaru C, Preitl S (2012) PSO and GSA algorithms for fuzzy controller tuning with reduced process small time constant sensitivity. In: 2012 16th international conference on system theory, control and computing (ICSTCC). IEEE, pp 1–6 David RC, Precup RE, Petriu EM, Purcaru C, Preitl S (2012) PSO and GSA algorithms for fuzzy controller tuning with reduced process small time constant sensitivity. In: 2012 16th international conference on system theory, control and computing (ICSTCC). IEEE, pp 1–6
Zurück zum Zitat Deepa SN, Baranilingesan I (2017) Optimized deep learning neural network predictive controller for continuous stirred tank reactor. Comput Electr Eng 000:1–16 Deepa SN, Baranilingesan I (2017) Optimized deep learning neural network predictive controller for continuous stirred tank reactor. Comput Electr Eng 000:1–16
Zurück zum Zitat Del Ser J, Osaba E, Molina D, Yang X-S, Salcedo-Sanz S, Camacho D, Das S, Suganthan PN, Coello Coello CC, Herrera F (2019) Bio-inspired computation: where we stand and what’s next. Swarm Evolut Comput 48:220–250CrossRef Del Ser J, Osaba E, Molina D, Yang X-S, Salcedo-Sanz S, Camacho D, Das S, Suganthan PN, Coello Coello CC, Herrera F (2019) Bio-inspired computation: where we stand and what’s next. Swarm Evolut Comput 48:220–250CrossRef
Zurück zum Zitat Desell T (2017) Large scale evolution of convolutional neural networks using volunteer computing. In: Proceedings of the genetic and evolutionary computation conference companion. ACM, pp 127–128 Desell T (2017) Large scale evolution of convolutional neural networks using volunteer computing. In: Proceedings of the genetic and evolutionary computation conference companion. ACM, pp 127–128
Zurück zum Zitat Desell T, Clachar S, Higgins J, Wild B (2015) Evolving deep recurrent neural networks using ant colony optimization. In: European conference on evolutionary computation in combinatorial optimization. Springer, Cham, pp 86–98 Desell T, Clachar S, Higgins J, Wild B (2015) Evolving deep recurrent neural networks using ant colony optimization. In: European conference on evolutionary computation in combinatorial optimization. Springer, Cham, pp 86–98
Zurück zum Zitat Duchi J, Hazan E, Singer Y (2011) Adaptive subgradient methods for online learning and stochastic optimization. J Mach Learn Res 12:2121–2159MathSciNetMATH Duchi J, Hazan E, Singer Y (2011) Adaptive subgradient methods for online learning and stochastic optimization. J Mach Learn Res 12:2121–2159MathSciNetMATH
Zurück zum Zitat Dufourq E, Bassett BA (2017) EDEN: evolutionary deep networks for efficient machine learning. In: Pattern recognition association of South Africa and robotics and mechatronics (PRASA-RobMech). IEEE, pp 110–115 Dufourq E, Bassett BA (2017) EDEN: evolutionary deep networks for efficient machine learning. In: Pattern recognition association of South Africa and robotics and mechatronics (PRASA-RobMech). IEEE, pp 110–115
Zurück zum Zitat Durillo JJ, Nebro AJ (2011) jMetal: a Java framework for multi-objective optimization. Adv Eng Softw 42(10):760–771CrossRef Durillo JJ, Nebro AJ (2011) jMetal: a Java framework for multi-objective optimization. Adv Eng Softw 42(10):760–771CrossRef
Zurück zum Zitat Eiben AE, Smit SK (2011) Parameter tuning for configuring and analyzing evolutionary algorithms. Swarm Evolut Comput 1(1):19–31CrossRef Eiben AE, Smit SK (2011) Parameter tuning for configuring and analyzing evolutionary algorithms. Swarm Evolut Comput 1(1):19–31CrossRef
Zurück zum Zitat Elman JL (1990) Finding structure in time. Cognit Sci 14(2):179–211CrossRef Elman JL (1990) Finding structure in time. Cognit Sci 14(2):179–211CrossRef
Zurück zum Zitat ElSaid A, Wild B, Jamiy FE, Higgins J, Desell T (2017) Optimizing LSTM RNNs using ACO to predict turbine engine vibration. In: Proceedings of the genetic and evolutionary computation conference companion. ACM, pp 21–22 ElSaid A, Wild B, Jamiy FE, Higgins J, Desell T (2017) Optimizing LSTM RNNs using ACO to predict turbine engine vibration. In: Proceedings of the genetic and evolutionary computation conference companion. ACM, pp 21–22
Zurück zum Zitat ElSaid A, Jamiy FE, Higgins J, Wild B, Desell T (2018) Using ant colony optimization to optimize long short-term memory recurrent neural networks. In: Proceedings of the genetic and evolutionary computation conference. ACM, pp 13–20 ElSaid A, Jamiy FE, Higgins J, Wild B, Desell T (2018) Using ant colony optimization to optimize long short-term memory recurrent neural networks. In: Proceedings of the genetic and evolutionary computation conference. ACM, pp 13–20
Zurück zum Zitat Erol OK, Eksin I (2006) A new optimization method: big bang-big crunch. Adv Eng Softw 37(2):106–111CrossRef Erol OK, Eksin I (2006) A new optimization method: big bang-big crunch. Adv Eng Softw 37(2):106–111CrossRef
Zurück zum Zitat Fielding B, Zhang L (2018) Evolving image classification architectures with enhanced particle swarm optimisation. In: IEEE Access, vol 6, pp 68560–68575 Fielding B, Zhang L (2018) Evolving image classification architectures with enhanced particle swarm optimisation. In: IEEE Access, vol 6, pp 68560–68575
Zurück zum Zitat Fogel DB (1995) Phenotypes, genotypes, and operators in evolutionary computation. In: IEEE international conference on evolutionary computation, 1995, vol 1. IEEE, p 193 Fogel DB (1995) Phenotypes, genotypes, and operators in evolutionary computation. In: IEEE international conference on evolutionary computation, 1995, vol 1. IEEE, p 193
Zurück zum Zitat Fujino S, Mori N, Matsumoto K (2017) Deep convolutional networks for human sketches by means of the evolutionary deep learning. In: 2017 Joint 17th world congress of international fuzzy systems association and 9th international conference on soft computing and intelligent systems (IFSA-SCIS). IEEE, pp 1–5 Fujino S, Mori N, Matsumoto K (2017) Deep convolutional networks for human sketches by means of the evolutionary deep learning. In: 2017 Joint 17th world congress of international fuzzy systems association and 9th international conference on soft computing and intelligent systems (IFSA-SCIS). IEEE, pp 1–5
Zurück zum Zitat Fukushima K (1980) Neocognitron: a self-organizing neural network model for a mechanism of pattern recognition unaffected by shift in position. Biol Cybern 36:193–202MATHCrossRef Fukushima K (1980) Neocognitron: a self-organizing neural network model for a mechanism of pattern recognition unaffected by shift in position. Biol Cybern 36:193–202MATHCrossRef
Zurück zum Zitat Galloway GS, Catterson VM, Fay T, Robb A, Love C (2016) Diagnosis of tidal turbine vibration data through deep neural networks. In: Third European conference of the prognostics and health management society, pp 172–180 Galloway GS, Catterson VM, Fay T, Robb A, Love C (2016) Diagnosis of tidal turbine vibration data through deep neural networks. In: Third European conference of the prognostics and health management society, pp 172–180
Zurück zum Zitat Gascón-Moreno J, Salcedo-Sanz S, Saavedra-Moreno B, Carro-Calvo L, Portilla-Figueras A (2013) An evolutionary-based hyper-heuristic approach for optimal construction of group method of data handling networks. Inf Sci 247:94–108MathSciNetCrossRef Gascón-Moreno J, Salcedo-Sanz S, Saavedra-Moreno B, Carro-Calvo L, Portilla-Figueras A (2013) An evolutionary-based hyper-heuristic approach for optimal construction of group method of data handling networks. Inf Sci 247:94–108MathSciNetCrossRef
Zurück zum Zitat Gauci J, Stanley K (2007) Generating large-scale neural networks through discovering geometric regularities. In: Proceedings of the 9th annual conference on genetic and evolutionary computation. ACM, pp 997–1004 Gauci J, Stanley K (2007) Generating large-scale neural networks through discovering geometric regularities. In: Proceedings of the 9th annual conference on genetic and evolutionary computation. ACM, pp 997–1004
Zurück zum Zitat Gauriau R, Cuingnet R, Lesage D, Bloch I (2015) Multi-organ localization with cascaded global-to-local regression and shape prior. Med Image Anal 23(1):70–83CrossRef Gauriau R, Cuingnet R, Lesage D, Bloch I (2015) Multi-organ localization with cascaded global-to-local regression and shape prior. Med Image Anal 23(1):70–83CrossRef
Zurück zum Zitat Geng W (2018) Cognitive deep neural networks prediction method for software fault tendency module based on bound particle swarm optimization. Cognit Syst Res 52:12–20CrossRef Geng W (2018) Cognitive deep neural networks prediction method for software fault tendency module based on bound particle swarm optimization. Cognit Syst Res 52:12–20CrossRef
Zurück zum Zitat Girshick R, Donahue J, Darrell T, Malik J (2014) Rich feature hierarchies for accurate object detection and semantic segmentation. CVPR 2014:580–587 Girshick R, Donahue J, Darrell T, Malik J (2014) Rich feature hierarchies for accurate object detection and semantic segmentation. CVPR 2014:580–587
Zurück zum Zitat Glorot X, Bordes A, Bengio Y (2011) Deep sparse rectifier networks. In: AISTATS, vol 15, pp 315–323 Glorot X, Bordes A, Bengio Y (2011) Deep sparse rectifier networks. In: AISTATS, vol 15, pp 315–323
Zurück zum Zitat Gomes L (2014) Machine-learning maestro michael jordan on the delusions of big data and other huge engineering efforts. In: IEEE spectrum, Oct 20 Gomes L (2014) Machine-learning maestro michael jordan on the delusions of big data and other huge engineering efforts. In: IEEE spectrum, Oct 20
Zurück zum Zitat Gong M, Liu J, Li H, Cai Q, Su L (2015) A multiobjective sparse feature learning model for deep neural networks. IEEE Trans Neural Netw Learn Syst 26(12):3263–3277MathSciNetCrossRef Gong M, Liu J, Li H, Cai Q, Su L (2015) A multiobjective sparse feature learning model for deep neural networks. IEEE Trans Neural Netw Learn Syst 26(12):3263–3277MathSciNetCrossRef
Zurück zum Zitat Goodfellow I, Pouget-Abadie J, Mirza M, Xu B, Warde-Farley D, Ozair S, Courville A, Bengio Y (2014) Generative adversarial networks. arXiv:1406.2661 Goodfellow I, Pouget-Abadie J, Mirza M, Xu B, Warde-Farley D, Ozair S, Courville A, Bengio Y (2014) Generative adversarial networks. arXiv:​1406.​2661
Zurück zum Zitat Goodfellow I, Bengio Y, Courville A (2015) Modern practical deep networks. In: Goodfellow I, Bengio Y, Courville A (eds) Deep learning. MIT Press, Cambridge, pp 162–481MATH Goodfellow I, Bengio Y, Courville A (2015) Modern practical deep networks. In: Goodfellow I, Bengio Y, Courville A (eds) Deep learning. MIT Press, Cambridge, pp 162–481MATH
Zurück zum Zitat Greff K, Srivastava RK, Koutník J, Steunebrink BR, Schmidhuber J (2017) LSTM: a search space odyssey. IEEE Trans Neural Netw Learn Syst 28(10):2222–2232MathSciNetCrossRef Greff K, Srivastava RK, Koutník J, Steunebrink BR, Schmidhuber J (2017) LSTM: a search space odyssey. IEEE Trans Neural Netw Learn Syst 28(10):2222–2232MathSciNetCrossRef
Zurück zum Zitat Grievank A (2000) Principles and techniques of algorithmic differentiation: evaluating derivatives. SIAM, Philadelphia Grievank A (2000) Principles and techniques of algorithmic differentiation: evaluating derivatives. SIAM, Philadelphia
Zurück zum Zitat Guo S, Yang Z (2018) Multi-channel-ResNet: an integration framework towards skin lesion analysis. Inform Med Unlocked 12:67–74CrossRef Guo S, Yang Z (2018) Multi-channel-ResNet: an integration framework towards skin lesion analysis. Inform Med Unlocked 12:67–74CrossRef
Zurück zum Zitat Han S, Pool J, Tran J, Dally W (2015) Learning both weights and connections for efficient neural network. In Advances in neural information processing systems, pp 1135–1143 Han S, Pool J, Tran J, Dally W (2015) Learning both weights and connections for efficient neural network. In Advances in neural information processing systems, pp 1135–1143
Zurück zum Zitat Hardt M, Recht B, Singer Y (2015) Train faster, generalize better: stability of stochastic gradient descent. arXiv preprint arXiv:1509.01240 Hardt M, Recht B, Singer Y (2015) Train faster, generalize better: stability of stochastic gradient descent. arXiv preprint arXiv:​1509.​01240
Zurück zum Zitat Hatamlou A (2013) Black hole: a new heuristic optimization approach for data clustering. Inf Sci 222:175–184MathSciNetCrossRef Hatamlou A (2013) Black hole: a new heuristic optimization approach for data clustering. Inf Sci 222:175–184MathSciNetCrossRef
Zurück zum Zitat He K, Zhang X, Ren S, Sun J (2016) Deep residual learning for image recognition, In: 2016 IEEE conference on computer vision and pattern recognition (CVPR), Las Vegas, NV, pp 770–778 He K, Zhang X, Ren S, Sun J (2016) Deep residual learning for image recognition, In: 2016 IEEE conference on computer vision and pattern recognition (CVPR), Las Vegas, NV, pp 770–778
Zurück zum Zitat Hinton GE, Srivastava N, Krizhevsky A, Sutskever I, Salakhutdinov RR (2012a) Improving neural networks by preventing co-adaptation of feature detectors. arXiv preprint arXiv:1207.0580 Hinton GE, Srivastava N, Krizhevsky A, Sutskever I, Salakhutdinov RR (2012a) Improving neural networks by preventing co-adaptation of feature detectors. arXiv preprint arXiv:​1207.​0580
Zurück zum Zitat Hinton G, Deng L, Yu D, Dahl GE, Mohamed AR, Jaitly N et al (2012b) Deep neural networks for acoustic modeling in speech recognition: the shared views of four research groups. IEEE Signal Process Mag 29(6):82–97CrossRef Hinton G, Deng L, Yu D, Dahl GE, Mohamed AR, Jaitly N et al (2012b) Deep neural networks for acoustic modeling in speech recognition: the shared views of four research groups. IEEE Signal Process Mag 29(6):82–97CrossRef
Zurück zum Zitat Hochreiter S, Schmidhuber J (1997) Long short-term memory. Neural Comput 9(8):1735–1780CrossRef Hochreiter S, Schmidhuber J (1997) Long short-term memory. Neural Comput 9(8):1735–1780CrossRef
Zurück zum Zitat Holker G, dos Santos MV (2010) Toward an estimation of distribution algorithm for the evolution of artificial neural networks. In: Proceedings of the third C* conference on computer science and software engineering. ACM, pp 17–22 Holker G, dos Santos MV (2010) Toward an estimation of distribution algorithm for the evolution of artificial neural networks. In: Proceedings of the third C* conference on computer science and software engineering. ACM, pp 17–22
Zurück zum Zitat Horng MH (2017) Fine-tuning parameters of deep belief networks using artificial bee colony algorithm. In: 2017 2nd international conference on artificial intelligence: techniques and applications DEStech transactions on computer science and engineering (AITA 2017) Horng MH (2017) Fine-tuning parameters of deep belief networks using artificial bee colony algorithm. In: 2017 2nd international conference on artificial intelligence: techniques and applications DEStech transactions on computer science and engineering (AITA 2017)
Zurück zum Zitat Huang G, Liu Z, Maaten LVD, Weinberger KQ (2017) Densely connected convolutional networks. In: 2017 IEEE conference on computer vision and pattern recognition (CVPR), Honolulu, HI, 2017, pp 2261–2269 Huang G, Liu Z, Maaten LVD, Weinberger KQ (2017) Densely connected convolutional networks. In: 2017 IEEE conference on computer vision and pattern recognition (CVPR), Honolulu, HI, 2017, pp 2261–2269
Zurück zum Zitat Hubel DH, Wiesel TN (1959) Receptive fields of single neurones in the cat’s striate cortex. J Physiol 148(3):574–591CrossRef Hubel DH, Wiesel TN (1959) Receptive fields of single neurones in the cat’s striate cortex. J Physiol 148(3):574–591CrossRef
Zurück zum Zitat Hubel DH, Wiesel TN (1968) Receptive fields and functional architecture of monkey striate cortex. J Physiol 195(1):215–243CrossRef Hubel DH, Wiesel TN (1968) Receptive fields and functional architecture of monkey striate cortex. J Physiol 195(1):215–243CrossRef
Zurück zum Zitat Ioffe S, Szegedy C (2015) Batch normalization: accelerating deep network training by reducing internal covariate shift. arXiv preprint arXiv:1502.03167 Ioffe S, Szegedy C (2015) Batch normalization: accelerating deep network training by reducing internal covariate shift. arXiv preprint arXiv:​1502.​03167
Zurück zum Zitat Jiang S, Ji Z, Shen Y (2014) A novel hybrid particle swarm optimization and gravitational search algorithm for solving economic emission load dispatch problems with various practical constraints. Int J Electr Power Energy Syst 55:628–644CrossRef Jiang S, Ji Z, Shen Y (2014) A novel hybrid particle swarm optimization and gravitational search algorithm for solving economic emission load dispatch problems with various practical constraints. Int J Electr Power Energy Syst 55:628–644CrossRef
Zurück zum Zitat Jiang S, Chin KS, Wang L, Qu G, Tsui KL (2017) Modified genetic algorithm-based feature selection combined with pre-trained deep neural network for demand forecasting in outpatient department. Exp Syst Appl 82:216–230CrossRef Jiang S, Chin KS, Wang L, Qu G, Tsui KL (2017) Modified genetic algorithm-based feature selection combined with pre-trained deep neural network for demand forecasting in outpatient department. Exp Syst Appl 82:216–230CrossRef
Zurück zum Zitat Junbo T, Weining L, Juneng A, Xueqian W (2015) Fault diagnosis method study in roller bearing based on wavelet transform and stacked auto-encoder. In: The 27th Chinese control and decision conference (2015 CCDC), IEEE 2015, pp 4608–4613 Junbo T, Weining L, Juneng A, Xueqian W (2015) Fault diagnosis method study in roller bearing based on wavelet transform and stacked auto-encoder. In: The 27th Chinese control and decision conference (2015 CCDC), IEEE 2015, pp 4608–4613
Zurück zum Zitat Justesen N, Risi S (2017) Continual online evolutionary planning for in-game build order adaptation in StarCraft. In: Proceedings of the genetic and evolutionary computation conference. ACM, pp 187–194 Justesen N, Risi S (2017) Continual online evolutionary planning for in-game build order adaptation in StarCraft. In: Proceedings of the genetic and evolutionary computation conference. ACM, pp 187–194
Zurück zum Zitat Kang K, Bae C, Yeung HWF, Chung YY (2018) A hybrid gravitational search algorithm with swarm intelligence and deep convolutional feature for object tracking optimization. Appl Soft Comput 66:319–329CrossRef Kang K, Bae C, Yeung HWF, Chung YY (2018) A hybrid gravitational search algorithm with swarm intelligence and deep convolutional feature for object tracking optimization. Appl Soft Comput 66:319–329CrossRef
Zurück zum Zitat Kenny A, Li X (2017) A study on pre-training deep neural networks using particle swarm optimisation. In: Asia-Pacific conference on simulated evolution and learning. Springer, Cham, pp 361–372 Kenny A, Li X (2017) A study on pre-training deep neural networks using particle swarm optimisation. In: Asia-Pacific conference on simulated evolution and learning. Springer, Cham, pp 361–372
Zurück zum Zitat Khalifa MH, Ammar M, Ouarda W, Alimi AM (2017) Particle swarm optimization for deep learning of convolution neural network. In: 2017 Sudan conference on computer science and information technology (SCCSIT). IEEE, pp 1–5 Khalifa MH, Ammar M, Ouarda W, Alimi AM (2017) Particle swarm optimization for deep learning of convolution neural network. In: 2017 Sudan conference on computer science and information technology (SCCSIT). IEEE, pp 1–5
Zurück zum Zitat Kim JK, Han YS, Lee JS (2017) Particle swarm optimization–deep belief network–based rare class prediction model for highly class imbalance problem. Concurr Comput Pract Exp 2017(29):e4128CrossRef Kim JK, Han YS, Lee JS (2017) Particle swarm optimization–deep belief network–based rare class prediction model for highly class imbalance problem. Concurr Comput Pract Exp 2017(29):e4128CrossRef
Zurück zum Zitat Koza JR, Rice JP (1991) Genetic generation of both the weights and architecture for a neural network. In: IJCNN-91-seattle international joint conference on neural networks, vol 2. IEEE, pp 397–404 Koza JR, Rice JP (1991) Genetic generation of both the weights and architecture for a neural network. In: IJCNN-91-seattle international joint conference on neural networks, vol 2. IEEE, pp 397–404
Zurück zum Zitat Kriegman S, Cheney N, Corucci F, Bongard JC (2017) A minimal developmental model can increase evolvability in soft robots. In: Proceedings of the genetic and evolutionary computation conference. ACM, pp 131–138 Kriegman S, Cheney N, Corucci F, Bongard JC (2017) A minimal developmental model can increase evolvability in soft robots. In: Proceedings of the genetic and evolutionary computation conference. ACM, pp 131–138
Zurück zum Zitat Krizhevsky A, Sutskever I, Hinton GE (2012) Imagenet classification with deep convolutional neural networks. In: Advances in neural information processing systems, pp 1097–1105 Krizhevsky A, Sutskever I, Hinton GE (2012) Imagenet classification with deep convolutional neural networks. In: Advances in neural information processing systems, pp 1097–1105
Zurück zum Zitat Kuremoto T, Kimura S, Kobayashi K, Obayashi M (2014) Time series forecasting using a deep belief network with restricted Boltzmann machines. Neurocomputing 137:47–56CrossRef Kuremoto T, Kimura S, Kobayashi K, Obayashi M (2014) Time series forecasting using a deep belief network with restricted Boltzmann machines. Neurocomputing 137:47–56CrossRef
Zurück zum Zitat Lamos-Sweeney J, Gaborski R (2012) Deep learning using genetic algorithms. Master thesis, Institute Thomas Golisano College of Computing and Information Sciences. Advisor Lamos-Sweeney J, Gaborski R (2012) Deep learning using genetic algorithms. Master thesis, Institute Thomas Golisano College of Computing and Information Sciences. Advisor
Zurück zum Zitat Lander S, Shang Y (2015) EvoAE—a new evolutionary method for training autoencoders for deep learning networks. In: 2015 IEEE 39th annual computer software and applications conference (COMPSAC), vol 2. IEEE, pp 790–795 Lander S, Shang Y (2015) EvoAE—a new evolutionary method for training autoencoders for deep learning networks. In: 2015 IEEE 39th annual computer software and applications conference (COMPSAC), vol 2. IEEE, pp 790–795
Zurück zum Zitat LeCun Y, Boser BE, Denker JS, Henderson D, Howard RE, Hubbard WE, Jackel LD (1990) Handwritten digit recognition with a back-propagation network. In: Advances in neural information processing systems, pp 396–404 LeCun Y, Boser BE, Denker JS, Henderson D, Howard RE, Hubbard WE, Jackel LD (1990) Handwritten digit recognition with a back-propagation network. In: Advances in neural information processing systems, pp 396–404
Zurück zum Zitat LeCun Y, Bottou L, Bengio Y, Haffner P (1998) Gradient-based learning applied to document recognition. Proc IEEE 86(11):2278–2324CrossRef LeCun Y, Bottou L, Bengio Y, Haffner P (1998) Gradient-based learning applied to document recognition. Proc IEEE 86(11):2278–2324CrossRef
Zurück zum Zitat Lee H, Pham P, Largman Y, Ng AY (2009) Unsupervised feature learning for audio classification using convolutional deep belief networks. In: Advances in neural information processing systems, pp 1096–1104 Lee H, Pham P, Largman Y, Ng AY (2009) Unsupervised feature learning for audio classification using convolutional deep belief networks. In: Advances in neural information processing systems, pp 1096–1104
Zurück zum Zitat Leke C, Ndjiongue AR, Twala B, Marwala T (2017) A deep learning-cuckoo search method for missing data estimation in high-dimensional datasets. In: International conference in swarm intelligence. Springer, Cham, pp 561–572 Leke C, Ndjiongue AR, Twala B, Marwala T (2017) A deep learning-cuckoo search method for missing data estimation in high-dimensional datasets. In: International conference in swarm intelligence. Springer, Cham, pp 561–572
Zurück zum Zitat Leung FHF, Lam HK, Ling SH, Tam PKS (2003) Tuning of the structure and parameters of a neural network using an improved genetic algorithm. IEEE Trans Neural Netw 14(1):79–88CrossRef Leung FHF, Lam HK, Ling SH, Tam PKS (2003) Tuning of the structure and parameters of a neural network using an improved genetic algorithm. IEEE Trans Neural Netw 14(1):79–88CrossRef
Zurück zum Zitat Liang J, Meyerson E, Miikkulainen R (2018) Evolutionary architecture search for deep multitask networks. In: GECCO 18: genetic and evolutionary computation conference, July 15–19, Kyoto, Japan. ACM, New York, NY, USA Liang J, Meyerson E, Miikkulainen R (2018) Evolutionary architecture search for deep multitask networks. In: GECCO 18: genetic and evolutionary computation conference, July 15–19, Kyoto, Japan. ACM, New York, NY, USA
Zurück zum Zitat Lieto A, Radicioni DP, Cruciani M (eds) Proceedings of the second international workshop on artificial intelligence and cognition, pp 164–171 Lieto A, Radicioni DP, Cruciani M (eds) Proceedings of the second international workshop on artificial intelligence and cognition, pp 164–171
Zurück zum Zitat Liu Q, Wang Z, He X, Zhou DH (2015a) Event-based H ∞ consensus control of multiagent systems with relative output feedback: the finite-horizon case. IEEE Trans Autom Control 60(9):2553–2558MATHCrossRef Liu Q, Wang Z, He X, Zhou DH (2015a) Event-based H ∞ consensus control of multiagent systems with relative output feedback: the finite-horizon case. IEEE Trans Autom Control 60(9):2553–2558MATHCrossRef
Zurück zum Zitat Liu X, Gao J, He X, Deng L, Duh K, Wang YY (2015b) Representation learning using multi-task deep neural networks for semantic classification and information retrieval. In: Proc. of NAACL, pp 912–921 Liu X, Gao J, He X, Deng L, Duh K, Wang YY (2015b) Representation learning using multi-task deep neural networks for semantic classification and information retrieval. In: Proc. of NAACL, pp 912–921
Zurück zum Zitat Liu S, Hou Z, Yin C (2016) Data-driven modeling for UGI gasification processes via an enhanced genetic BP neural network with link switches. IEEE Trans Neural Netw Learn Syst 27(12):2718–2729CrossRef Liu S, Hou Z, Yin C (2016) Data-driven modeling for UGI gasification processes via an enhanced genetic BP neural network with link switches. IEEE Trans Neural Netw Learn Syst 27(12):2718–2729CrossRef
Zurück zum Zitat Liu Q, Wang Z, He X, Ghinea G, Alsaadi FE (2017) A resilient approach to distributed filter design for time-varying systems under stochastic nonlinearities and sensor degradation. IEEE Trans Signal Process 65(5):1300–1309MathSciNetMATHCrossRef Liu Q, Wang Z, He X, Ghinea G, Alsaadi FE (2017) A resilient approach to distributed filter design for time-varying systems under stochastic nonlinearities and sensor degradation. IEEE Trans Signal Process 65(5):1300–1309MathSciNetMATHCrossRef
Zurück zum Zitat Liu H, Simonyan K, Vinyals O, Fernando C, Kavukcuoglu K (2018a) Hierarchical representations for efficient architecture search. In: Sixth international conference on learning representations (ICLR 2018). Canada Liu H, Simonyan K, Vinyals O, Fernando C, Kavukcuoglu K (2018a) Hierarchical representations for efficient architecture search. In: Sixth international conference on learning representations (ICLR 2018). Canada
Zurück zum Zitat Liu J, Gong M, Miao Q, Wang X, Li H (2018b) Structure learning for deep neural networks based on multiobjective optimization. IEEE Trans Neural Netw Learn Syst 29(6):2450–2463MathSciNetCrossRef Liu J, Gong M, Miao Q, Wang X, Li H (2018b) Structure learning for deep neural networks based on multiobjective optimization. IEEE Trans Neural Netw Learn Syst 29(6):2450–2463MathSciNetCrossRef
Zurück zum Zitat López-Ibáñez M, Stützle T, Dorigo M (2018) Ant colony optimization: a component-wise overview. In: Handbook of heuristics, pp 371–407 López-Ibáñez M, Stützle T, Dorigo M (2018) Ant colony optimization: a component-wise overview. In: Handbook of heuristics, pp 371–407
Zurück zum Zitat Lopez-Rincon A, Tonda A, Elati M, Schwander O, Piwowarski B, Gallinari P (2018) Evolutionary optimization of convolutional neural networks for cancer miRNA biomarkers classification. Appl Soft Comput 65:91–100CrossRef Lopez-Rincon A, Tonda A, Elati M, Schwander O, Piwowarski B, Gallinari P (2018) Evolutionary optimization of convolutional neural networks for cancer miRNA biomarkers classification. Appl Soft Comput 65:91–100CrossRef
Zurück zum Zitat Lorenzo PR, Nalepa J (2018) Memetic evolution of deep neural networks. In: Proceedings of the genetic and evolutionary computation conference. ACM, pp 505–512 Lorenzo PR, Nalepa J (2018) Memetic evolution of deep neural networks. In: Proceedings of the genetic and evolutionary computation conference. ACM, pp 505–512
Zurück zum Zitat Lorenzo PR, Nalepa J, Kawulok M, Ramos LS, Pastor JR (2017) Particle swarm optimization for hyper-parameter selection in deep neural networks. In: Proceedings of the genetic and evolutionary computation conference. ACM, pp 481–488 Lorenzo PR, Nalepa J, Kawulok M, Ramos LS, Pastor JR (2017) Particle swarm optimization for hyper-parameter selection in deep neural networks. In: Proceedings of the genetic and evolutionary computation conference. ACM, pp 481–488
Zurück zum Zitat Lu C, Wang ZY, Qin WL, Ma J (2017) Fault diagnosis of rotary machinery components using a stacked denoising autoencoder-based health state identification. Signal Process 130:377–388CrossRef Lu C, Wang ZY, Qin WL, Ma J (2017) Fault diagnosis of rotary machinery components using a stacked denoising autoencoder-based health state identification. Signal Process 130:377–388CrossRef
Zurück zum Zitat Ma L, Wang Z, Lam HK (2017a) Event-triggered mean-square consensus control for time-varying stochastic multi-agent system with sensor saturations. IEEE Trans Autom Control 62(7):3524–3531MathSciNetMATHCrossRef Ma L, Wang Z, Lam HK (2017a) Event-triggered mean-square consensus control for time-varying stochastic multi-agent system with sensor saturations. IEEE Trans Autom Control 62(7):3524–3531MathSciNetMATHCrossRef
Zurück zum Zitat Ma L, Wang Z, Lam HK (2017b) Mean-square H∞ consensus control for a class of nonlinear time-varying stochastic multiagent systems: the finite-horizon case. IEEE Trans Syst Man Cybern Syst 47(7):1050–1060CrossRef Ma L, Wang Z, Lam HK (2017b) Mean-square H∞ consensus control for a class of nonlinear time-varying stochastic multiagent systems: the finite-horizon case. IEEE Trans Syst Man Cybern Syst 47(7):1050–1060CrossRef
Zurück zum Zitat Mandischer M (2002) A comparison of evolution strategies and backpropagation for neural network training. Neurocomputing 42(1–4):87–117MATHCrossRef Mandischer M (2002) A comparison of evolution strategies and backpropagation for neural network training. Neurocomputing 42(1–4):87–117MATHCrossRef
Zurück zum Zitat Mandt S, Hoffman M, Blei D (2016) A variational analysis of stochastic gradient algorithms. In: International conference on machine learning, pp 354–363 Mandt S, Hoffman M, Blei D (2016) A variational analysis of stochastic gradient algorithms. In: International conference on machine learning, pp 354–363
Zurück zum Zitat Maravall D, de Lope J (2009) Hybridizing evolutionary computation and reinforcement learning for the design of almost universal controllers for autonomous robots. Neurocomputing 72(4–6):887–894CrossRef Maravall D, de Lope J (2009) Hybridizing evolutionary computation and reinforcement learning for the design of almost universal controllers for autonomous robots. Neurocomputing 72(4–6):887–894CrossRef
Zurück zum Zitat Martin A, Lara-Cabrera R, Fuentes-Hurtado F, Naranjo V, Camacho D (2018) EvoDeep: a new evolutionary approach for automatic deep neural networks parametrisation. J Parallel Distrib Comput 117:180–191CrossRef Martin A, Lara-Cabrera R, Fuentes-Hurtado F, Naranjo V, Camacho D (2018) EvoDeep: a new evolutionary approach for automatic deep neural networks parametrisation. J Parallel Distrib Comput 117:180–191CrossRef
Zurück zum Zitat Miikkulainen R (2017) Neuroevolution. In: Encyclopedia of machine learning and data mining, pp 899–904 Miikkulainen R (2017) Neuroevolution. In: Encyclopedia of machine learning and data mining, pp 899–904
Zurück zum Zitat Mirjalili S, Andrew L (2016) The whale optimization algorithm. Adv Eng Softw 95:51–67CrossRef Mirjalili S, Andrew L (2016) The whale optimization algorithm. Adv Eng Softw 95:51–67CrossRef
Zurück zum Zitat Mirjalili S, Mirjalili SM, Lewis A (2014) Grey wolf optimizer. Adv Eng Softw 69:46–61CrossRef Mirjalili S, Mirjalili SM, Lewis A (2014) Grey wolf optimizer. Adv Eng Softw 69:46–61CrossRef
Zurück zum Zitat Mukhopadhyay A, Maulik U, Bandyopadhyay S (2015) A survey of multiobjective evolutionary clustering. ACM Comput Surv 47(4):61:1–61:46CrossRef Mukhopadhyay A, Maulik U, Bandyopadhyay S (2015) A survey of multiobjective evolutionary clustering. ACM Comput Surv 47(4):61:1–61:46CrossRef
Zurück zum Zitat Nair V, Hinton GE (2010) Rectified linear units improve restricted boltzmann machines. In: Proceedings of the 27th international conference on machine learning (ICML-10), pp 807–814 Nair V, Hinton GE (2010) Rectified linear units improve restricted boltzmann machines. In: Proceedings of the 27th international conference on machine learning (ICML-10), pp 807–814
Zurück zum Zitat Neri F, Cotta C (2012) Memetic algorithms and memetic computing optimization: a literature review. Swarm Evolut Comput 2:1–14CrossRef Neri F, Cotta C (2012) Memetic algorithms and memetic computing optimization: a literature review. Swarm Evolut Comput 2:1–14CrossRef
Zurück zum Zitat Neyshabur B, Salakhutdinov RR, Srebro N (2015) Path-sgd: path-normalized optimization in deep neural networks. In: Advances in neural information processing systems, pp 2422–2430 Neyshabur B, Salakhutdinov RR, Srebro N (2015) Path-sgd: path-normalized optimization in deep neural networks. In: Advances in neural information processing systems, pp 2422–2430
Zurück zum Zitat Papa JP, Scheirer W, Cox DD (2016) Fine-tuning deep belief networks using harmony search. Appl Soft Comput 46:875–885CrossRef Papa JP, Scheirer W, Cox DD (2016) Fine-tuning deep belief networks using harmony search. Appl Soft Comput 46:875–885CrossRef
Zurück zum Zitat Parker A, Nitschke G (2017) Autonomous intersection driving with neuro-evolution. In: Proceedings of the genetic and evolutionary computation conference companion. ACM, pp 133–134 Parker A, Nitschke G (2017) Autonomous intersection driving with neuro-evolution. In: Proceedings of the genetic and evolutionary computation conference companion. ACM, pp 133–134
Zurück zum Zitat Passino KM (2002) Biomimicry of bacterial foraging for distributed optimization and control. IEEE Control Syst 22(3):52–67MathSciNetCrossRef Passino KM (2002) Biomimicry of bacterial foraging for distributed optimization and control. IEEE Control Syst 22(3):52–67MathSciNetCrossRef
Zurück zum Zitat Passos LA, Rodrigues DR, Papa JP (2018) Fine tuning deep boltzmann machines through meta-heuristic approaches. In: 2018 IEEE 12th international symposium on applied computational intelligence and informatics (SACI). IEEE, pp 000419–000424 Passos LA, Rodrigues DR, Papa JP (2018) Fine tuning deep boltzmann machines through meta-heuristic approaches. In: 2018 IEEE 12th international symposium on applied computational intelligence and informatics (SACI). IEEE, pp 000419–000424
Zurück zum Zitat Pawełczyk K, Kawulok M, Nalepa J (2018) Genetically-trained deep neural networks. In: Proceedings of the genetic and evolutionary computation conference companion. ACM, pp 63–64 Pawełczyk K, Kawulok M, Nalepa J (2018) Genetically-trained deep neural networks. In: Proceedings of the genetic and evolutionary computation conference companion. ACM, pp 63–64
Zurück zum Zitat Peña-Reyes CA, Sipper M (2000) Evolutionary computation in medicine: an overview. Artif Intell Med 19(1):1–23CrossRef Peña-Reyes CA, Sipper M (2000) Evolutionary computation in medicine: an overview. Artif Intell Med 19(1):1–23CrossRef
Zurück zum Zitat Peng L, Liu S, Liu R, Wang L (2018) Effective long short-term memory with differential evolution algorithm for electricity price prediction. Energy 162(2018):1301–1314CrossRef Peng L, Liu S, Liu R, Wang L (2018) Effective long short-term memory with differential evolution algorithm for electricity price prediction. Energy 162(2018):1301–1314CrossRef
Zurück zum Zitat Piotrowski AP (2014) Differential evolution algorithms applied to neural network training suffer from stagnation. Appl Soft Comput 21:382–406CrossRef Piotrowski AP (2014) Differential evolution algorithms applied to neural network training suffer from stagnation. Appl Soft Comput 21:382–406CrossRef
Zurück zum Zitat Rajasekhar A, Lynn N, Das S, Suganthan PN (2017) Computing with the collective intelligence of honey bees–a survey. Swarm Evolut Comput 32:25–48CrossRef Rajasekhar A, Lynn N, Das S, Suganthan PN (2017) Computing with the collective intelligence of honey bees–a survey. Swarm Evolut Comput 32:25–48CrossRef
Zurück zum Zitat Rao RV, Savsani VJ, Vakharia DP (2011) Teaching–learning-based optimization: a novel method for constrained mechanical design optimization problems. Comput Aided Des 43(3):303–315CrossRef Rao RV, Savsani VJ, Vakharia DP (2011) Teaching–learning-based optimization: a novel method for constrained mechanical design optimization problems. Comput Aided Des 43(3):303–315CrossRef
Zurück zum Zitat Rashedi E, Nezamabadi-Pour H, Saryazdi S (2009) GSA: a gravitational search algorithm. Inf Sci 179(13):2232–2248MATHCrossRef Rashedi E, Nezamabadi-Pour H, Saryazdi S (2009) GSA: a gravitational search algorithm. Inf Sci 179(13):2232–2248MATHCrossRef
Zurück zum Zitat Rawal A, Miikkulainen R (2016) Evolving deep LSTM-based memory networks using an information maximization objective. In: Friedrich T (ed) Proceedings of the genetic and evolutionary computation conference 2016 (GECCO’16). ACM, New York, NY, USA, pp 501–508 Rawal A, Miikkulainen R (2016) Evolving deep LSTM-based memory networks using an information maximization objective. In: Friedrich T (ed) Proceedings of the genetic and evolutionary computation conference 2016 (GECCO’16). ACM, New York, NY, USA, pp 501–508
Zurück zum Zitat Real E, Moore S, Selle A, Saxena S, Suematsu YL, Tan J, Le QV, Kurakin A (2017) Large-scale evolution of image classifiers. ICML 2017:2902–2911 Real E, Moore S, Selle A, Saxena S, Suematsu YL, Tan J, Le QV, Kurakin A (2017) Large-scale evolution of image classifiers. ICML 2017:2902–2911
Zurück zum Zitat Real E, Aggarwal A, Huang Y, Le QV (2018) Regularized evolution for image classifier architecture search. arXiv preprint arXiv:1802.01548 Real E, Aggarwal A, Huang Y, Le QV (2018) Regularized evolution for image classifier architecture search. arXiv preprint arXiv:​1802.​01548
Zurück zum Zitat Reddy KK, Sarkar S, Venugopalan V, Giering M (2016) Anomaly detection and fault disambiguation in large flight data: a multi-modal deep auto-encoder approach. In: Annual conference of the prognostics and health management society, Denver, Colorado, pp 1–8 Reddy KK, Sarkar S, Venugopalan V, Giering M (2016) Anomaly detection and fault disambiguation in large flight data: a multi-modal deep auto-encoder approach. In: Annual conference of the prognostics and health management society, Denver, Colorado, pp 1–8
Zurück zum Zitat Risi S, Stanley KO (2012) A unified approach to evolving plasticity and neural geometry. In: International joint conference on neural networks. IEEE, pp 1–8 Risi S, Stanley KO (2012) A unified approach to evolving plasticity and neural geometry. In: International joint conference on neural networks. IEEE, pp 1–8
Zurück zum Zitat Rosa G, Papa J, Marana A, Scheirer W, Cox D (2015) Fine-tuning convolutional neural networks using harmony search. In: Iberoamerican congress on pattern recognition. Springer, Cham, pp 683–690 Rosa G, Papa J, Marana A, Scheirer W, Cox D (2015) Fine-tuning convolutional neural networks using harmony search. In: Iberoamerican congress on pattern recognition. Springer, Cham, pp 683–690
Zurück zum Zitat Rosa G, Papa J, Costa K, Passos L, Pereira C, Yang XS (2016) Learning parameters in deep belief networks through firefly algorithm. In: IAPR workshop on artificial neural networks in pattern recognition. Springer, Cham, pp 138–149 Rosa G, Papa J, Costa K, Passos L, Pereira C, Yang XS (2016) Learning parameters in deep belief networks through firefly algorithm. In: IAPR workshop on artificial neural networks in pattern recognition. Springer, Cham, pp 138–149
Zurück zum Zitat Salakhutdinov R, Hinton GE (2009) Deep Boltzmann machines. In: AISTATS: 1, p 3 Salakhutdinov R, Hinton GE (2009) Deep Boltzmann machines. In: AISTATS: 1, p 3
Zurück zum Zitat Salakhutdinov R, Larochelle H (2010) Efficient learning of deep Boltzmann machines. In: Proceedings of the thirteenth international conference on artificial intelligence and statistics, pp 693–700 Salakhutdinov R, Larochelle H (2010) Efficient learning of deep Boltzmann machines. In: Proceedings of the thirteenth international conference on artificial intelligence and statistics, pp 693–700
Zurück zum Zitat Salimans T, Ho J, Chen X, Sidor S, Sutskever I (2017) Evolution strategies as a scalable alternative to reinforcement learning. arXiv:1703.03864 Salimans T, Ho J, Chen X, Sidor S, Sutskever I (2017) Evolution strategies as a scalable alternative to reinforcement learning. arXiv:​1703.​03864
Zurück zum Zitat Sánchez D, Melin P, Castillo O (2017) A grey Wolf optimizer for modular granular neural networks for human recognition. Comput Intell Neurosci 2017:1–26CrossRef Sánchez D, Melin P, Castillo O (2017) A grey Wolf optimizer for modular granular neural networks for human recognition. Comput Intell Neurosci 2017:1–26CrossRef
Zurück zum Zitat Sarikaya R, Hinton GE, Deoras A (2014) Application of deep belief networks for natural language understanding. IEEE/ACM Trans Audio Speech Lang Process (TASLP) 22(4):778–784CrossRef Sarikaya R, Hinton GE, Deoras A (2014) Application of deep belief networks for natural language understanding. IEEE/ACM Trans Audio Speech Lang Process (TASLP) 22(4):778–784CrossRef
Zurück zum Zitat Schmidhuber J (2015) Deep learning in neural networks: an overview. Neural Netw 61:85–117CrossRef Schmidhuber J (2015) Deep learning in neural networks: an overview. Neural Netw 61:85–117CrossRef
Zurück zum Zitat Shafiee M, Wong A (2016) Evolutionary synthesis of deep neural networks via synaptic cluster-driven genetic encoding. In: NIPS Workshop on efficient methods for deep neural networks. Thirtieth conference on neural information processing systems, Barcelona, Spain, Dec 5–10, 2016 Shafiee M, Wong A (2016) Evolutionary synthesis of deep neural networks via synaptic cluster-driven genetic encoding. In: NIPS Workshop on efficient methods for deep neural networks. Thirtieth conference on neural information processing systems, Barcelona, Spain, Dec 5–10, 2016
Zurück zum Zitat Shenfield A, Rostami S (2017) Multi-objective evolution of artificial neural networks in multi-class medical diagnosis problems with class imbalance. In: 2017 IEEE conference on computational intelligence in bioinformatics and computational biology (CIBCB). IEEE, pp 1–8 Shenfield A, Rostami S (2017) Multi-objective evolution of artificial neural networks in multi-class medical diagnosis problems with class imbalance. In: 2017 IEEE conference on computational intelligence in bioinformatics and computational biology (CIBCB). IEEE, pp 1–8
Zurück zum Zitat Shi Y (2011) An optimization algorithm based on brainstorming process. Int J Swarm Intell Res 2(4):35–62CrossRef Shi Y (2011) An optimization algorithm based on brainstorming process. Int J Swarm Intell Res 2(4):35–62CrossRef
Zurück zum Zitat Shinozaki T, Watanabe S (2015) Structure discovery of deep neural network based on evolutionary algorithms. In: 2015 IEEE international conference on acoustics, speech, and signal processing, ICASSP 2015—proceedings, vol 2015-August, [7178918] Institute of Electrical and Electronics Engineers Inc., pp 4979–4983. https://doi.org/10.1109/icassp.2015.7178918 Shinozaki T, Watanabe S (2015) Structure discovery of deep neural network based on evolutionary algorithms. In: 2015 IEEE international conference on acoustics, speech, and signal processing, ICASSP 2015—proceedings, vol 2015-August, [7178918] Institute of Electrical and Electronics Engineers Inc., pp 4979–4983. https://​doi.​org/​10.​1109/​icassp.​2015.​7178918
Zurück zum Zitat Silver D, Huang A, Maddison CJ, Guez A, Sifre L, Van Den Driessche G (2016) Mastering the game of Go with deep neural networks and tree search. Nature 529(7587):484–489CrossRef Silver D, Huang A, Maddison CJ, Guez A, Sifre L, Van Den Driessche G (2016) Mastering the game of Go with deep neural networks and tree search. Nature 529(7587):484–489CrossRef
Zurück zum Zitat Simon D (2013) Evolutionary optimization algorithms. Wiley, New York Simon D (2013) Evolutionary optimization algorithms. Wiley, New York
Zurück zum Zitat Simonyan K, Zisserman A (2015) Very deep convolutional networks for large-scale image recognition. In: ICLR Simonyan K, Zisserman A (2015) Very deep convolutional networks for large-scale image recognition. In: ICLR
Zurück zum Zitat Singh P, Dwivedi P (2018) Integration of new evolutionary approach with artificial neural network for solving short term load forecast problem. In: Applied energy, vol 217(C). Elsevier, pp 537–549 Singh P, Dwivedi P (2018) Integration of new evolutionary approach with artificial neural network for solving short term load forecast problem. In: Applied energy, vol 217(C). Elsevier, pp 537–549
Zurück zum Zitat Song J, Niu Y (2016) Resilient finite-time stabilization of fuzzy stochastic systems with randomly occurring uncertainties and randomly occurring gain fluctuations. Neurocomputing 171:444–451CrossRef Song J, Niu Y (2016) Resilient finite-time stabilization of fuzzy stochastic systems with randomly occurring uncertainties and randomly occurring gain fluctuations. Neurocomputing 171:444–451CrossRef
Zurück zum Zitat Song YS, Hu J, Chen D, Ji D, Liu F (2016) Recursive approach to networked fault estimation with packet dropouts and randomly occurring uncertainties. Neurocomputing 214:340–349CrossRef Song YS, Hu J, Chen D, Ji D, Liu F (2016) Recursive approach to networked fault estimation with packet dropouts and randomly occurring uncertainties. Neurocomputing 214:340–349CrossRef
Zurück zum Zitat Srivastava N, Hinton G, Krizhevsky A, Sutskever I, Salakhutdinov R (2014a) Dropout: a simple way to prevent neural networks from overfitting. J Mach Learn Res 15(1):1929–1958MathSciNetMATH Srivastava N, Hinton G, Krizhevsky A, Sutskever I, Salakhutdinov R (2014a) Dropout: a simple way to prevent neural networks from overfitting. J Mach Learn Res 15(1):1929–1958MathSciNetMATH
Zurück zum Zitat Srivastava N, Hinton G, Krizhevsky A, Sutskever I, Salakhutdinov R (2014b) Dropout: a simple way to prevent neural networks from overfitting. J Mach Learn Res 15(1):1929–1958MathSciNetMATH Srivastava N, Hinton G, Krizhevsky A, Sutskever I, Salakhutdinov R (2014b) Dropout: a simple way to prevent neural networks from overfitting. J Mach Learn Res 15(1):1929–1958MathSciNetMATH
Zurück zum Zitat Stanley KO, Miikkulainen R (2002) Evolving neural networks through augmenting topologies. Evolut Comput 10(2):99–127CrossRef Stanley KO, Miikkulainen R (2002) Evolving neural networks through augmenting topologies. Evolut Comput 10(2):99–127CrossRef
Zurück zum Zitat Stanley KO, Clune J, Lehman J, Miikkulainen R (2019) Designing neural networks through neuroevolution. Nat Mach Intell 1:24–35CrossRef Stanley KO, Clune J, Lehman J, Miikkulainen R (2019) Designing neural networks through neuroevolution. Nat Mach Intell 1:24–35CrossRef
Zurück zum Zitat Suganthan PN (2018) On non-iterative learning algorithms with closed-form solution. Appl Soft Comput 70:1078–1082CrossRef Suganthan PN (2018) On non-iterative learning algorithms with closed-form solution. Appl Soft Comput 70:1078–1082CrossRef
Zurück zum Zitat Sun Y, Yen GG, Yi Z (2018b) Evolving unsupervised deep neural networks for learning meaningful representations. IEEE Trans Evolut Comput 23:89–103CrossRef Sun Y, Yen GG, Yi Z (2018b) Evolving unsupervised deep neural networks for learning meaningful representations. IEEE Trans Evolut Comput 23:89–103CrossRef
Zurück zum Zitat Szegedy C, Liu W, Jia Y, Sermanet P, Reed S, Anguelov D, Erhan D, Vanhoucke V, Rabinovich A (2015) Going deeper with convolutions. In: 2015 IEEE conference on computer vision and pattern recognition (CVPR), Boston, MA, pp 1–9 Szegedy C, Liu W, Jia Y, Sermanet P, Reed S, Anguelov D, Erhan D, Vanhoucke V, Rabinovich A (2015) Going deeper with convolutions. In: 2015 IEEE conference on computer vision and pattern recognition (CVPR), Boston, MA, pp 1–9
Zurück zum Zitat Takase T, Oyama S, Kurihara M (2018) Effective neural network training with adaptive learning rate based on training loss. Neural Netw 101:68–78CrossRef Takase T, Oyama S, Kurihara M (2018) Effective neural network training with adaptive learning rate based on training loss. Neural Netw 101:68–78CrossRef
Zurück zum Zitat Tan Y, Zhu Y (2010) Fireworks algorithm for optimization. In: International conference in swarm intelligence. Springer, Berlin, pp 355–364 Tan Y, Zhu Y (2010) Fireworks algorithm for optimization. In: International conference in swarm intelligence. Springer, Berlin, pp 355–364
Zurück zum Zitat Tan SC, Watada J, Ibrahim Z, Khalid M (2015) Evolutionary fuzzy ARTMAP neural networks for classification of semiconductor defects. IEEE Trans Neural Netw Learn Syst 26(5):933–950MathSciNetCrossRef Tan SC, Watada J, Ibrahim Z, Khalid M (2015) Evolutionary fuzzy ARTMAP neural networks for classification of semiconductor defects. IEEE Trans Neural Netw Learn Syst 26(5):933–950MathSciNetCrossRef
Zurück zum Zitat Team TTD, Al-Rfou R, Alain G, Almahairi A, Angermueller C, Bahdanau D et al (2016) Theano: a python framework for fast computation of mathematical expressions. arXiv preprint arXiv:1605.02688 Team TTD, Al-Rfou R, Alain G, Almahairi A, Angermueller C, Bahdanau D et al (2016) Theano: a python framework for fast computation of mathematical expressions. arXiv preprint arXiv:​1605.​02688
Zurück zum Zitat Thirukovalluru R, Dixit S, Sevakula RK, Verma NK, Salour A (2016) Generating feature sets for fault diagnosis using denoising stacked auto-encoder. In: 2016 IEEE international conference on prognostics and health management (ICPHM). IEEE, pp 1–7 Thirukovalluru R, Dixit S, Sevakula RK, Verma NK, Salour A (2016) Generating feature sets for fault diagnosis using denoising stacked auto-encoder. In: 2016 IEEE international conference on prognostics and health management (ICPHM). IEEE, pp 1–7
Zurück zum Zitat Tieleman T, Hinton GE (2012) Lecture 6.5—rmsprop, COURSERA: neural networks for machine learning Tieleman T, Hinton GE (2012) Lecture 6.5—rmsprop, COURSERA: neural networks for machine learning
Zurück zum Zitat Tirumala SS (2014) Implementation of evolutionary algorithms for deep architectures. CEUR workshop proceedings Tirumala SS (2014) Implementation of evolutionary algorithms for deep architectures. CEUR workshop proceedings
Zurück zum Zitat Tomoumi T, Satoshi O, Masahito K (2018) Effective neural network training with adaptive learning rate based on training loss. Neural Netw 101:68–78CrossRef Tomoumi T, Satoshi O, Masahito K (2018) Effective neural network training with adaptive learning rate based on training loss. Neural Netw 101:68–78CrossRef
Zurück zum Zitat Trivedi A, Srinivasan D, Sanyal K, Ghosh A (2017) A survey of multiobjective evolutionary algorithms based on decomposition. IEEE Trans Evolut Comput 21(3):440–462 Trivedi A, Srinivasan D, Sanyal K, Ghosh A (2017) A survey of multiobjective evolutionary algorithms based on decomposition. IEEE Trans Evolut Comput 21(3):440–462
Zurück zum Zitat Vincent P, Larochelle H, Lajoie I, Bengio Y, Manzagol PA (2010) Stacked denoising autoencoders: learning useful representations in a deep network with a local denoising criterion. J Mach Learn Res 11:3371–3408MathSciNetMATH Vincent P, Larochelle H, Lajoie I, Bengio Y, Manzagol PA (2010) Stacked denoising autoencoders: learning useful representations in a deep network with a local denoising criterion. J Mach Learn Res 11:3371–3408MathSciNetMATH
Zurück zum Zitat Wan L, Zeiler M, Zhang S, Le Cun Y, Fergus R (2013) Regularization of neural networks using dropconnect. In: International conference on machine learning, pp 1058–1066 Wan L, Zeiler M, Zhang S, Le Cun Y, Fergus R (2013) Regularization of neural networks using dropconnect. In: International conference on machine learning, pp 1058–1066
Zurück zum Zitat Wang B, Merrick KE, Abbass HA (2017) Co-operative coevolutionary neural networks for mining functional association rules. IEEE Trans Neural Netw Learn Syst 28(6):1331–1344CrossRef Wang B, Merrick KE, Abbass HA (2017) Co-operative coevolutionary neural networks for mining functional association rules. IEEE Trans Neural Netw Learn Syst 28(6):1331–1344CrossRef
Zurück zum Zitat Wang B, Sun Y, Xue B, Zhang M (2018a) A hybrid differential evolution approach to designing deep convolutional neural networks for image classification. In: The Australasian joint conference on artificial intelligence (AI 2018). Springer, pp 237–250 Wang B, Sun Y, Xue B, Zhang M (2018a) A hybrid differential evolution approach to designing deep convolutional neural networks for image classification. In: The Australasian joint conference on artificial intelligence (AI 2018). Springer, pp 237–250
Zurück zum Zitat Wang B, Sun Y, Xue B, Zhang M (2018b) Evolving deep convolutional neural networks by variable-length particle swarm optimization for image classification. arXiv preprint arXiv:1803.06492 Wang B, Sun Y, Xue B, Zhang M (2018b) Evolving deep convolutional neural networks by variable-length particle swarm optimization for image classification. arXiv preprint arXiv:​1803.​06492
Zurück zum Zitat Wang R, Clune J, Stanley KO (2018c) VINE: an open source interactive data visualization tool for neuroevolution. In: GECCO ‘18 companion: genetic and evolutionary computation conference companion, July 15–19, Kyoto, Japan. ACM, New York, NY, USA Wang R, Clune J, Stanley KO (2018c) VINE: an open source interactive data visualization tool for neuroevolution. In: GECCO ‘18 companion: genetic and evolutionary computation conference companion, July 15–19, Kyoto, Japan. ACM, New York, NY, USA
Zurück zum Zitat Wiatowski T, Bölcskei H (2018) A mathematical theory of deep convolutional neural networks for feature extraction. In: IEEE transactions on information theory, vol 64(3), pp 1845–1866 Wiatowski T, Bölcskei H (2018) A mathematical theory of deep convolutional neural networks for feature extraction. In: IEEE transactions on information theory, vol 64(3), pp 1845–1866
Zurück zum Zitat Wu ZY, Rahaman A (2017) Optimized deep learning framework for water distribution data-driven modeling. In: XVIII international conference on water distribution systems analysis, WDSA2016, Procedia Engineering, vol 186, pp 261–268 Wu ZY, Rahaman A (2017) Optimized deep learning framework for water distribution data-driven modeling. In: XVIII international conference on water distribution systems analysis, WDSA2016, Procedia Engineering, vol 186, pp 261–268
Zurück zum Zitat Xie L, Yuille A (2017) Genetic CNN. In: 2017 IEEE international conference on computer vision (ICCV), Venice, pp 1388–1397 Xie L, Yuille A (2017) Genetic CNN. In: 2017 IEEE international conference on computer vision (ICCV), Venice, pp 1388–1397
Zurück zum Zitat Yang XS (2010) Nature-inspired metaheuristic algorithms, 2nd edn. Luniver Press, Frome Yang XS (2010) Nature-inspired metaheuristic algorithms, 2nd edn. Luniver Press, Frome
Zurück zum Zitat Yang H, Wang Z, Shu H, Alsaadi FE, Hayat T (2016) Almost sure H∞ sliding mode control for nonlinear stochastic systems with Markovian switching and time-delays. Neurocomputing 175(Part A):392–400CrossRef Yang H, Wang Z, Shu H, Alsaadi FE, Hayat T (2016) Almost sure H∞ sliding mode control for nonlinear stochastic systems with Markovian switching and time-delays. Neurocomputing 175(Part A):392–400CrossRef
Zurück zum Zitat Yao X (1999) Evolving artificial neural networks. Proc IEEE 87(9):1423–1447CrossRef Yao X (1999) Evolving artificial neural networks. Proc IEEE 87(9):1423–1447CrossRef
Zurück zum Zitat Yao X, Liu Y (1997) A new evolutionary system for evolving artificial neural networks. IEEE Trans Neural Netw Learn Syst 8(3):694–713CrossRef Yao X, Liu Y (1997) A new evolutionary system for evolving artificial neural networks. IEEE Trans Neural Netw Learn Syst 8(3):694–713CrossRef
Zurück zum Zitat Ye F (2017) Particle swarm optimization-based automatic parameter selection for deep neural networks and its applications in large-scale and high-dimensional data. PLoS ONE 12(12):e0188746CrossRef Ye F (2017) Particle swarm optimization-based automatic parameter selection for deep neural networks and its applications in large-scale and high-dimensional data. PLoS ONE 12(12):e0188746CrossRef
Zurück zum Zitat Yuan Y, Sun F, Liu H, Yang H (2014a) Low-frequency robust control for singularly perturbed system. IET Control Theory Appl 9(2):203–210MathSciNetCrossRef Yuan Y, Sun F, Liu H, Yang H (2014a) Low-frequency robust control for singularly perturbed system. IET Control Theory Appl 9(2):203–210MathSciNetCrossRef
Zurück zum Zitat Yuan Z, Lu Y, Wang Z, Xue Y (2014b) Droid-sec: deep learning in android malware detection. In: ACM SIGCOMM computer communication review, vol 44(4). ACM., pp 371–372 Yuan Z, Lu Y, Wang Z, Xue Y (2014b) Droid-sec: deep learning in android malware detection. In: ACM SIGCOMM computer communication review, vol 44(4). ACM., pp 371–372
Zurück zum Zitat Yuan Z, Lu Y, Xue Y (2016) Droiddetector: android malware characterization and detection using deep learning. Tsinghua Sci Technol 21(1):114–123CrossRef Yuan Z, Lu Y, Xue Y (2016) Droiddetector: android malware characterization and detection using deep learning. Tsinghua Sci Technol 21(1):114–123CrossRef
Zurück zum Zitat Zhang C, Lim P, Qin AK, Tan KC (2017a) Multiobjective deep belief networks ensemble for remaining useful life estimation in prognostics. IEEE Trans Neural Netw Learn Syst 28(10):2306–2318CrossRef Zhang C, Lim P, Qin AK, Tan KC (2017a) Multiobjective deep belief networks ensemble for remaining useful life estimation in prognostics. IEEE Trans Neural Netw Learn Syst 28(10):2306–2318CrossRef
Zurück zum Zitat Zhong Z, Yan J, Liu C-L (2018) Practical network blocks design with q-learning. In; Proceedings of the IEEE conference on computer vision and pattern recognition (CVPR 2018), pp 2423–2432 Zhong Z, Yan J, Liu C-L (2018) Practical network blocks design with q-learning. In; Proceedings of the IEEE conference on computer vision and pattern recognition (CVPR 2018), pp 2423–2432
Zurück zum Zitat Zhou C, Paffenroth RC (2017) Anomaly detection with robust deep autoencoders. In: Proceedings of the 23rd ACM SIGKDD international conference on knowledge discovery and data mining. ACM, pp 665–674 Zhou C, Paffenroth RC (2017) Anomaly detection with robust deep autoencoders. In: Proceedings of the 23rd ACM SIGKDD international conference on knowledge discovery and data mining. ACM, pp 665–674
Zurück zum Zitat Zhou S, Chen Q, Wang X (2010) Discriminative deep belief networks for image classification. In 2010 17th IEEE international conference on image processing (ICIP). IEEE, pp 1561–1564 Zhou S, Chen Q, Wang X (2010) Discriminative deep belief networks for image classification. In 2010 17th IEEE international conference on image processing (ICIP). IEEE, pp 1561–1564
Zurück zum Zitat Zhou A, Qu BY, Li H, Zhao SZ, Suganthan PN, Zhang Q (2011) Multiobjective evolutionary algorithms: a survey of the state of the art. Swarm Evolut Comput 1(1):32–49CrossRef Zhou A, Qu BY, Li H, Zhao SZ, Suganthan PN, Zhang Q (2011) Multiobjective evolutionary algorithms: a survey of the state of the art. Swarm Evolut Comput 1(1):32–49CrossRef
Zurück zum Zitat Zhu G, Lizotte D, Hoey J (2014) Scalable approximate policies for Markov decision process models of hospital elective admissions. Artif Intell Med 61(1):21–34CrossRef Zhu G, Lizotte D, Hoey J (2014) Scalable approximate policies for Markov decision process models of hospital elective admissions. Artif Intell Med 61(1):21–34CrossRef
Zurück zum Zitat Zoph B, Vasudevan V, Shlens J, Le QV (2017) Learning transferable architectures for scalable image recognition. arXiv preprint arXiv:1707.07012 Zoph B, Vasudevan V, Shlens J, Le QV (2017) Learning transferable architectures for scalable image recognition. arXiv preprint arXiv:​1707.​07012
Metadaten
Titel
A survey of swarm and evolutionary computing approaches for deep learning
verfasst von
Ashraf Darwish
Aboul Ella Hassanien
Swagatam Das
Publikationsdatum
13.06.2019
Verlag
Springer Netherlands
Erschienen in
Artificial Intelligence Review / Ausgabe 3/2020
Print ISSN: 0269-2821
Elektronische ISSN: 1573-7462
DOI
https://doi.org/10.1007/s10462-019-09719-2

Weitere Artikel der Ausgabe 3/2020

Artificial Intelligence Review 3/2020 Zur Ausgabe

Premium Partner