Skip to main content
Erschienen in: Programming and Computer Software 5/2018

01.09.2018

Adaptation of General Concepts of Software Testing to Neural Networks

verfasst von: Yu. L. Karpov, L. E. Karpov, Yu. G. Smetanin

Erschienen in: Programming and Computer Software | Ausgabe 5/2018

Einloggen

Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.

search-config
loading …

Abstract

The problem of testing and debugging learning neural network systems is discussed. Differences of these systems from program implementations of algorithms from the point of view of testing are noted. Requirements to the testing systems are identified. Specific features of various neural network models from the point of view of selection of the testing technique and determination of tested parameters are analyzed. It is discussed how to get rid of the noted drawbacks of the systems under study. The discussion is illustrated by an example.

Sie haben noch keine Lizenz? Dann Informieren Sie sich jetzt über unsere Produkte:

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Fußnoten
1
Function а(х) is called a sigmoid function if is continuously differentiable, monotonically increasing, and is bounded from below and above.
 
2
In binary networks, neuron values are sometimes defined to be (1, 0). In this case, the activation function is the Heaviside function \(\mathcal{Q}(z) = \left\{ \begin{gathered} 1,\quad z \geqslant 0, \hfill \\ 0,\quad z < 0. \hfill \\ \end{gathered} \right.\) Transition from these values to the standard one (1, –1) is trivial: \(y = 2x - 1\).
 
Literatur
1.
Zurück zum Zitat Ciresan, D., Meier, U., Masci, J., and Schmidhuber, J., Multi-column deep neural network for traffic sign classification, in Neural Networks. Selected Papers from IJCNN, 2011, vol. 32, pp. 333–338. Ciresan, D., Meier, U., Masci, J., and Schmidhuber, J., Multi-column deep neural network for traffic sign classification, in Neural Networks. Selected Papers from IJCNN, 2011, vol. 32, pp. 333–338.
2.
Zurück zum Zitat CES 2015: Nvidia Demos a Car Computer Trained with “Deep Learning”, A commercial device uses powerful image and information processing to let cars interpret camera views, David Talbot, January 6, 2015, MIT Technology Review; Schmidt. CES 2015: Nvidia Demos a Car Computer Trained with “Deep Learning”, A commercial device uses powerful image and information processing to let cars interpret camera views, David Talbot, January 6, 2015, MIT Technology Review; Schmidt.
3.
Zurück zum Zitat Roth, S., Shrinkage fields for effective image restoration, Proc. of the IEEE Conf. on Computer Vision and Pattern Recognition (CVPR), 2014. Roth, S., Shrinkage fields for effective image restoration, Proc. of the IEEE Conf. on Computer Vision and Pattern Recognition (CVPR), 2014.
4.
Zurück zum Zitat Deng, L. and Yu, D., Deep learning: Methods and applications, Found. Trends Signal Process., 2014, vol. 7, nos. 3–4, pp. 1–19.MathSciNetCrossRef Deng, L. and Yu, D., Deep learning: Methods and applications, Found. Trends Signal Process., 2014, vol. 7, nos. 3–4, pp. 1–19.MathSciNetCrossRef
5.
Zurück zum Zitat Gao, Jianfeng, He, Xiaodong, Yih, Scott Wen-tau, and Deng, Li, Learning continuous phrase representations for translation modeling, 2014, Microsoft Research, www.aclweb.org/anthology/P14-1066. Gao, Jianfeng, He, Xiaodong, Yih, Scott Wen-tau, and Deng, Li, Learning continuous phrase representations for translation modeling, 2014, Microsoft Research, www.aclweb.org/anthology/P14-1066.
6.
Zurück zum Zitat Chicco, D., Sadowski, P., and Baldi, P., Deep autoencoder neural networks for gene ontology annotation predictions, Proc. of the 5th ACM Conf. on Bioinformatics, Computational Biology, and Health Informatics, pp. 533–540. Chicco, D., Sadowski, P., and Baldi, P., Deep autoencoder neural networks for gene ontology annotation predictions, Proc. of the 5th ACM Conf. on Bioinformatics, Computational Biology, and Health Informatics, pp. 533–540.
7.
Zurück zum Zitat Sathyanarayana, A., Joty, S., Fernandez-Luque, L., Ofli, F., Srivastava, J., Elmagarmid, A., Arora, T., and Taheri, S., Sleep quality prediction from wearable data using deep learning, JMIR Mhealth Uhealth, 2016, vol. 4, no. 4, p. e125.CrossRef Sathyanarayana, A., Joty, S., Fernandez-Luque, L., Ofli, F., Srivastava, J., Elmagarmid, A., Arora, T., and Taheri, S., Sleep quality prediction from wearable data using deep learning, JMIR Mhealth Uhealth, 2016, vol. 4, no. 4, p. e125.CrossRef
8.
Zurück zum Zitat Movahedi, F., Coyle, J.L., and Sejdic, E., Deep belief networks for electroencephalography: A review of recent contributions and future outlooks, IEEE J. Biomed Health Inform, 2018, vol. 3, pp. 642–652.CrossRef Movahedi, F., Coyle, J.L., and Sejdic, E., Deep belief networks for electroencephalography: A review of recent contributions and future outlooks, IEEE J. Biomed Health Inform, 2018, vol. 3, pp. 642–652.CrossRef
9.
Zurück zum Zitat Choi, E., Schuetz, A., and Stewart, W.F., Sun, Jimeng, Using recurrent neural network models for early detection of heart failure onset, J. Am. Med. Inform. Assoc., 2016, doi: . doi 10.1093/jamia/ocw112 Choi, E., Schuetz, A., and Stewart, W.F., Sun, Jimeng, Using recurrent neural network models for early detection of heart failure onset, J. Am. Med. Inform. Assoc., 2016, doi: . doi 10.1093/jamia/ocw112
10.
Zurück zum Zitat Elkahky, A.M., Song, Y., and He, X., A multi-view deep learning approach for cross domain user modeling in recommendation systems, Microsoft Research. http: //sonyis.me/paperpdf/frp1159-songA-www-2015.pdf. Elkahky, A.M., Song, Y., and He, X., A multi-view deep learning approach for cross domain user modeling in recommendation systems, Microsoft Research. http: //sonyis.me/paperpdf/frp1159-songA-www-2015.pdf.
11.
Zurück zum Zitat Yamins, D.L.K. and DiCarlo, J.J., Using goal-driven deep learning models to understand sensory cortex, Nat. Neurosci., 2016, vol. 19, no. 3, pp. 356–365.CrossRef Yamins, D.L.K. and DiCarlo, J.J., Using goal-driven deep learning models to understand sensory cortex, Nat. Neurosci., 2016, vol. 19, no. 3, pp. 356–365.CrossRef
12.
Zurück zum Zitat Zorzi, M. and Testolin, A., An emergentist perspective on the origin of number sense, Phil. Trans. R. Soc. B, 2018, vol. 373, no. 1740. Zorzi, M. and Testolin, A., An emergentist perspective on the origin of number sense, Phil. Trans. R. Soc. B, 2018, vol. 373, no. 1740.
13.
Zurück zum Zitat Morel, D., Singh, C., and Levy, W.B., Linearization of excitatory synaptic integration at no extra cost, J. Comput. Neurosci., 2018, vol. 44, no. 2, pp. 173–188.MathSciNetCrossRef Morel, D., Singh, C., and Levy, W.B., Linearization of excitatory synaptic integration at no extra cost, J. Comput. Neurosci., 2018, vol. 44, no. 2, pp. 173–188.MathSciNetCrossRef
14.
Zurück zum Zitat IEEE 829. Standard for Software Test Documentation. IEEE 1008. Standard for Software Unit Testing. https: //www.twirpx.com/file/1615980/. IEEE 829. Standard for Software Test Documentation. IEEE 1008. Standard for Software Unit Testing. https: //www.twirpx.com/file/1615980/.
15.
Zurück zum Zitat ISO/MЭK 12119. Program packages. Requirements to quality and testing. http://docs.cntd.ru/document/1200025075. ISO/MЭK 12119. Program packages. Requirements to quality and testing. http://​docs.​cntd.​ru/​document/​1200025075.​
16.
Zurück zum Zitat GOST R 56920-2016, GOST R 56921-2016, GOST R 56922-2016. https://allgosts.ru. GOST R 56920-2016, GOST R 56921-2016, GOST R 56922-2016. https://​allgosts.​ru.​
17.
Zurück zum Zitat ISO/IEC 29119-2013 1-5. Software testing. http:// files.stroyinf.ru/Data2/1/4293754/4293754866.pdf. ISO/IEC 29119-2013 1-5. Software testing. http:// files.stroyinf.ru/Data2/1/4293754/4293754866.pdf.
18.
Zurück zum Zitat GOST R 12207-2010, ISO/IEC 12207:2008. http:// docs.cntd.ru/document/1200082859 GOST R 12207-2010, ISO/IEC 12207:2008. http:// docs.cntd.ru/document/1200082859
19.
Zurück zum Zitat Beizer, B., Black-Box Testing: Techniques for Functional Testing of Software and Systems, Wiley, 1995. Beizer, B., Black-Box Testing: Techniques for Functional Testing of Software and Systems, Wiley, 1995.
20.
Zurück zum Zitat Dusting, E., Rashka, J., and Paul, J., Automated Software Testing. Introduction, Management and Performance, Addison Wesley, 1999. Dusting, E., Rashka, J., and Paul, J., Automated Software Testing. Introduction, Management and Performance, Addison Wesley, 1999.
21.
Zurück zum Zitat Louise Tamres, Introducing Software Testing, Addison Wesley, 2002. Louise Tamres, Introducing Software Testing, Addison Wesley, 2002.
22.
Zurück zum Zitat Kuliamin, V.V., Petrenko, A.K., Kossatchev, A.S., and Burdonov, I.B., The UniTesK approach to designing test suites, Program. Comput. Software, 2003, no. 6, pp. 310–322. Kuliamin, V.V., Petrenko, A.K., Kossatchev, A.S., and Burdonov, I.B., The UniTesK approach to designing test suites, Program. Comput. Software, 2003, no. 6, pp. 310–322.
23.
Zurück zum Zitat Burdonov, I.B., Kossatchev, A.S., and Kuliamin, V.V., Teoriya sootvetstviya dlya sistem s blokirovkami i razrusheniem (Correspondence Theory for Systems with Blockings and Destruction), Moscow: Nauka, 2008. Burdonov, I.B., Kossatchev, A.S., and Kuliamin, V.V., Teoriya sootvetstviya dlya sistem s blokirovkami i razrusheniem (Correspondence Theory for Systems with Blockings and Destruction), Moscow: Nauka, 2008.
24.
Zurück zum Zitat Ivannikov, V.P., Petrenko, A.K., Kuliamin, V.V., and Maksimov, A.V., Experience of using UniTESK as a mirror of model-based testing technology development, Tr. Inst. Sistemnogo Program. Ross. Akad. Nauk, 2013, vol. 24, pp. 207–218. Ivannikov, V.P., Petrenko, A.K., Kuliamin, V.V., and Maksimov, A.V., Experience of using UniTESK as a mirror of model-based testing technology development, Tr. Inst. Sistemnogo Program. Ross. Akad. Nauk, 2013, vol. 24, pp. 207–218.
25.
Zurück zum Zitat Kuliamin, V.V. and Petrenko, A.K., Evolution of the UniTESK test development technology. Program. Comput. Software, 2014, vol. 24, no. 5, pp. 296—304.CrossRef Kuliamin, V.V. and Petrenko, A.K., Evolution of the UniTESK test development technology. Program. Comput. Software, 2014, vol. 24, no. 5, pp. 296—304.CrossRef
26.
Zurück zum Zitat Yenigun, H., Kushik, N., Lopez, J., Yevtushenko, N., and Cavalli, A.R., Decreasing the complexity of deriving test suites against nondeterministic finite state machines, Proc. of East-West Design \(\& \) Test Symposium (EWDTS), 2017, IEEE Xplore, pp. 1–4. Yenigun, H., Kushik, N., Lopez, J., Yevtushenko, N., and Cavalli, A.R., Decreasing the complexity of deriving test suites against nondeterministic finite state machines, Proc. of East-West Design \(\& \) Test Symposium (EWDTS), 2017, IEEE Xplore, pp. 1–4.
27.
Zurück zum Zitat Beck, K., Test-Driven Development: By Example, Addison-Wesley, 2003. Beck, K., Test-Driven Development: By Example, Addison-Wesley, 2003.
28.
Zurück zum Zitat Astels, D., Test-Driven Development. A Practical Guide, Prentice Hall, 2003. Astels, D., Test-Driven Development. A Practical Guide, Prentice Hall, 2003.
29.
Zurück zum Zitat Rosenblatt, F., Principles of Neurodynamics: Perceptrons and the Theory of Brain Mechanisms, Washington DC: Spartan Books, 1961.CrossRefMATH Rosenblatt, F., Principles of Neurodynamics: Perceptrons and the Theory of Brain Mechanisms, Washington DC: Spartan Books, 1961.CrossRefMATH
30.
Zurück zum Zitat Rumelhart, D.E. Hinton, G.E., and Williams, R.J., Learning Internal Representations by Error Propagation, 1986. Rumelhart, D.E. Hinton, G.E., and Williams, R.J., Learning Internal Representations by Error Propagation, 1986.
31.
Zurück zum Zitat Parallel distributed processing: Explorations in the microstructure of cognition, vol. 1: Foundation, Rumelhart, D.E. and McClelland, J.L., Eds., MIT Press, 1986. Parallel distributed processing: Explorations in the microstructure of cognition, vol. 1: Foundation, Rumelhart, D.E. and McClelland, J.L., Eds., MIT Press, 1986.
32.
Zurück zum Zitat Hopfield, J.J., Neural networks and physical systems with emergent collective computational abilities, Proc. Nat. Acad. Sci. USA, 1982, vol. 79 no. 8, pp. 2554–2558.MathSciNetCrossRefMATH Hopfield, J.J., Neural networks and physical systems with emergent collective computational abilities, Proc. Nat. Acad. Sci. USA, 1982, vol. 79 no. 8, pp. 2554–2558.MathSciNetCrossRefMATH
33.
Zurück zum Zitat Ackley, D.H., Hinton, G.E., and Sejnowski, T.J., A learning algorithm for Boltzmann machines, Cogn. Sci., 1985, vol. 9, no. 1, pp. 147–169.CrossRef Ackley, D.H., Hinton, G.E., and Sejnowski, T.J., A learning algorithm for Boltzmann machines, Cogn. Sci., 1985, vol. 9, no. 1, pp. 147–169.CrossRef
34.
Zurück zum Zitat Kohonen, T., Self-organized formation of topologically correct feature maps, Biol. Cybernet., 1982, vol. 43, no. 1, pp. 59—69.MathSciNetCrossRefMATH Kohonen, T., Self-organized formation of topologically correct feature maps, Biol. Cybernet., 1982, vol. 43, no. 1, pp. 59—69.MathSciNetCrossRefMATH
35.
Zurück zum Zitat Ivakhnenko, A.G. and Lapa, V.G., Kiberneticheskie predskazyvayushchie ustroistva (Cybernetic Forecasting Devices), Kiev: Naukova Dumka, 1965 (in Russian). Ivakhnenko, A.G. and Lapa, V.G., Kiberneticheskie predskazyvayushchie ustroistva (Cybernetic Forecasting Devices), Kiev: Naukova Dumka, 1965 (in Russian).
36.
Zurück zum Zitat Ivakhnenko, A.G. and Lapa, V.G., Cybernetics and Forecasting Techniques, New York: Elsevier, 1967. Ivakhnenko, A.G. and Lapa, V.G., Cybernetics and Forecasting Techniques, New York: Elsevier, 1967.
37.
Zurück zum Zitat Fukushima, K., Neocognitron: A self-organizing neural network model for a mechanism of pattern recognition unaffected by shift in position, Biol. Cybern., 1980, vol. 36, pp. 193–202.CrossRefMATH Fukushima, K., Neocognitron: A self-organizing neural network model for a mechanism of pattern recognition unaffected by shift in position, Biol. Cybern., 1980, vol. 36, pp. 193–202.CrossRefMATH
38.
Zurück zum Zitat Yann, Lecun., Boser, B., Denker, J.S., Henderson, D., Howard, R.E., Hubbard, W., and Jackel, L.D., Backpropagation applied to handwritten zip code recognition, Neural Comput., 1989, vol. 1, no. 4, pp. 541—551.CrossRef Yann, Lecun., Boser, B., Denker, J.S., Henderson, D., Howard, R.E., Hubbard, W., and Jackel, L.D., Backpropagation applied to handwritten zip code recognition, Neural Comput., 1989, vol. 1, no. 4, pp. 541—551.CrossRef
39.
Zurück zum Zitat Hinton, G.E., Osindero, S., and Teh, Y.W., A fast learning algorithm for deep belief nets, Neural Comput., vol. 18, pp. 1527–1554. http://dx.doi.org/ doi 10.1162/ neco.2006.18.7.1527 Hinton, G.E., Osindero, S., and Teh, Y.W., A fast learning algorithm for deep belief nets, Neural Comput., vol. 18, pp. 1527–1554. http://​dx.​doi.​org/​ doi 10.1162/ neco.2006.18.7.1527
40.
Zurück zum Zitat Hinton, G.E., Learning multiple layers of representation, Trends Cogn. Sci., 2007, vol. 11, no. 10, pp. 428–434.CrossRef Hinton, G.E., Learning multiple layers of representation, Trends Cogn. Sci., 2007, vol. 11, no. 10, pp. 428–434.CrossRef
41.
Zurück zum Zitat Rumelhart, D.E., Hinton, G.E., and Williams, R.J., Learning internal representations by backpropagating errors, Nature, 1986, vol. 323, pp. 533–536.CrossRefMATH Rumelhart, D.E., Hinton, G.E., and Williams, R.J., Learning internal representations by backpropagating errors, Nature, 1986, vol. 323, pp. 533–536.CrossRefMATH
42.
Zurück zum Zitat Floreen, P., Worst-case convergence times for Hopfield memories, IEEE Trans. Neural Networks, 1991, vol. 2, no. 5, pp. 533–535.CrossRef Floreen, P., Worst-case convergence times for Hopfield memories, IEEE Trans. Neural Networks, 1991, vol. 2, no. 5, pp. 533–535.CrossRef
43.
Zurück zum Zitat Floreen, P., The convergence of Hamming memory networks, IEEE Trans. Neural Networks, 1991, vol. 2, no. 4, pp. 449–457.CrossRef Floreen, P., The convergence of Hamming memory networks, IEEE Trans. Neural Networks, 1991, vol. 2, no. 4, pp. 449–457.CrossRef
44.
Zurück zum Zitat Utgoff, P.E. and Stracuzzi, D.J., Many-layered learning, Neural Comput., 2002, vol. 14, pp. 2497–2529.CrossRefMATH Utgoff, P.E. and Stracuzzi, D.J., Many-layered learning, Neural Comput., 2002, vol. 14, pp. 2497–2529.CrossRefMATH
45.
Zurück zum Zitat Jeffrey, L., Elman, J.L., Bates, E.A., Johnson, M.H., Karmiloff-Smith, A., Parisi, D., and Plunkett, K., Rethinking Innateness: A Connectionist Perspective on Development, Cambridge: MIT Press, 1996. Jeffrey, L., Elman, J.L., Bates, E.A., Johnson, M.H., Karmiloff-Smith, A., Parisi, D., and Plunkett, K., Rethinking Innateness: A Connectionist Perspective on Development, Cambridge: MIT Press, 1996.
46.
Zurück zum Zitat Shrager, J. and Johnson, M.H., Dynamic plasticity influences the emergence of function in a simple cortical array, Neural Networks, 1996, vol. 9, no. 7, pp. 1119–1129.CrossRef Shrager, J. and Johnson, M.H., Dynamic plasticity influences the emergence of function in a simple cortical array, Neural Networks, 1996, vol. 9, no. 7, pp. 1119–1129.CrossRef
47.
Zurück zum Zitat Quartz, S.R. and Sejnowski, T.J., The neural basis of cognitive development: A constructivist manifesto, Behav. Brain Sci., 1997, vol. 20, no. 4, pp. 537–556. Quartz, S.R. and Sejnowski, T.J., The neural basis of cognitive development: A constructivist manifesto, Behav. Brain Sci., 1997, vol. 20, no. 4, pp. 537–556.
48.
Zurück zum Zitat Kaiming, He, Xiangyu, Zhang., Shaoqing, Ren, and Jian, Sun, Identity mappings in deep residual networks, Proc. of Europ. Conf. on Computer Vision, 2016, pp. 630–645. Kaiming, He, Xiangyu, Zhang., Shaoqing, Ren, and Jian, Sun, Identity mappings in deep residual networks, Proc. of Europ. Conf. on Computer Vision, 2016, pp. 630–645.
49.
Zurück zum Zitat Ivakhnenko, A., Polynomial theory of complex systems, IEEE Trans. Systems, Man Cybernet., 1971, vol. 4, no. 1, pp. 364–378.MathSciNetCrossRef Ivakhnenko, A., Polynomial theory of complex systems, IEEE Trans. Systems, Man Cybernet., 1971, vol. 4, no. 1, pp. 364–378.MathSciNetCrossRef
50.
Zurück zum Zitat Bengio, Y., Boulanger-Lewandowski, N., and Pascanu, R., Advances in optimizing recurrent networks, 2013 IEEE Int. Conf. on Acoustics, Speech and Signal Processing, 2013, pp. 8624–8628. arXiv:1212.0901v2 [cs.LG] Bengio, Y., Boulanger-Lewandowski, N., and Pascanu, R., Advances in optimizing recurrent networks, 2013 IEEE Int. Conf. on Acoustics, Speech and Signal Processing, 2013, pp. 8624–8628. arXiv:1212.0901v2 [cs.LG]
51.
Zurück zum Zitat Dahl, G., Sainath, T., and Hinton, G., Improving DNNs for LVCSR using rectified linear units and dropout, Proc. of Int. Conf. on Acoustics, Speech and Signal Processing, 2011, pp. 8609–8613. Dahl, G., Sainath, T., and Hinton, G., Improving DNNs for LVCSR using rectified linear units and dropout, Proc. of Int. Conf. on Acoustics, Speech and Signal Processing, 2011, pp. 8609–8613.
52.
Zurück zum Zitat Hinton, G.E., Srivastava, N., Krizhevsky, A., Sutskever, I., and Salakhutdinov, R.R., Improving neural networks by preventing co-adaptation of feature detectors, 2012, arXiv:1207.0580. Hinton, G.E., Srivastava, N., Krizhevsky, A., Sutskever, I., and Salakhutdinov, R.R., Improving neural networks by preventing co-adaptation of feature detectors, 2012, arXiv:1207.0580.
53.
Zurück zum Zitat Hinton, G.E. and Salakhutdinov, R.R., Reducing the dimensionality of data with neural networks, Science, 2006, vol. 313, no. 5786, pp. 504–507.MathSciNetCrossRefMATH Hinton, G.E. and Salakhutdinov, R.R., Reducing the dimensionality of data with neural networks, Science, 2006, vol. 313, no. 5786, pp. 504–507.MathSciNetCrossRefMATH
54.
Zurück zum Zitat Kuliamin, V.V., Tekhnologii programmirovaniya. Komponentnyi podkhod (Programming Technologies: Component Approach), Moscow: BINOM, 2007 (in Russian). Kuliamin, V.V., Tekhnologii programmirovaniya. Komponentnyi podkhod (Programming Technologies: Component Approach), Moscow: BINOM, 2007 (in Russian).
55.
Zurück zum Zitat Floreen, P., Orponen, P., Attraction radii in binary Hopfield nets are hard to compute, Neural Comput., 1993, vol. 5, pp. 812–821.CrossRef Floreen, P., Orponen, P., Attraction radii in binary Hopfield nets are hard to compute, Neural Comput., 1993, vol. 5, pp. 812–821.CrossRef
Metadaten
Titel
Adaptation of General Concepts of Software Testing to Neural Networks
verfasst von
Yu. L. Karpov
L. E. Karpov
Yu. G. Smetanin
Publikationsdatum
01.09.2018
Verlag
Pleiades Publishing
Erschienen in
Programming and Computer Software / Ausgabe 5/2018
Print ISSN: 0361-7688
Elektronische ISSN: 1608-3261
DOI
https://doi.org/10.1134/S0361768818050031

Weitere Artikel der Ausgabe 5/2018

Programming and Computer Software 5/2018 Zur Ausgabe