Skip to main content

2019 | OriginalPaper | Buchkapitel

7. Hopfield Networks, Simulated Annealing, and Chaotic Neural Networks

verfasst von : Ke-Lin Du, M. N. S. Swamy

Erschienen in: Neural Networks and Statistical Learning

Verlag: Springer London

Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.

search-config
loading …

Abstract

Hopfield model is the most popular dynamic model. Simulated annealing, inspired by annealing in metallurgy, is a metaheuristic to approximate global optimization in a large search space. The annealing concept is widely used in the training of recurrent neural networks. Chaotic neural networks are recurrent neural networks introduced with chaotic dynamics. The cellular network is a generalization of the Hopfield network to a two- or higher dimensional array of cells. This chapter is dedicated to these topics. They are widely used for solving combinatorial optimization problems.

Sie haben noch keine Lizenz? Dann Informieren Sie sich jetzt über unsere Produkte:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Literatur
1.
Zurück zum Zitat Aarts, E., & Korst, J. (1989). Simulated annealing and Boltzmann machines. Chichester: John Wiley.MATH Aarts, E., & Korst, J. (1989). Simulated annealing and Boltzmann machines. Chichester: John Wiley.MATH
2.
Zurück zum Zitat Abe, S., Kawakami, J., & Hirasawa, K. (1992). Solving inequality constrained combinatorial optimization problems by the Hopfield neural networks. Neural Networks, 5, 663–670.CrossRef Abe, S., Kawakami, J., & Hirasawa, K. (1992). Solving inequality constrained combinatorial optimization problems by the Hopfield neural networks. Neural Networks, 5, 663–670.CrossRef
3.
Zurück zum Zitat Adachi, M., & Aihara, K. (1997). Associative dynamics in a chaotic neural network. Neural Networks, 10(1), 83–98.CrossRef Adachi, M., & Aihara, K. (1997). Associative dynamics in a chaotic neural network. Neural Networks, 10(1), 83–98.CrossRef
4.
5.
Zurück zum Zitat Aomori, H., Otake, T., Takahashi, N., & Tanaka, M. (2008). Sigma-delta cellular neural network for 2D modulation. Neural Networks, 21, 349–357.CrossRef Aomori, H., Otake, T., Takahashi, N., & Tanaka, M. (2008). Sigma-delta cellular neural network for 2D modulation. Neural Networks, 21, 349–357.CrossRef
6.
Zurück zum Zitat Azencott, R. (1992). Simulated annealing: Parallelization techniques. New York: Wiley.MATH Azencott, R. (1992). Simulated annealing: Parallelization techniques. New York: Wiley.MATH
7.
Zurück zum Zitat Bruck, J. (1990). On the convergence properties of the Hopfield model. Proceedings of the IEEE, 78(10), 1579–1585.CrossRef Bruck, J. (1990). On the convergence properties of the Hopfield model. Proceedings of the IEEE, 78(10), 1579–1585.CrossRef
8.
Zurück zum Zitat Chakraborty, K., Mehrotra, K. G., Mohan, C. K., & Ranka, S. (1992). An optimization network for solving a set of simultaneous linear equations. In Proceedings of the International Joint Conference on Neural Networks (Vol. 2, pp. 516–521). Baltimore, MD. Chakraborty, K., Mehrotra, K. G., Mohan, C. K., & Ranka, S. (1992). An optimization network for solving a set of simultaneous linear equations. In Proceedings of the International Joint Conference on Neural Networks (Vol. 2, pp. 516–521). Baltimore, MD.
9.
Zurück zum Zitat Chen, L., & Aihara, K. (1995). Chaotic simulated annealing by a neural-network model with transient chaos. Neural Networks, 8(6), 915–930.CrossRef Chen, L., & Aihara, K. (1995). Chaotic simulated annealing by a neural-network model with transient chaos. Neural Networks, 8(6), 915–930.CrossRef
10.
11.
Zurück zum Zitat Chen, L., & Aihara, K. (1999). Global searching ability of chaotic neural networks. IEEE Transactions on Circuits and Systems I, 48(8), 974–993. Chen, L., & Aihara, K. (1999). Global searching ability of chaotic neural networks. IEEE Transactions on Circuits and Systems I, 48(8), 974–993.
12.
Zurück zum Zitat Chiueh, T. D., & Goodman, R. M. (1991). Recurrent correlation associative memories. IEEE Transactions on Neural Networks, 2(2), 275–284.CrossRef Chiueh, T. D., & Goodman, R. M. (1991). Recurrent correlation associative memories. IEEE Transactions on Neural Networks, 2(2), 275–284.CrossRef
13.
Zurück zum Zitat Chua, L. O., & Yang, L. (1988). Cellular neural network—Part I: Theory; Part II: Applications. IEEE Transactions on Circuits and Systems, 35, 1257–1290.MathSciNetMATHCrossRef Chua, L. O., & Yang, L. (1988). Cellular neural network—Part I: Theory; Part II: Applications. IEEE Transactions on Circuits and Systems, 35, 1257–1290.MathSciNetMATHCrossRef
14.
Zurück zum Zitat Chua, L. O., & Roska, T. (2002). Cellular neural network and visual computing—Foundation and applications. Cambridge, UK: Cambridge University Press.CrossRef Chua, L. O., & Roska, T. (2002). Cellular neural network and visual computing—Foundation and applications. Cambridge, UK: Cambridge University Press.CrossRef
15.
Zurück zum Zitat Cichocki, A., & Unbehauen, R. (1993). Neural networks for optimization and signal processing. New York: Wiley.MATH Cichocki, A., & Unbehauen, R. (1993). Neural networks for optimization and signal processing. New York: Wiley.MATH
16.
Zurück zum Zitat Cohen, M. A., & Grossberg, S. (1983). Absolute stability of global pattern formation and parallel memory storage by competitive neural networks. IEEE Transactions on Systems, Man, and Cybernetics, 13, 815–826. Cohen, M. A., & Grossberg, S. (1983). Absolute stability of global pattern formation and parallel memory storage by competitive neural networks. IEEE Transactions on Systems, Man, and Cybernetics, 13, 815–826.
17.
Zurück zum Zitat Culhane, A. D., Peckerar, M. C., & Marrian, C. R. K. (1989). A neural net approach to discrete Hartley and Fourier transforms. IEEE Transactions on Circuits and Systems, 36(5), 695–702.MathSciNetCrossRef Culhane, A. D., Peckerar, M. C., & Marrian, C. R. K. (1989). A neural net approach to discrete Hartley and Fourier transforms. IEEE Transactions on Circuits and Systems, 36(5), 695–702.MathSciNetCrossRef
18.
Zurück zum Zitat Czech, Z. J. (2001). Three parallel algorithms for simulated annealing. In: R. Wyrzykowski, J. Dongarra, M. Paprzycki, & J. Waniewski (Eds.), Proceedings of the 4th International Conference on Parallel Processing and Applied Mathematics, LNCS (Vol. 2328, pp. 210–217). Naczow, Poland; London: Springer. Czech, Z. J. (2001). Three parallel algorithms for simulated annealing. In: R. Wyrzykowski, J. Dongarra, M. Paprzycki, & J. Waniewski (Eds.), Proceedings of the 4th International Conference on Parallel Processing and Applied Mathematics, LNCS (Vol. 2328, pp. 210–217). Naczow, Poland; London: Springer.
19.
Zurück zum Zitat Dang, C., & Xu, L. (2001). A Lagrange multiplier and Hopfield-type barrier function method for the traveling salesman problem. Neural Computation, 14, 303–324.MATHCrossRef Dang, C., & Xu, L. (2001). A Lagrange multiplier and Hopfield-type barrier function method for the traveling salesman problem. Neural Computation, 14, 303–324.MATHCrossRef
20.
Zurück zum Zitat Demirciler, K., & Ortega, A. (2005). Reduced-complexity deterministic annealing for vector quantizer design. EURASIP Journal on Applied Signal Processing, 2005(12), 1807–1820. Demirciler, K., & Ortega, A. (2005). Reduced-complexity deterministic annealing for vector quantizer design. EURASIP Journal on Applied Signal Processing, 2005(12), 1807–1820.
21.
Zurück zum Zitat Fleisher, M. (1988). The Hopfield model with multi-level neurons. In D. Z. Anderson (Ed.), Neural information processing systems (pp. 278–289). New York: American Institute Physics. Fleisher, M. (1988). The Hopfield model with multi-level neurons. In D. Z. Anderson (Ed.), Neural information processing systems (pp. 278–289). New York: American Institute Physics.
22.
Zurück zum Zitat Gao, K., Ahmad, M. O., & Swamy, M. N. S. (1990). A neural network least-square estimator. In Proceedings of the International Joint Conference on Neural Networks (Vol. 3, pp. 805–810). Washington, DC. Gao, K., Ahmad, M. O., & Swamy, M. N. S. (1990). A neural network least-square estimator. In Proceedings of the International Joint Conference on Neural Networks (Vol. 3, pp. 805–810). Washington, DC.
23.
Zurück zum Zitat Gao, X.-B., & Liao, L.-Z. (2006). A novel neural network for a class of convex quadratic minimax problems. Neural Computation, 18, 1818–1846.MathSciNetMATHCrossRef Gao, X.-B., & Liao, L.-Z. (2006). A novel neural network for a class of convex quadratic minimax problems. Neural Computation, 18, 1818–1846.MathSciNetMATHCrossRef
24.
Zurück zum Zitat Geman, S., & Geman, D. (1984). Stochastic relaxation, Gibbs distributions, and the Bayesian restoration of images. IEEE Transactions on Pattern Analysis and Machine Intelligence, 6, 721–741. Geman, S., & Geman, D. (1984). Stochastic relaxation, Gibbs distributions, and the Bayesian restoration of images. IEEE Transactions on Pattern Analysis and Machine Intelligence, 6, 721–741.
26.
Zurück zum Zitat He, Y. (2002). Chaotic simulated annealing with decaying chaotic noise. IEEE Transactions on Neural Networks, 13(6), 1526–1531.CrossRef He, Y. (2002). Chaotic simulated annealing with decaying chaotic noise. IEEE Transactions on Neural Networks, 13(6), 1526–1531.CrossRef
27.
Zurück zum Zitat Hopfield, J. J. (1982). Neural networks and physical systems with emergent collective computational abilities. Proceedings of the National Academy of Sciences of the United States of America, 79, 2554–2558.MathSciNetMATHCrossRef Hopfield, J. J. (1982). Neural networks and physical systems with emergent collective computational abilities. Proceedings of the National Academy of Sciences of the United States of America, 79, 2554–2558.MathSciNetMATHCrossRef
28.
Zurück zum Zitat Hopfield, J. J. (1984). Neurons with graded response have collective computational properties like those of two-state neurons. Proceedings of the National Academy of Sciences of the United States of America, 81, 3088–3092.MATHCrossRef Hopfield, J. J. (1984). Neurons with graded response have collective computational properties like those of two-state neurons. Proceedings of the National Academy of Sciences of the United States of America, 81, 3088–3092.MATHCrossRef
29.
Zurück zum Zitat Hopfield, J. J., & Tank, D. W. (1985). Neural computation of decisions in optimization problems. Biological Cybernetics, 52, 141–152.MathSciNetMATH Hopfield, J. J., & Tank, D. W. (1985). Neural computation of decisions in optimization problems. Biological Cybernetics, 52, 141–152.MathSciNetMATH
31.
Zurück zum Zitat Jang, J. S., Lee, S. Y., & Shin, S. Y. (1988). An optimization network for matrix inversion. In D. Z. Anderson (Ed.), Neural information processing systems (pp. 397–401). New York: American Institute Physics. Jang, J. S., Lee, S. Y., & Shin, S. Y. (1988). An optimization network for matrix inversion. In D. Z. Anderson (Ed.), Neural information processing systems (pp. 397–401). New York: American Institute Physics.
32.
Zurück zum Zitat Jankowski, S., Lozowski, A., & Zurada, J. M. (1996). Complex-valued multi-state neural associative memory. IEEE Transactions on Neural Networks, 7(6), 1491–1496. Jankowski, S., Lozowski, A., & Zurada, J. M. (1996). Complex-valued multi-state neural associative memory. IEEE Transactions on Neural Networks, 7(6), 1491–1496.
33.
Zurück zum Zitat Kennedy, M. P., & Chua, L. O. (1988). Neural networks for nonlinear programming. IEEE Transactions on Circuits and Systems, 35, 554–562.MathSciNetCrossRef Kennedy, M. P., & Chua, L. O. (1988). Neural networks for nonlinear programming. IEEE Transactions on Circuits and Systems, 35, 554–562.MathSciNetCrossRef
34.
35.
Zurück zum Zitat Kwok, T., & Smith, K. A. (1999). A unified framework for chaotic neural-network approaches to combinatorial optimization. IEEE Transactions on Neural Networks, 10(4), 978–981.CrossRef Kwok, T., & Smith, K. A. (1999). A unified framework for chaotic neural-network approaches to combinatorial optimization. IEEE Transactions on Neural Networks, 10(4), 978–981.CrossRef
36.
Zurück zum Zitat Lee, B. W., & Shen, B. J. (1992). Design and analysis of analog VLSI neural networks. In B. Kosko (Ed.), Neural networks for signal processing (pp. 229–284). Englewood Cliffs, NJ: Prentice-Hall. Lee, B. W., & Shen, B. J. (1992). Design and analysis of analog VLSI neural networks. In B. Kosko (Ed.), Neural networks for signal processing (pp. 229–284). Englewood Cliffs, NJ: Prentice-Hall.
37.
Zurück zum Zitat Lee, R. S. T. (2006). Lee-Associator: A chaotic auto-associative network for progressive memory recalling. Neural Networks, 19, 644–666.MATHCrossRef Lee, R. S. T. (2006). Lee-Associator: A chaotic auto-associative network for progressive memory recalling. Neural Networks, 19, 644–666.MATHCrossRef
38.
Zurück zum Zitat Le Gall, A., & Zissimopoulos, V. (1999). Extended Hopfield models for combinatorial optimization. IEEE Transactions on Neural Networks, 10(1), 72–80.CrossRef Le Gall, A., & Zissimopoulos, V. (1999). Extended Hopfield models for combinatorial optimization. IEEE Transactions on Neural Networks, 10(1), 72–80.CrossRef
39.
Zurück zum Zitat Lendaris, G. G., Mathia, K., & Saeks, R. (1999). Linear Hopfield networks and constrained optimization. IEEE Transactions on Systems, Man, and Cybernetics, Part B, 29(1), 114–118.CrossRef Lendaris, G. G., Mathia, K., & Saeks, R. (1999). Linear Hopfield networks and constrained optimization. IEEE Transactions on Systems, Man, and Cybernetics, Part B, 29(1), 114–118.CrossRef
40.
Zurück zum Zitat Li, J. H., Michel, A. N., & Parod, W. (1989). Analysis and synthesis of a class of neural networks: Linear systems operating on a closed hypercube. IEEE Transactions on Circuits and Systems, 36(11), 1405–1422. Li, J. H., Michel, A. N., & Parod, W. (1989). Analysis and synthesis of a class of neural networks: Linear systems operating on a closed hypercube. IEEE Transactions on Circuits and Systems, 36(11), 1405–1422.
41.
Zurück zum Zitat Little, W. A. (1974). The existence of persistent states in the brain. Mathematical Biosciences, 19, 101–120.MATHCrossRef Little, W. A. (1974). The existence of persistent states in the brain. Mathematical Biosciences, 19, 101–120.MATHCrossRef
42.
Zurück zum Zitat Liu, Y., & You, Z. (2008). Stability analysis for the generalized Hopfield neural networks with multi-level activation functions. Neurocomputing, 71, 3595–3601. Liu, Y., & You, Z. (2008). Stability analysis for the generalized Hopfield neural networks with multi-level activation functions. Neurocomputing, 71, 3595–3601.
43.
Zurück zum Zitat Locatelli, M. (2001). Convergence and first hitting time of simulated annealing algorithms for continuous global optimization. Mathematical Methods of Operations Research, 54, 171–199.MathSciNetMATHCrossRef Locatelli, M. (2001). Convergence and first hitting time of simulated annealing algorithms for continuous global optimization. Mathematical Methods of Operations Research, 54, 171–199.MathSciNetMATHCrossRef
44.
Zurück zum Zitat Matsuda, S. (1998). “Optimal” Hopfield network for combinatorial optimization with linear cost function. IEEE Transactions on Neural Networks, 9(6), 1319–1330.CrossRef Matsuda, S. (1998). “Optimal” Hopfield network for combinatorial optimization with linear cost function. IEEE Transactions on Neural Networks, 9(6), 1319–1330.CrossRef
45.
Zurück zum Zitat Metropolis, N., Rosenbluth, A., Rosenbluth, M., Teller, A., & Teller, E. (1953). Equations of state calculations by fast computing machines. Journal of Chemical Physics, 21(6), 1087–1092. Metropolis, N., Rosenbluth, A., Rosenbluth, M., Teller, A., & Teller, E. (1953). Equations of state calculations by fast computing machines. Journal of Chemical Physics, 21(6), 1087–1092.
46.
Zurück zum Zitat Muezzinoglu, M. K., Guzelis, C., & Zurada, J. M. (2003). A new design method for the complex-valued multistate Hopfield associative memory. IEEE Transactions on Neural Networks, 14(4), 891–899. Muezzinoglu, M. K., Guzelis, C., & Zurada, J. M. (2003). A new design method for the complex-valued multistate Hopfield associative memory. IEEE Transactions on Neural Networks, 14(4), 891–899.
47.
Zurück zum Zitat Nam, D. K., & Park, C. H. (2000). Multiobjective simulated annealing: A comparative study to evolutionary algorithms. International Journal of Fuzzy Systems, 2(2), 87–97. Nam, D. K., & Park, C. H. (2000). Multiobjective simulated annealing: A comparative study to evolutionary algorithms. International Journal of Fuzzy Systems, 2(2), 87–97.
48.
49.
Zurück zum Zitat Richardt, J., Karl, F., & Muller, C. (1998). Connections between fuzzy theory, simulated annealing, and convex duality. Fuzzy Sets and Systems, 96, 307–334. Richardt, J., Karl, F., & Muller, C. (1998). Connections between fuzzy theory, simulated annealing, and convex duality. Fuzzy Sets and Systems, 96, 307–334.
50.
Zurück zum Zitat Rose, K., Gurewitz, E., & Fox, G. C. (1990). A deterministic annealing approach to clustering. Pattern Recognition Letters, 11(9), 589–594. Rose, K., Gurewitz, E., & Fox, G. C. (1990). A deterministic annealing approach to clustering. Pattern Recognition Letters, 11(9), 589–594.
51.
Zurück zum Zitat Rose, K. (1998). Deterministic annealing for clustering, compression, classification, regression, and related optimization problems. Proceedings of the IEEE, 86(11), 2210–2239.CrossRef Rose, K. (1998). Deterministic annealing for clustering, compression, classification, regression, and related optimization problems. Proceedings of the IEEE, 86(11), 2210–2239.CrossRef
52.
Zurück zum Zitat Roska, T., & Chua, L. O. (1993). The CNN universal machine: An analogic array computer. IEEE Transactions on Circuits and Systems II, 40(3), 163–173.MATHCrossRef Roska, T., & Chua, L. O. (1993). The CNN universal machine: An analogic array computer. IEEE Transactions on Circuits and Systems II, 40(3), 163–173.MATHCrossRef
53.
Zurück zum Zitat Si, J., & Michel, A. N. (1995). Analysis and synthesis of a class of discrete-time neural networks with multilevel threshold neurons. IEEE Transactions on Neural Networks, 6(1), 105–116. Si, J., & Michel, A. N. (1995). Analysis and synthesis of a class of discrete-time neural networks with multilevel threshold neurons. IEEE Transactions on Neural Networks, 6(1), 105–116.
54.
Zurück zum Zitat Sima, J., Orponen, P., & Antti-Poika, T. (2000). On the computational complexity of binary and analog symmetric Hopfield nets. Neural Computation, 12, 2965–2989.CrossRef Sima, J., Orponen, P., & Antti-Poika, T. (2000). On the computational complexity of binary and analog symmetric Hopfield nets. Neural Computation, 12, 2965–2989.CrossRef
55.
Zurück zum Zitat Sima, J., & Orponen, P. (2003). Continuous-time symmetric Hopfield nets are computationally universal. Neural Computation, 15, 693–733.MATHCrossRef Sima, J., & Orponen, P. (2003). Continuous-time symmetric Hopfield nets are computationally universal. Neural Computation, 15, 693–733.MATHCrossRef
56.
Zurück zum Zitat Smith, K. I., Everson, R. M., Fieldsend, J. E., Murphy, C., & Misra, R. (2008). Dominance-based multiobjective simulated annealing. IEEE Transactions on Evolutionary Computation, 12(3), 323–342.CrossRef Smith, K. I., Everson, R. M., Fieldsend, J. E., Murphy, C., & Misra, R. (2008). Dominance-based multiobjective simulated annealing. IEEE Transactions on Evolutionary Computation, 12(3), 323–342.CrossRef
57.
Zurück zum Zitat Szu, H. H., & Hartley, R. L. (1987). Nonconvex optimization by fast simulated annealing. Proceedings of the IEEE, 75, 1538–1540. Szu, H. H., & Hartley, R. L. (1987). Nonconvex optimization by fast simulated annealing. Proceedings of the IEEE, 75, 1538–1540.
58.
Zurück zum Zitat Tank, D. W., & Hopfield, J. J. (1986). Simple “neural” optimization networks: An A/D converter, signal decision circuit, and a linear programming circuit. IEEE Transactions on Circuits and Systems, 33, 533–541.CrossRef Tank, D. W., & Hopfield, J. J. (1986). Simple “neural” optimization networks: An A/D converter, signal decision circuit, and a linear programming circuit. IEEE Transactions on Circuits and Systems, 33, 533–541.CrossRef
59.
Zurück zum Zitat Thompson, D. R., & Bilbro, G. L. (2005). Sample-sort simulated annealing. IEEE Transactions on Systems, Man, and Cybernetics, Part B, 35(3), 625–632.CrossRef Thompson, D. R., & Bilbro, G. L. (2005). Sample-sort simulated annealing. IEEE Transactions on Systems, Man, and Cybernetics, Part B, 35(3), 625–632.CrossRef
60.
Zurück zum Zitat Tsallis, C., & Stariolo, D. A. (1996). Generalized simulated annealing. Physica A, 233, 395–406.CrossRef Tsallis, C., & Stariolo, D. A. (1996). Generalized simulated annealing. Physica A, 233, 395–406.CrossRef
61.
Zurück zum Zitat Wang, J., & Li, H. (1994). Solving simultaneous linear equations using recurrent neural networks. Information Sciences, 76, 255–277.MATHCrossRef Wang, J., & Li, H. (1994). Solving simultaneous linear equations using recurrent neural networks. Information Sciences, 76, 255–277.MATHCrossRef
62.
Zurück zum Zitat Wang, L., & Smith, K. (1998). On chaotic simulated annealing. IEEE Transactions on Neural Networks, 9, 716–718.CrossRef Wang, L., & Smith, K. (1998). On chaotic simulated annealing. IEEE Transactions on Neural Networks, 9, 716–718.CrossRef
63.
Zurück zum Zitat Wang, L., Li, S., Tian, F., & Fu, X. (2004). A noisy chaotic neural network for solving combinatorial optimization problems: Stochastic chaotic simulated annealing. IEEE Transactions on Systems, Man, and Cybernetics, Part B, 34(5), 2119–2125.CrossRef Wang, L., Li, S., Tian, F., & Fu, X. (2004). A noisy chaotic neural network for solving combinatorial optimization problems: Stochastic chaotic simulated annealing. IEEE Transactions on Systems, Man, and Cybernetics, Part B, 34(5), 2119–2125.CrossRef
64.
Zurück zum Zitat Wang, R. L., Tang, Z., & Cao, Q. P. (2002). A learning method in Hopfield neural network for combinatorial optimization problem. Neurocomputing, 48, 1021–1024.MATHCrossRef Wang, R. L., Tang, Z., & Cao, Q. P. (2002). A learning method in Hopfield neural network for combinatorial optimization problem. Neurocomputing, 48, 1021–1024.MATHCrossRef
65.
Zurück zum Zitat Wang, X. (1991). Period-doublings to chaos in a simple neural network: An analytic proof. Complex Systems, 5, 425–441.MathSciNetMATH Wang, X. (1991). Period-doublings to chaos in a simple neural network: An analytic proof. Complex Systems, 5, 425–441.MathSciNetMATH
66.
Zurück zum Zitat Yan, H. (1991). Stability and relaxation time of Tank and Hopfield’s neural network for solving LSE problems. IEEE Transactions on Circuits and Systems, 38(9), 1108–1110.MathSciNetCrossRef Yan, H. (1991). Stability and relaxation time of Tank and Hopfield’s neural network for solving LSE problems. IEEE Transactions on Circuits and Systems, 38(9), 1108–1110.MathSciNetCrossRef
67.
Zurück zum Zitat Yuh, J. D., & Newcomb, R. W. (1993). A multilevel neural network for A/D conversion. IEEE Transactions on Neural Networks, 4(3), 470–483.CrossRef Yuh, J. D., & Newcomb, R. W. (1993). A multilevel neural network for A/D conversion. IEEE Transactions on Neural Networks, 4(3), 470–483.CrossRef
68.
Zurück zum Zitat Zhang, S., & Constantinides, A. G. (1992). Lagrange programming neural networks. IEEE Transactions on Circuits and Systems II, 39(7), 441–452. Zhang, S., & Constantinides, A. G. (1992). Lagrange programming neural networks. IEEE Transactions on Circuits and Systems II, 39(7), 441–452.
69.
Zurück zum Zitat Zurada, J. M., Cloete, I., & van der Poel, E. (1996). Generalized Hopfield networks for associative memories with multi-valued stable states. Neurocomputing, 13, 135–149.CrossRef Zurada, J. M., Cloete, I., & van der Poel, E. (1996). Generalized Hopfield networks for associative memories with multi-valued stable states. Neurocomputing, 13, 135–149.CrossRef
Metadaten
Titel
Hopfield Networks, Simulated Annealing, and Chaotic Neural Networks
verfasst von
Ke-Lin Du
M. N. S. Swamy
Copyright-Jahr
2019
Verlag
Springer London
DOI
https://doi.org/10.1007/978-1-4471-7452-3_7