Exponential stability of Markovian jumping Cohen–Grossberg neural networks with mixed mode-dependent time-delays
Introduction
The past few decades have witnessed the recent developments of the neural networks that have been successfully applied in a variety of areas such as system identification and control, pattern recognition, game-playing and decision making, sequence recognition, data mining, quantum chemistry, medical diagnosis and visualization [1], [2]. Accordingly, a great number of results have been published concerning several neural networks, such as biological neural networks, bidirectional associative neural networks, artificial neural networks, Hopfield neural networks, and Cohen–Grossberg neural networks [3], [4]. It is worth mentioning that the Cohen–Grossberg neural network has received special research interest due to the fact that it can encompass several neural networks as special cases [5], [6], [7], [8], [9]. Therefore, a rich body of research results concerning various stability criteria has been reported for Cohen–Grossberg-type neural networks. For example, in [10], [11], the exponential stability problems have been discussed for delayed Cohen–Grossberg neural networks. The globally exponential stability problem has been discussed in [12] for inertial Cohen–Grossberg-type neural networks with time delays. In [13], the problems of the exponential stability and the almost sure exponential stability have been studied for a class of fuzzy stochastic Cohen–Grossberg neural networks with time-invariant delays. In addition, the input-to-state stability problem has been addressed in [14] for a class of stochastic impulsive Cohen–Grossberg neural networks with mixed delays.
As is well known, the time-delays are frequently encountered in the neural processing and signal transmissions [15], [16], [17], [18]. Therefore, considerable research effort has been made to investigate the dynamics behaviour analysis problem for neural networks subject to discrete time-delays. For example, the problems of exponential stability in mean square sense have been studied in [11], [19] for fuzzy Cohen–Grossberg neural networks with time delays and some sufficient conditions have been given based on the linear matrix inequality technique. In parallel to the discrete time-delays, another class of time-delays, i.e., distributed time-delays, has begun to receive some research attention due to the fact that there commonly exists an amount of parallel pathways in a neural network with a variety of axon sizes and lengths and the neural network often exhibits the phenomenon of the distributed time-delays [20], [21]. In [20], the global asymptotic stability problem has been studied for a class of stochastic Cohen–Grossberg neural networks with discrete time-delays and distributed time-delays. Recently, the problems of the existence and global exponential stability of periodic solution have been conducted in [21] for a general class of fuzzy Cohen–Grossberg bidirectional associative memory neural networks with variable coefficients and mixed time-delays.
On another research frontier, due to the random failures, repairs of the network components, and/or sudden environment switching, the random mode switches would occur in the neural networks [22]. Such a phenomenon can be efficiently characterized by using the Markovian chain, and hence the Markovian jumping neural networks have attracted a great deal of research interest in the past ten years [23], [24], [25], [26]. For instance, in [23], [24], new delay-dependent robust exponential stability conditions have been given for Markovian jump stochastic neural networks with simultaneous presence of the parameter uncertainties and time-delays. In [26], the problems of stability and synchronization have been investigated for Markovian switching neural networks with stochastic perturbation, where both the coupling information and the impulsive delay have been fully reflected in the developed synchronization criteria. Very recently, in [22], the asymptotical synchronization problem has been investigated for an array of identical neutral-type recurrent neural networks subject to Markovian jumping parameters, mode-dependent discrete time-delays and mode-dependent unbounded distributed time-delays, where a semi-definite programme approach has been developed to establish the sufficient criteria guaranteeing the asymptotical synchronization of the coupled recurrent neural networks in mean square sense. However, the exponential stability problem for Markovian jumping Cohen–Grossberg neural networks with mixed mode-dependent time-delays has not been fully addressed, which constitutes the main motivation of the current research.
The main contributions of this paper is to establish a novel framework to deal with the exponential stability problem for a class of Cohen–Grossberg neural networks with simultaneous presence of Markovian jumping parameters and mixed time-delays. By using the Lyapunov stability theory and stochastic analysis technique, the stability problem of the addressed Cohen–Grossberg neural networks is converted into the feasibility problem of a set of linear matrix inequalities. By constructing a new Lyapunov–Krasovskii functional, sufficient criteria are given such that the considered Cohen–Grossberg neural networks are exponentially stable in mean square sense if a set of linear matrix inequalities is feasible. It is worth mentioning that the proposed results are delay-dependent and can be easily solved by using the Matlab toolbox. Finally, a numerical example is utilized to show the feasibility and applicability of the proposed exponential stability criterion.
The rest of this paper is organized as follows. In Section 2, the problem under consideration is briefly introduced. In Section 3, a new Lyapunov–Krasovskii functional is constructed, and some stability criteria are given in terms of a set of linear matrix inequalities that can be easily solved by using the semi-definite program method. A simulation example is provided in Section 4 to show the feasibility and effectiveness of the presented results. This paper is concluded in Section 5.
Notations: The notations used throughout the paper are standard. denotes the n-dimensional Euclidean space. For a matrix X, XT represents its transpose. The notation represents that matrix X is symmetric and positive definite (positive semi-definite). stands for the Euclidean norm of a vector. denotes a block-diagonal matrix. is the mathematical expectation of x and is the mathematical expectation of x conditional on y. I and 0 represent the identity matrix and the zero matrix with appropriate dimensions, respectively. In symmetric block matrices or long matrix expressions, we use a star “⁎” to represent a term that is induced by symmetry. Matrices, if their dimensions are not explicitly stated, are assumed to be compatible for algebraic operations.
Section snippets
Problem formulation
Let r(t) be a right-continuous Markovian chain on a probability space taking values in a finite state space with generator given by Here, , and is the transition rate from i to j if while
In this paper, we consider the following n-neuron Cohen–Grossberg neural networks with mixed mode-dependent time-delays:
Main results and proof
In this section, the stability analysis problem is studied for the Markovian jumping Cohen–Grossberg neural networks (1), (2) with mixed mode-dependent time-delays.
To begin with, we introduce the following lemma which will be used in the proof of the main results. Lemma 1 Let be a positive semi-definite matrix, scalars a and b satisfy , and be a vector function. If the integrations concerned are well defined, then the following inequality holds:Gu [27]
A numerical example
In this section, we present a simulation example to demonstrate the effectiveness of the theoretical results proposed in this paper.
Consider a three-neuron three-mode Cohen–Grossberg neural network (1), (2) with the following parameters:
Conclusions
In this paper, the exponential stability problem has been investigated for a class of Cohen–Grossberg neural networks subject to Markovian jumping parameters and mixed mode-dependent time-delays. Based on the Lyapunov stability theory, sufficient conditions have been given to guarantee the exponential stability of the addressed neural networks in mean square sense. The proposed stability criteria can be readily verified by using the standard numerical software. Finally, a simulation has been
Acknowledgements
This project was funded by the Deanship of Scientific Research (DSR), King Abdulaziz University, under Grant no. 9-130-36-HiCi. The authors, therefore, acknowledge with thanks DSR technical and financial support.
Yurong Liu was born in China in 1964. He received his B.Sc. degree in Mathematics from Suzhou University, Suzhou, China, in 1986, the M.Sc. degree in Applied Mathematics from Nanjing University of Science and Technology, Nanjing, China, in 1989, and the Ph.D. degree in Applied Mathematics from Suzhou University, Suzhou, China, in 2001.
Dr. Liu is currently a professor with the Department of Mathematics at Yangzhou University, China. He also serves as an Associate Editor of Neurocomputing. So
References (40)
- et al.
Robust stability criteria for Takagi–Sugeno fuzzy Cohen–Grossberg neural networks of neutral type
Neurocomputing
(2014) Adaptive synchronization of Cohen–Grossberg neural networks with unknown parameters and mixed time-varying delays
Commun. Nonlinear Sci. Numer. Simul.
(2012)- et al.
Multistability analysis for a general class of delayed Cohen–Grossberg neural networks
Inf. Sci.
(2012) - et al.
Mean square exponential stability of stochastic fuzzy delayed Cohen–Grossberg neural networks with expectations in the coefficients
Neurocomputing
(2015) - et al.
Stability analysis of inertial Cohen–Grossberg-type neural networks with time delays
Neurocomputing
(2013) - et al.
Exponential and almost sure exponential stability of stochastic fuzzy delayed Cohen–Grossberg neural networks
Fuzzy Sets Syst.
(2012) - et al.
Global asymptotic stability of nonautonomous Cohen–Grossberg neural network models with infinite delays
Appl. Math. Comput.
(2015) - et al.
Lagrange exponential stability for fuzzy Cohen–Grossberg neural networks with time-varying delays
Fuzzy Sets Syst.
(2015) - et al.
Envelope-constrained filtering with fading measurements and randomly occurring nonlinearitiesthe finite horizon case
Automatica
(2015) - et al.
Finite-horizon reliable control with randomly occurring uncertainties and nonlinearities subject to output quantization
Automatica
(2015)
Finite-horizon estimation of randomly occurring faults for a class of nonlinear time-varying systems
Automatica
Neural network and support vector machine predictive control of tert-amyl methyl ether reactive distillation column
Syst. Sci. Control Eng.: An Open Access J.
Artificial neural network-based induction motor fault classifier using continuous wavelet transform
Syst. Sci. Control Eng.: An Open Access J.
Global stability in Lagrange sense for BAM-type Cohen–Grossberg neural networks with time-varying delays
Syst. Sci. Control Eng.: An Open Access J.
Adaptive regulation synchronization for a class of delayed Cohen–Grossberg neural networks
Nonlinear Dyn.
Adaptive synchronization of different Cohen–Grossberg chaotic neural networks with unknown parameters and time-varying delays
Nonlinear Dyn.
Adaptive lag synchronization of chaotic Cohen–Grossberg neural networks with discrete delays
Chaos
Adaptive exponential synchronization of delayed Cohen–Grossberg neural networks with discontinuous activations
Int. J. Mach. Learn. Cybern.
Exponential input-to-state stability of stochastic Cohen–Grossberg neural networks with mixed delays
Nonlinear Dyn.
Quantized control for nonlinear stochastic time-delay systems with missing measurements
IEEE Trans. Autom. Control
Cited by (0)
Yurong Liu was born in China in 1964. He received his B.Sc. degree in Mathematics from Suzhou University, Suzhou, China, in 1986, the M.Sc. degree in Applied Mathematics from Nanjing University of Science and Technology, Nanjing, China, in 1989, and the Ph.D. degree in Applied Mathematics from Suzhou University, Suzhou, China, in 2001.
Dr. Liu is currently a professor with the Department of Mathematics at Yangzhou University, China. He also serves as an Associate Editor of Neurocomputing. So far, he has published more than 50 papers in refereed international journals. His current interests include stochastic control, neural networks, complex networks, nonlinear dynamics, time-delay systems, multi-agent systems, and chaotic dynamics.
Weibo Liu received his B.Sc. degree in electrical engineering from University of Liverpool, Liverpool, UK, in 2015. He is currently a Ph.D. student at the Department of Computer Science, Brunel University London, London, UK. His research interests include big data analysis and deep learning.
Mustafa Ali Obaid received his Ph.D. degree from the University of Kansas, Manhattan, Kansas, USA. He is presently working as a Professor of Mathematics in King Abdulaziz University, Jeddah, Saudi Arabia. His research interests include applied algebra and differential equations.
Ibrahim Atiatallah Abbas completed his Ph.D. in Biomathematics in 2004. He has been working as an Associate Professor of Mathematics in Faculty of Science and Arts-Khulais, King Abdulaziz University, Saudi Arabia, since 2011. His research interests include finite element method, biological tissues, fluid mechanics, porous media, fractional calculus. He has published several papers in journals of international repute.