Skip to main content
Erschienen in: Journal of Inequalities and Applications 1/2011

Open Access 01.12.2011 | Research

Further analysis on stability of delayed neural networks via inequality technique

verfasst von: Jiayu Wang

Erschienen in: Journal of Inequalities and Applications | Ausgabe 1/2011

Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.

search-config
loading …

Abstract

In this paper, further analysis on stability of delayed neural networks is presented via the impulsive delay differential inequality, which was obtained by Li in recent publications. Based on the inequality, some new sufficient conditions ensuring global exponential stability of impulsive delay neural networks are derived, and the estimated exponential convergence rates are also obtained. The conditions are less conservative and restrictive than those established in the earlier references. In addition, some numerical examples are given to show the effectiveness of our obtained results.
Hinweise

Competing interests

The author declares that he has no competing interests.

Authors' contributions

The studies and manuscript of this paper was written by Jiayu Wang independently.

1. Introduction and preliminaries

In recent years, extensive research has been done in neural networks such as Hopfield neural networks, Cohen-Grossberg neural networks, cellular neural networks, and bidirectional associative memory neural networks, because of their potential applications in pattern recognition, image processing, associative memory, and so on, see [128]. Recently, a new type of neural networks--impulsive neural networks display a combination of characteristics of both the continuous-time and discrete-time systems, which is an appropriate description of the phenomena of abrupt qualitative dynamical changes of essentially continuous-time systems, see [4, 9, 1322]. The stability of impulsive delay neural networks has become an important topic of theoretical studies and has been investigated by many researchers via different approaches, see [9, 1316, 2022] and the references cited therein. For example, Liu et al. [14] obtained some sufficient conditions on global exponential stability by utilizing impulsive delay differential inequality that has been given by Yue et al. [18] for impulsive high-order Hopfield neural networks with time-varying delays as follows:
C i d u i ( t ) d t = - u i ( t ) R i + j = 1 n T i j g j ( u j ( t - τ j ( t ) ) ) + j = 1 n l = 1 n T i j l g j ( u j ( t - τ j ( t ) ) ) × g l ( u l ( t - τ l ( t ) ) ) + I i , t t k , t t 0 , Δ u i | t = t k = d i u i ( t k - ) + j = 1 n W i j h j ( u j ( t k - - τ j ( t k - ) ) ) + j = 1 n l = 1 n W i j l h j ( u j ( t k - - τ j ( t k - ) ) ) × h l ( u l ( t k - - τ l ( t k - ) ) ) , i Λ , k + , u i ( s ) = ϕ i ( s ) , s [ t 0 - τ , t 0 ] .
(1.1)
In [20, 21], Xu and Yang investigated the global exponential stability of impulsive delay neural networks by establishing a delay differential inequality with impulsive initial conditions. The results extend and improve the recent works [23, 24]. More recently, Yang et al. [22] investigated the global exponential stability by Lyapunov function and Halanay inequality for impulsive extended BAM type Cohen-Grossberg neural networks with delays and variable coefficients as follows:
x i ( t ) = - a i ( x i ( t ) ) b i ( x i ( t ) ) - j = 1 m p j i f j ( y j ( t ) ) u j - j = 1 m r j i f j ( y j ( t - τ j i ) ) v j + c i , i = 1 , , n y j ( t ) = - a j ( y j ( t ) ) b j ( y j ( t ) ) - i = 1 n q i j g i ( x i ( t ) ) w i - i = 1 n s i j g i ( x i ( t - σ i j ) ) e i + d j , j = 1 , , m x i ( s ) = ϕ i ( s ) , y j ( s ) = ψ j ( s ) , s [ t 0 - τ , t 0 ] ,
(1.2)
where
u j = 1 + k = 1 α j k δ ( t - t k ) , v j = 1 + k = 1 β j k δ ( t - t k ) , w i = 1 + k = 1 γ i k δ ( t - t k ) , e i = 1 + k = 1 λ i k δ ( t - t k ) .
Although some stability conditions for impulsive delay neural networks proposed in [9, 14, 15, 1822], they have some conservatism to some extent, and there still exists open room for further improvement.
Recently, Li [25] establishes a new impulsive delay differential inequality as follows:
Lemma 1.1. Let α, β, r and τ denote nonnegative constants, and function fPC(ℝ, ℝ+) satisfies the scalar impulsive differential inequality
D + f ( t ) - α f ( t ) + β sup t - τ s t f ( s ) + r 0 σ k ( s ) f ( t - s ) d s , t t k , t t 0 , f ( t k ) a k f ( t k - ) + b k sup t k - τ s < t k f ( s ) , k + ,
where 0 < σ ≤ + ∞, a k , b k , ∈ ℝ+, k(·) ∈ PC([0, σ], ℝ+) satisfies 0 σ k ( s ) e η 0 s d s < for some positive constant η0 > 0 in the case when σ = +∞. Moreover, when σ = +∞, the interval [t - σ, t] is understood to be replaced by (-∞, t].
Assume that
(i) α > β + r 0 σ k ( s ) d s .
(ii) There exist constants M > 0, η > 0 such that
k = 1 n max 1 , a k + b k e λ τ M e η ( t n - t 0 ) , n + ,
where λ ∈ (0, η0) satisfies
λ < α - β e λ τ - r 0 σ k ( s ) e λ s d s .
Then,
f ( t ) M sup t 0 - max { σ , τ } s t 0 f ( s ) e - ( λ - η ) ( t - t 0 ) , t t 0 ,
In particular, it includes the special case:
Lemma 1.2. Let α, β and τ denote nonnegative constants, a k , b k ∈ ℝ+, and function fPC(ℝ, ℝ+) satisfies
D + f ( t ) - α f ( t ) + β sup t - τ s t f ( s ) , t t k , f ( t k ) a k f ( t k - ) + b k sup t k - τ s < t k f ( s ) , k + ,
(1.3)
Assume that
(i) α > β ≥ 0.
(ii) There exist constants M > 0, η > 0 such that
k = 1 n max 1 , a k + b k e λ τ M e η ( t n - t 0 ) , n + ,
where λ > 0 satisfies
λ < α - β e λ τ .
Then
f ( t ) M sup t 0 - τ s t 0 f ( s ) e - ( λ - η ) ( t - t 0 ) , t t 0 .
The purpose of this paper is to improve the results in [9, 14, 15, 1822] via the above results in Lemma 1.2, which is a special case of [25]. We will derive some new sufficient conditions to ensure the global exponential stability of equilibrium point for impulsive delay Hopfield neural networks (1.1) and BAM type Cohen-Grossberg neural networks (1.2). The main advantages of the obtained exponential stability conditions include:
(I)
In [9, 14, 15, 18, 22], all of those results require that the time sequence {t k } satisfies inf k + { t k - t k - 1 } > τ δ , δ > 1 . But this restriction will not be required in our results.
 
(II)
Even for the case inf k + { t k - t k - 1 } > τ , our results still can be applied to the case not covered in [19, 20].
 
In addition, some illustrative examples are also given to demonstrate the effectiveness of the obtained results.

2. Global exponential stability analysis for HNNs

In this section, we will give some new sufficient conditions on the global exponential stability of equilibrium point for the neural network (1.1). The conditions are less restrictive and conservative than that given in [14].
System (1.1) may be rewritten in the following matrices forms:
C d x ( t ) d t = - R - 1 x ( t ) + ( T + Γ T T H ) f ( x ( t - τ ( t ) ) ) , t t k , t t 0 , Δ x ( t k ) = D x ( t k - ) + ( W + Λ T Ξ ) φ ( x ( t k - - τ ( t k - ) ) ) , k + , x ( s ) = φ ( s ) , s [ t 0 - τ , t 0 ] ,
(2.1)
Remark 2.1. For detail information about (2.1), one may see [14].
Theorem 2.1. Assume that conditions (i), (ii) in Theorem 1 in[14]hold, and
(iii)
there exists a constant η >0 such that
ρ max 1 , a + b exp { λ τ } < exp { λ η } ,
 
where η = inf k + { t k - t k - 1 } > 0 ,
a = 2 λ m a x ( P ) λ m i n ( P ) ( I + D ) 2 , b = 2 λ m a x ( P ) λ m i n ( P ) × max 1 i n { L i 2 } ( W + Λ T Ξ ) 2 ,
and λ > 0 satisfies
λ a - b exp { λ τ } .
Then the equilibrium point of the system (1.1) is globally exponentially stable with the approximate exponential convergence rate λ - ln ρ η .
Remark 2.2. For the proofs of Theorems 2.1, we need only to mention a few points, since the rest is the same as in the proofs of Theorems 1 in [14]. First, similarly one may define V(t) = x T (t)Px(t), and it can be deduced that
D + V ( t ) | ( 2 . 1 ) - a V ( t ) + b sup s [ t - τ , t ] V ( s ) , V ( t k ) a V ( t k - ) + b sup s [ t k - τ , t k ) V ( s ) .
Then using Lemma 1.2 in this paper (replacing Lemma 1 in [14]), Theorem 2.1 can be obtained.
Similarly we can obtain another stability criterion corresponding to Theorem 2 in [14] as follows:
Theorem 2.2. Assume that conditions (i) in Theorem 2 in[14]hold and
(iii)
there exists a constant η > 0 such that
ρ max 1 , a * + b * exp { λ τ } < exp { λ η } ,
 
where η = inf k + { t k - t k - 1 } > 0 ,
a * = max 1 i n { 1 + d i } , b * = max 1 j n i = 1 n W i j + i = 1 n W i j l + W i l j N l L j ,
and λ > 0 satisfies
λ a - b exp { λ τ } .
Then the equilibrium point of the system (1.1) is globally exponentially stable with the approximate exponential convergence rate λ - ln ρ η .
Remark 2.3. In [14], under the assumption that inf k + { t k - t k - 1 } > τ δ , δ > 1 . , Liu et al. obtained some theorems on exponential stability of (1.1). Note that in our theorem 2.1 and 2.2, we only require that inf k + { t k - t k - 1 } > 0 . Thus, our results improve the previous findings.
Example 2.1 Consider the three-neuron Hopfield neural network (1.1) with g1(u1) = tanh(0.63u1), g2(u2) = tanh(0.78u2), g3(u3) = tanh(0.46u3), h1(u1) = tanh(0.09u1), h2(u2) = tanh(0.02u2), h3(u3) = tanh(0.17u3), C = diag (C1, C2, C3) = diag (0.89, 0.88, 0.53), R = diag (R1, R2, R3) = diag(0.16, 0.12, 0.03), D = diag(d1, d2, d3) = diag(-0.95, -0.84, -0.99), 0 ≤ τ i (t) ≤ 0.5, i = 1, 2, 3 and
T = ( T i j ) 3 × 3 = [ 0.19 0.35 1.29 0.31 0.61 0.25 0.07 0.37 0.44 ] , T 1 = ( T 1 i j ) 3 × 3 = [ 0.05 0.14 0.28 0.06 0.05 0.11 0.24 0.06 0.09 ] , T 2 = ( T 2 i j ) 3 × 3 = [ 0.29 0.10 0.35 023 0.14 0.25 0.05 0.22 0.01 ] , T 3 = ( T 3 i j ) 3 × 3 = [ 0.23 0.07 0.03 0.09 0.02 0.19 0.16 0.01 0.06 ] , W = ( W i j ) 3 × 3 = [ 0.04 0.05 0.16 0.19 0.17 0.02 0.03 0.13 0.04 ] , W 1 = ( T 1 i j ) 3 × 3 = [ 0.01 0.01 0.03 0.08 0.09 0.07 0.08 0.01 0.01 ] , W 2 = ( W 2 i j ) 3 × 3 = [ 0.06 0 0.04 0.04 0.07 0.07 0.02 0.06 0.05 ] , W 3 = ( T 3 i j ) 3 × 3 = [ 0.04 0.04 0.01 0.02 0.05 0.05 0.02 0.03 0.02 ] .
(2.2)
In this example, similar to [14], one may choose P = diag(0.9, 0.7, 0.8), ε1 = 1, ε2 = 2 such that Ω < 0 in Theorem 2.1, and that a = 10.2628 > 2.3814 = b. Also, we can compute that ρ = 1. Thus, by Theorem 2.1, the equilibrium point of (2.2) is globally exponentially stable with the approximate convergence rate λ for inf k + { t k - t k - 1 } > 0 , where λ > 0 satisfies the inequality: λ ≤ 10.2628 - 2.3814eλ 0.5.
Remark 2.4. In [14], Liu et al. obtained that the equilibrium point of (2.2) is globally exponentially stable for inf k + { t k - t k - 1 } > 0 . 5 0 5 , which was more restrictive and conservative than that of our result. Therefore, the result in this paper is applicable to more conditions.

3. Global exponential stability analysis for BAM type CGNNs

In this section, we will reconsider the global exponential stability of impulsive BAM type Cohen-Grossberg neural networks (1.2).
Theorem 3.1. Assume that (H1) - (H3) and (i), (ii) in Theorem 2 in[22]hold; moreover, suppose that
(iii)
there exists a constant η > 0 such that
M max 1 , a ¯ a ¯ r k - 1 + a ¯ a ¯ R k exp { λ τ } < exp { λ η } ,
 
where η = inf k + { t k - t k - 1 } > 0 ,
r k = min 1 i n , 1 j m 1 - a ¯ j = 1 m q i j γ i k L i g , 1 - a ¯ i = 1 n p j i α j k L j f > 0 ,
R k = a ¯ r k - 1 max 1 i n , 1 j m j = 1 m s i j λ i k L i g , i = 1 n r j i β j k L j f ,
and λ > 0 satisfies
λ k 1 - k 2 exp { λ τ } .
Then the equilibrium point of the system (1.2) is globally exponentially stable with the approximate exponential convergence rate λ - ln M η .
Proof. Consider Lyapunov function as follows:
V ( t ) = i = 1 n 0 z i ( t ) S g n s α i ( s ) d s + j = 1 m 0 z ̃ i ( t ) S g n s α j ( s ) d s .
Then similar to the proof of Theorem 2 in [22], we arrive at
D + V ( t ) | ( 1 . 2 ) - k 1 V ( t ) + k 2 sup s [ t - τ , t ] V ( s ) , V ( t k ) a ¯ a r k - 1 V ( t k - ) + a ¯ a R k sup s [ t k - τ , t k ) V ( s ) .
Then by Lemma 1.2, the result holds. □
Remark 3.1. In [22], Yang et al. obtained a sufficient condition for global asymptotic stability of (1.2), which assumes that inf k + { t k - t k - 1 } > τ δ , δ > 1 . , while ours do not impose this restriction.
Example 3.1. Consider the following extended BAM neural networks:
x ( t ) = - ( 3 + cos x ( t ) ) x ( t ) - 1 1 0 cos t sin y ( t ) u k - 1 1 0 0 sin t sin ( y ( t ) - 1 8 ) v k - π 2 , y ( t ) = - ( 1 + sin y ( t ) ) y ( t ) - 1 1 0 sin t cos x ( t ) w k - 1 1 0 0 cos t cos ( x ( t ) - 1 6 ) e k - π ,
(3.1)
where u k = w k = v k = e k = 1 + (-1) k δ(t - t k ), the impulse times t k satisfy 0 ≤ t0 < t1 < < t k < ⋯, limk→ +∞t k = +∞ and inf k + { t k - t k - 1 } = 1 6 . Let τ = 18.
By simple calculation, we can obtain k 1 = 9 1 0 , k 2 = 1 2 5 , r k = 3 5 , R k = 1 1 5 , M max { 1 , a ¯ a ¯ r k - 1 + a ¯ a ¯ R k exp { λ τ } } = 2 0 3 + 4 1 5 exp { 1 8 λ } , where λ > 0 satisfies the inequality: λ 9 1 0 - 1 2 5 exp { 1 8 λ } . We may choose λ = 0.16, then M ≈ 11.511 < 12.932 = exp {16λ}. By Theorem 3.1, the equilibrium point ( π 2 , π ) of (3.1) is globally exponentially stable with the approximate convergence rate 0.007.
Remark 3.2. It can be easily verified that (iv), (v) in Theorem 2 in [22] are violated in the above example. Thus, our results improve the results in [22].

4. A new inequality

In this section, we shall give a new inequality that is different from Lemma 1.2 and can be applied to the case not covered in [19, 20].
Theorem 4.1. Suppose that
(i)
α > β max k + 1 a k + b k , 1 ;
 
(ii)
t k - t k-1> τ, and there exist constants M > 0, γ ≥ 0 such that
s = 1 k a s + b s exp { λ τ } M exp { γ ( t k - t 0 ) } , k + ,
 
where λ > 0 satisfies
λ α - β max k + 1 a k + b k exp { λ τ } , 1 exp { λ τ } .
(4.1)
Then,
f ( t ) M f ¯ ( t 0 ) exp { - ( λ - γ ) ( t - t 0 ) } , t t 0 .
Proof. Condition (i) implies that there exists small enough λ > 0 such that the inequality (4.1) holds.
Next, we show
f ( t ) f ¯ ( t 0 ) s = 0 k a s + b s exp { λ τ } exp { - λ ( t - t 0 ) } , t [ t k , t k + 1 ) ,
where a0 = 1, b0 = 0.
It is clear that f ( t ) f ¯ ( t 0 ) for t ∈ [t0 - τ, t0] by the definition of f ¯ .
Take k = 0, we shall show, for t ∈ [t0, t1)
f ( t ) f ¯ ( t 0 ) exp { - λ ( t - t 0 ) } .
(4.2)
Suppose on the contrary, then there exists some t ∈ [t0, t1) such that f ( t ) > f ¯ ( t 0 ) exp { - λ ( t - t 0 ) } .
Let
t = inf { t [ t 0 , t 1 ) , f ( t ) > W 0 ( t ) } , W 0 ( t ) = f ¯ ( t 0 ) exp { - λ ( t - t 0 ) } ,
then t ∈ [t0, t1) and
(1)
f(t ) = W 0(t );
 
(2)
f(t) ≤ W 0 (t), t ∈ [t 0, t ];
 
(3)
D + f ( t ) > W 0 ( t ) .
 
Since f ¯ ( t ) = sup s [ t - τ , t ] f ( s ) , , t ∈ [t0,t1), we get
f ¯ ( t ) f ¯ ( t 0 ) exp { - λ ( t - τ - t 0 ) } .
Hence, we have
D + f ( t ) - α f ( t ) + β f ¯ ( t ) - α f ( t ) + β f ¯ ( t 0 ) exp { - λ ( t - τ - t 0 ) } - α W 0 ( t ) + β W 0 ( t - τ ) - α W 0 ( t ) + β max k + 1 a k + b k exp { λ τ } , 1 W 0 ( t - τ ) .
Thus, by the definitions of λ and W0, we have
W 0 ( t ) = λ f ¯ ( t 0 ) exp { λ ( t t 0 ) } ( β max k + { 1 a k + b k exp { λ τ } ,1 } exp { λ τ } α ) f ¯ ( t 0 ) exp { λ ( t t 0 ) } = α f ¯ ( t 0 ) exp { λ ( t t 0 ) } + β max k + { 1 a k + b k exp { λ τ } ,1 } f ¯ ( t 0 ) exp { λ ( t τ t 0 ) = α W 0 ( t ) + β max k + { 1 a k + b k exp { λ τ } ,1 } W 0 ( t τ ) D + f ( t ),
which contradicts (3). So we get that (4.2) holds for all t ∈ [t0, t1).
Now, we assume that for t ∈ [tm-1, t m ), m ∈ ℤ+
f ( t ) f ¯ ( t 0 ) s = 0 m - 1 a s + b s exp { λ τ } exp { - λ ( t - t 0 ) } .
(4.3)
We shall show that for t ∈ [t m , tm+1), m ∈ ℤ+
f ( t ) f ¯ ( t 0 ) s = 0 m a s + b s exp { λ τ } exp { - λ ( t - t 0 ) } .
(4.4)
By (4.3) and the fact that t m - tm-1> τ, we know
f ¯ ( t m - ) f ¯ ( t 0 ) s = 0 m - 1 a s + b s exp { λ τ } exp { - λ ( t m - τ - t 0 ) } .
Hence,
f ( t m ) a m f ( t m - ) + b m f ¯ ( t m - ) a m f ¯ ( t 0 ) s = 0 m - 1 a s + b s exp { λ τ } exp { - λ ( t m - t 0 ) } + b m f ¯ ( t 0 ) s = 0 m - 1 a s + b s exp { λ τ } exp { - λ ( t m - τ - t 0 ) } ( a m + b m exp { λ τ } ) f ¯ ( t 0 ) s = 0 m - 1 a s + b s exp { λ τ } exp { - λ ( t m - t 0 ) } f ¯ ( t 0 ) s = 0 m a s + b s exp { λ τ } exp { - λ ( t m - t 0 ) } .
(4.5)
If (4.4) is not true, then there exists some t ∈ [t m , tm-1) such that
f ( t ) > f ¯ ( t 0 ) s = 0 m a s + b s exp { λ τ } exp { - λ ( t - t 0 ) } .
By (4.5), we define
t * = inf { t [ t m , t m + 1 ) , f ( t ) > W m ( t ) } ,
where
W m ( t ) = f ¯ ( t 0 ) s = 0 m a s + b s exp { λ τ } exp { - λ ( t - t 0 ) } ,
then t* ∈ [t m , tm+1) and
(4)
f(t*) = W m (t*);
 
(5)
f(t) ≤ W m (t), t ∈ [t m , t*];
 
(6)
D + f ( t * ) > W m ( t * ) .
 
Since f ¯ ( t ) = sup s [ t - τ , t ] f ( s ) , , t* ∈ [t m , tm+1), we get
f ¯ ( t * ) W m ( t * - τ ) max 1 , 1 a m + b m exp { λ τ } .
In fact, when t* - τt m , from (5), we have
f ¯ ( t * ) f ¯ ( t 0 ) s = 0 m a s + b s exp { λ τ } exp { - λ ( t * - τ - t 0 ) } W m ( t * - τ ) W m ( t * - τ ) max 1 , 1 a m + b m exp { λ τ } .
When t* - τ < t m , note that t k - tk-1> τ, we have
f ¯ ( t ) max { f ¯ ( t 0 ) ( s = 0 m ( a s + b s exp { λ τ } ) ) exp { λ ( t m t 0 ) }, f ¯ ( t 0 ) ( s = 0 m 1 ( a s + b s exp { λ τ } ) ) exp { λ ( t τ t 0 ) } } max { f ¯ ( t 0 ) ( s = 0 m ( a s + b s exp { λ τ } ) ) exp { λ ( t τ t 0 ) }, f ¯ ( t 0 ) ( s = 0 m 1 ( a s + b s exp { λ τ } ) ) exp { λ ( t τ t 0 ) } } f ¯ ( t 0 ) ( s = 0 m ( a s + b s exp { λ τ } ) ) exp { λ ( t τ t 0 ) } max { 1, 1 a m + b m exp { λ τ } } W m ( t τ ) max { 1, 1 a m + b m exp { λ τ } } .
This, together with (4), leads to
D + f ( t * ) - α f ( t * ) + β f ¯ ( t * ) - α W m ( t * ) + β W m ( t * - τ ) max 1 , 1 a m + b m exp { λ τ } .
Hence, we obtain
W m ( t * ) = - λ f ¯ ( t 0 ) s = 0 m a s + b s exp { λ τ } exp { - λ ( t * - t 0 ) } β max k + 1 a k + b k exp { λ τ } , 1 exp { λ τ } - α f ¯ ( t 0 ) × s = 0 m a s + b s exp { λ τ } exp { - λ ( t * - t 0 ) } - α W m ( t * ) + β max 1 , 1 a m + b m exp { λ τ } W m ( t * - τ ) D + f ( t * ) ,
which is a contradiction with (6). Hence, we obtain (4.4) holds for all t ∈ [t m , tm+1), m ∈ ℤ+. Thus, by the method of induction, we get, for t ∈ [t k , tk+1)
f ( t ) f ¯ ( t 0 ) s = 0 k a s + b s exp { λ τ } exp { - λ ( t - t 0 ) } , k + .
By condition (ii), we have
f ( t ) M f ¯ ( t 0 ) exp { - ( λ - γ ) ( t - t 0 ) } , t t 0 ,
where λ satisfies (4.1). The proof of Theorem 4.1 is therefore completed. □
Remark 4.1. If there exists constant M > 0 such that s = 1 k ( a s + b s exp { λ τ } ) M for all k ∈ ℤ+ holds, then we can choose γ = 0 in Theorem 4.1.
If let b k = 0, k ∈ ℤ+ in Theorem 4.1, then we can obtain the following result.
Corollary 4.1. Suppose that
(iii)
α > β max k + 1 a k , 1 ;
 
(iv)
t k -t k-1> τ, and there exist constants M > 0, γ ≥ 0 such that
s = 1 k a s M exp { γ ( t k - t 0 ) } , k + ,
 
where λ > 0 satisfies
λ α - β max k + 1 a k , 1 exp { λ τ } .
Then,
f ( t ) M f ¯ ( t 0 ) exp { - ( λ - γ ) ( t - t 0 ) } , t t 0 .
In the following, the superiority of the present approach over [19, 20] will be demonstrated by an example. The main tool for studying the neural network in [19, 20] is the following:
Lemma 4.1. Suppose that α > β ≥ 0, and f(t) satisfies scalar impulsive differential inequality
D + f ( t ) - α f ( t ) + β f ¯ ( t ) , t t k , f ( t k ) a k f ( t k - ) , k + ,
where
f ( t ) 0 , f ¯ ( t ) = sup s [ t - τ , t ] f ( s ) , f ¯ ( t - ) = sup s [ t - τ , t ) f ( s ) ,
and f(t) is continuous except at each t k , k ∈ ℤ+, where it has jump discontinuities. The sequence {t k } satisfies 0 ≤ t0 < t1 < ⋯ < t k < ⋯, limk→ +∞t k = +∞.
Then,
f ( t ) f ¯ ( t 0 ) t 0 t k t max { 1 , a k } exp { - λ ( t - t 0 ) } , k + ,
(4.6)
where
λ α - β exp { λ τ } .
Consider a particular network of two neurons as follows:
x ( t ) = - 4 . 6 x ( t ) + 0 . 6 sin x ( t ) - 0 . 5 sin y ( t - τ 1 ) , t t k , y ( t ) = - 5 y ( t ) + 0 . 4 cos y ( t ) - 0 . 4 cos x ( t - τ 2 ) , t t k , x ( t k ) = β k x ( t k - ) , y ( t k ) = γ k y ( t k - ) , k + ,
(4.7)
where t k - tk-1= 0.25, t0 = 0, k ∈ ℤ+, τ i ∈ (0, 0.25), i = 1, 2 and
β k = 6 , k = 2 n - 1 , e - 2 , k = 2 n , n + , γ k = e 2 , k = 2 n - 1 , 1 9 , k = 2 n , n + .
Let τ = max {τ1, τ2}, then τ ∈ (0, 0.25).
Choose V(t) = |x(t)| + |y(t)|, then
D + V ( 4 . 7 ) - 4 . 6 x ( t ) + 0 . 6 sin x ( t ) + 0 . 5 sin y ( t - τ 1 ) - 5 y ( t ) + 0 . 4 cos y ( t ) + 0 . 4 cos x ( t - τ 2 ) - 4 x ( t ) + 0 . 5 y ( t - τ 1 ) - 4 . 6 y ( t ) + 0 . 4 x ( t - τ 2 ) - 4 [ x ( t ) + y ( t ) ] + 0 . 5 [ y ( t - τ 1 ) + x ( t - τ 2 ) ] - 4 V ( t ) + 0 . 5 V ̃ ( t ) ,
where V ̃ ( t ) = sup t - τ s t V ( s ) .
Moreover,
V ( t k ) = x ( t k ) + y ( t k ) max { β k , γ k } [ x ( t k - ) + y ( t k - ) ] ,
where
max { β k , γ k } = e 2 , k = 2 n - 1 , e - 2 k = 2 n , n + ,
Choose M = e2, γ = 0 in Corollary 4.1, we get
x ( t ) + y ( t ) = V ( t ) e 2 V ̃ ( t 0 ) exp { - λ ( t - t 0 ) } ,
(4.8)
where λ > 0 satisfies λ ≤ 4 - 0.5e2 exp{λτ}. Hence, the equilibrium point (0, 0) of (4.7) is globally exponentially stable with the approximate convergence rate λ.
On the other hand, we will point out the inequality (4.6) is not feasible here.
In fact, by using the inequality (4.6), we get, for t ∈ [t k , tk+1),
x ( t ) + y ( t ) = V ( t ) V ˜ ( t 0 ) ( e 2 ) k + 1 2 exp { λ ( t t 0 ) } V ˜ ( t 0 ) e k + 1 e λ k 4 V ˜ ( t 0 ) ( e 1 λ 4 ) k e + as  t ,
since λ > 0 satisfies λ ≤ 4 - 0.5 exp{λτ}. This leads to that it is very difficult to get the estimation formula like (4.8). Therefore, our method is less conservative in some degree than that in [19, 20].
Open AccessThis article is distributed under the terms of the Creative Commons Attribution 2.0 International License (https://​creativecommons.​org/​licenses/​by/​2.​0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Competing interests

The author declares that he has no competing interests.

Authors' contributions

The studies and manuscript of this paper was written by Jiayu Wang independently.
Literatur
1.
Zurück zum Zitat Newman M: The structure and function of complex networks. SIAM Rev 2003, 45: 167–256. 10.1137/S003614450342480MathSciNetCrossRef Newman M: The structure and function of complex networks. SIAM Rev 2003, 45: 167–256. 10.1137/S003614450342480MathSciNetCrossRef
2.
Zurück zum Zitat Gopalsamy K, He X: Stability in asymmetric Hopfield nets with transmission delays. Phys D 1994, 76: 344–358. 10.1016/0167-2789(94)90043-4MathSciNetCrossRef Gopalsamy K, He X: Stability in asymmetric Hopfield nets with transmission delays. Phys D 1994, 76: 344–358. 10.1016/0167-2789(94)90043-4MathSciNetCrossRef
3.
Zurück zum Zitat Cao J, Chen A, Huang X: Almost periodic attractor of delayed neural networks with variable coefficients. Phys Lett A 2005, 340: 104–120. 10.1016/j.physleta.2005.04.021CrossRef Cao J, Chen A, Huang X: Almost periodic attractor of delayed neural networks with variable coefficients. Phys Lett A 2005, 340: 104–120. 10.1016/j.physleta.2005.04.021CrossRef
4.
Zurück zum Zitat Gui Z, Ge W: Existence and uniqueness of periodic solutions of nonautonomous cellular neural networks with impulses. Phys Lett A 2006, 354: 84–94. 10.1016/j.physleta.2006.01.018CrossRef Gui Z, Ge W: Existence and uniqueness of periodic solutions of nonautonomous cellular neural networks with impulses. Phys Lett A 2006, 354: 84–94. 10.1016/j.physleta.2006.01.018CrossRef
5.
Zurück zum Zitat Lou X, Cui B: Novel global stability criteria for high-order Hopfield-type neural networks with time-varying delays. J Math Anal Appl 2007, 330: 144–158. 10.1016/j.jmaa.2006.07.058MathSciNetCrossRef Lou X, Cui B: Novel global stability criteria for high-order Hopfield-type neural networks with time-varying delays. J Math Anal Appl 2007, 330: 144–158. 10.1016/j.jmaa.2006.07.058MathSciNetCrossRef
6.
Zurück zum Zitat Zhang Q, Wei X, Xu J: Delay-dependent global stability condition for delayed Hopfield neural networks. Nonlinear Anal 2007, 8: 997–1002. 10.1016/j.nonrwa.2006.05.002MathSciNetCrossRef Zhang Q, Wei X, Xu J: Delay-dependent global stability condition for delayed Hopfield neural networks. Nonlinear Anal 2007, 8: 997–1002. 10.1016/j.nonrwa.2006.05.002MathSciNetCrossRef
7.
Zurück zum Zitat Liu B: Almost periodic solutions for Hopfield neural networks with continuously distributed delays. Math Comput Simul 2007, 73: 327–335. 10.1016/j.matcom.2006.05.027CrossRef Liu B: Almost periodic solutions for Hopfield neural networks with continuously distributed delays. Math Comput Simul 2007, 73: 327–335. 10.1016/j.matcom.2006.05.027CrossRef
8.
Zurück zum Zitat Liu B, Huang L: Existence and exponential stability of almost periodic solutions for Hopfield neural networks with delays. Neurocomputing 2005, 68: 196–207.CrossRef Liu B, Huang L: Existence and exponential stability of almost periodic solutions for Hopfield neural networks with delays. Neurocomputing 2005, 68: 196–207.CrossRef
9.
Zurück zum Zitat Chen Z, Ruan J: Global stability analysis of impulsive Cohen--Grossberg neural networks with delay. Phys Lett A 2005, 345: 101–111. 10.1016/j.physleta.2005.06.104CrossRef Chen Z, Ruan J: Global stability analysis of impulsive Cohen--Grossberg neural networks with delay. Phys Lett A 2005, 345: 101–111. 10.1016/j.physleta.2005.06.104CrossRef
10.
Zurück zum Zitat Arik S, Tavsanoglu V: On the global asymptotic stability of delayed cellular neural networks. IEEE Trans Circuits Syst I 2000,47(4):571–574. 10.1109/81.841859MathSciNetCrossRef Arik S, Tavsanoglu V: On the global asymptotic stability of delayed cellular neural networks. IEEE Trans Circuits Syst I 2000,47(4):571–574. 10.1109/81.841859MathSciNetCrossRef
11.
Zurück zum Zitat Arik S, Tavsanoglu V: Global asymptotic stability analysis of bidirectional associative memory neural networks with constant time delays. Neurocomputing 2005, 68: 161–176.CrossRef Arik S, Tavsanoglu V: Global asymptotic stability analysis of bidirectional associative memory neural networks with constant time delays. Neurocomputing 2005, 68: 161–176.CrossRef
12.
Zurück zum Zitat Feng C, Plamondon R: Stability analysis of bidirectional associative memory networks with time delays. IEEE Trans Neural Netw 2003, 14: 1560–1565. 10.1109/TNN.2003.820829CrossRef Feng C, Plamondon R: Stability analysis of bidirectional associative memory networks with time delays. IEEE Trans Neural Netw 2003, 14: 1560–1565. 10.1109/TNN.2003.820829CrossRef
13.
Zurück zum Zitat Zhang Y, Sun J: Stability of impulsive neural networks with time delays. Phys Lett A 2005, 348: 44–50. 10.1016/j.physleta.2005.08.030CrossRef Zhang Y, Sun J: Stability of impulsive neural networks with time delays. Phys Lett A 2005, 348: 44–50. 10.1016/j.physleta.2005.08.030CrossRef
14.
Zurück zum Zitat Liu X, Teo K, Xu B: Exponential stability of impulsive high-order Hopfield-type neural networks with time-varying delays. IEEE Trans Neural Netw 2005, 16: 1329–1339. 10.1109/TNN.2005.857949CrossRef Liu X, Teo K, Xu B: Exponential stability of impulsive high-order Hopfield-type neural networks with time-varying delays. IEEE Trans Neural Netw 2005, 16: 1329–1339. 10.1109/TNN.2005.857949CrossRef
15.
Zurück zum Zitat Qiu J: Exponential stability of impulsive neural networks with time-varying delays and reaction-diffusion terms. Neurocomputing 2007, 70: 1102–1108.CrossRef Qiu J: Exponential stability of impulsive neural networks with time-varying delays and reaction-diffusion terms. Neurocomputing 2007, 70: 1102–1108.CrossRef
16.
Zurück zum Zitat Yang Z, Xu D: Global exponential stability of Hopfield neural networks with variable delays and impulsive effects. Appl Math Mech 2006,27(11):1517–1522. 10.1007/s10483-006-1109-1MathSciNetCrossRef Yang Z, Xu D: Global exponential stability of Hopfield neural networks with variable delays and impulsive effects. Appl Math Mech 2006,27(11):1517–1522. 10.1007/s10483-006-1109-1MathSciNetCrossRef
17.
Zurück zum Zitat Akca H, et al.: Continuous-time additive Hopfield-type neural networks with impulses. J Math Anal Appl 2004, 290: 436–451. 10.1016/j.jmaa.2003.10.005MathSciNetCrossRef Akca H, et al.: Continuous-time additive Hopfield-type neural networks with impulses. J Math Anal Appl 2004, 290: 436–451. 10.1016/j.jmaa.2003.10.005MathSciNetCrossRef
18.
Zurück zum Zitat Yue D, Xu S, Liu Y: Differential inequality with delay and impulse and its applications to design robust control. Control Theory Appl 1999,16(4):519–524.MathSciNet Yue D, Xu S, Liu Y: Differential inequality with delay and impulse and its applications to design robust control. Control Theory Appl 1999,16(4):519–524.MathSciNet
19.
Zurück zum Zitat Zhou J, Xiang L, Liu Z: Synchronization in complex delayed dynamical networks with impulsive effects. Physica A 2007, 384: 684–692. 10.1016/j.physa.2007.05.060MathSciNetCrossRef Zhou J, Xiang L, Liu Z: Synchronization in complex delayed dynamical networks with impulsive effects. Physica A 2007, 384: 684–692. 10.1016/j.physa.2007.05.060MathSciNetCrossRef
20.
Zurück zum Zitat Yang Z, Xu D: Stability analysis of delay neural networks with impulsive effects. IEEE Trans Circuits Syst I 2005,52(1):517–521.CrossRef Yang Z, Xu D: Stability analysis of delay neural networks with impulsive effects. IEEE Trans Circuits Syst I 2005,52(1):517–521.CrossRef
21.
Zurück zum Zitat Xu D, Yang Z: Impulsive delay differential inequality and stability of neural networks. J Math Anal Appl 2005, 305: 107–120. 10.1016/j.jmaa.2004.10.040MathSciNetCrossRef Xu D, Yang Z: Impulsive delay differential inequality and stability of neural networks. J Math Anal Appl 2005, 305: 107–120. 10.1016/j.jmaa.2004.10.040MathSciNetCrossRef
22.
Zurück zum Zitat Yang F, Zhang C, Wu D: Global stability analysis of impulsive BAM type Cohen-Grossberg neural networks with delays. Appl Math Comput 2007, 186: 932–940. 10.1016/j.amc.2006.08.016MathSciNetCrossRef Yang F, Zhang C, Wu D: Global stability analysis of impulsive BAM type Cohen-Grossberg neural networks with delays. Appl Math Comput 2007, 186: 932–940. 10.1016/j.amc.2006.08.016MathSciNetCrossRef
23.
Zurück zum Zitat Cao J, Wang J: Global asymptotic stability of a general class of recurrent neural networks with time-varying delays. IEEE Trans Circuits Syst I 2003, 50: 34–44. 10.1109/TCSI.2002.807494MathSciNetCrossRef Cao J, Wang J: Global asymptotic stability of a general class of recurrent neural networks with time-varying delays. IEEE Trans Circuits Syst I 2003, 50: 34–44. 10.1109/TCSI.2002.807494MathSciNetCrossRef
24.
Zurück zum Zitat Guo S, Huang L: Stability analysis of a delayed Hopfield neural network. Phys Rev E 2003, 67: 119–122. Guo S, Huang L: Stability analysis of a delayed Hopfield neural network. Phys Rev E 2003, 67: 119–122.
25.
Zurück zum Zitat Li X: Existence and global exponential stability of periodic solution for impulsive Cohen Grossberg-type BAM neural networks with continuously distributed delays. Appl Math Comput 2009, 215: 292–307. 10.1016/j.amc.2009.05.005MathSciNetCrossRef Li X: Existence and global exponential stability of periodic solution for impulsive Cohen Grossberg-type BAM neural networks with continuously distributed delays. Appl Math Comput 2009, 215: 292–307. 10.1016/j.amc.2009.05.005MathSciNetCrossRef
26.
Zurück zum Zitat Pan L, Cao J: Oscillations of even order linear impulsive delay differential equations. Differ Equ Appl 2010, 2: 163–176.MathSciNet Pan L, Cao J: Oscillations of even order linear impulsive delay differential equations. Differ Equ Appl 2010, 2: 163–176.MathSciNet
27.
Zurück zum Zitat Ho DWC, Liang J, Lam J: Global exponential stability of impulsive high-order BAM neural networks with time-varying delays. Neural Netw 2006, 19: 1581–1590. 10.1016/j.neunet.2006.02.006CrossRef Ho DWC, Liang J, Lam J: Global exponential stability of impulsive high-order BAM neural networks with time-varying delays. Neural Netw 2006, 19: 1581–1590. 10.1016/j.neunet.2006.02.006CrossRef
28.
Zurück zum Zitat Lu J, Ho DWC, Cao J: A unified synchronization criterion for impulsive dynamical networks. Automatica 2010, 46: 1215–1221. 10.1016/j.automatica.2010.04.005MathSciNetCrossRef Lu J, Ho DWC, Cao J: A unified synchronization criterion for impulsive dynamical networks. Automatica 2010, 46: 1215–1221. 10.1016/j.automatica.2010.04.005MathSciNetCrossRef
Metadaten
Titel
Further analysis on stability of delayed neural networks via inequality technique
verfasst von
Jiayu Wang
Publikationsdatum
01.12.2011
Verlag
Springer International Publishing
Erschienen in
Journal of Inequalities and Applications / Ausgabe 1/2011
Elektronische ISSN: 1029-242X
DOI
https://doi.org/10.1186/1029-242X-2011-103

Weitere Artikel der Ausgabe 1/2011

Journal of Inequalities and Applications 1/2011 Zur Ausgabe

Premium Partner