Stability criteria for BAM neural networks with leakage delays and probabilistic time-varying delays

https://doi.org/10.1016/j.amc.2013.03.070Get rights and content

Abstract

This paper is concerned with the stability criteria for bidirectional associative memory (BAM) neural networks with leakage time delay and probabilistic time-varying delays. By establishing a stochastic variable with Bernoulli distribution, the information of probabilistic time-varying delay is transformed into the deterministic time-varying delay with stochastic parameters. Based on the Lyapunov–Krasovskii functional and stochastic analysis approach, delay-probability-distribution-dependent sufficient conditions are derived to achieve the globally asymptotically mean square stable of the considered BAM neural networks. The criteria are formulated in terms of a set of linear matrix inequalities (LMIs), which can be checked efficiently by use of some standard numerical packages. Finally, a numerical example and its simulations are given to demonstrate the usefulness and effectiveness of the proposed results.

Introduction

Over the past few decades, dynamical behavior of neural networks has been studied much in science and technology area, such as signal processing, parallel computing, optimization problems, and so on [1], [2]. This led to significant attraction of many researchers, like mathematicians, physicists, computer scientists and biologist. In this regard, Hopfield [3] modeled a continuous time-dynamical neural networks which contain n dynamic neural units (DNUs) by the implementation of analog RC (resistance capacitance) network circuit. Further, Marcus and Westervelt [4] introduced a time delay into the above model. This contributed to the increased attention on stability analysis of various kind of neural network models such as Hopfield neural networks, cellular neural networks, Cohen–Grossberg neural networks, and BAM neural networks [5], [6], [7], [8], [9], [10], [11], [12], [13], [14], [15], [16], [17], [18], [19]. It is well known that BAM is a type of recurrent neural network which was introduced by Kosko in 1988 [20], who generalized the single auto-associative Hebbian correlator to a two-layer pattern-matched hetero-associative circuit. Recently, BAM neural networks have attracted the attention of many researchers [21], [22], [23], [24], [25], [26], [27], [28], [29], because of its widely application in the field of pattern recognition, automatic control associative memory and image processing.

Naturally, time delays which cause the poor performance and instability of dynamic systems are commonly encountered in various physical, engineering and neural based systems. Many existing works investigated about neural networks with deterministic time-delay, but at the same time, latest literatures are concerned with neural networks with stochastic time-delay. For example, if some values of the delay are very large but the probabilities of its occurrence are very small. In this case, we do not get less conservative results when only the information of variation range of the time delay is considered. Thus, the stability analysis of dynamic neural networks with random time-delay deserves to receive much attention and has been studied in recent years [30], [31], [32], [33]. The authors in [30] addressed the problem of delay-distribution-dependent state estimation for discrete-time stochastic neural networks with random delay. Similarly, other authors have also proposed the delay-distribution-dependent stability of stochastic discrete-time neural networks with randomly mixed time-varying delays in [31].

On the other hand, a typical time delay called as Leakage (or “forgetting”) delay may exist in the negative feedback terms of the neural network system and it has a great impact on the dynamic behaviors of delayed neural networks. Therefore, the leakage delay in dynamical neural networks is considered as an important research topic in stability analysis [37], [38], [39], [40], [41], [42], [43]. In this perspective, Gopalsamy [35] investigated the stability analysis for the BAM neural networks with constant delay in the leakage term. Further, Peng [36] addressed the BAM neural networks with continuously distributed delays in the leakage terms and derived conditions for the existence and global attraction of periodic solutions via the Lyapunov functional approach. Following this work, the authors of [37], [38] have studied the stability of BAM neural networks with fuzzy and impulsive effect. Likewise, other researchers have also investigated the problem of stability analysis for neural networks and nonlinear system by using LMI technique, Lyapunov–Krasovskii functional and free matrix inequality in [39], [40], [41], [42], [43]. To the best of our knowledge, none have worked on the issue of stability criteria for BAM neural networks with time-delays in the leakage term and probabilistic time-varying delays, till now.

Motivated by the above discussion, the main objective of this paper is to propose the stability criteria for BAM neural networks with leakage delay and probabilistic time-varying delay functions by using a combination of Lyapunov–Krasovskii functional with triple integral terms, stochastic stability theory, Jenson’s inequality and free-weighting matrices. All the criteria are expressed in terms of LMIs. Finally a numerical example is given to show the effectiveness and significance of the proposed criterion.

Notations: Rn and Rn×n denote the n-dimensional Euclidean space and the set of all n×n real matrices respectively. The superscript T denotes the transposition and the notation XY (similarly, X>Y), where X and Y are symmetric matrices, means that X-Y is positive semi-definite (similarly, positive definite). · is the Euclidean norm in Rn and Λ={1,2,,n}. Pr{α} means the occurrence probability of the event α. E{x} and E{x|y} , respectively, mean the expectation of the stochastic variable x and the expectation of the stochastic variable x conditional on the stochastic variable y. diag{} stands for a block diagonal matrix. The notation always denotes the symmetric block in one symmetric matrix. λmin(·) and λmax(·) denote the minimum and maximum eigenvalues of a given matrix.

Section snippets

Problem description and preliminaries

The delayed BAM neural networks can be described as follows:u̇i(t)=-aiui(t-ρ1)+j=1mbij1fj̃(vj(t))+j=1mbij2fj̃(vj(t-τ(t)))+Ii,i=1,,n,v̇j(t)=-cjvj(t-ρ2)+i=1ndij1gĩ(ui(t))+i=1ndij2gĩ(ui(t-σ(t)))+Jj,j=1,,mor be rewritten in the following vector–matrix form:u̇(t)=-Au(t-ρ1)+B1f̃(v(t))+B2f̃(v(t-τ(t)))+I,v̇(t)=-Cv(t-ρ2)+D1g̃(u(t))+D2g̃(u(t-σ(t)))+J,where u(t)=[u1(t),u2(t),,un(t)]TRn,v(t)=[v1(t),v2(t),,vn(t)]TRm are neuron state vectors, A=diag{a1,,an,}>0, C=diag{c1,,cn,}>0 are diagonal

Main results

Using a simple transformation, model (11) has an equivalent form as follows:ddtx(t)-At-ρ1tx(s)ds=-Ax(t)+B1f(y(t))+α0B2f(y(t-τ1(t)))+(1-α0)B2f(y(t-τ2(t)))+(α(t)-α0)B2f(y(t-τ1(t)))-B2f(y(t-τ2(t))),ddty(t)-Ct-ρ2ty(s)ds=-Cy(t)+D1g(x(t))+β0D2g(x(t-σ1(t)))+(1-β0)D2g(x(t-σ2(t)))+(β(t)-β0)D2g(x(t-σ1(t)))-D2g(x(t-σ2(t))).For representation convenience, we introduce the following notations:L1=diag{l1-l1+,l2-l2+,,lm-lm+,},L2=diagl1-+l1+2,l2-+l2+2,,lm-+lm+2,L-=diag{l1-,l2-,,lm-},L+=diagdiag{l1+,l2+,,l

Numerical example

In this section, a numerical example is provided along with simulation results to illustrate the potential benefits and effectiveness of the developed method for BAM neural networks.

Consider a third-order delayed BAM neural network (11) or (12) with the following parametersA=2.80003.10003.2,B1=0.90-1.2-0.70100.21.3,B2=100.2-1.200.40.5-0.20,C=2.60002.50002.6,D1=0.4-0.80-0.50.40.8100.4,D2=0.30000.30000.3and the activation functions are taken as follows:f(y(t))=tanhy(t),g(x(t))=tanhx(t).Further,

Conclusions

In this paper, we have dealt with the stability criteria for BAM neural networks with time delays in leakage term and probabilistic time-varying delays. By using a model transformation, appropriate Lyapunov–Krasovskii functional, and some inequality techniques, several delay-dependent stability criteria for BAM neural networks have been derived. Finally, a numerical example has been given to show the effectiveness and superiority of the proposed results. Especially, the sensitivity on system

References (48)

  • B. Liu et al.

    Delay-range-dependent stability for fuzzy BAM neural networks with time-varying delays

    Phys. Lett. A

    (2009)
  • H. Bao et al.

    Delay-distribution-dependent state estimation for discrete-time stochastic neural networks with random delay

    Neural Netw.

    (2011)
  • Y. Zhang et al.

    Robust delay-distribution-dependent stability of discrete-time stochastic neural networks with time-varying delay

    Neurocomputing

    (2009)
  • Y. Tang et al.

    Delay-distribution-dependent stability of stochastic discrete-time neural networks with randomly mixed time-varying delays

    Neurocomputing

    (2009)
  • K. Gopalsamy

    Leakage delays in BAM

    J. Math. Anal. Appl.

    (2007)
  • S. Peng

    Global attractive periodic solutions of BAM neural networks with continuously distributed delays in the leakage terms

    Nonlinear Anal. Real World Appl.

    (2010)
  • P. Balasubramaniam et al.

    Global asymptotic stability of BAM fuzzy cellular neural networks with Time delay in the leakage term, discrete and unbounded distributed delays

    Math. Comput. Model.

    (2011)
  • C. Li et al.

    On the stability of nonlinear systems with leakage delay

    J. Franklin Inst.

    (2009)
  • X. Li et al.

    Existence, uniqueness and stability analysis of recurrent neural networks with time delay in the leakage term under impulsive perturbations

    Nonlinear Anal. Real World Appl.

    (2010)
  • Y. Liu et al.

    Global exponential stability of generalized recurrent neural networks with discrete and distributed delays

    Neural Netw.

    (2006)
  • Z. Wang et al.

    Robust stability analysis of generalized neural networks with discrete and distributed time delays

    Chaos Solitons Fractals

    (2006)
  • P.G. Park et al.

    Reciprocally convex approach to stability of systems with time-varying delays

    Automatica

    (2011)
  • S. Haykin

    Neural Networks: A comprehensive Foundation

    (1998)
  • A. Cichocki et al.

    Neural Networks for Optimization and Signal Processing

    (1993)
  • Cited by (0)

    This work was supported by 2012 Yeungnam University Research Grant.

    View full text