首页 关于本刊 编 委 会 期刊动态 作者中心 审者中心 读者中心 下载中心 联系我们 English
 自动化学报  2017, Vol. 43 Issue (8): 1465-1469 PDF

Globally Exponential Stability of Memristive Neural Networks With Time-varying Delays and Synchronous Switching
Yinlu Jiang, Chuandong Li
College of Electronic and Information Engineering, Southwest University, Chongqing 400715, China
Abstract: In this paper, we formulate and investigate a memristive neural networks with time-varying delays and synchronous switching. Conditions are derived which ensure the existence of an equilibrium point and uniform stability for state trajectories of the memristive neural network. The analysis in the paper employs results from the theory of Lyapunov function. Moreover, the proposed stability conditions are straightforward and convenient which can reflect the impact of time-varying delays on the stability. The simulation results demonstrate the effectiveness of the theoretical results.
Key words: Lyapunov function     memristive neural networks     synchronous switching     time-varying delays
1 Introduction

The "memristor", an abbreviation for memory resistor, is the fourth basic circuit element along with resistors, capacitors and inductors which were studied by Chua [1] in 1971. Chua et al. [2] showed that the value of the memristor, called memristance, is the function of electric charge $q$ given as $M(q) = \frac{d\varphi}{dq}$, where $\varphi$ represents the magnetic flux. On May 1, 2008, the Hewlett-Packard (HP) research team proudly announced that they had built a prototype of the memristor with an official publication in $Nature$ [3].

As we know, the memristive neural network is a prerequisite in many applications such as signal processing, pattern recognition, systems control and intelligent circuit [4]-[11]. Because of the special nonlinear structure and its important applications, the theoretical analysis of memristive neural network is a very important issue for research. Memristive switching neural network is one of the study branches. Switching system is a hybrid system, which consists of several subsystems and a switching rule. Neural networks with switching behavior are called switching neural networks. Because of the special nature of memristor, there are many researchers who have been interested in investigating memristive switching neural networks [12]-[15].

It is known that time delay affects the dynamic behaviors of neural networks enormously, and delay-dependent stability criteria are of little conservatism, therefore many important results have been established for time-delay system [16]-[18]. According to the feature of memristor and the current characteristics, we know there exists a fixed switching time between memristors' two states, thus, the memristor is a switching system itself. Motivated by the above discussion, the main aim in this paper was to establish a memristive neural network with time-varying delays and put some external switching signals which are synchronous with the fixed switching time between memristors' two states. This memristive system is not only a time-delay system but also a switching system, we can call the system memristive neural networks with time-varying delays and synchronous switching. This is a new system that is never studied before, we will establish stability criteria for this system in this paper. The method of this paper can be extended to study some other memristive neural networks.

According to the feature of memristor and the current characteristics, in this paper, we apply a simple mathematical model of the memristance as follows [19]:

 \begin{align} M(u(t))=\begin{cases} M',& { \dot{u}(t)>0 } \\ M",& { \dot{u}(t)<0 } \\ \lim\limits_{\tau\rightarrow t-} M(u(\tau)),& { \dot{u}(t)=0 } \end{cases} \end{align} (1)

where $u$ is the voltage applied to the memristor, $\dot{u}(t)$ is the derivative of $u$ with respect to time $t$. $M'\leq M"$, $M'$ and $M"$ are the memristances when $\dot{u}(t)>0$ and $\dot{u}(t)<0$ respectively. When $\dot{u}(t)=0$, $\lim_{\tau\rightarrow t-} M(u(\tau))$ means that the memristance keeps the current value.

The rest of this paper is organized as follows. In Section 2, we put forward a memristive neural network with time-varying delays and synchronous switching model, then make a brief description. In Section 3, we analyze the system in brief via Lyapunov function. In Section 4, one example is given to illustrate our results. Finally, in Section 5 we give the conclusion, in Section 6 we give the acknowledgments.

2 Model Description

We consider the following memristive neural networks with time-varying delays and synchronous switching model:

 \begin{align} \dot{z}_{i}(t)=& -d_{i\sigma(t)}z_{i}(t)+\sum^n_{j=1}a_{ij}(z_i-z_j)_{\sigma(t)}\tilde{f}_{j}(z_{j}(t))\notag\\ & +\sum^n_{j=1}b_{ij}(z_i-z_j)_{\sigma(t)}\tilde{g}_{j}(z_{j}(t-\tau_{ij}(t)))+S_{i\sigma(t)}\notag\\ &\qquad \quad\qquad\qquad\qquad\qquad \quad i=1, 2, \ldots, n \end{align} (2)

where $z_{i}(t)$ is the state variable of the $i$th neuron, $\gamma=\sigma(t)$ is the switching signal which takes value in the finite set $\Sigma$ $=$ $\{1, 2, \ldots, N\}$, $d_{i\gamma}$ is the $i$th neuron's self-feedback connection weight at the switching signal of $\gamma$, $a_{ij}(z_i-z_j)_{\gamma}$ and $b_{ij}(z_i-z_j)_{\gamma}$ are, respectively, memristive connection noindent weights and those associated with time delays at the switching signal of $\gamma$, $S_{i\gamma}$ is the external constant inputs at the switching signal of $\gamma$, $\tau_{ij}(t)$ is time-varying delays for the system. $n$ denotes the number of neurons in the indicated neural networks. $\tilde{f}_{i}(\cdot)$ and $\tilde{g}_{i}(\cdot)$ are the $i$th activation functions and those associated with time delays, respectively.

System (2) can be rewritten in the following vector form

 \begin{align} \dot{z}(t)=&\ P(z)\notag\\ = &-D_{\sigma(t)}z(t)+ A(z)_{\sigma(t)}\tilde{f}(z(t))\notag\\ &+ B(z)_{\sigma(t)}\tilde{g}(z(t-\tau(t)))+S_{\sigma(t)}. \end{align} (3)

By applying the theories of set-valued maps and differential inclusions, the memristive neural network (3) has the same solution set as the following differential inclusion equation [19]-[21]:

 \begin{align} \dot{z}(t)\in&\ {{\rm co}\{P(z)\}}\notag\\ =& -D_{\sigma(t)}z(t)+ A_{\sigma(t)}\tilde{f}(z(t))\notag\\ &+ B_{\sigma(t)}\tilde{g}(z(t-\tau(t)))+S_{\sigma(t)}, \quad i=1, 2, \ldots, n \end{align} (4)

where $D_{\gamma}={\rm diag}{(d_{1\gamma}, d_{2\gamma}, \ldots, d_{n\gamma})}$, $\gamma\in\{1, 2, \ldots, N\}$. For some time $t$, $\sigma(t)=\gamma$, we say that $i$th neural network $(D_{i}$, $A_{i}$, $B_{i})$ is activated at time $t$. Assume that the function $\sigma(t)$ is right-continuous, that is, $\sigma(t)=\sigma(t^+)$. The time $t$ is called the switching time if $\sigma(t)\neq\sigma(t^-)$. And

 \begin{align*} &A_{\sigma(t)}=(\xi^{j}_{i\sigma(t)}(t)a'_{ij}+(1-\xi^{j}_{i\sigma(t)}(t))a"_{ij})_{n\times n}\\ &B_{\sigma(t)}=(\xi^{j}_{i\sigma(t)}(t)b'_{ij}+(1-\xi^{j}_{i\sigma(t)}(t))b"_{ij})_{n\times n} \end{align*}

$\xi^{j}_{i\sigma(t)}(t)$ are arbitrary constants such that $0\leq\xi^{j}_{i\sigma(t)}(t)$ $\leq$ $1$ and $\xi^{j}_{i\sigma(t)}(t)+\xi^{i}_{j\sigma(t)}(t)=1$. $\tilde{f}(z(t))=[\tilde{f}_{1}(z_{1}(t))$, $\tilde{f}_{2}(z_{2}(t))$, $\ldots, \tilde{f}_{n}(z_{n}(t))]^T$, $\tilde{g}(z(t))=[\tilde{g}_{1}(z_{1}(t)), \tilde{g}_{2}(z_{2}(t))$, $\ldots$, $\tilde{g}_{n}(z_{n}(t))]^T$. $\tau(t)$ is the time delay with $\tilde{\tau}$ $\leq$ $\tau(t)\leq\bar{\tau}$. $S_{\gamma}$ is the external input, $S_{\gamma}=(S_{1\gamma}, S_{2\gamma}, \ldots, S_{n\gamma})^T$. Assume $P(z)$ is locally bounded, according to the Lemma 2 in [19], the existence of the solution of (4) is ensured.

The differential inclusion equation (4) means that there exist sets ${\rm diag}\{d_{1\gamma}, d_{2\gamma}, \ldots, d_{n\gamma}\}$, ${\rm diag}\{S_{1\gamma}, S_{2\gamma}, \ldots, S_{n\gamma}\}$ and $\{\xi^{i}_{j\sigma(t)}(t) \}$ such that

 \begin{align} \dot{z}_{i}(t)=&-d_{i}z_{i}(t)+\sum^n_{j=1}[\xi_{ij}(t)a'_{ij}+(1-\xi_{ij}(t))a"_{ij})]\tilde{f}_{j}(z_{j}(t))\notag\\ &+\sum^n_{j=1}[\xi_{ij}(t)b'_{ij}+(1-\xi_{ij}(t))b"_{ij})]\notag\\ &+\tilde{g}_{j}(z_{j}(t-\tau_{ij}(t)))+S_{i}, \quad i=1, 2, \ldots, n. \end{align} (5)

Moreover, we assume that the initial conditions of the system (4) are of the form

 \begin{align} z_i(t)=\phi_i(t), \quad t\in[-\tau, 0], \ t\in[-\max\limits_{1\leq i, j\leq n}]\nonumber \end{align}

where $\phi_i(\cdot)$ denote real-valued continuous functions defined on $[-\tau, 0]$.

Suppose that $z^*=(z^*_1, z^*_2, \ldots, z^*_n)^T$ is an equilibrium point of system (2). Let $x(t)=z(t)-z^*$, then system (4) can be rewritten as follows

 \begin{align} \dot{x}(t)=-D_{\sigma(t)}x(t)+A_{\sigma(t)}f(x(t))+B_{\sigma(t)}g(x(t-\tau(t))) \end{align} (6)

where

 \begin{align*} &f(x(t))=(f_1(x_1(t)), f_2(x_2(t)), \ldots, f_n(x_n(t)))^T\end{align*}
 \begin{align*} &f_n(x_n(t))=\tilde{f}_n((x_n(t))+z^*_n)-\tilde{f}_n(z^*_n)\\ &g(x(t-\tau(t)))=(g_1(x_1(t-\tau(t))), \\ &\qquad\quad g_2(x_2(t-\tau(t))), \ldots, g_n(x_n(t-\tau(t))))^T\\ &g_n(x_n(t-\tau(t)))=\tilde{g}_n(x_n(t-\tau(t)+z^*_n)-\tilde{g}_n(z^*_n).\end{align*}

Transform (6) into the following form

 \begin{align} \dot{x}_{i}(t)= &-d_{i}x_{i}(t)+\sum^n_{j=1}[\xi_{ij}(t)a'_{ij}+(1-\xi_{ij}(t))a"_{ij})]f_{j}(x_{i}(t))\notag\\ &+\sum^n_{j=1}[\xi_{ij}(t)b'_{ij}+(1-\xi_{ij}(t))b"_{ij})]g_{j}(x_{i}(t-\tau_{ij}(t)))\notag\\ &\qquad \qquad\qquad \qquad\qquad \qquad\quad i=1, 2, \ldots, n \end{align} (7)

where $f_{j}(x_{i}(t))=\tilde{f}_{j}(x_{j}(t)+x^*_j)-\tilde{f}_{j}(x^*_{j})$ and $g_{j}(x_{i}(t))=\tilde{g}_{j}(x_{j}(t)+x^*_j)-\tilde{g}_{j}(x^*_{j})$.

The initial conditions of system (4) will be transformed into the following form

 $x_i(s)=\phi_i(s)-z^{*}_i=\varphi_i(s), \ \ \ s\in[-\tau, 0], \ t=\max\limits_{1\leq i, j\leq n}\{\tau_{ij}\}.$

The following assumptions and definition are made on system (2) throughout this paper.

Assumption 1: There exist positive constants $F_i$, $G_i$, $i$ $=$ $1$, $2$, $\ldots, n$, and $f_i(0)=0$, $g_i(0)=0$ such that, for all arguments

 \begin{align} 0\leq\frac{f_i(x_i)-f_j(x_j)}{x_i-x_j}\leq F_i, \ \ 0\leq\frac{g_i(x_i)-g_j(x_j)}{x_i-x_j}\leq G_i.\nonumber \end{align}

Assumption 2: $\tau_{ij}:[0, +\infty)\rightarrow[0, +\infty)$ is continuously differentiable, and $0\leq\tau_{ij}\leq{\bar{\tau}}$, $\tau'_{ij}\leq R<1.$

Assumption 2 ensures that $t-\tau_{ij}$ has differential inverse function denoted by $\varphi_{ij}(t)$ and $\inf_{t>0}\{\varphi'_{ij}(t)\}>0$.

Definition 1 [22]: Let us consider the set-valued map $\phi(x)$ defined as:

 \begin{align} \phi(x)=\bigcap\limits_{\sigma>0} \bigcap\limits_{\mu(N)=0} K[y(t, B_{\sigma}(x))\setminus N]\nonumber \end{align}

where $K[U]$ represents the closure of convex hull of set $U$, i.e., $K[U]=\overline{\rm co}(U)$. $\mu[U]$ denotes the Lebesgue measure of set $U$, $N$ is an arbitrary set with measure zero. When $y(t, x)$ is locally bounded, there exists a set $N^t_0\subset \mathbb{R}^n$ and $\mu(N^t_0)$ $=$ $0$, such that $\phi(x)={\rm co}\{\nu:$ there exists a set $\{x_i\}$ that satisfies $x_i\notin N^t_0\bigcup N$ and $\nu=\lim y(t, x_i)\}$ for any $t$ $\geq$ $0$, $N\subset \mathbb{R}^n$ and $\mu(N)=0$.

3 Analysis of Memristive Synchronous Switching Neural Networks

Theorem 1: Assume that there exist positive constants $\lambda_i$ $(i=1, 2, \ldots, n)$, $\mu$, $\nu\in [0,1]$, and positive constants $\varepsilon$ such that the following condition holds:

 \begin{align} \theta=& \max\limits_{1\leq i\leq n}\sup\limits_{t\geq0}\{-2(d_i-\varepsilon)\lambda_i\notag\\ &+\lambda_i\sum^n_{j=1}[F^{2\mu}_j|a_{ij}|_{\max}+G^{2\nu}_je^{2\varepsilon\bar{\tau}}|b_{ij}|_{\max}]\notag\\ &+\sum^n_{j=1}\lambda_j[F^{2(1-\mu)}_j|a_{ij}|_{\max}\notag\\ &+G^{2(1-\nu)}_j|b_{ij}|_{\max}\varphi'_{ij}(t)]\}x^2_i(t)<0 \end{align} (8)

then the trivial solution of (7) is exponentially stable, and we say the system (2) is globally exponentially stable.

Proof: We can choose the following nonnegative Lyapunov function candidate for system (7)

 \begin{align} V(t)=&\ \sum^n_{i=1}\lambda_i\{x^2_i(t)e^{2\varepsilon t}\notag\\ &+\sum^n_{i=1}G^{2(1-\nu)}_j\int^{\varphi_{ij}(t)}_t\left|\xi_{ij}(t)b'_{ij}+(1-\xi_{ij}(t))b"_{ij}\right|\notag\\ &\times x^2_j(s-\tau_{ij}(s))e^{2\varepsilon(s-\tau_{ij}(s))}ds\} \end{align} (9)

where $t>0$, and then compute the upper and right Dini derivative along the trajectories of system (7)

 \begin{align} D^{+}V(t)=&\ \sum^n_{i=1}\lambda_i\{2x_i(t)[-d_{i}(x_{i}(t))\notag\\ &+\sum^n_{j=1}(\xi_{ij}(t)a'_{ij} +(1-\xi_{ij}(t))a"_{ij})f_{j}(x_{i}(t))\notag\\ &+\sum^n_{j=1}(\xi_{ij}(t)b'_{ij}+ (1-\xi_{ij}(t))b"_{ij})\notag\\ &\times g_{j}(x_{i}(t-\tau_{ij}(t)))]e^{2\varepsilon t}+x^2_i(t)e^{2\varepsilon t}\notag\\ & +\sum^n_{j=1}G^{2(1-\nu)}_j|(\xi_{ij}(t)b'_{ij} +(1-\xi_{ij}(t))b"_{ij})|\notag\\ &\times x^2_j(t)\varphi'_{ij}(t)e^{2\varepsilon t} -\sum^n_{j=1}G^{2(1-\nu)}_j|(\xi_{ij}(t)b'_{ij} \notag\\ &+(1-\xi_{ij}(t))b"_{ij})|\times x^2_j(t-\tau_{ij}(t))e^{2\varepsilon (t-\tau_{ij}(t))\}}\notag\\ \leq &\ \sum^n_{i=1}\lambda_ie^{2\varepsilon t}\{-2(d_i- \varepsilon)x^2_i(t) \notag\\ \qquad\qquad &+\sum^n_{j=1}|a_{ij}|_{\max} 2(F^{\mu}_j|x_i(t)|) \times(F^{1-\mu}_j|x_j(t)|)\notag\\ &+\sum^n_{j=1}|b_{ij}|_{\max} 2(e^{\varepsilon\tau_{ij}(t)}G^{\nu}_j|x_i(t)|) \notag\\ &\times(e^{-\varepsilon\tau_{ij}(t)}G^{1-\nu}_j|x_j (t-\tau_{ij}(t))|)\notag\\ &+\sum^n_{j=1}G^{2(1-\nu)}_j |b_{ij}|_{\max} x^2_j(t)\varphi'_{ij}(t)\notag\\ &\times-\sum^n_{j=1}G^{2(1-\nu)}_j|b_{ij}|_{\max}x^2_j (t-\tau_{ij}(t))\}\notag\\ \leq &\ \sum^n_{i=1}\lambda_ie^{2\varepsilon t}\{-2(d_i- \varepsilon)x^2_i(t)\notag\\ &+\sum^n_{j=1}|a_{ij}|_{\max}F^ {2\mu}_jx^2_i(t)\notag\\ &+\sum^n_{j=1}|a_{ij}|_{\max}F^{2(1-\mu)}_jx^2_j(t) \notag\\ &+\sum^n_{j=1}|b_{ij}|_{\max}e^{2\varepsilon\tau_{ij}(t)} G^{2\nu}_jx^2_i(t) \nonumber\\ &+\sum^n_{j=1}|b_{ij}|_{\max}e^{-2\varepsilon\tau_{ij}(t)} G^{2(1-\nu)}_jx^2_j(t-\tau_{ij}(t))\notag\\ &+\sum^n_{j=1}|b_{ij}|_{\max} G^{2(1-\nu)}_jx^2_j(t) \varphi'_{ij}(t)\notag\\ &-\sum^n_{j=1}G^{2(1-\nu)}_j|b_{ij}|_{\max}e^{2\varepsilon \tau_{ij}(t)}x^2_j (t-\tau_{ij}(t))\notag\\ \leq&\ \sum^n_{i=1}\lambda_ie^{2\varepsilon t}\{-2(d_i-\varepsilon)\lambda_i\notag\\ &+\lambda_i\sum^n_{j=1}[F^{2\mu}_j|a_{ij}|_{\max}+G^ {2\nu}_je^{2\varepsilon\bar{\tau}}|b_{ij}|_{\max}]\notag\\ &+\sum^n_{j=1}\lambda_j[F^{2(1-\mu)}_j|a_{ij}|_{\max}\notag\\ & +G^{2(1-\nu)}_j|b_{ij}|_{\max}\varphi'_{ij}(t)]\}x^2_i(t)\notag\\ \leq&\ \max\limits_{1\leq i\leq n}\{\lambda_i(d_i-\varepsilon)(-2+ \alpha)\}\sum^n_{i=1}x^2_i(t)e^{2\varepsilon t}\notag\\ =&\ \max\limits_{1\leq i\leq n}\theta\sum^n_{i=1}x^2_i(t)e^{2\varepsilon t}. \end{align} (10)

Therefore, when $\theta<0$

 \begin{align} V(t)\leq V(0), ~~~t\geq0. \end{align} (11)

After a series of calculations according to (9), we can obtain

 \begin{align} V(0)=&\ \sum^n_{i=1}\lambda_ix^2_i(0)+\sum^n_{i=1}\sum^n_{j=1}\lambda_iG^{2(1-\nu)}_{j=i}\notag\\ &\times\int^{\varphi_{ij}(0)}_0 |\xi_{ij}(t)b'_{ij}+(1-\xi_{ij}(t))b"_{ij}|\notag\\ &\times x^2_j(s-\tau_{ij}(s)e^{2\varepsilon(s-\tau_{ij}(s)))}ds \end{align} (12)
 \begin{align} \leq&\ \max\limits_{1\leq i\leq n}\{\lambda_i\}(1+nK)(1-R)^{-1}\notag\\ &\times \max\limits_{1\leq j\leq n}\{G^{2(1-\nu)}_{j=1}\}\|\phi\|^2_2 \end{align} (13)

where $K=\max_{1\leq i\leq n}\sup_{-\tau_{ij}(0)\leq t\leq 0}\sum^n_{i=1}\int^0_{-\tau_{ij}(0)}|b_{ij}|_{\max}ds$, and $\xi=s-\tau_{ij}(s)=\varphi^{-1}_{ij}(s)$.

Based on Lemma 1 in [4], we get $\sum^n_{i=1}\int^\infty_0x^2_i(t)e^{2\varepsilon t}dt<\infty$. From (8), we obtain

 \begin{align} \min\limits_{1\leq i\leq n}\{\lambda_i\}e^{2\varepsilon t}\sum^n_{i=1}x^2_i(t)\leq \sum^n_{i=1}x^2_i(t)\lambda_ie^{2\varepsilon t}\leq V(t). \end{align} (14)

Combining (10), (11) and (13), we can get

 \begin{align} \|x(t)\|_2\equiv\left(\sum^n_{i=1}x^2_i(t)\right)^ \frac{1}{2}\leq\chi_\theta\|\phi\|_2e^{-\varepsilon t} \end{align} (15)

where

 \begin{align} \chi_\theta=&\left(\frac{\max\limits_{1\leq i\leq n}\{\lambda_i\}}{\min\limits_{1\leq i\leq n}\{\lambda_i\}}\right)^\frac{1}{2} \left(1+nK(1-R)^{-1}\right)\max\limits_{1\leq i\leq n}\left\{G^{2(1-\nu)}_{j=1}\right\}. \end{align} (16)

Therefore, the equilibrium point $z^*$ of the memristive neural networks system is globally exponentially stable. Thus, the proof of Theorem 1 is completed.

The proposed criteria satisfy not only the case of binary-value memristor connection weight like previous results but also the case of the memristor connection weight changing continuously with the time. Moreover, we can say that the criteria of the system called memristive neural networks with time-varying delays and synchronous switching are firstly established.

Remark 1: In many other papers, the authors only considered the case of binary-value memristor connection weight. In the present paper, we relax this limitation and assume the memristor connection weight is changing continuously with the time, and the polarity of the voltage is also applied to the memristor.

4 An Illustrative Example

Example 1: The vector form of the memristive system (2) is as follows

 \begin{align*} \dot{z}(t)=&-D_{\sigma(t)}z(t)+ A(z)_{\sigma(t)}\tilde{f}(z(t))\\ &+ B(z)_{\sigma(t)}\tilde{g}(z(t-\tau(t)))+S_{\sigma(t)}.\nonumber \end{align*}

Consider the switching system with $\Sigma\in\{1, 2\}$, $D_1={\rm diag}(1, 1)$, $D_2={\rm diag}(5, 5)$, external input $S_1=S_2=$ $(0, 0)^T$, time delay $\tau=0.2-0.05\sin^2(t)$.

 $A\left( z \right) = \left( {\begin{array}{*{20}{c}} {0.4}&{{a_{12}}}\\ {{a_{21}}}&{0.2} \end{array}} \right),\;\;\;B\left( z \right) = \left( {\begin{array}{*{20}{c}} { - 0.2}&{{b_{12}}}\\ {{b_{21}}}&{ - 0.15} \end{array}} \right)$

in which

 \begin{align*} &a_{12}= \begin{cases} -1.5,&{\dot{z}_1>\dot{z}_2} \\ 0.5, & {\dot{z}_1<\dot{z}_2} \end{cases}, &a_{21}= \begin{cases} 0.25, & {\dot{z}_1>\dot{z}_2 } \\ -1, & {\dot{z}_1<\dot{z}_2} \end{cases}\ \ \\ &b_{12}= \begin{cases} 0.6, & {\dot{z}_1>\dot{z}_2} \\ -0.5, & {\dot{z}_1<\dot{z}_2} \end{cases}, &b_{21}= \begin{cases} 0.7, & {\dot{z}_1>\dot{z}_2} \\ -0.15, & {\dot{z}_1<\dot{z}_2} \end{cases} \end{align*}

and $\tilde{f}_i(x)=\tilde{g}_i(x)=({e^x-e^{-x}})/({e^x+e^{-x}})$, $i=1, 2, \ldots, n$, which implies that $F_i=G_i={1}/{2}$. We can obtain that

 $|A|_{\max}=\left( \begin{array}{cc} 0.4&1.5 \\ 1&0.2 \\ \end{array} \right), \quad |B|_{\max}=\left( \begin{array}{cc} 0.2&0.6 \\ 0.7&0.15 \\ \end{array} \right).$

Choose $\lambda_1=\lambda_2=1$, $\varepsilon=1$, it is easy to verify that those conditions satisfied Theorem 1, and system (2) is globally exponentially stable with convergence rate $\varepsilon$ based on Theorem 1. Time response curves for memristive neural network with time-varying delays and synchronous switching are as shown in Fig. 1.

 Figure 1 Time response curves for memristive neural network with time-varying delays and synchronous switching.
5 Conclusions

In this paper, we have studied the global exponential stability of memristive neural networks with time-varying delays and synchronous switching. Simple and easy criteria for exponential stability are obtained. As a result, one numerical example has been presented to illustrate effectiveness of the proposed theory. The method of this paper may be extended to study some other memristive neural networks.

References
 1 L. O. Chua, "Memristor-the missing circuit element, " IEEE Trans. Circ. Theory, vol. 18, no. 5, pp. 507-519, Sep. 1971. http://ieeexplore.ieee.org/document/1083337/ 2 L. O. Chua and S. M. Kang, "Memristive devices and systems, " Proc. IEEE, vol. 64, no. 2, pp. 209-223, Feb. 1976. http://ieeexplore.ieee.org/document/1454361/ 3 D. B. Strukov, G. S. Snider, D. R. Stewart, and R. S. Williams, "The missing memristor found, " Nature, vol. 453, no. 7191, pp. 80-83, May2008. 4 X. F. Liao and K. W. Wong, "Global exponential stability for a class of retarded functional differential equations with applications in neural networks, " J. Math. Anal. Appl. , vol. 293, no. 1, pp. 125-148, May2004. http://www.sciencedirect.com/science/article/pii/S0022247X04000344 5 C. J. Li, C. D. Li and T. W. Huang, "Exponential stability of impulsive high-order Hopfield-type neural networks with delays and reaction-diffusion, " Int. J. Computer Math. , vol. 88, no. 15, pp. 3150-3162, May2011. 6 C. D. Li, C. J. Li and C. Liu, "Destabilizing effects of impulse in delayed BAM neural networks, " Mod. Phys. Lett. B, vol. 23, no. 29, pp. 3503-3513, Nov. 2009. http://www.worldscientific.com/doi/abs/10.1142/S0217984909021569 7 X. F. Liao and J. B. Yu, "Robust stability for interval Hopfield neural networks with time delay, " IEEE Trans. Neural Netw. , vol. 9, no. 5, pp. 1042-1045, Sep. 1998. http://www.ncbi.nlm.nih.gov/pubmed/18255787 8 C. D. Li, X. F. Liao, and K. W. Wong, "Delay-dependent and delay-independent stability criteria for cellular neural networks with delays, " Int. J. Bifurcation Chaos, vol. 16, no. 11, pp. 3323-3340, Nov. 2006. http://www.worldscientific.com/doi/abs/10.1142/S0218127406016811 9 X. He, C. D. Li, and Y. L. Shu, "Bogdanov-takens bifurcation in a single inertial neuron model with delay, " Neurocomputing, vol. 89, pp. 193-201, Jul. 2012. 10 S. K. Duan, X. F. Hu, L. D. Wang, and C. D. Li, "Analog memristive memory with applications in audio signal processing, " Sci. China Inf. Sci. , vol. 57, no. 4, pp. 1-15, Apr. 2014. http://link.springer.com/article/10.1007/s11432-013-4864-z 11 X. Wang, C. D. Li, T. W. Huang, and S. K. Duan, "Global exponential stability of a class of memristive neural networks with time-varying delays, " Neural Comput. Appl. , vol. 24, no. 7-8, pp. 1707-1715, Jun. 2014. http://link.springer.com/article/10.1007/s00521-013-1383-1 12 A. V. Avizienis, H. O. Sillin, C. Martin-Olmos, H. H. Shieh, M. Aono, A. Z. Stieg, and J. K. Gimzewski, "Neuromorphic atomic switch networks, " PLoS One, vol. 7, no. 8, Article ID e42772, Aug. 2012. 13 A. L. Wu, Y. Shen, Z. G. Zeng, and J. Zhang, "Analysis of a memristor-based switching network, " in Proc. 2011 Int. Conf. Information Science and Technology (ICIST), Nanjing, China, 2011, pp. 1043-1046. http://ieeexplore.ieee.org/xpls/abs_all.jsp?arnumber=5765150 14 F. Z. Wang, N. Helian, S. Wu, X. Yang, Y. K. Guo, G. Lim, and M. M. Rashid, "Delayed switching applied to memristor neural networks, " J. Appl. Phys. , vol. 111, no. 7, Article ID 07E317, Feb. 2012. http://ieeexplore.ieee.org/xpls/abs_all.jsp?arnumber=6158403 15 L. Chua, "Resistance switching memories are memristors, " Appl. Phys. A, vol. 102, no. 4, pp. 765-783, Mar. 2011. http://link.springer.com/article/10.1007/s00339-011-6264-9 16 Q. Han, C. D. Li, and J. J. Huang, "Estimation on error bound of lag synchronization of chaotic systems with time delay and parameter mismatch, " J. Vib. Control, vol. 16, no. 11, pp. 1701-1711, May2010. 17 C. Liu, C. D. Li, and S. K. Duan, "Stabilization of oscillating neural networks with time-delay by intermittent control, " Int. J. Control Autom. Syst. , vol. 9, no. 6, pp. 1074-1079, Dec. 2011. http://link.springer.com/article/10.1007/s12555-011-0607-3 18 S. C. Wu, C. D. Li, X. F. Liao, and S. K. Duan, "Exponential stability of impulsive discrete systems with time delay and applications in stochastic neural networks: a Razumikhin approach, " Neurocomputing, vol. 82, 29-36, Apr. 2012. http://www.sciencedirect.com/science/article/pii/S0925231211007090 19 J. Hu and J. Wang, "Global uniform asymptotic stability of memristor-based recurrent neural networks with time delays, " in Proc. 2010 Int. Joint Conf. Neural Networks (IJCNN), Barcelona, Spain, pp. 1-8. http://en.cnki.com.cn/Article_en/CJFDTotal-YYSX201307006.htm 20 G. D. Zhang, Y. Shen, and J. W. Sun, "Global exponential stability of a class of memristor-based recurrent neural networks with time-varying delays, " Neurocomputing, vol. 97, pp. 149-154, Nov. 2012. http://www.sciencedirect.com/science/article/pii/S0925231212003761 21 L. Zhang and Z. Yi, "Selectable and unselectable sets of neurons in recurrent neural networks with saturated piecewise linear transfer function, " IEEE Transactions on Neural Networks, vol. 22, no. 7, pp. 1021-1031, Jul. 2011. http://ieeexplore.ieee.org/document/5772937/ 22 M. Forti and P. Nistri, "Global convergence of neural networks with discontinuous neuron activations, " IEEE Trans. Circuits and Systems Ⅰ: Fundamental Theory and Applications, vol. 50, no. 11, pp. 1421-1435, Nov. 2003.