运用不等式 aΠ_(k=1)~m b_k^q k≤(1/r)sum from k=1 to m q_kb_k^r+(1/r)a^r(a≥0,b_k≥0,q_k<0,sum from k=1 to m q_k=r-1,r>1)和构造新的李雅普洛夫泛函方法,研究了时滞双向联想记忆神经网络的全局指数稳定性。去掉了相关文...运用不等式 aΠ_(k=1)~m b_k^q k≤(1/r)sum from k=1 to m q_kb_k^r+(1/r)a^r(a≥0,b_k≥0,q_k<0,sum from k=1 to m q_k=r-1,r>1)和构造新的李雅普洛夫泛函方法,研究了时滞双向联想记忆神经网络的全局指数稳定性。去掉了相关文献中有关传递函数有界性的假设,给出了较弱的并且不依赖于时滞的判别条件,增强了模型的适用性,在网络的分析和设计中发挥着重要作用。最后我们通过模拟仿真进一步说明所得结果的正确性,并对双向联想记忆神经网络的收敛速度作了分析。展开更多
This paper analyzes noise sensitivity of bidirectional association memory (BAM) and shows that the anti-noise capability of BAM relates not only to the minimum absolute value of net inputs(MAV), as some researchers fo...This paper analyzes noise sensitivity of bidirectional association memory (BAM) and shows that the anti-noise capability of BAM relates not only to the minimum absolute value of net inputs(MAV), as some researchers found, but also to the variance of weights associated with synapse connections. In fact, it is determined by the quotient of these two factors. On this base, a novel learning algorithm-small variance leaning for BAM(SVBAM) is proposed, which is to decrease the variance of the weights of synapse matrix. Simulation experiments show that the algorithm can decrease the variance of weights efficiently, therefore, noise immunity of BAM is improved. At the same time, perfect recall of all training pattern pairs still can be guaranteed by the algorithm.展开更多
文摘运用不等式 aΠ_(k=1)~m b_k^q k≤(1/r)sum from k=1 to m q_kb_k^r+(1/r)a^r(a≥0,b_k≥0,q_k<0,sum from k=1 to m q_k=r-1,r>1)和构造新的李雅普洛夫泛函方法,研究了时滞双向联想记忆神经网络的全局指数稳定性。去掉了相关文献中有关传递函数有界性的假设,给出了较弱的并且不依赖于时滞的判别条件,增强了模型的适用性,在网络的分析和设计中发挥着重要作用。最后我们通过模拟仿真进一步说明所得结果的正确性,并对双向联想记忆神经网络的收敛速度作了分析。
基金国家自然科学基金,Innovation Research Foundation of Nankai University,Technology Key Project
文摘This paper analyzes noise sensitivity of bidirectional association memory (BAM) and shows that the anti-noise capability of BAM relates not only to the minimum absolute value of net inputs(MAV), as some researchers found, but also to the variance of weights associated with synapse connections. In fact, it is determined by the quotient of these two factors. On this base, a novel learning algorithm-small variance leaning for BAM(SVBAM) is proposed, which is to decrease the variance of the weights of synapse matrix. Simulation experiments show that the algorithm can decrease the variance of weights efficiently, therefore, noise immunity of BAM is improved. At the same time, perfect recall of all training pattern pairs still can be guaranteed by the algorithm.