Some sufficient conditions for the global exponential stability and lower bounds on the rate of exponential convergence of the cellular neural networks with delay (DCNNs) are obtained by means of a method based on del...Some sufficient conditions for the global exponential stability and lower bounds on the rate of exponential convergence of the cellular neural networks with delay (DCNNs) are obtained by means of a method based on delay differential inequality. The method, which does not make use of any Lyapunov functional, is simple and valid for the stability analysis of neural networks with delay. Some previously established results in this paper are shown to be special casses of the presented result.展开更多
Globally exponential stability (which implies convergence and uniqueness) of their classical iterative algorithm is established using methods of heat equations and energy integral after embedding the discrete iterat...Globally exponential stability (which implies convergence and uniqueness) of their classical iterative algorithm is established using methods of heat equations and energy integral after embedding the discrete iteration into a continuous flow. The stability condition depends explicitly on smoothness of the image sequence, size of image domain, value of the regularization parameter, and finally discretization step. Specifically, as the discretization step approaches to zero, stability holds unconditionally. The analysis also clarifies relations among the iterative algorithm, the original variation formulation and the PDE system. The proper regularity of solution and natural images is briefly surveyed and discussed. Experimental results validate the theoretical claims both on convergence and exponential stability.展开更多
文摘Some sufficient conditions for the global exponential stability and lower bounds on the rate of exponential convergence of the cellular neural networks with delay (DCNNs) are obtained by means of a method based on delay differential inequality. The method, which does not make use of any Lyapunov functional, is simple and valid for the stability analysis of neural networks with delay. Some previously established results in this paper are shown to be special casses of the presented result.
基金Foundation item: Projects(60835005, 90820302) supported by the National Natural Science Foundation of China Project(2007CB311001) supported by the National Basic Research Program of China
文摘Globally exponential stability (which implies convergence and uniqueness) of their classical iterative algorithm is established using methods of heat equations and energy integral after embedding the discrete iteration into a continuous flow. The stability condition depends explicitly on smoothness of the image sequence, size of image domain, value of the regularization parameter, and finally discretization step. Specifically, as the discretization step approaches to zero, stability holds unconditionally. The analysis also clarifies relations among the iterative algorithm, the original variation formulation and the PDE system. The proper regularity of solution and natural images is briefly surveyed and discussed. Experimental results validate the theoretical claims both on convergence and exponential stability.