Skip to main content
  • Research Article
  • Open access
  • Published:

Extended LaSalle's Invariance Principle for Full-Range Cellular Neural Networks

Abstract

In several relevant applications to the solution of signal processing tasks in real time, a cellular neural network (CNN) is required to be convergent, that is, each solution should tend toward some equilibrium point. The paper develops a Lyapunov method, which is based on a generalized version of LaSalle's invariance principle, for studying convergence and stability of the differential inclusions modeling the dynamics of the full-range (FR) model of CNNs. The applicability of the method is demonstrated by obtaining a rigorous proof of convergence for symmetric FR-CNNs. The proof, which is a direct consequence of the fact that a symmetric FR-CNN admits a strict Lyapunov function, is much more simple than the corresponding proof of convergence for symmetric standard CNNs.

Publisher note

To access the full article, please see PDF.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Mauro Di Marco.

Rights and permissions

Open Access This article is distributed under the terms of the Creative Commons Attribution 2.0 International License (https://creativecommons.org/licenses/by/2.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Reprints and permissions

About this article

Cite this article

Di Marco, M., Forti, M., Grazzini, M. et al. Extended LaSalle's Invariance Principle for Full-Range Cellular Neural Networks. EURASIP J. Adv. Signal Process. 2009, 730968 (2009). https://doi.org/10.1155/2009/730968

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1155/2009/730968

Keywords