Unconstrained learning of networked nonlinear systems via free parametrization of stable interconnected operators

23 Nov 2023  ·  Leonardo Massai, Danilo Saccani, Luca Furieri, Giancarlo Ferrari-Trecate ·

This paper characterizes a new parametrization of nonlinear networked incrementally $L_2$-bounded operators in discrete time. The distinctive novelty is that our parametrization is \emph{free} -- that is, a sparse large-scale operator with bounded incremental $L_2$ gain is obtained for any choice of the real values of our parameters. This property allows one to freely search over optimal parameters via unconstrained gradient descent, enabling direct applications in large-scale optimal control and system identification. Further, we can embed prior knowledge about the interconnection topology and stability properties of the system directly into the large-scale distributed operator we design. Our approach is extremely general in that it can seamlessly encapsulate and interconnect state-of-the-art Neural Network (NN) parametrizations of stable dynamical systems. To demonstrate the effectiveness of this approach, we provide a simulation example showcasing the identification of a networked nonlinear system. The results underscore the superiority of our free parametrizations over standard NN-based identification methods where a prior over the system topology and local stability properties are not enforced.

PDF Abstract

Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here