z-logo
open-access-imgOpen Access
Dimension reduction in recurrent networks by canonicalization
Author(s) -
Lyudmila Grigoryeva,
JuanPablo Ortega
Publication year - 2021
Publication title -
journal of geometric mechanics
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.511
H-Index - 17
eISSN - 1941-4897
pISSN - 1941-4889
DOI - 10.3934/jgm.2021028
Subject(s) - mathematics , hilbert space , reduction (mathematics) , gramian matrix , uniqueness , realization (probability) , controllability , forgetting , dimension (graph theory) , computer science , pure mathematics , mathematical analysis , eigenvalues and eigenvectors , linguistics , physics , geometry , statistics , philosophy , quantum mechanics
Many recurrent neural network machine learning paradigms can be formulated using state-space representations. The classical notion of canonical state-space realization is adapted in this paper to accommodate semi-infinite inputs so that it can be used as a dimension reduction tool in the recurrent networks setup. The so-called input forgetting property is identified as the key hypothesis that guarantees the existence and uniqueness (up to system isomorphisms) of canonical realizations for causal and time-invariant input/output systems with semi-infinite inputs. Additionally, the notion of optimal reduction coming from the theory of symmetric Hamiltonian systems is implemented in our setup to construct canonical realizations out of input forgetting but not necessarily canonical ones. These two procedures are studied in detail in the framework of linear fading memory input/output systems. {Finally, the notion of implicit reduction using reproducing kernel Hilbert spaces (RKHS) is introduced which allows, for systems with linear readouts, to achieve dimension reduction without the need to actually compute the reduced spaces introduced in the first part of the paper.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here