当前位置: X-MOL 学术Commun. Anal. Mech. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Dimension reduction in recurrent networks by canonicalization
Communications in Analysis and Mechanics ( IF 0.8 ) Pub Date : 2021-11-10 , DOI: 10.3934/jgm.2021028
Lyudmila Grigoryeva , Juan-Pablo Ortega

Many recurrent neural network machine learning paradigms can be formulated using state-space representations. The classical notion of canonical state-space realization is adapted in this paper to accommodate semi-infinite inputs so that it can be used as a dimension reduction tool in the recurrent networks setup. The so-called input forgetting property is identified as the key hypothesis that guarantees the existence and uniqueness (up to system isomorphisms) of canonical realizations for causal and time-invariant input/output systems with semi-infinite inputs. Additionally, the notion of optimal reduction coming from the theory of symmetric Hamiltonian systems is implemented in our setup to construct canonical realizations out of input forgetting but not necessarily canonical ones. These two procedures are studied in detail in the framework of linear fading memory input/output systems. {Finally, the notion of implicit reduction using reproducing kernel Hilbert spaces (RKHS) is introduced which allows, for systems with linear readouts, to achieve dimension reduction without the need to actually compute the reduced spaces introduced in the first part of the paper.

中文翻译:

通过规范化在循环网络中降维

许多循环神经网络机器学习范式可以使用状态空间表示来制定。本文采用了经典状态空间实现的经典概念以适应半无限输入,以便它可以用作循环网络设置中的降维工具。所谓的输入遗忘特性被确定为关键假设,它保证具有半无限输入的因果和时不变输入/输出系统的规范实现的存在性和唯一性(直至系统同构)。此外,在我们的设置中实现了来自对称哈密顿系统理论的最优归约概念,以从输入遗忘中构建规范实现,但不一定是规范实现。这两个过程在线性衰落存储器输入/输出系统的框架中进行了详细研究。{最后,引入了使用再现内核希尔伯特空间 (RKHS) 的隐式缩减的概念,对于具有线性读数的系统,它允许实现维度缩减,而无需实际计算本文第一部分中介绍的缩减空间。
更新日期:2021-12-06
down
wechat
bug