当前位置: X-MOL 学术Inf. Process. Lett. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
On size-independent sample complexity of ReLU networks
Information Processing Letters ( IF 0.5 ) Pub Date : 2024-02-07 , DOI: 10.1016/j.ipl.2024.106482
Mark Sellke

We study the sample complexity of learning ReLU neural networks from the point of view of generalization. Given norm constraints on the weight matrices, a common approach is to estimate the Rademacher complexity of the associated function class. Previously obtained a bound independent of the network size (scaling with a product of Frobenius norms) except for a factor of the square-root depth. We give a refinement which often has no explicit depth-dependence at all.

中文翻译:

ReLU 网络与大小无关的样本复杂度

我们从泛化的角度研究学习ReLU神经网络的样本复杂度。给定权重矩阵的范数约束,一种常见的方法是估计相关函数类的 Rademacher 复杂度。先前获得了与网络大小无关的界限(除平方根深度因子外)(与 Frobenius 范数的乘积进行缩放)。我们给出的细化通常根本没有明确的深度依赖性。
更新日期:2024-02-07
down
wechat
bug