当前位置: X-MOL 学术Numer. Math. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Two-layer networks with the $$\text {ReLU}^k$$ activation function: Barron spaces and derivative approximation
Numerische Mathematik ( IF 2.1 ) Pub Date : 2023-11-23 , DOI: 10.1007/s00211-023-01384-6
Yuanyuan Li , Shuai Lu , Peter Mathé , Sergei V. Pereverzev

We investigate the use of two-layer networks with the rectified power unit, which is called the \(\text {ReLU}^k\) activation function, for function and derivative approximation. By extending and calibrating the corresponding Barron space, we show that two-layer networks with the \(\text {ReLU}^k\) activation function are well-designed to simultaneously approximate an unknown function and its derivatives. When the measurement is noisy, we propose a Tikhonov type regularization method, and provide error bounds when the regularization parameter is chosen appropriately. Several numerical examples support the efficiency of the proposed approach.



中文翻译:

具有 $$\text {ReLU}^k$$ 激活函数的两层网络:Barron 空间和导数近似

我们研究了带有整流功率单元的两层网络的使用,该单元被称为\(\text {ReLU}^k\)激活函数,用于函数和导数逼近。通过扩展和校准相应的巴伦空间,我们表明具有\(\text {ReLU}^k\)激活函数的两层网络经过精心设计,可以同时逼近未知函数及其导数。当测量有噪声时,我们提出了一种 Tikhonov 型正则化方法,并在适当选择正则化参数时提供误差界限。几个数值例子支持所提出方法的效率。

更新日期:2023-11-26
down
wechat
bug