当前位置: X-MOL 学术Int. J. Optomechatron. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
A method to improve the computational performance of nonlinear all—optical diffractive deep neural network model
International Journal of Optomechatronics ( IF 5.5 ) Pub Date : 2023-06-28 , DOI: 10.1080/15599612.2023.2209624
Yichen Sun 1, 2 , Mingli Dong 2 , Mingxin Yu 2 , Lidan Lu 2 , Shengjun Liang 2 , Jiabin Xia 3 , Lianqing Zhu 2
Affiliation  

Abstract

To further improve the computational performance of the diffractive deep neural network (D2NN) model, we use the ReLU function to limit the phase parameters, which effectively solves the problem of vanishing gradient that occurs in the mitigation model. We add various commonly used nonlinear activation functions to the hidden layer of the model and establish the ReLU phase-limit nonlinear diffractive deep neural network (ReLU phase-limit N-D2NN) model. We evaluate the model by comparing the performance of various nonlinear activation functions, where confusion matrix and accuracy are used as evaluation methods. The numerical simulation results show that the model achieves better classification performance on the MNIST and Fashion-MNIST datasets, respectively. In particular, the highest classification performance is obtained by the ReLU phase-limit N-D2NN model, in which the hidden layer uses PReLU, with 98.38% and 90.14%, respectively. This paper provides a theoretical basis for applying the nonlinear D2NN systems in natural scenes.



中文翻译:

一种提高非线性全光衍射深度神经网络模型计算性能的方法

摘要

为了进一步提高衍射深度神经网络(D 2 NN)模型的计算性能,我们使用ReLU函数来限制相位参数,有效解决了缓解模型中出现的梯度消失问题。我们在模型的隐藏层添加各种常用的非线性激活函数,建立ReLU相限非线性衍射深度神经网络(ReLU相限ND 2NN)模型。我们通过比较各种非线性激活函数的性能来评估模型,其中使用混淆矩阵和准确性作为评估方法。数值模拟结果表明,该模型分别在 MNIST 和 Fashion-MNIST 数据集上取得了更好的分类性能。其中,隐藏层使用PReLU的ReLU相限ND 2 NN模型获得了最高的分类性能,分别为98.38%和90.14%。本文为非线性D 2 NN系统在自然场景中的应用提供了理论基础。

更新日期:2023-07-02
down
wechat
bug