当前位置: X-MOL 学术Adv. Eng. Inform. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
A hierarchical feature-logit-based knowledge distillation scheme for internal defect detection of magnetic tiles
Advanced Engineering Informatics ( IF 8.8 ) Pub Date : 2024-04-08 , DOI: 10.1016/j.aei.2024.102526
Luofeng Xie , Xuexiang Cen , Houhong Lu , Guofu Yin , Ming Yin

Magnetic tiles are the key components of various electrical and mechanical systems in modern industry, and detecting their internal defects holds immense significance in maintaining system performance and ensuring operational safety. Recently, deep learning has emerged as a leading approach in pattern recognition due to its strong capability of extracting latent information. In practical scenarios, there is a growing demand for embedding deep learning algorithms in edge devices to enable real-time decision-making and reduce data communication costs. However, a powerful deep learning algorithm with high complexity is impractical for deployment on edge devices with limited memory capacity and computational power. To overcome this issue, we propose a novel knowledge distillation method, entitled hierarchical feature-logit-based knowledge distillation, to compress deep neural networks for internal defect detection of magnetic tiles. Specifically, it comprises a one-to-all feature matching for disparate feature knowledge distillation, a logit separation for relevant and irrelevant logit knowledge distillation, and a parameter value prediction network for seamlessly fusing feature and logit knowledge distillation. Besides, an ingenious hierarchical distillation mechanism is designed to address the capacity gap issue between the teacher and the student. The extensive experimental results show the effectiveness of our proposed model. The code is available at .

中文翻译:

基于分层特征logit的磁瓦内部缺陷检测知识蒸馏方案

磁瓦是现代工业中各种机电系统的关键部件,检测其内部缺陷对于维持系统性能、确保运行安全具有重要意义。近年来,深度学习因其强大的提取潜在信息的能力而成为模式识别的主要方法。在实际场景中,人们越来越需要在边缘设备中嵌入深度学习算法以实现实时决策并降低数据通信成本。然而,复杂度高的强大深度学习算法对于部署在内存容量和计算能力有限的边缘设备上来说是不切实际的。为了克服这个问题,我们提出了一种新颖的知识蒸馏方法,名为基于分层特征逻辑的知识蒸馏,用于压缩深度神经网络以进行磁瓦内部缺陷检测。具体来说,它包括用于不同特征知识蒸馏的一对多特征匹配、用于相关和不相关Logit知识蒸馏的Logit分离,以及用于无缝融合特征和Logit知识蒸馏的参数值预测网络。此外,还设计了巧妙的分层蒸馏机制来解决教师和学生之间的能力差距问题。广泛的实验结果表明了我们提出的模型的有效性。该代码可在 处获取。
更新日期:2024-04-08
down
wechat
bug