当前位置: X-MOL 学术Pattern Recogn. Lett. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
PNSP: Overcoming catastrophic forgetting using Primary Null Space Projection in continual learning
Pattern Recognition Letters ( IF 5.1 ) Pub Date : 2024-02-15 , DOI: 10.1016/j.patrec.2024.02.009
DaiLiang Zhou , YongHong Song

Continual Learning (CL) plays a crucial role in enhancing learning performance for both new and previous tasks in continuous data streams, thus contributing to the advancement of cognitive computing. However, CL faces a fundamental challenge known as the stability-plasticity quandary. In this research, we present an innovative and effective CL algorithm called Primary Null Space Projection (PNSP) to strike a balance between network plasticity and stability. PNSP consists of three main components. Firstly, it leverages the NSP-LRA algorithm to project the gradient of network parameters from previous tasks into a meticulously designed null space. NSP-LRA harnesses high-dimensional geometric information extracted from the feature covariance matrix through low-rank approximation algorithm to obtain the basis of null space dynamically. This process constructs an innovation null space and ensures the continuous updating of orthonormal bases to accommodate changes in the input data. Secondly, we propose a Consistency-guided Task-specific Feature Learning (CTFL) mechanism to tackle the issue of catastrophic forgetting and facilitate continual learning. CTFL achieves this by aligning feature vectors and maintaining consistent feature learning directions, thereby preventing the loss of previously acquired knowledge. Lastly, we introduce Label Guided Self-Distillation (LGSD), a technique that utilizes true labels to guide the distillation process and incorporates a dynamic temperature mechanism to enhance performance. To evaluate the effectiveness of our proposed method, we conduct experiments on the CIFAR100 and TinyImageNet datasets. The results demonstrate significant improvements over state-of-the-art methods. We have made the implementation code of our approach available for reference.

中文翻译:

PNSP:在持续学习中使用初级零空间投影克服灾难性遗忘

持续学习(CL)在提高连续数据流中新任务和先前任务的学习性能方面发挥着至关重要的作用,从而促进认知计算的进步。然而,CL 面临着一个根本性的挑战,即稳定性-可塑性困境。在这项研究中,我们提出了一种创新且有效的 CL 算法,称为主零空间投影 (PNSP),以在网络可塑性和稳定性之间取得平衡。PNSP 由三个主要部分组成。首先,它利用 NSP-LRA 算法将先前任务的网络参数梯度投影到精心设计的零空间中。NSP-LRA利用从特征协方差矩阵中提取的高维几何信息,通过低秩逼近算法动态地获得零空间的基础。这个过程构建了一个创新零空间,并确保正交基的不断更新以适应输入数据的变化。其次,我们提出了一种一致性引导的特定任务特征学习(CTFL)机制来解决灾难性遗忘问题并促进持续学习。CTFL 通过对齐特征向量并保持一致的特征学习方向来实现这一点,从而防止丢失先前获得的知识。最后,我们介绍标签引导自蒸馏(LGSD),这是一种利用真实标签来指导蒸馏过程并结合动态温度机制来提高性能的技术。为了评估我们提出的方法的有效性,我们在 CIFAR100 和 TinyImageNet 数据集上进行了实验。结果表明,与最先进的方法相比,有了显着的改进。我们已经提供了我们方法的实现代码供参考。
更新日期:2024-02-15
down
wechat
bug