当前位置: X-MOL 学术Knowl. Inf. Syst. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Explaining deep multi-class time series classifiers
Knowledge and Information Systems ( IF 2.7 ) Pub Date : 2024-03-04 , DOI: 10.1007/s10115-024-02073-y
Ramesh Doddaiah , Prathyush S. Parvatharaju , Elke Rundensteiner , Thomas Hartvigsen

Explainability helps users trust deep learning solutions for time series classification. However, existing explainability methods for multi-class time series classifiers focus on one class at a time, ignoring relationships between the classes. Instead, when a classifier is choosing between many classes, an effective explanation must show what sets the chosen class apart from the rest. We now formalize this notion, studying the open problem of class-specific explainability for deep time series classifiers, a challenging and impactful problem setting. We design a novel explainability method, DEMUX, which learns saliency maps for explaining deep multi-class time series classifiers by adaptively ensuring that its explanation spotlights the regions in an input time series that a model uses specifically to its predicted class. DEMUX adopts a gradient-based approach composed of three interdependent modules that combine to generate consistent, class-specific saliency maps that remain faithful to the classifier’s behavior yet are easily understood by end users. We demonstrate that DEMUX outperforms nine state-of-the-art alternatives on seven popular datasets when explaining two types of deep time series classifiers. We analyze runtime performance, show the impacts of hyperparameter selection, and introduce a detailed study of perturbation methods for time series. Further, through a case study, we demonstrate that DEMUX’s explanations indeed highlight what separates the predicted class from the others in the eyes of the classifier.



中文翻译:

解释深度多类时间序列分类器

可解释性有助于用户信任时间序列分类的深度学习解决方案。然而,现有的多类时间序列分类器的可解释性方法每次只关注一个类,而忽略了类之间的关系。相反,当分类器在许多类之间进行选择时,有效的解释必须表明是什么使所选类与其他类区分开来。我们现在正式化这个概念,研究深度时间序列分类器的类特定可解释性的开放问题,这是一个具有挑战性和影响力的问题设置。我们设计了一种新颖的可解释性方法 DEMUX,它通过自适应地确保其解释突出模型专门用于其预测类的输入时间序列中的区域来学习显着图,以解释深度多类时间序列分类器。DEMUX 采用基于梯度的方法,该方法由三个相互依赖的模块组成,这些模块结合起来生成一致的、特定于类的显着性图,这些图既忠实于分类器的行为,又易于最终用户理解。在解释两种类型的深度时间序列分类器时,我们证明 DEMUX 在七个流行数据集上的性能优于九个最先进的替代方案。我们分析运行时性能,展示超参数选择的影响,并介绍时间序列扰动方法的详细研究。此外,通过案例研究,我们证明 DEMUX 的解释确实突出了分类器眼中预测类与其他类的区别。

更新日期:2024-03-05
down
wechat
bug