当前位置: X-MOL 学术Pattern Recogn. Lett. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
SPACE: Senti-Prompt As Classifying Embedding for sentiment analysis
Pattern Recognition Letters ( IF 5.1 ) Pub Date : 2024-02-28 , DOI: 10.1016/j.patrec.2024.02.022
Jinyoung Kim , Youngjoong Ko

In natural language processing, the general approach to sentiment analysis involves a pre-training and fine-tuning paradigm using pre-trained language models combined with classifier models. Recently, numerous studies have applied prompts not only to downstream generation but also to classification tasks as well. However, to fully utilize the advantages of prompts and incorporate the context-dependent meaning of class representations into the prompts, it is necessary for the prompts to be learned similarly to the sentiment representations of each class. To achieve this, We introduce a novel method to learn soft prompts during fine-tuning. In this method, the class prompts are initialized with sentiment-related embeddings and are trained by a denoising task, which replaces them with masked tokens, just like the conventional masked language model (MLM) approach. Furthermore, a novel attention pattern is designed to tune attention between class prompts effectively. As a result, we demonstrate that our approach outperforms state-of-the-art models through experiments on four common datasets, achieving superior performance on sentiment analysis.

中文翻译:

SPACE:Senti-Prompt 作为情感分析的分类嵌入

在自然语言处理中,情感分析的一般方法涉及使用预训练语言模型与分类器模型相结合的预训练和微调范例。最近,许多研究不仅将提示应用于下游生成,还应用于分类任务。然而,为了充分利用提示的优势并将类别表示的上下文相关含义纳入提示中,有必要像每个类别的情感表示一样学习提示。为了实现这一目标,我们引入了一种在微调过程中学习软提示的新颖方法。在此方法中,类提示使用与情感相关的嵌入进行初始化,并通过去噪任务进行训练,将其替换为掩码标记,就像传统的掩码语言模型 (MLM) 方法一样。此外,设计了一种新颖的注意力模式来有效地调整课堂提示之间的注意力。因此,我们通过对四个常见数据集的实验证明,我们的方法优于最先进的模型,在情感分析方面实现了卓越的性能。
更新日期:2024-02-28
down
wechat
bug