当前位置: X-MOL 学术Stat. Anal. Data Min. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Adaptive boosting for ordinal target variables using neural networks
Statistical Analysis and Data Mining ( IF 1.3 ) Pub Date : 2023-01-26 , DOI: 10.1002/sam.11613
Insung Um 1, 2 , Geonseok Lee 1 , Kichun Lee 1
Affiliation  

Boosting has proven its superiority by increasing the diversity of base classifiers, mainly in various classification problems. In reality, target variables in classification often are formed by numerical variables, in possession of ordinal information. However, existing boosting algorithms for classification are unable to reflect such ordinal target variables, resulting in non-optimal solutions. In this paper, we propose a novel algorithm of ordinal encoding adaptive boosting (AdaBoost) using a multi-dimensional encoding scheme for ordinal target variables. Extending an original binary-class AdaBoost, the proposed algorithm is equipped with a multi-class exponential loss function. We show that it achieves the Bayes classifier and establishes forward stagewise additive modeling. We demonstrate the performance of the proposed algorithm with a base learner as a neural network. Our experiments show that it outperforms existing boosting algorithms in various ordinal datasets.

中文翻译:

使用神经网络对有序目标变量进行自适应提升

Boosting 通过增加基分类器的多样性证明了它的优越性,主要是在各种分类问题中。现实中,分类中的目标变量往往由数值变量构成,具有序数信息。然而,现有的用于分类的提升算法无法反映这种有序的目标变量,导致非最优解。在本文中,我们提出了一种新的序数编码自适应增强算法 (AdaBoost),该算法使用序数目标变量的多维编码方案。所提出的算法扩展了原始的二进制类 AdaBoost,配备了多类指数损失函数。我们表明它实现了贝叶斯分类器并建立了前向阶段相加建模。我们使用作为神经网络的基础学习器来演示所提出算法的性能。我们的实验表明,它在各种有序数据集中优于现有的增强算法。
更新日期:2023-01-26
down
wechat
bug