当前位置: X-MOL 学术Digit. Signal Process. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Information criteria for structured parameter selection in high-dimensional tree and graph models
Digital Signal Processing ( IF 2.9 ) Pub Date : 2024-02-27 , DOI: 10.1016/j.dsp.2024.104437
Maarten Jansen

Parameter selection in high-dimensional models is typically fine-tuned in a way that keeps the (relative) number of false positives under control. This is because otherwise the few true positives may be dominated by the many possible false positives. This happens, for instance, when the selection follows from a naive optimisation of an information criterion, such as AIC or Mallows's . It can be argued that the overestimation of the model comes from the optimisation process itself changing the statistics of the selected variables, in a way that the information criterion no longer reflects the true divergence between the selected model and the data generating process. Using lasso, the overestimation can also be linked to the shrinkage estimator, which makes the selection too tolerant of false positive selections. For these reasons, this paper works on refined information criteria, carefully balancing false positives and false negatives, for use with estimators without shrinkage. In particular, the paper develops corrected Mallows's criteria for structured selection in trees and graphical models.

中文翻译:

高维树和图模型中结构化参数选择的信息标准

高维模型中的参数选择通常会进行微调,以控制误报的(相对)数量。这是因为否则少数真阳性可能会被许多可能的假阳性所支配。例如,当选择源自信息标准(例如 AIC 或 Mallows )的简单优化时,就会发生这种情况。可以说,模型的高估来自于优化过程本身改变了所选变量的统计数据,从而使信息标准不再反映所选模型与数据生成过程之间的真实差异。使用套索,高估也可以与收缩估计器联系起来,这使得选择对假阳性选择的容忍度太高。由于这些原因,本文致力于完善信息标准,仔细平衡误报和漏报,以便与估计器一起使用而不会收缩。特别是,本文针对树和图形模型中的结构化选择制定了修正的 Mallows 标准。
更新日期:2024-02-27
down
wechat
bug