当前位置: X-MOL 学术SIAM Rev. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Research Spotlights
SIAM Review ( IF 10.2 ) Pub Date : 2024-02-08 , DOI: 10.1137/24n975839
Stefan M. Wild

SIAM Review, Volume 66, Issue 1, Page 89-89, February 2024.
As modeling, simulation, and data-driven capabilities continue to advance and be adopted for an ever expanding set of applications and downstream tasks, there has been an increased need for quantifying the uncertainty in the resulting predictions. In “Easy Uncertainty Quantification (EasyUQ): Generating Predictive Distributions from Single-Valued Model Output,” authors Eva-Maria Walz, Alexander Henzi, Johanna Ziegel, and Tilmann Gneiting provide a methodology for moving beyond deterministic scalar-valued predictions to obtain particular statistical distributions for these predictions. The approach relies on training data of model output-observation pairs of scalars, and hence does not require access to higher-dimensional inputs or latent variables. The authors use numerical weather prediction as a particular example, where one can obtain repeated forecasts, and corresponding observations, of temperatures at a specific location. Given a predicted temperature, the EasyUQ approach provides a nonparametric distribution of temperatures around this value. EasyUQ uses the training data to effectively minimize an empirical score subject to a stochastic monotonicity constraint, which ensures that the predictive distribution values become larger as the model output value grows. In doing so, the approach inherits the theoretical properties of optimality and consistency enjoyed by so-called isotonic distributional regression methods. The authors emphasize that the basic version of EasyUQ does not require elaborate hyperparameter tuning. They also introduce a more sophisticated version that relies on kernel smoothing to yield predictive probability densities while preserving key properties of the basic version. The paper demonstrates how EasyUQ compares with the standard technique of applying a Gaussian error distribution to a deterministic forecast as well as how EasyUQ can be used to obtain uncertainty estimates for artificial neural network outputs. The approach will be especially of interest for settings when inputs or other latent variables are unreliable or unavailable since it offers a straightforward yet statistically principled and computationally efficient way for working only with outputs and observations.


中文翻译:

研究热点

SIAM Review,第 66 卷,第 1 期,第 89-89 页,2024 年 2 月。
随着建模、仿真和数据驱动功能的不断发展并被不断扩大的应用程序和下游任务所采用,人们对用于量化结果预测的不确定性。在“简单的不确定性量化(EasyUQ):从单值模型输出生成预测分布”一文中,作者 Eva-Maria Walz、Alexander Henzi、Johanna Ziegel 和 Tilmann Gneiting 提供了一种超越确定性标量值预测以获得特定统计数据的方法。这些预测的分布。该方法依赖于模型输出-观测标量对的训练数据,因此不需要访问更高维的输入或潜在变量。作者使用数值天气预报作为一个具体示例,人们可以获得特定位置温度的重复预报和相应的观测结果。给定预测温度,EasyUQ 方法提供围绕该值的非参数温度分布。 EasyUQ 使用训练数据有效地最小化受随机单调性约束的经验分数,这确保了预测分布值随着模型输出值的增长而变大。在此过程中,该方法继承了所谓的等渗分布回归方法所享有的最优性和一致性的理论特性。作者强调,EasyUQ 的基本版本不需要复杂的超参数调整。他们还引入了一个更复杂的版本,该版本依赖于内核平滑来产生预测概率密度,同时保留基本版本的关键属性。本文演示了 EasyUQ 如何与将高斯误差分布应用于确定性预测的标准技术进行比较,以及如何使用 EasyUQ 来获得人工神经网络输出的不确定性估计。当输入或其他潜在变量不可靠或不可用时,该方法将特别令人感兴趣,因为它提供了一种简单但统计原则和计算有效的方法,仅处理输出和观察结果。
更新日期:2024-02-08
down
wechat
bug