当前位置: X-MOL 学术Int. J. Uncertain. Quantif. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
A FULLY BAYESIAN GRADIENT-FREE SUPERVISED DIMENSION REDUCTION METHOD USING GAUSSIAN PROCESSES
International Journal for Uncertainty Quantification ( IF 1.7 ) Pub Date : 2022-01-01 , DOI: 10.1615/int.j.uncertaintyquantification.2021035621
Raphaël Gautier 1 , Piyush Pandita 2 , Sayan Ghosh 2 , Dimitri Mavris 1
Affiliation  

Modern day engineering problems are ubiquitously characterized by sophisticated computer codes that map parameters or inputs to an underlying physical process. In other situations, experimental setups are used to model the physical process in a laboratory, ensuring high precision while being costly in materials and logistics. In both scenarios, only a limited amount of data can be generated by querying the expensive information source at a finite number of inputs or designs. This problem is compounded further in the presence of a high-dimensional input space. State-of-the-art parameter space dimension reduction methods, such as active subspace, aim to identify a subspace of the original input space that is sufficient to explain the output response. These methods are restricted by their reliance on gradient evaluations or copious data, making them inadequate for expensive problems without direct access to gradients. The proposed methodology is gradient-free and fully Bayesian, as it quantifies uncertainty in both the low-dimensional subspace and the surrogate model parameters. This enables a full quantification of epistemic uncertainty and robustness to limited data availability. It is validated on multiple datasets from engineering and science and compared to two other state-of-the-art methods based on four aspects: (a) recovery of the active subspace, (b) deterministic prediction accuracy, (c) probabilistic prediction accuracy, and (d) training time. The comparison shows that the proposed method improves the active subspace recovery and predictive accuracy, in both the deterministic and probabilistic sense, when only few a model observations are available for training, at the cost of increased training time.

中文翻译:

使用高斯过程的全贝叶斯无梯度监督降维方法

现代工程问题普遍存在复杂的计算机代码,这些代码将参数或输入映射到底层物理过程。在其他情况下,实验装置用于模拟实验室中的物理过程,确保高精度,同时在材料和物流方面成本高昂。在这两种情况下,通过在有限数量的输入或设计上查询昂贵的信息源,只能生成有限数量的数据。在存在高维输入空间的情况下,这个问题更加复杂。最先进的参数空间降维方法,例如活动子空间,旨在识别原始输入空间中足以解释输出响应的子空间。这些方法受到依赖梯度评估或大量数据的限制,使它们不适用于无法直接访问梯度的昂贵问题。所提出的方法是无梯度且完全贝叶斯的,因为它量化了低维子空间和代理模型参数的不确定性。这可以对有限数据可用性的认知不确定性和稳健性进行全面量化。它在来自工程和科学的多个数据集上进行了验证,并与基于四个方面的其他两种最先进的方法进行了比较:(a) 活动子空间的恢复,(b) 确定性预测精度,(c) 概率预测精度, 和 (d) 训练时间。比较表明,当只有很少的模型观测可用于训练时,所提出的方法在确定性和概率意义上提高了主动子空间恢复和预测准确性,
更新日期:2022-02-03
down
wechat
bug