当前位置: X-MOL 学术Energy Convers. Manag. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
BERT4ST:: Fine-tuning pre-trained large language model for wind power forecasting
Energy Conversion and Management ( IF 10.4 ) Pub Date : 2024-03-27 , DOI: 10.1016/j.enconman.2024.118331
Zefeng Lai , Tangjie Wu , Xihong Fei , Qiang Ling

Accurate forecasting of wind power generation is essential for ensuring power safety, scheduling various energy sources, and improving energy utilization. However, the elusive nature of wind, influenced by various meteorological and geographical factors, greatly complicates the wind power forecasting task. To improve the forecasting accuracy of wind power (WP), we propose a BERT-based model for spatio-temporal forecasting (BERT4ST), which is the first approach to fine-tune a large language model for the spatio-temporal modeling of WP. To deal with the inherent characteristics of WP, BERT4ST exploits the individual spatial and temporal dependency of patches and redesigns a set of spatial and temporal encodings. By well analyzing the connection between bidirectional attention networks and WP spatio-temporal data, BERT4ST employs a pre-trained BERT encoder as the backbone network to learn the individual spatial and temporal dependency of patches of WP data. Additionally, BERT4ST fine-tunes the pre-trained backbone in a multi-stage manner, i.e., first aligning the language model with the spatio-temporal data and then fine-tuning the downstream tasks while maintaining the stability of the backbone network. Experimental results demonstrate that our BERT4ST achieves desirable performance compared to some state-of-the-art methods.

中文翻译:

BERT4ST:: 微调预训练的风电预测大语言模型

风电发电量的准确预测对于保障电力安全、调度各种能源、提高能源利用率至关重要。然而,风的难以捉摸的性质,受各种气象和地理因素的影响,使风电预测任务变得非常复杂。为了提高风电(WP)的预测精度,我们提出了一种基于BERT的时空预测模型(BERT4ST),这是第一个针对风电时空建模的大型语言模型进行微调的方法。为了处理 WP 的固有特征,BERT4ST 利用了 patch 的个体空间和时间依赖性,并重新设计了一组空间和时间编码。通过深入分析双向注意力网络和 WP 时空数据之间的联系,BERT4ST 采用预先训练的 BERT 编码器作为骨干网络来学习 WP 数据块的单独空间和时间依赖性。此外,BERT4ST 以多阶段的方式对预训练的主干网络进行微调,即首先将语言模型与时空数据对齐,然后在保持主干网络稳定性的同时对下游任务进行微调。实验结果表明,与一些最先进的方法相比,我们的 BERT4ST 实现了理想的性能。
更新日期:2024-03-27
down
wechat
bug