论文标题

使用图网络和新型变压器体系结构进行时空风速预测

Spatio-Temporal Wind Speed Forecasting using Graph Networks and Novel Transformer Architectures

论文作者

Bentsen, Lars Ødegaard, Warakagoda, Narada Dilp, Stenbro, Roy, Engelstad, Paal

论文摘要

这项研究着重于挪威大陆架的多步时时空风速预测。该研究旨在通过不同测量站的相对物理位置来利用空间依赖性,以改善当地风预测。我们的多步预测模型可产生10分钟,1小时或4小时的预测,分辨率为10分钟,这意味着这些模型为预测的未来趋势产生了更多信息的时间序列。图形神经网络(GNN)体系结构用于提取空间依赖性,具有不同的更新功能以学习时间相关性。这些更新功能是使用不同的神经网络体系结构实现的。近年来,一种这样的建筑,即变压器,在序列建模中变得越来越流行。已经提出了各种改动,以更好地促进时间序列预测,本研究的重点是信息器,Logsparse Transformer和Autoformer。这是第一次将logsparse变压器和自动构造物应用于风预测中,第一次以任何这些或告密者的方式以风向预测的时空设置进行配制。通过与时空长的短期记忆(LSTM)和多层感知器(MLP)模型进行比较,研究表明,使用变化的变压器体系结构作为GNN中更新功能的模型能够超越这些模型。此外,我们提出了Fast Fourier变压器(FFTRANSFORMER),它是基于信号分解的新型变压器体系结构,由两个单独的流组成,分别分析趋势和周期性成分。发现FFTRANSFORMER和自动成型器可在10分钟和1小时的预测中获得优异的结果,而FFTRANSFORMER明显优于所有其他模型,用于预测的所有其他模型。

This study focuses on multi-step spatio-temporal wind speed forecasting for the Norwegian continental shelf. The study aims to leverage spatial dependencies through the relative physical location of different measurement stations to improve local wind forecasts. Our multi-step forecasting models produce either 10-minute, 1- or 4-hour forecasts, with 10-minute resolution, meaning that the models produce more informative time series for predicted future trends. A graph neural network (GNN) architecture was used to extract spatial dependencies, with different update functions to learn temporal correlations. These update functions were implemented using different neural network architectures. One such architecture, the Transformer, has become increasingly popular for sequence modelling in recent years. Various alterations have been proposed to better facilitate time series forecasting, of which this study focused on the Informer, LogSparse Transformer and Autoformer. This is the first time the LogSparse Transformer and Autoformer have been applied to wind forecasting and the first time any of these or the Informer have been formulated in a spatio-temporal setting for wind forecasting. By comparing against spatio-temporal Long Short-Term Memory (LSTM) and Multi-Layer Perceptron (MLP) models, the study showed that the models using the altered Transformer architectures as update functions in GNNs were able to outperform these. Furthermore, we propose the Fast Fourier Transformer (FFTransformer), which is a novel Transformer architecture based on signal decomposition and consists of two separate streams that analyse the trend and periodic components separately. The FFTransformer and Autoformer were found to achieve superior results for the 10-minute and 1-hour ahead forecasts, with the FFTransformer significantly outperforming all other models for the 4-hour ahead forecasts.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源