A transformer neural network for predicting near-surface temperature

Emy Alerskans*, Joachim Nyborg, Morten Birk, Eigil Kaas

*Corresponding author af dette arbejde

Publikation: Bidrag til tidsskriftTidsskriftartikelForskningpeer review

12 Citationer (Scopus)
36 Downloads (Pure)

Abstract

A new method based on the Transformer model is proposed for post-processing of numerical weather prediction (NWP) forecasts of 2 m air temperature. The Transformer is a machine learning (ML) model based on self-attention, which extracts information about which inputs are most important for the prediction. It is trained using time series input from NWP variables and crowd-sourced 2 m air temperature observations from more than 1000 private weather stations (PWSs). The performance of the new post-processing model is evaluated using both observational data from PWSs and completely independent observations from the Danish Meteorological Institute (DMI) network of surface synoptic observations (SYNOP) stations. The performance of the Transformer model is compared against the raw NWP forecast, as well as against two benchmark post-processing models; a linear regression (LR) model and a neural network (NN). The results evaluated using PWS observations show an improvement in the 2 m temperature forecasts with respect to both bias and standard deviation (STD) for all three post-processing models, with the Transformer model showing the largest improvement. The raw NWP forecast, LR, NN and Transformer model have a bias and STD of 0.34 and 1.96 degrees C, 0.03 and 1.63 degrees C, 0.10 and 1.53 degrees C and 0.02 and 1.13 degrees C, respectively. The corresponding results using DMI SYNOP stations also show improved forecasts, where the Transformer model performs better than both the raw NWP forecast and the two benchmark models. However, a dependence on distance to the coast and cold temperatures is observed.

OriginalsprogEngelsk
Artikelnummer2098
TidsskriftMeteorological Applications
Vol/bind29
Udgave nummer5
Antal sider24
ISSN1350-4827
DOI
StatusUdgivet - sep. 2022

Citationsformater