New research from the University of Washington is refining AI weather models using deep learning for more accurate predictions and longer-term forecasts. The study, published in Geophysical Research Letters, shows how adjusting initial atmospheric data enables advanced AI models to extend current forecast limits. As extreme weather becomes increasingly more severe and frequent due to climate change, giving governments, businesses, the public, and emergency responders more time to prepare for natural disasters such as floods, heatwaves, or hurricanes could help reduce loss of life and property.
“If a perfect weather model is given slightly imperfect initial conditions, the error compounds over time and results in an inaccurate forecast,” said lead author Trent Vonich, a PhD candidate at the University of Washington. “This is especially true when modeling a chaotic system such as the Earth’s atmosphere. There has been great focus recently on making better models, while somewhat ignoring the fact that a perfect model is only half the problem. Machine learning models help us address this because they are fully differentiable end-to-end, allowing us to capture nonlinear interactions between inputs and outputs—something legacy techniques cannot do.”
While state-of-the-art AI weather forecasting systems, such as Google’s GraphCast and Huawei’s Pangu-Weather, reliably predict upcoming weather up to 10 days ahead, they’re limited by the accuracy of the initial data fed into their system.
These models were trained on the massive ERA5 reanalysis dataset containing petabytes of information. The dataset captures hourly temperature, wind speed, humidity, air pressure, precipitation, and cloud cover across a global grid of 37 pressure levels. It includes historical weather conditions, dating back to 1979, and near-real-time data.
The researchers focused on refining the initial atmospheric variables leading up to the June 2021 Pacific Northwest Heat Wave to improve the accuracy of this extreme event. They applied nonlinear optimization using the GPU-accelerated JAX framework to optimize the data.
According to Vonich, it takes only 20 minutes to perform 100 initial conditions updates on an NVIDIA A100 Tensor Core GPU.
The researchers tested their framework’s accuracy using atmospheric data captured during the 2021 Pacific Northwest heatwave, which was excluded from the original training dataset. The optimized data reduced 10-day forecast errors by 90%, successfully predicting the intensity and timing of the heatwave. It also more than doubled the prediction window, improving unoptimized forecasts up to 23 days in advance.
“This research may show that more accurate weather observations and measurements may be just as important as developing better models,” Vonich said. “If this technique can be used to identify systematic biases in the initial conditions, it could have an immediate impact on improving operational forecasts. Plus, more lead time enables greater preparation for communities. Aviation, shipping, and countless other industries rely on accurate weather forecasts, too. Improvements can translate to an economic benefit for them as well.”
Read the full news story on Eos.
Catch up on the study Predictability Limit of the 2021 Pacific Northwest Heatwave From Deep-Learning Sensitivity Analysis.