It’s not so long ago that forecasting the weather was still a largely manual process, relying on crude instrumentation, manual annotation of maps, and reams of numerical data, all spread across the table. Yet despite this mammoth effort the results were hit and miss at best, with forecasts of 12 hours ahead the optimal standard.
Now however weather forecasting has entered the supercomputer era, with many of the world’s biggest compute resources focused on weather predications globally. Today forecasts have moved from a finger in the air to high-performance computing. HPC lends itself particularly well to the complex partial differential equations that form the physics of weather dynamics. Looking forward, achieving even greater accuracy will demand ever larger compute resources.
The US National Oceanic and Atmospheric Administration (NOAA) relies on massively parallel high-performance supercomputers with tens of thousands of CPUs. These then run the operational models that deliver thousands of weather forecasts, storm warnings, navigational information, and scientific data to public and private stakeholders.
HPC has enabled NOAA global model resolution to increase from 375 square kilometres in 1980 to 13 square kilometres in 2015, with scientists looking forward to even higher levels of resolution as Exascale compute becomes available, enabling weather predictions at neighbourhood level and further ahead. Indeed quantum computing (when available) may enable forecasting at micro-meteorological event level, e.g. the formation of individual clouds and wind eddies, potentially predicting the weather in our backyards and on our rooftops.
Whilst forecasting has been accelerated by big data and parallel computing, AI and machine learning haven’t gained traction because historical data to exploit AI’s predictive potential is lacking. Even the first weather satellites were launched relatively recently. AI is further challenged by the nonlinear nature of weather patterns where what happened yesterday is not necessarily an indication of what will happen in the future, making historical data less relevant than in other AI and machine learning situations.
According to NOAA AI could deliver in post-processing and calibration of model output, using machine learning to improve quality control of observational data, re-analysis, re-forecasts and creating forecast products based on blends of operational models. Another area where AI can contribute is in climate science; here researchers studying global warming face a deluge of data with more to come, and as we know, AI results improve as the volume of data available to analyse grows.
In the future we won’t have to peer at highly generalized weather maps on our local TV forecast and wonder if that weather symbol in our general location applies to us. Instead we’ll have granular insights into the 5-day weather (and beyond) for our specific neighbourhood, perfect for planning that summer barbecue or mowing the lawn!