Today’s sophisticated models can make reasonably decent weather forecasts up to 10 days in advance. A newly developed computing technique promises to push the limits of weather forecasts even further ahead. If the system fulfils its initial promise, we could be looking at another four or five days of weather forecasts that are actually useful […]
Today’s sophisticated models can make reasonably decent weather forecasts up to 10 days in advance. A newly developed computing technique promises to push the limits of weather forecasts even further ahead.
If the system fulfils its initial promise, we could be looking at another four or five days of weather forecasts that are actually useful – that is, that have a reasonable likelihood of sticking somewhere close to reality.
The secret to this new extra-long range weather forecast lies in the recording of initial observations: the state of the weather as it is now, ScienceAlert reported.
By reducing the variables in real time readings, researchers have been able to get supercomputers to produce forecasts that stay accurate for up to two weeks.
“Reducing the current-day initial-condition uncertainty by an order of magnitude extends the deterministic forecast lead times of day-to-day weather by up to five days,” explains the team behind the study.
“Achieving this additional predictability limit can have enormous socioeconomic benefits but requires coordinated efforts by the entire community to design better numerical weather models, to improve observations, and to make better use of observations with advanced data assimilation and computing techniques.”
Every weather forecast is based around conditions as they are right now. The further you go into the future, the harder it becomes to predict which way the weather will go, because of the countless variables involved.
These initial conditions are compiled from satellites, balloons, and weather stations, but they’re not perfect – and so forecast models tend to run multiple scenarios based on different variations of the initial conditions, then get a consensus from all the results.
The researchers behind the new study changed this approach, and assumed the initial conditions were near-perfect, using 10 times fewer variations to begin with for midlatitude weather (the systems that affect most of the world’s population).
By testing the new technique with historical weather events from 2016, the team found that a reasonable level of accuracy was maintained up until the two-week mark, at which point the models bore no real resemblance to what actually happened in 2016.
Plenty more work is required on this intriguing new approach, but that’s a longer range weather forecast than we’ve ever had before, and a decent jump from the 10-day limit that meteorologists currently work with.
While certain sources may give you weather forecasts that stretch beyond two weeks, meteorologists will tell you that these are too speculative to be of any use.
That 14-day mark also matches up with previous research into atmospheric models stretching back half a century – that research also suggested a two-week limit before predictions became virtually useless, but it’s never before been put to the test in the supercomputers of the 21st century.
Computing power and forecasting models are improving all the time of course, and some experiments have already shown we might break the two-week barrier. For now though, 14 days looks as far into the future as we can peer, weather-wise.
“Two weeks is about right,” lead researcher Fuqing Zhang, from Pennsylvania State University, told Paul Voosen at Science. “It’s as close to be the ultimate limit as we can demonstrate.”