Regardless of its origin, the statement is as true now as it was at the turn of the 20th century. Nobody can do much to control the weather. The atmosphere remains the most unpredictable element in our daily lives.

What can we do about it? At the very least, we must try to predict it. After all, our very lives depend on having ample supplies of foodstuffs, which themselves depend on the heavens remaining roughly cooperative. Just as important, we must drive our predictions in real time into all activities that depend on the weather, so we don't get blindsided by meteorological circumstances over which we have little control.

Ancient civilizations relied on shamans to make sense of the ever-shifting winds, rains and clouds. For greater confidence and precision in our predictions, the modern era has turned to the atmospheric sciences. Most notable among these are meteorology (for near-term and real-time weather forecasting) and climatology (for longer term trends related to global warming, creeping desertification and so on).

Atmospheric scientists have been relying on predictive analytics tools for many decades. As this recent article notes, "supercomputing has played a major role in enabling predictive models since the 1950s and remains at the cornerstone of today's weather and climate modeling." It's also the heart of real-time weather tracking, upon which every modern society depends.

What we now call "big data" has been the heart of predictive and real-time weather analytics from the start. Throughout all eras, meteorological models have greedily devoured every high-performance (HPC) computing resource thrown their way, including storage, memory, CPU and bandwidth. The fine-grained parallelizability of most atmospheric models has driven development of ever more distributed HPC architectures. And the availability of new observational-data sources, especially satellites and radars, has fueled continued improvements in predictive speed, accuracy and granularity.

Where prediction granularity is concerned, the precision of timely weather forecasts depends on several factors. One of those factors is the network of observation sources that supply fresh data to predictive models. As noted in the cited article, the Internet of Things (IoT) promises to substantially deepen this granularity. Advances in weather forecasting will "leverag[e] new sources of observation, such as sensors placed on automobiles." The article presents a vision of ubiquitous urban weather observation networks where "thousands of sensors...provid[e] real-time meteorological information."

Another factor in predictive granularity is the geospatial resolution of the predictive models. This depends, in turn, on both the granularity of the source data and the extent to which the HPC parallelism enables real-time targeting of forecasts down to, ideally, a street-by-street level.

This sort of hyperlocal targeting may not be as farfetched as it sounds. The article reports on recent supercomputer simulations of Hurricane Sandy (post-hoc, obviously) in which researchers have pinpointed impacts "to a 500-meter resolution—the equivalent of a few city blocks." If they can deliver this same accuracy on a pre-hoc or predictive basis for future storms, that would be astounding. But it's obviously way too early to pop the champagne corks (or to alert future impacted households to run for their lives).

It will be interesting to see how these trends converge: deployment of ubiquitous IoT weather sensors and improvement of big-data-driven, hyperlocal, geospatial weather targeting. Imagine a scenario in which every smart car, smart building and smartphone take continuous environmental readings and forward them to a weather forecasting cloud service in real time. Not only would this enable extraordinarily high-resolution measurement of current weather conditions everywhere, but it would also facilitate precisely targeted forecasts of fast-moving conditions that may impact very narrow geographies for very brief periods. Such hyperlocal conditions may include fogs, flash floods, wind shears and tornados.

It might even be possible to use IoT and HPC-powered big data to control some hyperlocal atmospheric events. For example, air pollution and heat islands are caused in part by human activities (driving and urbanizing, for example). If sensor-laden smart vehicles were able to dynamically adjust their operation in accordance with local conditions (if they could recommend that drivers operate them during off-peak hours when city pollution levels subside), they might make a positive impact on air cleanliness. Likewise, sensor-rich smart buildings might dynamically adjust their external reflectance to moderate the contribution of incident sunlight on the local heat-island effect.

At the very least, we can always do something about the weather indoors. Smartphone sensors might alert you or I the moment we enter a "sick building" that's likely to aggravate our allergies.

In those cases, we might act on this real-time intelligence by walking right back outside, where, hopefully, the air is cleaner.

Related Content

White papers & Reports

Predictive analytics has helped industries and organizations of all sizes gain competitive advantage. Further evidence of this significant trend can found in an executive summary from Ventana Research.