Come the day of the storm, the media suggested a forecast was now difficult. The system tracked about 60 miles off earlier predictions, with the result that Saturday afternoon was gusty, but with little drama. The next day’s headlines spoke of The storm that wasn’t.

The weather forecast was, in some ways, correct. There was a possibility of a disastrous event. However, the inexactness of the prediction was lost in the echoes of social chatter. Uncertainty always lurked in the forecasts, but often as an afterthought to the talk of more dramatic outcomes, which was all that the public heard.

Business Intelligence has mostly sought to remove ambiguity and uncertainty, favoring clarity over complexity. The best known example is the attempt to deliver a single version of the truth, which implies, in its very phrasing, the existence of alternatives to be suppressed. Certainty, it seems, is again rewarded.

Of course, we don’t believe that we will ever behold a Gross Margin in January of exactly $1,501,360, but we lack the tools and even the vocabulary, to bring more realistic, more ambiguous, outcomes into action. At best, everyone involved understands this as a game of self-deception. Nevertheless, behind the comfortable illusion lurks the danger of the fallacy of false precision. We may place undue trust in these forecasts because they look accurate, forgetting our own part in the trick.

There are indeed black arts in machine learning, which those of us who work in the field should know. But mostly, we need to join Cliff Mass and his fellow meteorologists, in getting better at communicating uncertainty in our work.