No one anticipated the scale of the disaster wrought by San Diego County wildfires. After Katrina, shouldn’t the technology to sound the alarm have been in place?

Two years ago, I wrote a column, "Katrina's total system disruption," that highlighted the complete anarchy in the wake of that storm and argued for smarter predictive modeling to better understand not just the likelihood of extreme environmental events but also the human response to those events. The issue at the time, you may recall, was that a couple hundred thousand people failed to safely evacuate New Orleans, with disastrous consequences.

Today I sit watching as a million-plus people stream out of San Diego County, fleeing a totally foreseeable chain of extreme firestorms, fueled by the annual Santa Ana winds plus abnormally high temperatures.

What's changed in the intervening two years? Most people in San Diego have cars, which will help avoid a Katrina-like human disaster. And we can watch the destruction in real time on YouTube and Google Maps. But were we any more prepared than for Katrina? Had anybody modeled the potential for extreme firestorms in Southern California? And if they did, did anybody pay attention?

Did insurers adjust their rates to dissuade people from building in the highest-risk areas? Did governments invest in the right infrastructure to mitigate the threat and damage?

Some organization did have plans in place, such as Pepperdine University's IT Department, which snapped into action to secure backup systems and tapes as the fires came within 100 feet of its datacenter.

But "every man for himself" planning and predictive modeling is not enough anymore. We need a nationwide early-warning data-modeling platform that can be shared by industry and leveraged by the private and public sectors, as well as the military.

Ironically, today's Wall Street Journal has an interview entitled "How Business Intelligence Has Come of Age," which notes that analytics has gotten pretty advanced in industries such as hospitality (pricing and customer loyalty), health care (predictive diagnoses), and financial services (except the subprime mess – but ignore that for the moment).

Why can't we put some of this IT firepower to work to fortify us against major disruptive events that threaten and affect millions of people, not to mention the economy and national security? The Journal article had one hint: There must be a desire from the top, the CEO, to make it happen.

Several months ago, I sat with a group of CIOs who concluded that there was no way to possibly model or predict the many potential business disruptions facing businesses (pandemic, terrorism, climate, cyberthreats, financial turmoil, and so on). They therefore decided that building generic capabilities (such as responsiveness, communication, decision making) was the best protection.

OK, but I think we can do a lot better with the things we do know: that fire is a recurring threat in Southern California, and that if global temperatures are rising, fires could get uglier. We should put some shared resources against modeling out those scenarios.

Anybody with me on this? What would such a national modeling resource look like? I'd appreciate your ideas – let's get the ball rolling.

To get this column delivered to your e-mail inbox every week, sign up here.