An article in the New York Times reminds us once again that without a carefully crafted and highly disciplined governance architecture in place, perceived misalignment of personal interests between individuals and organizations across cultural ecosystems can lead to catastrophic decisions and outcomes. The article was written by Martin Fackler and is titled: Nuclear Disaster in Japan Was Avoidable, Critics Contend.

While not unexpected by those who study crises, rather yet another case where brave individuals raised red flags only to be shouted down by the crowd, the article does provide instructive granularity that should guide senior executives, directors, and policy makers in planning organizational models and enterprise systems. In a rare statement by a leading publication, Martin Fackler reports that insiders within “Japan’s tightly knit nuclear industry” attributed the Fukushima plant meltdown to a “culture of collusion in which powerful regulators and compliant academic experts”. This is a very similar dynamic found in other preventable crises, from the broad systemic financial crisis to narrow product defect cases.

One of the individuals who warned regulators of just such an event was professor Kunihiko Shimizaki, a seismologist on the committee created specifically to manage risk associated with Japan’s off shore earthquakes. Shimizaki’s conservative warnings were not only ignored, but his comments were removed from the final report “pending further research”. Shimizaki is reported to believe that “fault lay not in outright corruption, but rather complicity among like-minded insiders who prospered for decades by scratching one another’s backs.” This is almost verbatim to events in the U.S. where multi-organizational cultures evolved slowly over time to become among the highest systemic risks to life, property, and economy.

In another commonly found result, the plant operator Tepco failed to act on multiple internal warnings from their own engineers who calculated that a tsunami could reach up to 50 feet in height. This critical information was not revealed to regulators for three years, finally reported just four days before the 9.0 quake occurred causing a 45 foot tsunami, resulting in the meltdown of three reactors at Fukushima.

Three questions for consideration

1) Given that the root cause of the Fukushima meltdown was not the accurately predicted earthquake or tsunami, but rather dysfunctional organizational governance, are leaders not then compelled by moral imperative to seek out and implement organizational systems specifically designed to prevent crises in the future?

2) Given that peer pressure and social dynamics within the academic culture and relationship with regulators and industry are cited as the cause by the most credible witness—from their own community who predicted the event, would not prudence demand that responsible decision makers consider solutions external of the inflicted cultures?

3) With the not-invented-here-syndrome near the core of every major crises in recent history, which have seriously degraded economic capacity, can anyone afford not to?

Steps that must be taken to prevent the next Fukushima

1) Do not return to the same poisoned well for solutions that caused or enabled the crisis

The not-invented-here-syndrome combined with bias for institutional solutions perpetuates the myth that humans are incapable of anything but repeating the same errors over again.

This phenomenon is evident in the ongoing financial crisis which suffers from similar cultural dynamics between academics, regulators and industry.

Researchers have only recently begun to understand the problems associated with deep expertise in isolated disciplines and cultural dynamics. ‘Expertisis’ is a serious problem within disciplines that tend to blind researchers from transdisciplinary patterns and discovery, severely limiting consideration of possible solutions.

Systemic crises overlaps too many disciplines for the academic model to execute functional solutions, evidenced by the committee in this case that sidelined their own seismologist’s warnings for further study, which represents a classic enabler of systemic crises.

2) Understand that in the current digital era through the foreseeable future, organizational governance challenges are also data governance challenges, which requires the execution of data governance solutions

Traditional organizational governance is rapidly breaking down with the rise of the neural network economy, yet governance solutions are comparably slow to be adopted.

Many organizational leaders, policy makers, risk managers, and public safety engineers are not functionally literate with state-of-the-art technology, such as semantic, predictive, and human alignment methodologies.

Functional enterprise architecture that has the capacity to prevent the next Fukushima-like event, regardless of location, industry, or sector, will require a holistic design encapsulating a philosophy that proactively considers all variables that have enabled previous events.

Any functional architecture for this task cannot be constrained by the not-invented-here-syndrome, defense of guilds, proprietary standards, protection of business models, national pride, institutional pride, branding, culture, or any other factor.

Until this year, extending advanced analytics to the entire human workforce was considered futuristic (see 1/10/2012 Forrester Research report Future of BI), in part due to scaling limitations in high performance computing. While always evolving, the design has existed for a decade

Automated data generated by sensors should be carefully crafted and combined in modeling with human and financial data for predictive applications for use in risk management, planning, regulatory oversight and operations.

Near real-time reporting is now possible, so governance structures and enterprise architectural design should reflect that functionality.

Conclusion

While obviously not informed by a first-person audit and review, if reports and quotes from witnesses surrounding the Fukushima crisis are accurate, which are generally consistent from dozens of other human caused crises, we can conclude the following:

The dysfunctional socio-economic relationships in this case resulted in an extremely toxic cultural dynamic across academia, regulators and industry that shared tacit intent to protect the nuclear industry. Their collective actions, however, resulted in an outcome that idled the entire industry in Japan with potentially very serious long-term implications for their national economy.

Whether psychological, social, technical, economic, or some combination thereof, it would seem that no justification for not deploying the most advanced crisis prevention systems can be left standing. Indeed, we all have a moral imperative that demands of us to rise above our bias, personal and institutional conflicts, and defensive nature, to explore and embrace the most appropriate solutions, regardless of origin, institutional labeling, media branding, or any other factor. Some crises are indeed too severe not to prevent.

Observing lives lost and trauma from preventable tragedies is among the most frustrating experiences of my career. However, whatever frustration we feel pales in comparison to the pain victims and their family members experience. Prevention of human-caused catastrophes has long been a top priority of our R&D. We have a desire and an obligation to […]

It is truly an honor to share our recent announcement and welcome Vice Admiral Phil Wisecup USN (Ret.) to our board of directors. Phil joins Dr. Robert Neilson who is now special advisor to the board. As their bios only partially reflect, Phil and Rob are exceptional additions to Kyield’s leadership. Vice-Admiral James P. “Phil” Wisecup (Ret.) brings 40 […] […]

From theorem to market through multiple valleys of death and beyond This is a personal story about our real-world experience, which contains little resemblance to most of what is written about entrepreneurism and technology commercialization. While our journey has been longer than most, scientific commercialization (aka deep tech) typically requires two de […]

Even though some companies may seem well positioned, the fundamental economic and business environment is rapidly changing. To the best of my awareness, survival from this point forward will essentially require a strong AI OS for the super majority of organizations.

I wanted to share a general pattern that is negatively impacting organizations in part due to the compounding effect it has on the broader economy. Essentially this can be reduced to misapplying the company’s playbook in dealing with advanced technology (AI systems).

Every year, natural catastrophes (nat cat) are highly visible events that cause major damage across the world. In 2016 the cost of nat cats were estimated to be $175 billion, $50 billion of which were covered by insurance, reflecting severe financial losses for impacted areas.[i] The total cost of natural catastrophes since 2000 was approximately […]

The focus should be maximize benefits from our inventions, engineered systems and technologies to recreate a sustainable competitive advantage. One benefit of lagging behind other countries in infrastructure is that much progress has been made in recent years. Future projects can be embedded with hardware that enable intelligent networks, which can then be m […]

Learn about the background of Kyield and the multi-disciplinary science involved with AI systems, with a particular focus on AI augmentation for knowledge work and how to achieve a continuously adaptive learning organization (CALO).

The photo above represents a learning opportunity especially relating to survival and adaptation. Recently completed by my wife Betsy[i], the artwork was inspired by our visit to the Acoma Pueblo a few months ago, which is one of the oldest continuously inhabited communities in North America. Ancestors of current residents have lived on top of a 360-foot tal […]