"When considering the development or deployment of beneficial technologies with uncertain, but potentially significant, negative results, any decision should be made with a strong bias towards the ability to step back and reverse the decision should harmful outcomes become more likely. The determination of possible harmful results must be grounded in science but recognize the potential for people to use the technology in unintended ways, must include a consideration of benefits lost by choosing not to move forward with the technology, and must address the possibility of serious problems coming from the interaction of the new technology with existing systems and conditions. This consideration of reversibility should not cease upon the initial decision to go forward to hold back, but should be revisited as additional relevant information emerges."

One obvious candidate for reversibility analysis, says Cascio, is biotechnology. Cascio finds reasonableness in both the Precautionary and Proactionary stances, and comes up with a third way.

"GMOs should be engineered in a way to make it possible to remove them from the environment if unexpected or low-probability problems emerge," writes Cascio, "Issues of human consumption of GMOs would be handled on a case-by-case basis, with a bias towards holding off on products that demonstrate a possibility of serious or irreversible problems."

But even Cascio admits that there are two major issues in regards to the Reversibility Principle: is "reversibility" even possible, and can we predict the various possible outcomes, both good and bad?

Ultimately, says Cascio, the Reversibility Principle should be a heuristic, "a prism through which we look at the world and make our decisions." We may not always choose the path with the simplest way back, says Cascio, it may not always be the right choice, "but it would encourage us to consider the issue for all of our options." Reversibility will force people to think in terms of more than immediate gratification, and to consider how the choice connects to other choices we and the people around us have made and will make. "In the end," writes Cascio, "it may even be a good first-order approximation of wisdom."

While laudable, and even potentially practical, there's a certain idealism to Cascio's Reversibility Principle that I question.

First, Cascio makes the assumption that there are rational decision-makers at play who will willingly pull back on those projects that are proving to be harmful. Much of the world today is de facto corporatist, and corporations have proven to be insane. Yes, human civilization narrowly dodged the bullet on the depleting ozone layer issue, but it doesn't appear even remotely close to dealing with the global warming catastrophe. It may be naive to believe that enough co-operation can happen globally to stem the tide of burgeoning but harmful technological trends--particularly if those trends are proving profitable.

Second, controlling the development of technologies and how they will be used will not be easy, if not impossible. Technological contraband will result in the creation of basement labs and the rise of black markets. Where there's demand, there's a way.

And finally, while the Reversibility Principle might work for the environment and biotechnology, it most certainly will not work for the military. There is no precedent yet in human history where the pursuit of certain weapons technologies have been abandoned due to their potential risks. It is the nature of the military to be in a perpetual search for the most sophisticated technologies.

Worse, once a military force gains possession of a weapon, it will never relinquish it. Global nuclear disarmament is a pipe dream. As the U2 album cover sarcastically asks, "How to Dismantle an Atomic Bomb?" -- the answer is you can't. Some things just can't be un-invented. Because of the devastating potential for biotech, cybernetics, robotics, nano-weapons and AI on the battlefield, you can bet that these technologies will be developed. And like many things that are developed by the military, the technologies will eventually trickle down to society.

Today, during the Information Age, the risk of proliferation has heightened dramatically. The world is dealing with this right now as Iran threatens to become nuclear capable. And with non-state actors increasingly threatening to acquire dangerous weapons, societies are increasingly become more police-like in their approach to surveillance and control. Our social and legal infrastructure is being moulded by technological and geopolitical pressures -- something that is clearly beyond reversibility.

Hopefully Cascio is right, and the Reversibility Principle can be applied to such realms as biotechnology and the environment. Change management is clearly an important issue, one that might even help us avoid preventable disasters. But pulling back on the reigns during this time of globalization, powerful corporations, and accelerating change will be a truly difficult task indeed.

1 comment:

Nuclear disarmament is not a pipe dream, there is a basic assumption you make that lead to this mistake.

The assumption is that there is no difference in reversibility.

Wrong, because a difference exist.

Nuclear weapons can not be disinvented, true.

Nuclear weapons can be banned and dismantled, as chemical and biological weapons. True!

And then? And then someone may use/threat in future nuclear weapons explosions (terrorist likely), but the difference is between:

27.000 nuclear warheads + nuclear terrorist threat

and

0 nuclear warheads + nuclear terrorist threat.

If you consider that the nuclear terrorist threat is proportional with the spread of nuclear weapons + fissiles materials (plutonium and Highly Enriched Uranium) then you can see that the next priority is safety of fissile materials in hospitals, labs, military labs, nuclear power plants, nuclear waste disposal.

Take a look at www.bang-europe.org or www.abolition2000.org for further information

George Dvorsky

Canadian futurist, science writer, and ethicist, George Dvorsky has written and spoken extensively about the impacts of cutting-edge science and technology—particularly as they pertain to the improvement of human performance and experience. He is a contributing editor at io9, the Chairman of the Board at the Institute for Ethics and Emerging Technologies and is the program director for the Rights of Non-Human Persons program.