Pages

Tuesday, 15 October 2013

Guest post from ALBERT-LÁSZLÓ BARABÁSI

The recent revelation that the
National Security Agency collects the personal data of United States citizens,
allies and enemies alike has broken the traditional model governing the bond
between science and society.

Most breakthrough technologies
have dual uses. Think of atomic energy and the nuclear bomb or genetic engineering
and biological weapons. This tension never gives way. Our only hope to
overcoming it is to stop all research.

But that is unrealistic. Instead,
the model we scientists follow is simple: We need to be transparent about the
potential use and misuse of our trade. We publish our results, making them
accessible to everyone. And when we do see the potential for abuse, we speak
up, urging society to reach a consensus on how to keep the good but outlaw the
bad.

As the NSA secretly developed its
unparalleled surveillance program, relying on a mixture of tools rooted in
computer and social sciences, this model failed. Scientists whose work fueled
these advances failed to forcefully articulate the collateral dangers their
tools pose. And a political leadership, intoxicated by the power of these
tools, failed to keep their use within the strict limits of the Constitution.

It’s easy to see why this
happened. After all, the benefits of Big Data and the science behind it are
hard to overlook. Beyond the many digital applications that make our life
increasingly easy today, data science holds promise for emergency response and
for stopping the next virus from turning into a deadly pandemic. It also holds
the key to our personal health, since our activity patterns and disease history
are more predictive of our future disease than our genes.

For researchers involved in basic
science, like myself, Big Data is the Holy Grail: It promises to unearth the
mathematical laws that govern society at large. Motivated by this challenge, my
lab has spent much of the past decade studying the activity patterns of
millions of mobile phone consumers, relying on call patterns provided by mobile
phone companies. This data was identical to what NSA muscled away from
providers, except that ours was anonymized, processed to help research without
harming the participants. In a series of research papers published in the
journals Science and Nature, my team confirmed the promise of Big Data by
quantifying the predictability of our daily patterns, the threat digital
viruses pose to mobile phones and even the reaction people have when a bomb
goes off beside them.

We also learned that when it
comes to our behavior, we can’t use only two scales — one for good and the
other for bad. Rather, our activity patterns are remarkably diverse: For any
act labeled “unusual” or “anomalous,” such as calling people at odd hours or
visiting sensitive locations outside our predictable daily routine, we will
find millions of individuals who do just that as part of their normal routine.
Hence identifying terrorist intent is more difficult than finding a needle in a
haystack — it’s more like spotting a particular blade of hay.

Let’s face it: Powered by the
right type of Big Data, data mining is a weapon. It can be just as harmful,
with long-term toxicity, as an atomic bomb. It poisons trust, straining
everything from human relations to political alliances and free trade. It may
target combatants, but it cannot succeed without sifting through billions of
data points scraped from innocent civilians. And when it is a weapon, it should
be treated like a weapon.

To repair the damage already
done, we researchers, with a keen understanding of the promise and the limits
of our trade, must work for a world that uses science in an ethical manner. We
can look at the three pillars of nuclear nonproliferation as a model for going
forward.

The good news is that the first
pillar, the act of nonproliferation itself, is less pertinent in this context:
Many of the technologies behind NSA’s spying are already in the public domain,
a legacy of the openness of the scientific enterprise. Yet the other two
pillars, disarmament and peaceful use, are just as important here as they were
for nuclear disarmament. We must inspect and limit the use of this new science
for military purposes and, to restore trust, we must promote the peaceful use
of these technologies.

We can achieve this only in
alliance with the society at large, together amending universal human rights
with the right to data ownership and the right of safe passage.

Data ownership states that the
data pertaining to my activity, like my browsing pattern, shopping habits or
reading history, belongs to me, and only I control its use. Safe passage is the
expectation that the information I choose to transfer will reach its intended
beneficiaries without being tapped by countless electronic ears along the way.
The NSA, by indiscriminately tapping all communication pipelines, has degraded
both principles.

Science can counteract spying
overreach by developing tools and technologies that, by design, lock in these
principles. A good example of such a design is the Internet itself, built to be
an open system to which anyone could connect without vetting by a central
authority. It took decades for governments around the world to learn to censor
its openness.

This summer, while visiting my
hometown in Transylvania, I had the opportunity to talk with a neighbor who
spent years as a political prisoner. Once freed, for decades to come, he knew
that everything he uttered was listened to and recorded. He received
transcripts of his own communications after the fall of communism. They spanned
seven volumes. It was toxic and dehumanizing, a way of life that America has
repeatedly denounced and fought against.

So why are we beginning to spread
communism 2.0 around the world, a quarter-century after the Iron Curtain’s
collapse? This is effectively what NSA surveillance has become. If we
scientists stay silent, we all risk becoming digitally enslaved. Posted with permission.

Albert-László Barabási is a
physicist and network scientist at Northeastern University and Harvard Medical
School, and the author of “Bursts: The Hidden Patterns Behind Everything We
Do.”

FuturICT Hubs

Followers

FET Flagship Initiative

The activities leading to these results has received funding from the European Union Seventh Framework Programme (FP7/2007-2013) under grant agreement n° 284709 - project 'FuturICT', a Coordination and Support Action in the Information and Communication Technologies activity area