Beyond Data Ownership

Beyond Data Ownership to Information Sharing

=20

The question of who owns our data on the Internet is a challenging probl=
em. It can also be a red herring, distracting us from building the ne=
xt generation of online services.

=20

The term =E2=80=9Cownership=E2=80=9D simply brings too much baggage from=
the physical world, suggesting a win-lose, us-verses-them mentality that r=
etards the development of rich, powerful services based on shared informati=
on.

Unfortunately, sometimes the arguments behind these efforts are based on=
who owns =E2=80=93 or who should own =E2=80=93 the data. This is not just =
an intellectual debate or political rallying call, it often undermines our =
common efforts to build a better system.

=20

Consider this:

=20
=20

Privacy as secrecy is dead

=20

Data sharing is data copying

=20

Transaction data has dual ownership

=20

Yours, mine, & ours: Reality is complicated

=20

Taking back ownership is confrontational

=20
=20

Privacy as secrecy is=
dead

=20

First, the data is pretty much already out there. The issue isn=E2=80=99=
t =E2=80=9CHow do we keep data from bad people,=E2=80=9D it=E2=80=99s =E2=
=80=9CHow do we keep people from doing bad things with data?=E2=80=9D DRM and crypto and related technology as th=
e sole means to prevent data leakage and data abuse are failures. Sooner or=
later, the bad guys break the system and get the data. Sure, there are sma=
rt things we can do to protect ourselves. Just like we wear seatbelts and l=
ock our front doors, we should also use SSL and multi-factor authentication=
, but we can=E2=80=99t count on technology to keep our secrets. We need sol=
utions that work even when the secret is out.

=20

In fact, privacy isn=E2=80=99t about information we keep secret. It is a=
bout information we have revealed to someone else with expectation of discr=
etion, e.g., when we tell our doctor about our sexual activities. It=E2=80=
=99s no longer a secret from the Doctor, but because it is private, we have=
rules that keep the information from being used inappropriately. Most of t=
he time, with most doctors, it works. Those few who break those rules are d=
ealt with through legal means, both civil and criminal, as well as social a=
pprobation. So, because we inherently need to release data to different par=
ties at different times, we can=E2=80=99t control it through secrecy alone.=
Instead, we need to build a framework for preventing abuse when others do have access to sensitive information. Like in the case with our do=
ctor, we want our service providers to have the data they need to provide t=
he highest quality services.

=20

Data sharing is dat=
a copying

=20

Second, in the world of atoms, there can only be one of a thing, which i=
s the reverse of the world of bits. With atoms, even if there are copi=
es, each copy is itself a singular thing. Selling, transferring, or stealin=
g a thing precludes the original owner from continuing to use it.

=20

This isn=E2=80=99t true for information, which can easily be sold, trans=
fered, and stolen without disturbing the original version. In fact, the ent=
ire Internet is basically a copy machine, copying =
IP packets from router to router, as we =E2=80=9Csend=E2=80=9D images, =
web pages, and emails from user to user and machine to machine--each time a=
new copy is created whether or not the originating copy is deleted. To thi=
nk of bits as if they were ownable property leads to attempted solutions li=
ke DRM that try to technologically prevent access to the information within=
the data, which is only good until the first hacker cracks the code and di=
stributes it themselves. Instead, if we build social and legal controls on =
use, we can give information more freely, but under terms set by each indiv=
idual when they share that information. Enforced by social and legal rather=
than purely technological means, this makes the most of the low marginal c=
ost of distributing online, while retaining control for contributors.

=
=20

Transaction =
data has dual ownership

=20

Third, much interesting data is actually mutually owned=E2=80=A6 which m=
eans the other guy can already do whatever the heck they want with it. Cons=
ider web attention data, the stream of digital crumbs representing the webs=
ites we=E2=80=99ve visited and any interactions at each: all our purchases,=
all our blog posts, all our searches. Everything. Some folks argue that we=
own that data and therefore have the right to control the use of =
it. But so too do the owners of the websites we=E2=80=99ve been visiting. W=
e don=E2=80=99t own our http log entries at Amazon. Amazon does. In fact, i=
n every instance where two parties interact, where we engage in some transa=
ction with someone else, both parties are co-creating that informa=
tion. As such, both parties own it. So, if we tie the issue of control to o=
wnership, then we=E2=80=99ve already lost the battle, because every service=
provider has solid claims to ownership over the information stored in thei=
r log files, just as we, as individuals, own the browsing history stored on=
our hard drive by Firefox, Internet Explorer and Chrome.

=20

In the movie Fast Times at Ridgemont High, in a [confrontation =
with Mr. Hand|http://slice.seriouseats.com/archives/2010/01/video-jeff-spic=
oli-classroom-pizza-delivery-in-fast-times-at-ridgemont-high.html], Spicoli=
argues =E2=80=9CIf I=E2=80=99m here and you=E2=80=99re here, doesn=E2=80=
=99t that make it our time?=E2=80=9D Just like the time shared bet=
ween Spicoli and Mr. Hand, the information created by visiting a website is=
co-created and co-owned by both the visitor and the website. Every single =
interaction between two endpoints on the web generates at least two owners =
of the underlying data.

=20

This is not a minor issue. The courts have already ruled that if an emai=
l is stored for any period of time on a server, the owner of that server ha=
s a right to read the email. So, when =E2=80=9Cmy=E2=80=9D email is out the=
re at Gmail or AOL or on our company=E2=80=99s servers, know that it is =
also, legally, factually, and functionally, already their=
data.

=20

Yo=
urs, mine, & ours: Reality is complicated

=20

Fourth, when two parties come together for any reason, each brings their=
own data to the exchange. We need a framework that can handle that. Iain H=
enderson breaks down this complexity in a blog post abo=
ut your data, my data, and our data, talking about an individual doing busi=
ness with a vendor, for example, someone buying a car.

=20

=E2=80=9CMy data=E2=80=9D means data that I, as an individual have that =
is related to the transaction. It could include the kind of car I=E2=80=99m=
looking for, my budget, and estimates of my spouse=E2=80=99s requirements =
to approve of a new purchase.

=20

=E2=80=9CYour data=E2=80=9D means data that the car dealer knows, includ=
ing the actual cost of the vehicle, the number of units in inventory, the p=
ace of sales, current buzz from other dealers.

=20

=E2=80=9COur Data=E2=80=9D means information that both parties have in c=
ommon. That could be Shared Information, explicitly given by one p=
arty to the other in the course of the deal, such as a social security numb=
er so the dealer could run a credit check. It could be Mutual Informati=
on, generated by the very act of the transaction, such as the final sa=
le price of the vehicle. Or, it could be Overlapping Information, =
which each party happens to know independently, such as the Manufacturer Su=
ggested Retail Price (MSRP) of a vehicle (which we found online before head=
ing to the dealership).

=20

The ownership of =E2=80=9Cyour=E2=80=9D and =E2=80=9Cmy=E2=80=9D data is=
usually clear. However, ownership of the different types of =E2=
=80=9Cour=E2=80=9D data is a challenge at best. To complicate matters furth=
er, every instance of =E2=80=9Cmy data=E2=80=9D is somebody else=E2=80=99s =
=E2=80=9Cyour data=E2=80=9D. In every case, there is this mutually reciproc=
al relationship between us and them. In the VRM case, we usually think=
of the individual as owning =E2=80=9Cmy data=E2=80=9D and the vendor as ow=
ning =E2=80=9Cyour data=E2=80=9D, but for the vendor, the reverse is true: =
to them their data is =E2=80=9Cmy data=E2=80=9D and the individual=E2=80=99=
s data is =E2=80=9Cyour data=E2=80=9D. Similar dynamics occur when the othe=
r party is an individual. I bring my data, you bring your data, and togethe=
r we=E2=80=99ll engage with =E2=80=9Cour=E2=80=9D data. We need an approach=
that respects and applies to everyone=E2=80=99s data, you, me, them, every=
body.

=20

In these complex Venn diagrams of ownership, it is more important who co=
ntrols the data than who owns it. We=E2=80=99ve already lost the crud=
est form of control =E2=80=93 secrecy =E2=80=93 and we are going to continu=
e to lose more as we opt-in to seductive new services based on divulging mo=
re and more information: our purchase history, browsing activity, and real=
-world location data. But we still need to control how all this data is=
used, to protect our own interests while still enjoying the benefits of th=
e great big copy machine that is the Internet.

=20

Taking =
back ownership is confrontational

=20

Fifth, we don=E2=80=99t need to pick a fight to change the game. There i=
s a lot of data out there that many of us believe we should have control ov=
er. I agree. A lot of people argue that we should have the right to exclude=
other people=E2=80=99s use because we own the data, because it=E2=80=99s <=
em>ours in some legal, moral, or ethical framework. The problem is, th=
ose other people already have it, and they also believe that they =
are legitimate owners. In fact, many of them paid for that data, b=
uying it from data aggregators who compile all sorts of things about people=
, from both public and private sources. This entire ecosystem of customer d=
ata is a multi-billion dollar business and every single player =E2=80=9Cown=
s=E2=80=9D the data they are working with. So if we focus our energy in cla=
iming ownership over that same data in order to take control, we are framin=
g the conversation as a fight, a fight against a powerful, well-healed, wel=
l-funded, entrenched bunch of opponents.

=20

Most of these =E2=80=9Copponents=E2=80=9D are the very people we are try=
ing to win over to our way of thinking. These are the vendors we want to em=
brace a new way to do business. These are the technologists we want to tran=
sform their proven, value-generating CRM systems to work with our =
data on our terms, instead of their data on their terms. Arguing over ownership puts these potential allies on the defensiv=
e, when what we really want is their collaboration.

=20

From Ownership to Authority, Rights, and Responsibilities

=20

Rather than building a regime based on data ownership, I believe we woul=
d be better served by building one based on authority, rights, and responsi=
bilities. That is, based on Information Sharing.

=20

=20

Who has the authority to control access and use of particular informati=
on?

=20

What rights does a party have in using and distributing a piece of info=
rmation?

=20

What responsibilities does an information user have to others with resp=
ect to that information?

=20

=20

Let=E2=80=99s stop arguing about who owns what and start figuring out ho=
w we can share information in ways that allow everyone to win.

=20

When we collect al=
l of our information into a single conceptual repository, and then shar=
e access to it with service providers on our own terms, we create a high qu=
ality, highly relevant, curated personal datastore. This allows us to bootstrap a control regime over=
all of our data in a way that creates new value for us and for our service=
providers. Now, instead of iTunes Genius or a <=
a class=3D"external-link" href=3D"http://build.last.fm/category/Scrobblers"=
rel=3D"nofollow">Last.FM scrobbler only having access to our media use=
with their service, they can provide recommendations based on all the info=
rmation stored in our personal audio datastore. We get better recommendatio=
ns and they get better data to drive their services. This personal datastor=
e is entirely under the authority of the user, sharing information with ser=
vice providers according to specific rights and responsibilities.

=20

The Information Sharing approach neatly sidesteps the complexities =
involved in privacy and dataportability issues of the information already k=
nown by service providers. These remain serious issues, worth addressing. R=
esolving them will require long term investment in the legal, regulatory, m=
oral, and political systems that govern our society. Fortunately, sharing t=
he information in our personal datastore can begin almost immediately once =
we have working specifications.

=20

This controlled sharing of information will dramatically increase our co=
mfort level when revealing our intentions and interests. We would have cont=
rol over the use =E2=80=93 and would be able to prevent abuse =E2=80=93 of =
that information, while making it easy for service providers to improve our=
lives in countless ways.

=20

At the Information Sharing Wo=
rk Group at the Kantara Initiative, Iain Henderson and I ar=
e leading a conversation to create a framework for sharing information with=
service providers, online and off. We are coordinating with folks involved=
in privacy and dataportability and distinguish our effort by focusing on n=
ew information, information created for the purposes of sharing with others=
to enable a better service experience. Our goal is to create the technical=
and legal framework for Information Sharing that both protects the individ=
ual and enables new services built on previously unshared and unsharable in=
formation. In short, we are setting aside the questions of data ownership a=
nd focusing on the means for individuals to control that magical, digital p=
ixie dust we sprinkle across every website we visit.

=20

Because the fact is, we want to share information. We want Goo=
gle to know what we are searching for. We want Orbitz to know wher=
e we want to fly. We want Cars.com to know the kind of car we are lookin=
g for.

=20

We just don=E2=80=99t want that information to be abused. We don=E2=80=
=99t want to be spammed, telemarketed, and adverblasted to death. We don=E2=
=80=99t want companies stockpiling vast data warehouses of personal informa=
tion outside of our control. We don=E2=80=99t want to be exploited by corpo=
rations leveraging asymmetric power to force us to divulge and relinquish c=
ontrol over our addresses, dates of birth, and the names of our friends and=
family.

=20

What we want is to share our information, on our terms. We want=
to protect our interests and enable service providers to do truly=
amazing things for us and on our behalf. This is the promise of the digita=
l age: fabulous new services, under the guidance and control of each of us,=
individually.

=20

And that is precisely what Information Sharing work group at Kantara is =
enabling.

=20

The work is a continuation of several years of collaboration with Doc Searls and others at ProjectVRM. We=E2=80=99re building =
on the principles and conversations of Vendor Relationship Management and <=
a class=3D"external-link" href=3D"http://blog.joeandrieu.com/2009/04/26/int=
roducing-user-driven-services/" rel=3D"nofollow">User Driven Services t=
o create an industry standard for a legal and technical solution to individ=
ually-driven Information Sharing.

=20

Our work group, like all Kantara work groups, is open to all contributor=
s =E2=80=93 and non-contributing participants =E2=80=93 at no cost. I invit=
e everyone interested in helping create a user-driven world to join us.

=
=20

It should be an exciting future.

=20

This article first published independently by Joe Andrieu at Beyond Data Ownership to=
Information Sharing on January 21, 2010. Submitted to the Kantara Info=
rmation Sharing Work Group for consideration and further development. on Ma=
rch 1, 2010.