Tuesday, May 31, 2011

Federal health officials call it the Wall of Shame. It’s agovernment Web pagethat lists nearly 300 hospitals, doctors and insurance companies that have reported significant breaches of medical privacy in the last couple of years.

Such lapses, frightening to consumers, could impede the Obama administration’s effort to shift the nation to electronic health care records.

“People need to be assured that their health records are secure and private,” Kathleen Sebelius, secretary of health and human services, said in an interview by phone. “I feel equally strongly that conversion to electronic health records may be one of the most transformative issues in the delivery of health care, lowering medical errors, reducing costs and helping to improve the quality of outcomes.”

So the administration is making new efforts to enforce existing rules about medical privacy and security. But some health care experts wonder if the current rules are enough or whether stronger laws are needed, for example making it a crime for someone to use information obtained improperly.

“The consequences of breaches matter,” conceded Dr. Farzad Mostashari, a former New York public hospitals official who recently became the Obama administration’s national coordinator for health information technology. “People say they are afraid that if their private information becomes known, they may not be able to get health insurance.”

In the last two years, personal medical records of at least 7.8 million people have been improperly exposed, according to the government data. One particularly egregious case involved information about 1.7 million patients, staff members, contractors and suppliers of Bronx hospitals and clinics operated by the Health and Hospitals Corporation, the New York public health agency. Their electronic files were stolen from an unlocked van belonging to a record management company.

The affected patients got the disquieting news that their medical and personal information, likeSocial Securitynumbers, had been violated when their health care providers notified them under federal rules.

Showing just how lax security can be, the inspector general of theDepartment of Health and Human Servicessaid two weeks ago that the agency had found dozens of vulnerabilities in systems to protect records of patients at seven large hospitals in New York, California, Illinois, Texas, Massachusetts, Georgia and Missouri. Auditors cited such problems as personal information that was not encrypted and was stored on computers that could be easily used by unauthorized users.

Auditing teams are now inspecting eight more hospitals, said Lori Pilcher, an assistant inspector general at Health and Human Services. The hospitals are not being identified to avoid alerting hackers, she said.

Another big breach was reported in March on the official Web site by Health Net, a California-based insurance company, which notified 1.9 million health plan members that records with their personal information were missing.Health Net said I.B.M., which was managing its information system, told the insurer that the records could not be found.

“The health care industry is not as vigilant as they should be about protecting private information in a patient’s medical records,” said Representative Joe L. Barton, a Texas Republican who is co-chairman of the Bipartisan Privacy Caucus in the House.

Mr. Barton knows from personal experience. His own records after a heart attack, along with several thousand others from a research project at the National Institutes of Health, were “on a disk in a laptop in somebody’s trunk that disappeared,” he recalled. “I was stunned.”

The Obama administration has levied a string of stringent penalties for egregious violations of patient rights under the most commonly cited law, the Health Insurance Portability and Accountability Act, or HIPAA, of 1996. Health information is supposed to stay private under those rules, but research has shown that it is not that difficult to connect names and addresses to nominally anonymous data with Internet searches and computerized matchups.

The Office of Civil Rights at Health and Human Services, which took over enforcement of the law,imposed a $1 million fineon Massachusetts General Hospital in March after a hospital employee left paper records of 192 patients on a Boston subway train. The hospital agreed in a settlement, without admitting wrongdoing, to report twice a year on its efforts to tighten patient protections.

Earlier this year, the civil rights office fined a Maryland health plan, Cignet Health, $4.3 million, saying that it had denied patients the right to see their own records in violation of HIPAA provisions. It was the first civil penalty levied under the HIPAA law. “We have ramped up our enforcement,” said Georgina C. Verdugo, director of the civil rights office.

But Dr. David Brailer, a Bush appointee as the first national coordinator of health information technology, is skeptical about whether such efforts will curb security breaches. “We can’t just lock health care data away — because of its role in lifesaving treatment,” Dr. Brailer said.He said that even with the best technology it would be hard to make health systems secure. “It’s a huge challenge. Break-ins and hacks are unfortunately going to be part of the landscape,” he said.

One protection, he suggested, would be laws to make it illegal for an insurer or employer to discriminate against a person based on information about health conditions like H.I.V./AIDS, cancer and mental health problems.

As a model, he pointed to the antidiscrimination law to prevent the misuse of genetic information that was passed with bipartisan support in the Bush administration. He also said he believed the laws should say “patients own the data, period, and decide what happens to it. The patient should be able to say to Hospital X: ‘send my data to Hospital Y because I’m changing hospitals,’ “ he said.

Today, the information belongs to whoever possesses it, under ideas inherited from 17th-century English common law, he said. “If it gets into your database, essentially you own it,” he added, “and you can pass it on.”

“Today HIPAA makes no sense,” Dr. Brailer added. “The law didn’t anticipate a world where your data passes through many, many hands.”

Wes Rishel, a longtime health care analyst for Gartner, the technology consulting firm, and an adviser to the national coordinator’s office, has a similar view. “Your ability to control access to your information is a horse that is already out of the stable,” he said. “What is really needed is legislation that controls use of it.”

On that score, researchers at Carnegie Mellon University have shown that at least 30 people and organizations have access to the health data of a typical person with private insurance through an employer. They range from pharmacies and drug companies to an employer’s wellness programs and a spouse’s self-insured employer.

“Only you, your doctor and hundreds of others know,” said Latanya Sweeney, a health privacy expert at Harvard and Carnegie Mellon who is also an adviser to the office headed by Dr. Mostashari.

Since HIPAA was enacted there has been “an explosion in data sharing,” Ms. Sweeney said. “And after electronic records are widely adopted, there will be another big explosion.”

Thursday, May 26, 2011

My fellow Hogan Lovells Privacy and Information Management practice leader, Marcy Wilder, and I are delegates to the eG8 Forum in Paris, where later today I will be a speaker at the session on privacy…., the gathering has provided a remarkable opportunity for the sharing of ideas and perspectives on the future of the Internet.

Here are my prepared remarks for the privacy session at the eG8 Forum:

As the only privacy lawyer on today's panel, I appreciate the opportunity to share my perspectives. As we all know, data is the raw material of our Information Age. But the scale and scope of data collection and use are accelerating in ways previously unimaginable. The Internet, mobile devices, and new forms of networked sensors are combining to produce more and more data that can be collected, analyzed, shared and stored. Thus, according to a new McKinsey study we heard about yesterday here at the eG8 Forum, we are entering the era of “big data,” the label for the vast and increasing amounts of digital information being produced every day. The potential of big data, according to McKinsey, is more efficient and competitive businesses, a stronger world economy and better-served consumers, including with better health care services. The experts at McKinsey are concerned however that before the end of the decade, there will not be enough trained personnel to analyze all of the data. They also note the issue of personal privacy, an issue underlying the growing concern about the amount of data being collected about our lives and used by businesses, often without our knowledge or consent. While not a focus of the McKinsey study on big data, the world leaders gathering soon in Deauville, France for the annual G8 Summit will be considering the issue of privacy as they address the agenda item on how best to advance the Internet. Presumably, they understand – as a US Commerce Department report recently noted – that if privacy concerns increase, trust in the Internet will decrease, creating an economic drag on the Internet’s potential.The G8 leaders will be informed by our work. And I hope our discussion of Internet privacy will not divide on geographic lines, with representatives from the EU, which has an omnibus privacy law, expressing disdain for the American targeted approach to privacy protection, and those with a US orientation complaining about over-regulation of privacy. If that is how the discussion evolves, that will be too bad, for there is greater need than ever for global strategies to protect privacy, and countries on both sides of the Atlantic have much to learn from each other. To be sure, the regional approaches to privacy protection differ even as we share a commitment to the OECD’s Fair Information Practice Principles. In the EU, the Data Protection Directive, implemented through national legislation, is an across-the-board regulation of personal data that places strict limits on the collection, use and retention of personal information. The US, by contrast, has chosen to legislate at the federal level with respect to sensitive data such as health, financial and children’s data, and to target enforcement on privacy violations through the regulatory powers of the Federal Trade Commission and state attorneys general. A number of states have stepped in, too, to regulate the collection, use and security of personal data. Nearly all of the states have data security breach notification laws to inform people when their personal data is at risk.Privacy self-regulation by businesses and industry groups also is an American tradition, as more and more companies recognize that violations of privacy tarnishes brands and alienates consumers. As the privacy think tank I founded and co-chair, the Future of Privacy Forum, has noted, the recent initiative by industry to empower consumers to stop online tracking of their web activities by advertisers is an example of self-regulatory effort to protect privacy. While the American approach to privacy may be untidy, in contrast to an omnibus law, a recent Berkeley study concluded that the combination of laws and increased attention by business to the importance of privacy has led to a notably more privacy-protective environment than existed in the 1990’s. And there is recognition in the US that more has to be done to protect privacy. A report from the Federal Trade Commission will be finalized soon on new approaches to privacy protection and legislators on Capitol Hill are focusing on privacy as never before.Still, the EU takes the position that the US lacks “adequate protection” for the personal data of EU citizens and thus bans the cross-border transfer of such data to the US unless special legal undertakings are made by US businesses to receive the data.In the US, with our First Amendment traditions, we have trouble understanding the justification for certain EU legal actions in the name of privacy, such as "super injunctions" preventing "tweets" naming litigants in civil actions, enforcement of the so-called “right to be forgotten” against a search engine merely for linking to an unflattering article about someone on the Web. Nor do we understand how a Google executive can be convicted criminally for a random posting by a YouTube user that was said to violate personal privacy.Despite these differences, there is an emerging consensus on both sides of the Atlantic that people are entitled to greater privacy protections. There is much that can be done cooperatively to advance such protections, like cooperation in cross-border enforcement against multi-national privacy violators, and the adoption of “Privacy by Design” as a standard to be followed by businesses at every stage in the development of new technologies.In the era of big data, privacy is too important to be overshadowed by claims of legal framework superiority. The eG8 and G8 are good places to sound the chord of cooperation in the advancement of personal privacy. I am pleased to be part of the discussion.

Tuesday, May 24, 2011

During the past few months, more than 250 million Americans received the frightening news that their personal information, collected by many retailers where they shop, was stolen by hackers routinely.

Sixty-one million Americans who own a smartphone were told that their travels and movements are being tracked by companies who service their smartphones and shared with app providers without restriction on how the information was being used. (We don’t want to imply that Apple or Google phones steal information; they don’t, but the apps on the phones collect and use it without sufficient protections or information for consumers.) And 77 million Americans learned that personal information stored in their online gaming systems was lifted by hackers.

Almost every American is vulnerable to the loss, theft or unanticipated use of their information (theft listed alone is too strong), because in this digital age we routinely turn over personal information to online retailers, social networks and other services in growing numbers.

Americans are rightfully concerned and should be. Is the requirement that you provide such information and cede control of it simply the price of doing business in today’s digital economy? It shouldn’t be. That is why we introduced a Commercial Privacy Bill of Rights — to put Americans back in control of their personal information.

Last year, Internet users sent 107 trillion emails, Facebook hosted 600 million users, Twitter hosted 155 million tweets per day, and Americans across the country shared personal data when checking into hotels, shopping for groceries and refilling their cars. In many ways, all this information sharing is good for consumers. When companies collect data and use it with high ethical standards and the full knowledge and participation of their customers, they can generate immense economic activity, innovate and tailor the services they deliver to the clients they serve.

But today the data collectors are setting the rules. Companies can harvest our personal information and keep it for as long as they like. They can use it and sell it without asking permission. You shouldn’t have to be a computer genius to figure out how to opt out of a company’s information sharing policy. In short, these companies, from mobile phone operators to hotels to websites, can do almost whatever they want with our personal information, and we have no legal right to stop them.

That’s why we introduced the The Commercial Privacy Bill of Rights to keep our private data safe by laying down fair information practices for anyone collecting it. Our legislation will ensure that businesses collecting personal information secure that information, tell people why their data is being collected and allow people to have a say in whether they want their information used. If these companies turn around and transfer this information, any agreements they have made to secure the privacy of their consumers’ information would travel along with it. And if someone requests a company to stop using personal information, they finally have the legal power to make that demand.

We also recognize that it’s important to allow for experimentation and flexibility in the implementation of privacy practices. The Commercial Privacy Bill of Rights does that by establishing voluntary safe-harbor programs to allow companies to design their own privacy programs for complying with the law. They could implement protections however they wanted as long as they still achieved privacy protections on par with the standards set out in the law.

The business community is already responding to the concerns of consumers and regulators by recognizing that the time has come to establish these types of consumer-privacy protections. Industries are negotiating among themselves to establish uniform data collection and use practices. Three of the major Internet browser services have already created tools allowing their users to express their preferences regarding their personal information. Many companies are now making massive investments in privacy protection for their own customers — including employing chief privacy officers to ensure that they earn, retain and respect the trust of consumers.

These companies see that it doesn’t just make good business sense to protect customers’ private information. They know it’s the right thing to do, and we want to take that good work and make it common practice for everyone.

Monday, May 23, 2011

If you’re obsessive about your health, and you have $100 to spare, the Fitbit is a portable tracking device you can wear on your wrist that logs, in real time, how many calories you’ve burned, how far you’ve walked, how many steps you’ve taken, and how many hours you’ve slept. It generates colorful graphs that chart your lifestyle and lets you measure yourself against other users. Essentially, the Fitbit is a machine that turns your physical life into a precise, analyzable stream of data.

If this sounds appealing — if you’re the kind of person who finds something seductive about the idea of leaving a thick plume of data in your wake as you go about your daily business — you’ll be glad to know that it’s happening to you regardless of whether you own a fancy pedometer. Even if this thought terrifies you, there’s not much you can do: As most of us know by now, we’re all leaving a trail of data behind us, generating 0s and 1s in someone’s ledger every time we look something up online, make a phone call, go to the doctor, pay our taxes, or buy groceries.

Taken together, the information that millions of us are generating about ourselves amounts to a data set of unimaginable size and growing complexity: a vast, swirling cloud of information about all of us and none of us at once, covering everything from the kind of car we drive to the movies we’ve rented on Netflix to the prescription drugs we take.

Who owns the data in that cloud has been the subject of ferocious debate. It’s not all stored in one place, of course — our lives are tracked and documented by a diffuse assortment of entities that includes private companies like Google and Visa, as well as governmental agencies like the IRS, the Department of Education, and the Census Bureau. Up to now, the public conversation on this kind of data has taken the form of an argument about privacy rights, with legal scholars, computer scientists, and others arguing for tighter restrictions on how our data is used by companies and the government, and consumer advocates instructing us on how to prevent our information from being collected and misused.

But a small group of thinkers is suggesting an entirely new way of understanding our relationship with the data we generate. Instead of arguing about ownership and the right to privacy, they say, we should be imagining data as a public resource: a bountiful trove of information about our society which, if properly managed and cared for, can help us set better policy, more effectively run our institutions, promote public health, and generally give us a more accurate understanding of who we are. This growing pool of data should be public and anonymous, they say — and each of us should feel a civic responsibility to contribute to it.

In a paper forthcoming in the Harvard Journal of Law & Technology, Brooklyn Law School professor Jane Yakowitz introduces the concept of a “data commons” — a sort of public garden where everyone brings their data to be anonymized and made available to researchers working in the public interest. In the paper, she argues that the societal benefits of a thriving data commons outweigh the potential risks from the crooks and hackers who might use it for harm.

Yakowitz’s paper has found support among a wider movement of thinkers who believe that, while protecting people’s privacy is certainly important, it should not be our only priority when it comes to managing information. This position might be a hard sell at a time when consumers are increasingly worried about mass data leaks and identity theft, but Yakowitz and others argue that we shouldn’t let fear of such inevitable accidents cloud our ability to see just how necessary data collection is to our progress as a society.

“There are patterns and trends that none of us can discern by looking at our own individual experiences,” Yakowitz said. “But if we pooled our information, then these patterns can emerge very quickly and irrefutably. So, we should want that sort of knowledge to be made publicly available.”

The idea of sharing one’s personal information with researchers and policy makers for the good of society has a long history in the United States, dating back to the early years of the national census in the 1790s. Back then, a failure to comply with the census was considered a serious abdication of one’s duty to the state. According to Douglas Sylvester, a law professor at Arizona State University, that attitude was grounded in a fundamental belief that in order to run a fair democracy, the country’s leaders needed a detailed knowledge of the people they were governing. Anyone who stood in the way of that was publicly shamed.

“During the early years of the census, your name and your economic information were posted — literally posted, on a sheet of paper — in the public square, for anyone to come and see,” said Sylvester, who has written extensively on the history of data-collection and privacy in America. “The idea was that if your name did not appear, your peers would know that you had not cooperated. Providing this information was a civic obligation.”

Of course, census workers still speak of responsible citizenship and good government when they knock on your door and implore you to fill out their forms, and technically, not doing so is still illegal. But the idea that we owe it to our fellow men to share our information with the public is long gone — and the fact that we think of it as “our” information provides a hint as to what has changed. At some point, privacy experts say, Americans started thinking of their personal data as a form of property, something that could be stolen from them if they didn’t vigilantly protect it.

It’s hard to pinpoint exactly when this transformation began, but its roots lie in the dramatic expansion of administrative data-collection that began around the turn of the last century. A more urban and industrialized nation with more public programs meant that more information was being submitted to government agencies, and eventually, people started getting possessive. Then, during the 1960s, according to Sylvester, the Watergate scandal and advancements in computing power made people even more nervous about government monitoring, and the notion that one’s personal information required protection from hostile outside forces became deeply ingrained in the nation’s psyche.

“Property rules are where people end up going when something is new and uncertain,” said Yakowitz. “When we aren’t sure what to do with something new, there are always a lot of stakeholders who claim a property interest in it. And I think that’s sort of what happened with data.”

Yakowitz came face-to-face with this attitude, and realized how severely it might impede scholarship, as a researcher at UCLA four years ago, when she was working on a study on affirmative action and student performance. Trying to obtain the data sets she needed for her work proved to be an immensely frustrating experience, the 31-year-old said: Some of the schools that kept the records she was after were uncooperative, and in one case, individual graduates who had heard about her research objected to having their information included in her analysis despite the fact that it had been scrubbed of anything that personally identified them.

Yakowitz was disturbed by the fact that her research could be thwarted just because a few people didn’t want “their” data being used in ways they hadn’t anticipated or agreed to. The experience had a galvanizing effect on Yakowitz, causing her to think more pointedly about how Americans understood their relationship to data, and how their attitudes might be at odds with the public interest. Her concept of a “data commons” came out of that thought process. The underlying goal is to revive the idea that sharing our information — this time, without our names attached — should be seen as a civic duty, like a tax that each of us pays in exchange for enjoying the benefits of what researchers learn from it.

Yakowitz began giving presentations on the data commons in February — she visited Google earlier this month to discuss the idea — and although it won’t officially be published until the fall, her paper has already begun attracting attention among people who care about data and privacy law. In it, she reviews the literature on so-called re-identification techniques — the ways that hackers and criminals might cross-reference big, anonymous data sets to figure out information about specific individuals. Yakowitz concludes that these risks have been overblown, and don’t outweigh the social benefits of having lots of anonymized data publicly available. The importance currently placed on privacy in our culture, she says, is the result of a “moral panic” that ultimately hurts us.

She joins a small chorus of voices from law, technology, and government — united under the banner of a movement known as open data — who are already arguing that the benefits of opening up government records and generally disseminating as much data as possible outweigh the costs.

“If you look at the kinds of concerns that we have as a society, they involve questions about health and our economy, and these are all issues which, if they’re to be addressed from an empirical point of view, require actual data on individuals and organizations,” said George T. Duncan, a professor emeritus at Carnegie Mellon University’s Heinz College, who has written about the tension between privacy and the social benefits of data. “Privacy advocates are so locked into their own ideological viewpoint...that they fail to appreciate the value of the data.”

The potential value of data has arguably never been greater, for the simple reason that there’s never been as much of it collected as there is today. According to a report published this month by the consulting firm McKinsey & Co., 15 out of 17 sectors of the American economy have more data stored per company than the entire Library of Congress. One example of data being leveraged for the public good in a way that would have been unthinkable a short time ago is Google Flu Trends, a tool that helps users track the spread of flu by telling them where, and how often, people are typing in flu-related search terms. The Global Viral Forecasting Initiative, based in San Francisco, uses large data sets provided by cellphone and credit card companies to detect and predict epidemics around the world. In Boston recently, a group of researchers commissioned by the governor to study local greenhouse emissions obtained data from the Registry for Motor Vehicles — which keeps inspection records on every car in the city — to find out how much Bostonians were driving.

But advocates of the open data movement see these applications as just a hint of its potential: The more access researchers have to the vast amount of data that is being generated every day, the more accurate and wide-ranging the insights they’ll be able to produce about how to organize our cities, educate our children, fight crime, and stay healthy.

Marc Rodwin, a professor at Suffolk University Law School, has argued for a system in which patient records collected by hospitals and insurance companies — which are currently considered private property, and are routinely purchased in aggregate by pharmaceutical companies — are managed by a central authority and made available, in anonymized form, to researchers. “You can find out about dangerous drugs, you can find out about trends, you can compare effectiveness of different therapies, and the like,” he said. “But if you don’t have that database, you can’t do it.”

Even as such ideas ripen in some corners of the academy and government, proponents of open data are the first to admit that the culture as a whole seems to be heading in the opposite direction. More and more, people are bristling as they realize that everything they do online — including the e-mails they send their friends and the words they search for on Google — is being tracked and turned into data for the benefit of advertisers. And they are made understandably nervous by large-scale data breaches like the one reported last week in Massachusetts, which resulted in as many as 210,000 residents having their financial information exposed to computer hackers. In light of such perceived threats, it’s no wonder the words of privacy advocates are resonating.

Yakowitz and the open data advocates acknowledge that these are reasonable fears, but point out that they won’t be solved by locking down data further. The most damaging breaches, they argue, happen when thieves hack into private sources like credit card processors that are supposedly secure. When we respond by imposing tighter controls on the dissemination of anonymized data, we’re just ensuring that it can’t be used where it might do the most public good.

“The same groups that get really concerned about privacy issues are also the groups that call for more efficiently targeted government resources,” said Holly St. Clair, the director of data services at the Metropolitan Area Planning Council in Boston, where she works on procuring governmental data sets for research purposes. “The only way to do that is with more information — with better information, with more timely information.”

The problem with this vision of the future, according to some privacy experts, is not that large amounts of data don’t come with obvious public benefits. It’s that Yakowitz’s argument presumes a level of anonymization that not only doesn’t exist, but never will. Given enough outside information to draw on, they say, bad actors will always be able to cross-reference data sets with each other, figure out who’s who, and harm individuals who never explicitly agreed to be included in the first place.

In one famous case back in 1997, Carnegie Mellon professor of computer science Latanya Sweeney was able to match publicly available voter rolls to a set of supposedly anonymized medical data, and successfully identify former Massachusetts Governor William F. Weld.

According to Sweeney, currently a visiting professor at Harvard and an affiliate of the Berkman Center for Internet and Society, 87 percent of the US population can be identified by name in this way, based only on birthday, ZIP code, and gender. Sweeney called Yakowitz’s paper on the data commons “irresponsible” for dismissing the risk of re-identification.

There are other practical obstacles as well: Data, in today’s economy, is extremely valuable. Even if data sets could be made truly anonymous, Sweeney asks, why should we expect the huge private collectors of data — companies like Google and Facebook, whose business rather depends on their ability to maintain an exclusive trove of data on their customers — to share what they have for the public good? As data-gathering becomes bigger and bigger business, it might become more valuable to society — but also becomes an asset that companies will fight harder to protect.

As far as Yakowitz is concerned, that’s all the more reason to try to bring about a shift in the way our culture views data. To that end, she proposes granting legal immunity to any entity that releases data into the commons, protecting them from privacy litigation under the condition that they follow a set of strictly enforced standards for anonymization. She also hopes that framing data as a public resource — something that belongs, collectively, to all of us who generate it — will give the public some leverage over big private companies to make their information public.

“Right now I feel like the public gets the rawest deal, because a lot of data is collected, and it’s shared with any company that the private data-collector cares to share it with. But there’s no guarantee that they’ll share it with researchers who are working in the public interest,” Yakowitz said. “Maybe I don’t go far enough — maybe we should force these companies to share with researchers. But that’s for another day, I guess.”

Friday, May 20, 2011

Despite the recent revelations — and subsequent Congressional hearings — about the use (and misuse) of personal data by companies doing business on the Internet, companies aren’t about to stop collecting and trying to use it to improve their results.

And why should they, when the more data companies use, the better their chances of selling you more products and services, at better returns? According to Sandy Pentland, a professor at MIT’s Media Lab, the best chance people may have of controlling their data online is a modern version of “if you can’t beat them, join them.”

Mr. Pentland, who has been working with private companies, consumer advocates and regulators from different countries around the world to find an approach to data collection that satisfies all those groups, said the simplest and most logical approach would be one that allows consumers to manage their data and receive compensation in exchange for making it available to firms who want to market to them.

“Your data becomes a new asset class, you have more control over the information, and it becomes your most lucrative asset,” he said during a panel discussion at the MIT Sloan CIO Symposium in Cambridge Wednesday.

Another panelist, Erik Brynjolfsson, an MIT Sloan professor of digital business, predicted that society will suffer a “catastrophe” involving the misuse of personal data, but said this doesn’t mean we should stop “experimenting” with the Internet.

After the session, he told Digits, “We don’t know if data ownership is going to solve the problem.” But nevertheless, he said, as a society, “we have to maintain an experimental, open mind.”

He noted that when cars were first introduced, they created tens of thousands of highway deaths per year until we learned more about best practices like stop lights, and we created technology, like disk brakes, that made it safer. But we didn’t stop using cars in the meantime.

And the potential benefits of using this data are just too good for private companies to pass up, he said. Brynjolfsson said he led a study of 330 companies and found that companies that said they’re more likely to use data than intuition to guide business decisions had between 4% and 6% higher productivity, and 5% higher revenues, than those who didn’t use data. “It’s statistically significant,” he said.

Tuesday, May 17, 2011

Privacy regulators are embracing privacy by design as never before. This is the idea that “building in” privacy throughout the design and development of products and services achieves better results than “bolting it on” as an afterthought. In the US, a very recent FTC Staff Report makes privacy by design one of three main components of a new privacy framework.

According to the FTC, firms should adopt privacy by design by incorporating substantive protections into their development practices and implementing comprehensive data management procedures; the latter may also require a privacy impact assessment (PIA) where appropriate. In contrast, European privacy officials view privacy by design as also requiring the broad adoption of Privacy Enhancing Technologies (PETs), especially PETs that shield or reduce identification or minimize the collection of personal data.

Despite the enthusiasm of privacy regulators, privacy by design and PETs have yet to achieve widespread acceptance in the marketplace. One reason is that Internet firms derive much of their profit from the collection and use of personal data and may be unwilling to build in privacy if it disrupts profitable activities or new business ventures. Nor does the available evidence support the view that privacy by design pays for itself (except perhaps for a small group of firms who must protect privacy to maintain highly valued brands and avoid reputational damage). At the same time, the regulatory implications of privacy by design remain murky at best, not only for adopters but also for free riders.

This Article seeks to clarify the meaning of privacy by design and thereby suggest how privacy regulators might develop appropriate incentives to offset the certain economic costs and uncertain privacy benefits of this new approach. It begins by developing a taxonomy of PETs, classifying them as substitutes or complements depending on how they interact with data protection or privacy laws. Substitute PETs aim for zero-disclosure of PII, whereas complementary PETs enable greater user control over personal data through enhanced user controls. Next, it explores the meanings of privacy by design in the specific context of the FTC’s emerging concept of “comprehensive information privacy programs.” It also examines the activities of a few industry leaders, who rely on engineering approaches and related tools to implement privacy principles throughout the product development and the data management lifecycles.

Building on this analysis and using targeted advertising as its primary illustration, the Article then suggests how regulators might achieve better success in promoting the adoption of privacy by design by 1) identifying best practices in privacy design and development, including prohibited practices, required practices, and recommended practices; and 2) situating best practices within an innovative regulatory framework that a) promotes experimentation with new technologies and engineering practices; b) encourages regulatory agreements through stakeholder representation, face-to-face negotiations, and consensus-based decision making; and c) supports flexible, incentive driven safe harbor mechanisms as defined by (newly enacted) privacy legislation.

Friday, May 13, 2011

On May 10 executives from Google (GOOG) and Apple (AAPL) participated in the time-tested Washington ritual of a congressional grilling. Alarmed by revelations that smartphones store data on users' locations, legislators demanded details on the companies' privacy policies. "Consumers have a fundamental right to know what data is being collected about them," said Minnesota Democrat Al Franken, who called the hearing. "They have a right to decide whether they want to share that information, with whom they want to share it, and when."

Alan Davidson, Google's director of public policy, and Apple's vice-president for software technology, Bud Tribble, defended their employers' handling of user-location information and said the companies do not track individual customers.

Lawmakers are trying to determine what new rules are needed in the era of 24/7 connectivity. "The flash point is the mobile device," says Jeff Chester, executive director of the Washington-based Center for Digital Democracy. "The ability to combine one's behavior with one's location is about to create a political firestorm."

A series of high-profile data-security breaches have heightened concern about privacy. Sony (SNE) shut down its PlayStation Network last month after its online entertainment and game systems were hacked, compromising some 100 million personal accounts. JPMorgan Chase (JPM), Best Buy (BBY), and Target (TGT), along with some 17 other companies, disclosed last month that customers' e-mail addresses were exposed after cyber thieves hacked into databases at Alliance Data System's (ADS) Epsilon Data Management. "The recent spate of security breaches is off the charts," says Marc Rotenberg, executive director of the Electronic Privacy Information Center (EPIC).

A consensus has formed in Washington that the patchwork of federal and state privacy laws has not kept pace with the development of the Internet. The U.S. lags Europe, where broad safeguards of personal digital information have been in place since 1995.

One proposal by Senator Jay Rockefeller (D-W. Va.) would create a "do not track" mechanism, similar to the "do not call" list that freed U.S. households from the tyranny of telemarketers. Under the Rockefeller bill, consumers could elect whether to have their browsing data collected.

Such proposals may indicate that the era of the freewheeling Web is drawing to a close. "The original policy was to treat the Internet like a hothouse flower that had to be protected," says Cameron F. Kerry, general counsel to the Commerce Dept. The agency is now calling for a Consumer Bill of Rights that would establish baseline privacy practices and bring the U.S. more in line with Europe. At present, European companies can't send personal data to countries that lack equivalent levels of protection.

At the Federal Trade Commission, Chairman Jon Leibowitz has made privacy a priority. The agency issued a report in December calling for a "do not track" mechanism. The report also pressed companies to make their data policies easy for consumers to understand. "Privacy policies don't translate well to smartphones," says David Vladeck, head of the FTC's Bureau of Consumer Protection. "They are already hard to read on the Internet, but if you are in the car in the middle of Nebraska Avenue, it can be even harder."

In the wake of the FTC report, Microsoft (MSFT), Mozilla, and Google all incorporated tracking-protection features into the latest versions of their browsers. The Digital Advertising Alliance, a coalition of trade associations, in October introduced an opt-out button that allows consumers to indicate they don't want their online behavior collected. Google in March also unveiled a tool that lets users block unwanted websites.

Although some technology companies have taken action in hopes of forestalling additional regulation, it's unlikely that the industry on its own can agree on adequate policies that balance privacy and profits. Spending on online advertising is projected to almost double to $44 billion in 2016, from $26 billion last year, according to Alex Feldman, manager of global forecasting at MagnaGlobal (IPG), a media researcher. Mobile advertising revenue is projected to grow more than fourfold, to $1.8 billion by 2016, while the value of online video advertising could nearly triple, to $3.7 billion, over the same period, according to Feldman.

In several instances, the FTC has acted to curb what it considers unfair and deceptive practices. The agency reached a settlement with Google in March related to privacy breaches associated with the introduction of its Buzz social networking service last year. The 20-year agreement, hailed as a landmark by industry watchers, bars Google from misrepresenting how it handles information, obliges the company to protect consumer data in new products, and requires periodic government reviews. Also in March, the agency forged a similar settlement with Twitter after hackers obtained control of the Internet messaging service.

As a whole, a new federal privacy law should come down to a simple proposition, says EPIC's Rotenberg: "If you can't protect it, you shouldn't collect it."

The bottom line: A spate of high-profile data security breaches may finally compel Washington to draft a comprehensive privacy policy.

With Eric Engleman, Adam Satariano, and Stephanie Bodoni. Forden is a reporter for Bloomberg News.

In comments to the Federal Trade Commission, EPIC recommended that the FTC require Google to adopt and implement comprehensive Fair Information Practices, as part of the Privacy Program.

EPIC also recommended encryption for Google's cloud-based services, new safeguards for reader privacy, limitations on data collection, and warrant requirements for data disclosures to government officials.

EPIC said that similar privacy safeguards should be established for other Internet companies.

The FTC investigation and settlement arises from a complaint filed by EPIC with the Commission in February 2010.

BRUSSELS — As pressure grows for technology companies like Apple and Google to adjust how their phones and devices gather data, Europe seems to be where the new rules are being determined.

Last year, Google generated a storm of controversy in Germany when it had to acknowledge it had been recording information from unsecured wireless networks while compiling its Street View mapping service.

Then, last week, regulators in France, Germany and Italy said they would examine whether Apple’s iPhone and iPad violated privacy rules by tracking the location of users. Also, reports emerged last month that the Dutch police had obtained information from TomTom, a maker of popular satellite navigation devices, while setting up speed traps, prompting concerns by users and an apology from TomTom.

The companies all said there was nothing sinister about their activities, though Apple said it would issue a software update limiting the time that location data was kept to seven days. None of the information, the companies said, is particularly sensitive from the point of view of personal privacy, and they claim it will help them to deliver better services in many cases.

To address concerns about data protection, Viviane Reding, the European justice commissioner, said in a speech Tuesday that she would propose extending unionwide rules about breaches of privacy to online banking, video games, shopping and social media.

The rules require phone companies and Internet service providers to inform customers of any data breach “without undue delay.” “European citizens care deeply about protecting their privacy and data protection rights,” Ms. Reding said in a separate statement. “Any company operating in the E.U. market or any online product that is targeted at E.U. consumers should comply with E.U. rules.”

Ms. Reding made her remarks shortly after Sony apologized for a data theft involving 77 million account holders of the PlayStation Network, and a week after Apple said it would change the software that logs the location of users of its iPhone and iPad tablet computer. “Seven days is too late,” Ms. Reding said Tuesday, referring to how long it took Sony to inform account holders. Regarding Apple, she said she understood how the discovery that the iPhone collected location data had eroded “the trust of our citizens.”

Abraham L. Newman, an assistant professor at Georgetown University and a specialist in European privacy issues, said Europe’s spotlight on privacy could offer companies like Apple and Google the chance to reorganize the way they handled policies worldwide, using European standards in their corporate strategy. Alternatively, he said, the companies could develop policies to ensure that data gathered in Europe was sufficiently “quarantined” to comply with rules, but limit changes in the rest of the world.

“Apple is entering a political dynamic in Europe which is similar to Google’s experience,” Mr. Newman said. “Authorities in Europe have decided that consumers better not be duped in a world of unlimited location data where companies know literally every step you take.” What particularly distinguishes Europe is the strong role played by so-called national data protection authorities in keeping tabs on privacy issues, he said.

In the United States, there is no single agency dedicated to privacy, and while the Federal Trade Commission and the Federal Communications Commission can deal with violations of privacy, those agencies are mainly focused on enforcing fair business practices.

But Ms. Reding said the differences between Europe and the United States should not overshadow signs of convergence, like the work by the Obama administration and Congress to pass a privacy bill of rights that would stop companies from collecting or sharing personal information without an Internet user’s consent.

“Until recently, there was a common belief that the E.U. and U.S. have different approaches on privacy and that it would be difficult to work together,” Ms. Reding said. “This can no longer be argued in such simple terms.”