Monday, February 28, 2011

As the surreptitious tracking of Internet users becomes more aggressive and widespread, tiny start-ups and technology giants alike are pushing a new product: privacy.

Companies including Microsoft Corp., McAfee Inc.—and even some online-tracking companies themselves—are rolling out new ways to protect users from having their movements monitored online. Some are going further and starting to pay people a commission every time their personal details are used by marketing companies.

"Data is a new form of currency," says Shane Green, chief executive of a Washington start-up, Personal Inc.1 , which has raised $7.6 million for a business that aims to help people profit from providing their personal information to advertisers.

The Wall Street Journal's year-long What They Know investigation into online tracking has exposed a fast-growing network of hundreds of companies that collect highly personal details about Internet users—their online activities, political views, health worries, shopping habits, financial situations and even, in some cases, their real names—to feed the $26 billion U.S. online-advertising industry.

In the first nine months of last year, spending on Internet advertising rose nearly 14%, while the overall ad industry only grew about 6%, according to data from PriceWaterhouseCoopers LLP and WPP PLC's Kantar Media.

Testing the new privacy marketplace are people like Giles Sequeira, a London real-estate developer who recently began selling his own personal data. "I'm not paranoid about privacy," he says. But as he learned more, he says, he became concerned about how his data was getting used.

GraphicCompanies are introducing free and paid products that help people manage the way companies track their online activities. Some services pay people when their personal details are used. People "have no idea where it is going to end up," he says. So in December, Mr. Sequeira became one of the first customers of London start-up Allow Ltd.2 , which offers to sell people's personal information on their behalf, and give them 70% of the sale. Mr. Sequeira has already received one payment of £5.56 ($8.95) for letting Allow tell a credit-card company he is shopping for new plastic.

"I wouldn't give my car to a stranger" for free, Mr. Sequeira says, "So why do I do that with my personal data?"

As people are becoming more aware of the value of their data, some are seeking to protect it, and sometimes sell it. In January at the World Economic Forum in Davos, Switzerland, executives and academics gathered to discuss how to turn personal data into an "asset class" by giving people the right to manage and sell it on their own behalf.

"We are trying to shift the focus from purely privacy to what we call property rights," says Michele Luzi, a director at consulting firm Bain & Co. who led the Davos discussion.Allow, the company that paid Mr. Sequeira, is just one of nearly a dozen start-ups hoping to profit from the nascent privacy market. Several promise to pay people a commission on the sale of their data. Others offer free products to block online tracking, in the hopes of later selling users other services—such as disposable phone numbers or email addresses that make personal tracking tougher. Still others sell paid services, such as removing people's names from marketing databases.

"Entrepreneurs smell opportunity," says Satya Patel, venture capitalist at Battery Ventures, which led a group of investors that poured $8 million in June into a start-up called SafetyWeb3 , which helps parents monitor their children's activities on social-networking sites and is rolling out a new privacy-protection service for adults,myID.com4.

For the lightly regulated tracking industry, a big test of the new privacy marketplace is whether it will quiet the growing chorus of critics calling for tougher government oversight. Lawmakers this month introduced two separate privacy bills in Congress, and in December the Obama administration called for an online-privacy "bill of rights." The Federal Trade Commission is pushing for a do-not-track system inspired by the do-not-call registry that blocks phone calls from telemarketers.

The industry is hustling on several fronts to respond to regulatory concerns. Last week, Microsoft endorsed a do-not-track system. Microsoft also plans to add a powerful anti-tracking tool to the next version of its Web-browsing software, Internet Explorer 9. That's a reversal: Microsoft's earlier decision to remove a similar privacy feature from Explorer was the subject of a Journal article last year5.

The online-ad industry itself is also rolling out new privacy services in hopes of heading off regulation. Most let users opt out of seeing targeted ads, though they generally don't prevent tracking.

The privacy market has been tested before, during the dot-com boom around 2000, a time when online tracking was just being born. A flurry of online-privacy-related start-ups sprang up but only a few survived due to limited consumer appetite.

As recently as 2008, privacy was so hard to sell that entrepreneur Rob Shavell says he avoided even using the word when he pitched investors on his start-up, Abine Inc.6 , which blocks online tracking. Today, he says, Abine uses the word "privacy" again, and has received more than 30 unsolicited approaches from investors in the past six months.

It's rarely a coincidence when you see Web ads for products that match your interests. WSJ's Christina Tsuei explains how advertisers use cookies to track your online habits.

In June, another company, TRUSTe, raised $12 million from venture capitalists to expand its privacy services. At the same time, Reputation.com Inc. raised $15 million and tripled its investments in new privacy initiatives including a service that removes people's names from online databases and a tool to let people encrypt their Facebook posts.

"It's just night and day out there," says Abine's Mr. Shavell.

Online advertising companies—many of which use online tracking to target ads—are also jumping into the privacy-protection business. AOL, one of largest online trackers, recently ramped up promotion of privacy services that it sells.

And in December, enCircle Media, an ad agency that works with tracking companies, invested in the creation of a privacy start-up, IntelliProtect7 . Last month IntelliProtect launched a $8.95-a-month privacy service that will, among other things, prevent people from seeing some online ads based on tracking data.

In its marketing material, IntelliProtect doesn't disclose its affiliation with the ad company, enCircle Media, that invested in it. When contacted by the Journal, IntelliProtect said it would never give or sell customer data to other entities, including its parent companies.

A cofounder of Allow, Justin Basini, also traces his roots to the ad industry. Mr. Basini came up with the idea for his new business when working as head of brand marketing for Capital One Europe. He says he was amazed at the "huge amounts" of data the credit-card companies had amassed about individuals.

But the data didn't produce great results, he says. The response rate to Capital One's targeted mailings was 1-in-100, he says—vastly better than untargeted mailings, but still "massively inefficient." Mr. Basini says. "So I thought, 'Why not try to incentivize the customer to become part of the process?"

People feel targeted ads online are "spooky," he says, because people aren't aware of how much personal data is being traded. His proposed solution: Ask people permission before showing them ads targeted at their personal interests, and base the ads only on information people agree to provide.

In 2009, Mr. Basini left Capital One and teamed up with cofounder Howard Huntley, a technologist. He raised £440,000 ($708,400) from family, friends and a few investors, and launched Allow in December. The company has attracted 4,000 customers, he says.

Mr. Basini says his strategy is to first make individuals' data scarce, so it can become more valuable when he sells it later. To do that, Allow removes its customers from the top 12 marketing databases in the U.K., which Mr. Basini says account for 90% of the market. Allow also lists its customers in the official U.K. registries for people who don't want to receive telemarketing or postal solicitations.

Currently, Allow operates only in the U.K., which (unlike the U.S.) has a law that requires companies to honor individuals' requests to be removed from marketing databases.Then, Mr. Basini asks his customers to create a profile that can contain their name, address, employment, number of kids, hobbies and shopping intent—in other words, lists of things they're thinking about buying. Customers can choose to grant certain marketers permission to send them offers, in return for a 70% cut of the price marketers pay to reach them. Allow says it has finalized a deal with one marketer and has five more deals it hopes to close soon. Mr. Basini says Allow tries to prevent people from "gaming" the system by watching for people who state an intention to buy lots of things, but don't follow through.

Because Allow's data comes from people who have explicitly stated their interest in being contacted about specific products, it can command a higher price than data gathered by stealthier online-tracking technologies. For instance, online-tracking companies routinely sell pieces of information about people's Web-browsing habits for less than a penny per person. By comparison, Allow says it sells access to Mr. Sequeira for £5 to £10 per marketer.

Mr. Sequeira, the London real-estate executive, says that after he filled out an "intention" to get a new credit card, he received a £15.56 credit in his Allow account: a £10 signing fee plus a £5.56 payment from the sale of his data to a credit-card marketer. So far, he says, he hasn't received a card offer from the company.

"I don't think it's going to make a life-changing amount of money," says Mr. Sequeira. But, he says he enjoyed the little windfall enough that he is now letting Allow offer his data to other advertisers. "I can see this becoming somewhat addictive."

Sunday, February 27, 2011

Facebook's response to the FTC urges optimism and governmental restraint.

Facebook just released a 26-page retort to the Federal Trade Commission’s preliminary report on privacy regulation--a report that social media firms see as an ominous approaching storm of chaotic bureaucracy. In summary, Facebook fears that government meddling could stifle both its ability to profit and smother the industry’s progress on yet-unknown technological advancements.

Facebook responded in mirror-image to the FTC; first, (respectively) reminding the FTC how much social media has done for the government itself, the advancement of democracy, and the growing cottage industry of social software:

On government"In government, leaders use social media services to promote transparency, as evidenced by the nearly 140,000 followers of the White House Press Secretary's Twitter feed and the fact that more than 70 federal agencies have Facebook pages."

On democracy"Advocates of democracy used Twitter to make their voices heard following the contested 2009 Iranian election of and Oscar Morales in Colombia famously employed Facebook to organize massive street demonstrations against the FARC terrorist group in 2008. Most recently, people in Tunisia and Egypt used social media to spread up-to-the-minute news, share videos of local events with the broader population, and mobilize online communities of thousands (and sometimes millions) behind a common cause."

On business"Finally, the social web is a crucial engine for economic growth and job creation. Hundreds of thousands of application developers have built businesses on Facebook Platform. To take just one example, games developer Zynga, creator of the popular Farmville game, has more than 1,300 employees and has been valued at about $5.8 billion."

Second, it pleaded for the FTC to be optimistic about how ostensibly intrusive technologies end up benefiting the public:

Caller ID"Telephone companies originally collected and exchanged subscribers’ telephone numbers solely for the purpose of completing telephone calls. But telephone companies later realized that they could use this information to display the calling party's telephone number and name to the call recipient, allowing the recipient to identify the caller in advance. Today, caller ID is an accepted and valued part of telephone communication, and few subscribers choose to block outgoing caller ID even though it is easy to do so."

Facebook Newsfeed"In 2006 Facebook launched a new feature called News Feed on every person's homepage. The product updated a personalized list of news stories throughout the day so users would know what their friends were posting. Before News Feed, people had to visit their friends' profiles to see what their friends were up to. Despite initial user skepticism when the product was first launched, News Feed is now--as any user would attest--an integral part of the Facebook experience."

Google Flu Trends"When the founders of Google began collaborating on a search engine research project in 1996, they probably did not envision that search queries about topics would one day become an early detection system for flu outbreaks. Today, Google Flu Trends can estimate flu activity one to two weeks more quickly than traditional surveillance systems involving virologic and clinical data, and may help public health officials and health professionals better respond to seasonal epidemics."

Finally, Facebook urged the FTC to be sensitive to the business implications of its decisions: “For Facebook--like most other online service providers--getting this balance right is a matter of survival,” the report notes.

It continued, "Ultimately, the FTC's enforcement activities in the area of privacy must be guided by the realization that aggressive enforcement and imprecise standards can lead to more legalistic disclosures--and, as described above, chill economic growth--as companies seek to manage regulatory risk by over-disclosing, reserving broad rights, and under-innovating. To avoid these unintended consequences, the FTC should err on the side of clarifying its policies rather than taking aggressive enforcement action against practices that previously were not clearly prohibited."

Both the FTC and Facebook have been light on data on experimental evidence--and both are obscuring a yet unrevealed future value (or detriment) associated with all of this sharing. Then again, forecasting problems into a very turbulent future is nearly impossible. Ultimately, both documents read like the fight will come down to a philosophical debate. And billions of dollars.Full Facebook response at:http://www.fastcompany.com/1731121/facebook-ftc-response

"Privacy and security is a significant challenge for every health care organization and a concern for every U.S. citizen. The move toward an entirely automated health care system featuring electronic and personal health records, clinical data warehousing, and increased transparency means more data is at risk and suggests an urgent review of industry privacy and security safeguards.

The potential liability for data breaches is significant and increasing. Stakeholders must act now to prevent compromising sensitive patient data, preserve brand value, and avoid substantial financial penalties for violations.

This Issue Brief from the Deloitte Center for Health Solutions (DCHS):

Provides an update about current and emergent privacy and security challenges in health care;

Examines notable hot spots where current policies, rules, and regulations are a focus of industry risk;

Reviews the state of preparedness for privacy and security risk throughout the industry;

Suggests an approach to assessing an organization's current preparedness."

Thursday, February 17, 2011

Personal data – digital data created by and about people – represents a new economic “asset class”, touching all aspects of society. The abundance of personal data represents untapped opportunities for economic growth and social benefit; however, the barriers restricting personal data’s movement and protection need to be resolved. Granting individuals greater control over their data is necessary to create a balanced personal data ecosystem.

The report addresses the interrelated and complex cultural, business, technology and policy trends shaping the personal data ecosystem by presenting a user-centric set of recommendations for individuals, private enterprise and policy-makers. In particular, the report suggests five areas for collective action:

1) Innovate around user-centricity and trust. The personal data ecosystem will be built on the trust and control individuals have in sharing their data. Continued testing and promoting of trust frameworks that explore innovative approaches for identity assurance at Internet scale are needed.

2) Define global principles for using and sharing personal data. Given the lack of globally accepted policies governing the use and exchange of personal data, an international community should articulate core principles of a user-centric personal data ecosystem.

3) Strengthen the dialogue between regulators and the private sector. Technologists should closely align with regulators to establish processes that enable stakeholders to formulate and update a standardized set of rules to create a basic legal infrastructure.

4) Focus on interoperability and open standards. Stakeholders should identify best practices and engage with standards bodies, advocacy groups, think tanks and various consortia on the user-centric approaches required to scale the value of personal data.

5) Continually share knowledge. To stay current, stakeholders should actively share insights and lessons learned on their relevant activities (both successes and failures). The ecosystem promises tremendous value created when individuals share information about who they are and what they know. This principle should also apply to practitioners within the development community.

Launched in 2010, the Rethinking Personal Data project is a multi-year project intended to bring together private companies, public sector representatives, end-user privacy and rights groups, academics and topic experts to deepen the collective understanding of how a principled, collaborative and balanced personal data ecosystem can evolve.

As the Internet in general and social networking in particular are used as a point of reference for gathering and sharing health information, a study that examined 10 diabetes-focused social networking sites has found that the quality of clinical information, as well as privacy policies, significantly varied across these sites.

The study, "Social but safe? Quality and safety of diabetes-related online social networks," was conducted by researchers in the Children's Hospital Boston informatics program who performed an in-depth evaluation of the sites and found that only 50% presented content consistent with diabetes science and clinical practice.

The research, published in late January in the Journal of the American Medical Informatics Association, also revealed that sites lacked scientific accuracy and other safeguards such as personal health information privacy protection, effective internal and external review processes, and appropriate advertising.

For example, misinformation about a diabetes cure was found on four moderated sites. Additionally, of the nine sites with advertising, transparency was missing on five, and ads for unfounded cures were present on three. Technological safety was poor, with almost no use of procedures for secure data storage and transmission.

The study found that only three sites support member controls over personal information. Additionally, privacy policies were difficult to read and only three sites (30%) demonstrated better practice, wrote the study's authors.

Elissa R. Weitzman, lead author of the study and assistant professor at Harvard Medical School, told InformationWeek that she was surprised at the high use of online health-related social networking among people with diabetes, and noted that the healthcare community and key stakeholders at these sites should implement policies to protect member privacy and align site content with medical science and clinical practice.

"Exchanging information on these sites has the potential to accelerate what we know about this disease and to rapidly disseminate vital information and support. However, the spread of information throughout online communities poses a safety concern for patients," Weitzman observed. "I'm surprised that the clinical healthcare system seems to be lagging behind patients and consumers in engaging with this medium and finding ways to support them, synergistically -- without trying to replace or control them."

"I think a sustainable standard for how these communities operate with respect to privacy, security, and honesty will come about because the communities themselves and their users will adopt and enforce norms of transparency and protection," Weitzman predicted. "One way this could happen is for stakeholders of these sites to develop a system of 'peer review' around these issues to support better or best practices."

The study evaluated diabetes Web sites that appeared prominently in Google searches and allowed members to create personal profiles and interact with each other. The study examined four key factors:

-- agreement of content with diabetes science and clinical practice standards, -- practices for auditing content and supporting transparency, -- accessibility and readability of privacy policies, and -- the degree of control members had over the sharing of personal data.

The average number of members per Web site was 6,707. Activity ranged widely among the sites, from over 100 new posts per day to less than 5 new posts per day.

Other findings were that the majority of sites did not include a "disclaimer" encouraging patients to discuss their care regimen with a healthcare provider. Several sites did not post essential diabetes information, such as the definition of "A1c" -- a biomarker commonly used by diabetics to access blood glucose levels.

In addition to recommending improvements in these areas, the authors saw a need for increased moderation, for the credentials of moderators to be more visible, and for periodic external review. Further, potential conflicts of interest -- such as ties to the pharmaceutical industry -- needed to be more clearly disclosed, and privacy policies easier to understand.

Weitzman is an assistant professor in the laboratory of Kenneth Mandl, who also co-authored the study. Last year the two developed an application for the social networking website TuDiabetes that allows users to submit their A1c levels to be displayed in a worldwide map, as part of an effort to encourage diabetes management and inform public health efforts and research.

Researchers said they chose to study diabetes-related networks because they were among the earliest to emerge and remain among the most active. The research team in the Children's Hospital informatics program will further study how these sites are used -- how people choose to interact with them and how specifically they share their medical information.Weitzman also said the Web is a notoriously difficult sphere to regulate with respect to issues of privacy, information security, and honesty in advertising, but said she is hopeful that these sites will improve.

When the plaintiff made a purchase by credit card at the defendant retailer, a cashier requested her ZIP code and she provided it, believing that it was necessary to complete the transaction. The plaintiff alleged that the store then used her name and ZIP code to locate her home address, which it added to a marketing database.

The Court of Appeals affirmed the trial court’s dismissal of the claim, holding that a ZIP code, without more, does not constitute personal identification information under the Credit Card Act. The California Supreme Court reversed and remanded.

The Court first looked to statutory construction in its analysis of whether ZIP codes constitute personal identification information. The Credit Card Act defines personal identification information as “information concerning the cardholder, other than information set forth on the credit card, and including, but not limited to, the cardholder’s address and telephone number.”

The Court found that the word “address” in the statute should be construed as encompassing not only a complete address, but also its components. Furthermore, the Court rejected the lower court’s conclusion that a ZIP code is not personal identification information because it pertains to a group, rather than a specific individual. The Court found that ZIP codes are like addresses or telephone numbers in that such information is “unnecessary to the sales transaction” and “alone or together with other data such as a cardholder’s name or credit card number, can be used for the retailer’s business purposes.”

The Court noted that this interpretation is also consistent with the Credit Card Act’s provision which allows businesses to require the cardholder to provide a form of identification, such as a driver’s license, “provided that none of the information contained thereon is written or recorded.”

In addition to examining the statute’s provisions, the Court reviewed the legislative history of the Credit Card Act. The Court found that the California Legislature “intended to provide robust consumer protections by prohibiting retailers from soliciting and recording information about the cardholder that is unnecessary to the credit card transaction.” A primary issue motivating the creation of the statute was how retailers acquired additional personal information, unnecessary to the transaction, to build mailing and telephone lists for its in-house marketing or to sell or others. Later amendments of the statute prohibited businesses from recording information in consumers’ provided identification; the purpose of which was to prevent retailers from matching this information with the consumer’s credit card number.

The Court rejected the defendant’s argument that its construction of the Credit Card Act violates due process, and found that a broad interpretation of the Credit Card Act did not render the statute unconstitutionally vague because the law includes adequate notice of prohibited conduct.

Patients have long had limited rights under the privacy provisions of the Health Insurance Portability and Accountability Act of 1996 to demand that providers and other “covered entities” provide them with an accounting of disclosures of their personally identifiable medical information.

The American Recovery and Reinvestment Act of 2009, however, expanded patients' privacy rights and closed a HIPAA exemption that covered entities were not required to audit and account for disclosures for treatment, payment and a broad, catch-all category known as other “healthcare operations,” if the covered entity uses an electronic health-record system. The ARRA eliminated that exemption and the new rule before the OMB provides language to implement the rule change. Patients can demand an accounting of disclosures going back three years from the date the demand is made. The accounting requirement also applies to business associates of covered entities.

In a May 3, 2010, request forpublic comment on the disclosure rules (PDF), the Office for Civil Rights at HHS noted that the new rule would require covered entities who have acquired an EHR after Jan. 1, 2009, to comply with the new accounting requirement by Jan. 1, 2011, unless the OCR extends the deadline, which is allowed but only no later than 2013. -Joseph Conn

Thursday, February 10, 2011

As the FTC gathers comments on its proposed privacy rules, including a “Do Not Track” proposal, FTC Commissioner Julie Brill told a crowd of privacy researchers and policy wonks gathered at UC Berkeley that her agency was willing to go to Congress if online advertisers and analytics companies don’t clean up their act.

While Do Not Track has become a buzz phrase that has been getting a lot of attention, there’s more that’s needed beyond implementing a good no-tracking option, Brill said. First, companies need to start considering “privacy by design.” That means that companies building new products need to think about privacy from the get-go, not just “retrofitting” privacy features once there’s a problem. Online companies also need to think about collecting less information about their users and holding it for a shorter period of time, Brill added. That’s a suggestion that puts the FTC in direct conflict with the data-retention policies desired by the Department of Justice and law-enforcement agencies.

Second, privacy choices need to be simplified for consumers. Privacy policies are too cluttered and confusing, and tend to be full of information that’s barely relevant to the consumer. For example, an online shopper already knows that his address will be shared with FedEx or another shipper when he buys something.

Privacy policies need to address the collection of the data itself, not just how the data is used. For example, plenty of companies, such as ad networks, are holding large amounts of consumer data and could stop using it for behavioral advertising if consumers opt out. But they might be less willing to not collect the info at all. That’s because they can still sell or share that data with others.

Finally, data practices need to be transparent. Not only should consumers know what kind of data companies are collecting about them, but the FTC is actually proposing that consumers should get access to that data, Brill said.

While the commission originally called for an approach that involved a persistent “header” alerting websites to the data-collection preferences of users who visit those sites—exactly the mechanism that Mozilla just unveiled in its new Firefox browser—the FTC is open to considering other strategies, she said.

Brill also addressed a question she’s been getting frequently: what does she think about industry response to the FTC privacy report so far? Her answer: It’s nice to be getting some reaction at all. The commission called for industry to self-regulate back in February of 2009, she noted. “Industry has been kind of slow to deal with this issue… We’ve been very pleased that since we released our report two months ago, we seemed to have caught industry’s attention now.”

If the self-regulation proposals coming in aren’t sufficient to protect consumers, “we will ask Congress to take up the issue,” Brill concluded.

Monday, February 7, 2011

Jay Cline ComputerworldFebruary 3, 2011Who are the best people and firms at providing privacy advice? It's a question I've been asking since 2006, before privacy was cool. Since then, a plethora of new privacy rules and penalties and a tsunami of new technologies and risks have placed privacy among the top handful of corporate concerns. Doing privacy wrong now takes a bigger bite off the bottom line than it did when I first started asking this question. So have the answers changed?

Not when the question is which type of outside privacy practice you prefer. Lawyers are still the top choices, with law firms grabbing six of the top 10 spots in the survey. And for the fourth consecutive time, Hunton & Williams garnered the most votes. This may be a case of success breeding more success: Hunton attracted more than twice as many votes as its nearest challenger.

What does this say about the corporate privacy agenda? Two things, I think: Regulatory compliance is still the first step to take for many companies, and the firms that were the best at assisting with this first step five years ago are still the go-to destinations for in-house privacy officers.

Other firms gaining groundEven though law firms took six of the top 10 places, that was down from the last survey, in 2008, when they accounted for eight spots. Indeed, consulting firms now account for half of the top 12.

The stronger showing of consultancies may reflect the emerging consensus in the privacy profession that doing privacy right is bigger than regulatory compliance. Particularly for industries such as healthcare and technology, which involve an intensive use of personal information, creating privacy-friendly products and services involves meeting customer and social expectations. "Organizations need to 'do' privacy better, faster and cheaper," noted Brian Tretick, managing director for Athena Privacy, a new boutique firm. "That means more formal, repeatable processes, automation and active monitoring."

The survey also showed that firms may be looking for services beyond traditional advice from experts. New entrants to the list of top vote-getters include service providers, a certification firm and a professional association. Among them:

• San Francisco-based Truste is the provider of the popular Web-privacy seal and a number of other privacy-verification products and services.• Portland, Ore.-based ID Experts and Austin-based Debix provide data-breach response services.• Toronto-based Nymity provides an information portal for privacy content.•Seattle-based MediaPro offers computer-based training for privacy and security.

Friday, February 4, 2011

Rep. Jackie Speier (D-Calif.) plans to introduce an online privacy bill next week directing the Federal Trade Commission (FTC) to begin a "do not track" program for online advertisers, a Speier aide told The Hill.

The program would enable consumers to "opt out" of tracking by online advertisers. The aide said the bill is narrowly tailored to address tracking issues only, rather than the broader question of online privacy. It provides a floor, rather than a ceiling, for privacy law, so it does not pre-empt additional legislation in the future.

Speier's office worked with a host of pro-privacy groups on the bill, including Consumer Watchdog, the Consumer Federation of America, Consumers Union and the Electronic Frontier Foundation, among others.

Rep. Bobby Rush (D-Ill.) is also planning to reintroduce his privacy bill next week. His bill does not include a "do not track" mechanism; however, it provides a safe harbor for marketers who participate in such a federal program if one is created. Speier's bill does not include a safe harbor.

The FTC released its own privacy report last year, throwing its weight behind a "do not track" system. David Vladeck, the FTC consumer protection director, told Congress in a December hearing that "do not track" legislation could help protect consumers, since many are unaware they are being tracked. It might also simplify individuals' efforts to keep their online data private.

Why is it that two sprawling yet similar Western cultures -- those on both sides of the Atlantic -- respond so differently to Internet privacy?

A quarter-century after coming to the United States, Franz Werro still thinks like a European. The 54-year-old Georgetown law professor, born and raised in Switzerland, is troubled when ads in French automatically pop up on his American laptop. The computer assumes that's what he wants. We live naked on the Internet, Werro knows, in a brave new world where our data lives forever. Google your name, and you'll stumble onto drunken photos from college, a misguided quote given to a reporter five years ago, court records, ancient 1 a.m. blog comments, that outdated Friendster profile ... the list goes on, a river of data creating a profile of who you are for anyone searching online: friend, merchant, or potential future employer. Werro's American students rarely mind.

But America is not Europe, and despite our no-secrets age of WikiLeaks, Europe wants to enshrine a special form of privacy into law. Individuals should, according to many in Europe, possess what they call a "right to be forgotten" on the Internet.

How do you create a space where we're free to analyze the data but not free to abuse the data? We've been asking the wrong questions.

How would this even be possible? This developing right, authorities in several European countries suggest, would allow an individual to control and sometimes eliminate his or her data trail and allow him or her to ask Google to remove select search results -- a newspaper article, say, which once painted him or her in a bad light. A look at recent news events guarantees that this right will only become more relevant in 2011.

On January 19, Google refused Spain's request that the ubiquitous, California-based search engine remove 90 links. Many of the links Spain wanted to remove included newspaper articles and information from public record, often painting the plaintiffs in a bad light. Google called Spain's request "disappointing" in its official statement and emphasized that as a search engine, it should not be responsible for curating Internet content. Removing links would be expensive, Google argued in court, and violate the "objectivity" of the Internet search. Last November, the European Union announced data protection goals for 2011, which include "clarifying the so-called 'right to be forgotten', i.e. the right of individuals to have their data no longer processed and deleted when they are no longer needed for legitimate purposes" (PDF).

The EU explicitly said that users should have the right. It has already been heavily discussed and praised in countries such as France, whose President Sarkozy said last year: "Regulating the Internet to correct the excesses and abuses that come from the total absence of rules is a moral imperative!" France's leadership at the coming G8 summit also signifies more dialogue, as Sarkozy hopes to discuss the right on an international stage.

These European concerns rarely come up in the United States. People may worry about Facebook's privacy settings, but few would suggest an individual has a right to remove an offending Gawker post from Google's index. After all, who decides? A person might want an embarrassing photo removed from record, but what if the photo features not only that person but four others? The question of censorship is inevitable. The closest manifestation on this side of the Atlantic is likely a paper from the ACLU lobbying for a "right to delete" (PDF). Why, then, have our two sprawling yet similar Western cultures responded so differently to Internet privacy?

In Europe, the idea that privacy should overrule free expression is nothing new. Professor Franz Werro keenly highlights the historical difference in a 2009 academic paper and points to a 1983 case in Switzerland. Swiss TV had planned to air a documentary about a criminal from the 1930s. Swiss law, however, forbade the airing of the program -- the European court "held that the documentary would unjustifiably violate plaintiff's privacy right to keep his feelings as a son from being trampled." Yale law professor James Whitman sees the differing concepts of privacy as a battle between liberty and dignity (here, the PDF of his 2004 journal article).

Transatlantic clashes over privacy in recent years have included the use of Google's Street View in Germany, Switzerland, the Czech Republic, and elsewhere. German criminals sued Wikipedia in 2009 to have their names scrubbed. A little less than a year ago, an Italian court successfully sued Google for allowing a user to post offensive video. The fact that many of the Internet companies such as Facebook and Google are located in the United States (where, as Werro says, there is "fetishization" of the constitutional First Amendment of free speech) creates deeper problems in the courtroom, as it did in Google's recent refusal in Madrid.

American companies favor American law if possible, no matter what country they operate in. In Europe, the courts balance a right to a free press with rights of privacy, of personality, and of dignity, protected in Article 8 of the European Convention on Human Rights. In America, the implicit right to privacy always fell flat when running against the Supreme Court's fidelity to the First Amendment.

A right to be forgotten raises practical concerns as well as theoretical. "It's almost absurd to say we have the right to disappear from public domain," said Martin Abrams, a policy director with leading global privacy think tank Hunton & Williams. "We're really talking about the right not to be observed in the first place.... We've been focused on symptoms rather than the underlying issues."

"The Americans run their show, but can they impose their rights on the rest of the planet?

Abrams is far from enthusiastic about Europe's proposed right to be forgotten -- he'd rather people focus on what he considers the real issues of Internet accountability and the increasingly popular notion of "data stewardship" among corporations. Data will inevitably be out there, Abrams believes, and what matters now is a dialogue about how to retire certain data. There is great value, he emphasizes, in using Internet data to model the future and permit innovation -- he brings up positive examples of this, such as Google-supported HealthMap, which tracks infectious diseases around the globe by synthesizing public data. You can't go west and not be known anymore, Abrams believes, but we can move beyond a "rhetoric hump" and reach a more realistic and practical level of dialogue on data responsibility.

"How do you create a space where we're free to analyze the data but not free to abuse the data?" Abrams considers. "We've been asking the wrong questions."

And why is Europe asking questions about the right? Because, Abrams said, Europe is used to legally processing all its data, whereas America grants far more permissive rights of observation of behavior and its data -- which, when extended to the Internet, affect how companies observe and model our activity. The Europeans resist this digital observation without consent. But the European model runs strongly against American traditions of free press and expression. Up until now, the fight for the right to be forgotten has remained largely within the province of Europe. That can't last forever though -- especially given how many global Internet titans remain based in the U.S.

"The Americans run their show," Werro said, "but can they impose their rights on the rest of the planet?" Europeans are, Werro continues, equally sensitive to the use of personal images and especially the "merchantability" of personal data by corporations. A European sensibility would not, he added, easily accept the invasion of privacy that occurs so frequently in American media. He brings up Fox News, which to keep coverage of the Eliot Spitzer scandal alive, chased after the prostitute-in-question's grandfather at 9 p.m. on a Saturday.

Yet on both continents the discussion of Internet privacy is evolving. In December, the U.S. Department of Commerce recommended establishing a Privacy Policy Office, its potential role "acting as both a convener of diverse stakeholders and a center of Administration commercial data privacy policy expertise" to address what it calls "a continuum of risks to personal privacy" (PDF). Another U.S. goal is to establish "global privacy interoperability" to reduce the friction and costs American companies have been incurring as they face the "omnibus privacy laws" adopted in the European Union. In late January, both Google and Mozilla presented people with an option to opt out of being tracked online for advertising purposes.

These basic privacy concerns are universal, but the right to be forgotten -- and the potential precedent its adoption could set -- takes the concern over privacy many steps further. As in Madrid this January, the European sensibility is colliding in powerful ways directly with U.S.-based, transnational corporations bred on American values of both expression and profit. The fight is hardly over.

"I wonder at times," Werro said, "if this conception of privacy in Europe could be wiped out."

Tuesday, February 1, 2011

Online privacy is drawing increasing attention from policy makers, the press, and the public due to rapid changes in social networking, online targeted advertising, and location-based services for smart phones.

Last month, the Department of Commerce asked for comment on its new green paper, entitled “Commercial Data Privacy and Innovation in the Internet Economy: A Dynamic Policy Framework.” One important proposal in the green paper was to create a Privacy Policy Office in the Department of Commerce.

I have submitted comments explaining “Why the Federal Government Should Have a Privacy Policy Office.” The chief criticism of the proposal is that the new office would weaken privacy protection. In one vivid turn of phrase, Jeff Chester of the Center for Digital Democracy said: “Having the Commerce Department play a role in protecting privacy will enable the data collection foxes to run the consumer privacy henhouse.” Chester and other privacy advocates essentially argue that having the Commerce Department play a role in privacy policy will dilute the effectiveness of the Federal Trade Commission’s privacy efforts.

I disagree, and reach three conclusions, which I explain below. My comments also consider whether the new office should be placed in the Department of Commerce, as the green paper recommends, or else in the Executive Office of the President, where I served as chief counselor for privacy under President Clinton. I conclude that the important thing is to ensure an ongoing privacy policy capability in the executive branch, while a good case can be made for housing it either in the Commerce Department or the Executive Office of the President.

Why the Federal Government Should Have a Privacy Policy OfficeThese comments support the creation of a Privacy Policy Office in the executive branch, as called for in the Department of Commerce green paper, “Commercial Data Privacy and Innovation in the Internet Economy: A Dynamic Policy Framework.”

The chief criticism of this proposal is that the office would weaken privacy protection. In one vivid turn of phrase, Jeff Chester of the Center for Digital Democracy said: “Having the Commerce Department play a role in protecting privacy will enable the data collection foxes to run the consumer privacy henhouse.” Mr. Chester and other privacy advocates essentially argue that having the Commerce Department play a role in privacy policy will dilute the effectiveness of the Federal Trade Commission’s privacy efforts.I disagree. My comments support three conclusions:

1. The office would provide important benefits to complement what the FTC does. As part of the executive branch, the office would make distinctive contributions to building privacy policy into the development and implementation of U.S. government positions for domestic and international policy. Relatedly, the office would be able to draw on the perspectives and expertise of other federal agencies far more effectively than can an independent agency such as the FTC.2. The likely outcome with an office would be better protection of privacy than would occur without the office.3. The likely outcome with an office would be better achievement of other policy goals than would occur without the office.

These comments also consider whether the office should be placed in the Department of Commerce, as the greenpaper recommends, or else in the Executive Office of the President, which housed the office of the chief counselor for privacy under President Clinton. I conclude that the important thing is to ensure an ongoing privacy policy capability in the executive branch, while a good case can be made for housing it either in the Commerce Department or the Executive Office of the President.

Background on privacy and the department of commerceMuch as is occurring this year, the FTC and Commerce Departments played complementary roles in the mid- to late-1990s in developing privacy policy. At the Federal Trade Commission, privacy initiatives were pushed by Chairman Robert Pitofsky, Commissioners Mozelle Thompson and Christine Varney, and Director of the Consumer Protection Bureau Jodie Bernstein (along with her dedicated staff, led by David Medine). At the Commerce Department, Barbara Wellbery and Becky Burr played important roles, as did Administrator of the National Telecommunications and Information Administration Larry Irving, General Counsel Andy Pincus, Undersecretary for the International Trade Administration David Aaron, and Secretary William Daley. The history of the FTC’s involvement in this period has been well discussed in work by Kenneth Bamberger and Deirdre Mulligan.

The vital work in that period of the Department of Commerce has been less fully discussed.[1] In 1997, Secretary Daley personally hosted a major conference and report on “Privacy and Self-Regulation in the Information Age.” That conference engaged many of the persons, and developed many of the concepts, that shaped U.S. privacy policy in the following years.[2] The department then led the complex and ongoing negotiations with the European Union about how to reconcile the E.U. Data Protection Directive and U.S. law, culminating in the Safe Harbor agreement in 2000, which is still in effect today. For the Safe Harbor and in numerous other privacy issues, the department, including its International Trade Administration, brought expertise to bear on topics such as e-commerce, international trade, and how privacy fits into broader business practices.

In the summer of 1998, Vice President Al Gore announced that a privacy policy position would be created in the U.S. Office of Management and Budget. As discussed further below, I entered the role of chief counselor for privacy in early 1999, and worked closely with the Department of Commerce, the FTC, and other agencies until early 2001. Under President George W. Bush, the Commerce Department administered the Safe Harbor program, but did not play as visible a policy role on privacy.

Under President Obama, Secretary Gary Locke created the Internet Policy Task Force , which has published the green paper that is the subject of these comments, entitled “Commercial Data Privacy and Innovation in the Internet Economy: A Dynamic Policy Framework.” The green paper states:

Recommendation #4: Using existing resources, the Commerce Department should establish a Privacy Policy Office (PPO) to serve as a center of commercial data privacy policy expertise. The proposed PPO would have the authority to convene multi-stakeholder discussions of commercial data privacy implementation models, best practices, codes of conduct, and other areas that would benefit from bringing stakeholders together; and it would work in concert with the Executive Office of the President as the Administration’s lead on international outreach for commercial data privacy policy. The PPO would be a peer of other Administration offices and components that have data privacy responsibilities; but, because the PPO would focus solely on commercial data privacy, its functions would not overlap with existing Administration offices. Nor would the PPO have any enforcement authority.

For reasons set forth below, I generally support this recommendation, but with greater emphasis on certain functions the office can play, especially as an ongoing source of institutional expertise on privacy and in order to facilitate the interagency clearance of privacy-related issues.

A complementary role for a privacy office in Commerce: The importance of clearance and international privacy issuesTo assess the potential usefulness of the PPO, it helps to first understand some important roles played by the Federal Trade Commission in privacy protection:

1. Enforcement. The FTC has the power to bring enforcement actions against “unfair and deceptive trade practices,” and has negotiated consent decrees on privacy with both large and small companies.2. Rulemaking.In specific areas, such as children’s online privacy and anti-spam measures, the FTC has explicit authority to issue rules under the Administrative Procedure Act. More broadly, the FTC could write rules under the more burdensome procedures created by the Magnuson-Moss Act, but it has not chosen to do so on privacy.3. Convener.The FTC has brought together stakeholders in a variety of ways to discuss emerging online privacy issues, and in some instances catalyze industry self-regulatory codes of conduct.4. Institutional expertise. Leading members of today’s FTC efforts were also active during the privacy debates of the 1990’s. The continuity of FTC staff has contributed to the commission’s institutional expertise on privacy issues.5. Bully pulpit. Top FTC officials and staff direct the attention of companies toward emerging privacy issues.

The Commerce Department has at least two distinctive roles that complement this list of FTC privacy functions: clearance and ability to speak internationally for the administration.The role of “clearance” is particularly important yet often little understood. In a 2000 document prepared for publication in the Stanford Law Review but not actually published, I went into some detail on the subject. To ensure a unified administration position, for congressional testimony, executive orders, and many other documents, drafts of documents are circulated among the various agencies and components of the Executive Office of the President. Once comments are received, discussions are sometimes needed to resolve differences of opinion, with appeal to more senior officials if differences are not resolved at lower levels. In addition to these structured clearance procedures, agency experts on an issue such as privacy often get engaged earlier in the policy planning process, in a variety of working groups and less-formal methods of sharing expertise and views.

In my experience, an independent agency, such as the FTC, has a sharply limited ability to participate in the Administration’s clearance process. On some occasions, a draft document may be shared with the FTC, often early in a policy process, for whatever input the commission may wish to offer. The decision making, however, is done by persons in the executive branch, notably the Executive Office of the President and cabinet agencies such as the Department of Commerce. There are important and long-standing reasons for this separation between independent and executive agencies—the separation avoids the appearance of political pressure on independent agencies. Separation is especially important for enforcement decisions—the FTC has true independence on what enforcement actions it brings, but the corollary is that the FTC is not “inside” the administration when it comes to creating administration policy. A variety of rules exist to limit the interaction of independent agencies and the executive branch; new White House officials, for instance, are briefed by counsel to exercise great caution in their interaction with independent agencies.

As an example of the constructive role in clearance played by the Department of Commerce, consider testimony in 2010 on the controversial question of whether and how to amend the Electronic Communication Privacy Act of 1986. ECPA is an important law for law enforcement—it sets forth the standards by which police and prosecutors can get access to emails and other electronic communications. ECPA, though, is also an important law about corporations and personal privacy. For corporations, ECPA sets the rules for what sorts of access to corporate databases should be permitted, under what circumstances and at what cost. For individuals whose records may be seen by law enforcement, ECPA creates the rules of the road for privacy protection, especially in our modern world when many records are stored in the “cloud” and thus at least potentially accessible to law enforcement.

ECPA thus provides one example of how multiple, compelling values can come into play in clearing the administration’s testimony to Congress. On September 22, 2010, both James Baker of the Department of Justice and Cameron Kerry of the Commerce Department testified before the Senate Judiciary Committee. Under the clearance rules, the testimony of both witnesses had to be shared in advance with the other, and the administration had to develop a common position. In my experience, sharing a draft document with an agency with a sharply different perspective is often extremely valuable—assumptions held in the initial agency get challenged, overstatements are modified, and the number of mistakes is reduced. Although I have no direct knowledge of the clearance process in this instance,[3] I think it quite possible that the presence of the Department of Commerce in the process helped create a more nuanced and privacy-protective administration position.

The ability of an independent agency such as the FTC to have a similar role in clearance is sharply limited. Based on my own experience, and on background discussions with people at the FTC, the FTC is not staffed well enough or situated close enough to the “inside” to engage on the day-to-day clearance of documents on the many law enforcement issues affecting commerce and privacy, including ECPA, the Communications Assistance to Law Enforcement Act, rules about encryption controls, and so forth.

From my time as chief counselor for privacy, the number of privacy issues addressed by federal agencies is far greater than realized by most people who have worked primarily on privacy with the FTC. I offer a list here as an illustration of the sorts of privacy issues that can arise in each of the cabinet departments. For many of the agency activities, there are important implications for commerce, providing a natural role for the Department of Commerce on commercial privacy issues. For others, the link to commerce is less direct, but a broad-based experience with privacy issues at the Department of Commerce will facilitate development of a sound administration position on privacy:

Along with clearance, another role for the executive branch is to develop and announce the administration position in international settings. The green paper discusses the office’s role in international privacy activities, but is worth explaining a bit how this would complement any international activities by the FTC.

The FTC plays at least three roles on international privacy issues. First, the FTC is the designated enforcement agency for complaints under the U.S.-E.U. Safe Harbor. Second, the FTC’s overall privacy expertise and convening functions inform international discussions about privacy issues, and there has been international cooperation on enforcement actions. Third, last year the FTC for the first time received full member status in the closed session of data protection authorities at the International Conference of Data Protection and Privacy Commissioners. Executive branch officials continue to attend the closed session, as they have since 1999, but with “observer” status.

These important FTC international activities, however, do not replace the need for the executive branch to have policy capability about privacy. For instance, privacy and e-commerce issues arise in a wide range of bilateral and multilateral trade negotiations—because transborder data flows are such an important part of modern commerce, data-related issues can arise as one piece of many larger trade negotiations, which often involve the International Trade Administration of the Department of Commerce. Some multilateral fora persistently address privacy issues, such as the Asia-Pacific Economic Cooperation and the Organization for International Cooperation and Development. The U.S. delegations for these activities are led by the executive branch, with representation from the Commerce and State Departments.

More generally, the clearance process applies to developing and implementing the position of the United States in international negotiations. The FTC as an independent agency would have no basis for making representations, for instance, about what any executive branch agency would accept, including for law enforcement, homeland security, and non-privacy commercial issues. There is thus a sound basis for the green paper’s recommendation that the office “would work in concert with the Executive Office of the President as the Administration’s lead on international outreach for commercial data privacy policy.”

Whether privacy policy should be centered in the Commerce Department or the executive office of the presidentI believe there is an extremely strong case in favor of developing an ongoing privacy policy capability in the executive branch. Privacy policy requires familiarity with a complex set of legal, technological, market, and consumer considerations. Good government thus calls for creating an institutional memory and a group of civil servants experienced in privacy policy. This privacy policy capability goes well beyond the need for federal agencies to comply with the Privacy Act and implement good practices for the personal information they hold.

Where to locate this privacy policy capability is less clear. In a 1998 book, Robert Litan and I discussed the question in detail, and concluded that a privacy policy office should be created in the Department of Commerce.[4] From 1999 until early 2001, by contrast, I served in the role of chief counselor for privacy in the U.S. Office of Management and Budget, and I have written reasons for supporting that approach as well.

The chief advantages and disadvantages are mirror images of each other. Placing the office in the Commerce Department allows for substantially greater staffing, increasing the chance that institutional expertise will accumulate through the ups and downs of public attention to privacy protection. The Commerce Department, however, will be only one of the various agencies that may have views on a particular privacy issue, increasing the risk that privacy will lose out in clearance. On the other hand, placing the policy leadership in OMB or elsewhere in the Executive Office of the President likely improves the possibility of effective coordination of privacy policy across the various agencies. Staffing, however, is always tight at the White House. The chief counselor for privacy, at most, had two full-time staff and one detailee from the Commerce Department.

One model worth considering is the position that Howard Schmidt now fills as cybersecurity coordinator. Mr. Schmidt is part of the national security staff, and also coordinates with the National Economic Council. My understanding is that a significant amount of support for the cybersecurity coordinator is provided by various agencies rather than directly by staff of the Executive Office of the President. A hybrid approach of this sort might achieve more effective privacy policy coordination while also retaining ongoing staffing.

This sort of role might also usefully integrate with the Privacy and Civil Liberties Oversight Board, for which President Obama recently nominated James Dempsey and Elizabeth Collins Cook. That board, to be effective, should have professional staff to carry out its task of working on privacy and civil liberties issues that affect anti-terrorist activities. As shown by the example of the Electronic Communications Privacy Act, anti-terrorist and law enforcement activities often have intricate interconnections with the commercial actors that own and operate most of the infrastructure for processing personal information. It quite possibly makes sense to permit dual tasking of personnel assigned to the board to work on privacy issues that concern commercial privacy. If this were done, an Executive Office of the President role for a privacy coordinator could be supported both by commercial privacy experts and persons assigned to the oversight board.

In short, various institutional choices might succeed for institutionalizing privacy policy in the executive branch. The privacy policy capability prior to 2009, and it is a good sign that the Department of Commerce green paper is reinvigorating the debate about how best to protect privacy policy while achieving other important goals.

ConclusionIn conclusion, the comments here show important tasks for a Privacy Policy Office in the executive branch, which would complement the FTC’s ongoing privacy activities. Notably, such an office would improve interagency clearance, and be important in developing and stating the position of the United States government in international settings. Based on my own discussions with people at the FTC, the FTC does not have the budget or institutional structure to attempt to participate in all of the issues touching on commercial privacy throughout the federal government.

Because these functions complement the existing activities of the FTC, the general effect of such an office would be to improve privacy policy expertise and capabilities, contrary to the concerns expressed by some privacy advocates that such an office would undermine privacy protections. In addition to the advantages described above, executive branch participation in development of industry codes of conduct permits expert input from a range of federal agencies and also brings those agencies up to speed on evolving technology. Another advantage is that an executive branch privacy capability can lend force to privacy legislative or other initiatives—when both the FTC and the administration work together on an issue, the combined effect is likely to be greater than when an independent agency such as the FTC acts alone. Because the administration is likely to be asked to provide its views on important legislation in any event, the existence of an ongoing privacy office in the executive branch will lead to better-informed privacy policy decisions by the administration.

The existence of such an office would also provide a more effective structure for the administration to weigh privacy concerns with other competing policy goals and values. The hope, which I believe is supported by experience, is that participation by privacy experts in executive branch decisions increases the likelihood of win-win situations, in which privacy goals are better achieved and other goals as well.

In short, the Department of Commerce deserves praise for advancing the idea of an ongoing Privacy Policy Office as part of its green paper.

Peter Swire is the C. William O’Neill Professor of Law at the Moritz College of Law of the Ohio State University, and a Senior Fellow at the Center of American Progress. From 1999 through early 2001 he served as Chief Counselor for Privacy in the U.S. Office of Management and Budget. From 2009 through August, 2010 he served as Special Assistant to the President for Economic Policy, including on privacy and related technology issues.

Endnotes[1] One reason may be the untimely death in 2003 of Barbara Wellbery, who worked tirelessly to address the issues of U.S. and E.U. relations in connection with the European Union Data Protection Directive and was instrumental to creation of the Safe Harbor privacy program that is now administered by the Department of Commerce.[2] The conference invitation pushed me to write “Markets, Self-regulation, and Government Enforcement in the Protection of Personal Information,” my first article specifically on privacy issues.[3] I served in the National Economic Council until August 2010, before the September 2010 testimony described in the text.[4] Peter P. Swire and Robert E. Litan, None of Your Business: World Data Flows, Electronic Commerce, and the European Privacy Directive (Brookings, 1998), at 179-188.