Background:

The World Privacy Forum filed comments last week for the FTC Privacy Roundtables, the first of which will be held December 7, 2009. The WPF comments urged the FTC to consider the Fair Credit Reporting Act as a key privacy model to apply to additional areas, to use the full version of Fair Information Practices, and discussed how a rights-based framework was the key to advancing consumers’ interests. The comments discussed list brokers at length, and explained how even the most informationally cautious consumer will land on numerous marketing lists and databases. The WPF comments noted that not all marketing lists are used to target ads to consumers; some lists and databases are used to deny consumers goods and services. The comments contain a detailed section on privacy frameworks, a section on direct marketing, and an appendix with supporting information.

or Read comments below

Comments of the World Privacy Forum

Re: Privacy Roundtables – Comment, Project No. P095416

Dear Commissioners,

Thank you for planning the upcoming Exploring Privacy Roundtable Series to be held in December 2009 through 2010. The World Privacy Forum supports the Commission’s goal of determining how consumers may use and benefit from modern technologies while retaining robust privacy protections. It is an important balance to achieve, and one which does not currently exist.

Our comments are focused on two areas, existing legal requirements and on consumer expectations of privacy.

I. The Effectiveness of Existing Legal Requirements and Self-Regulatory Regimes

In its request for comments, the Commission asked: “Do the existing legal requirements and self-regulatory regimes in the United States today adequately protect consumer privacy interests?” We will discuss Fair Credit Reporting Act and how it can be a model for new regulations, Fair Information Practices, and joint regulation.

A. The FCRA Model for New Regulations

Perhaps the most successful — but not perfect — privacy law of longstanding is the Fair Credit Reporting Act (FCRA). Congress passed the FCRA after years of persistence by a Senator who understood (1) the essential importance of credit reports in the lives of consumers and in the operation of the economy, and (2) the lack of any rights or due process for consumers in the credit reporting system.

We now see history repeating itself with other forms of detailed and unregulated reporting about consumers. The difference now is that the emphasis has shifted from the credit reporting system to other areas. We have already seen an explosion of non-credit, unregulated consumer reporting. Anyone can buy a dossier with personal information about almost any individual over the Internet for a few dollars. Often based on public record information, these dossiers provide basic identity, location, and history information about individuals. These dossiers are just a precursor of what is to come in the online and offline world. One driver of these activities is online advertising. We note that the stakes are not limited to advertising, nor to online; however, online advertising bears some further discussion in this context.

The online advertising industry’s behavioral targeting activities monitor various aspects of computer usage, collecting information from largely unsuspecting consumers. The goal is to serve more ads with somewhat improved efficiency. However, what is really occurring is an expansion of consumer monitoring using new technology that lowers the cost of data collection.

In effect, behavioral targeting is closing the circle on consumer monitoring. It began with credit reporting, which managed to overcome the costs of data collection in a pre-computerized world because of the significant economic incentives. The development of later styles of consumer non-credit profiling activities, with lower value, was possible only because the costs of data collection were reduced by advances in technology.

The final step, also supported by low-cost technologies, is what we are calling “universal consumer monitoring.” We are aware that the phrase “universal consumer monitoring” can sound like an overreach on first blush. We are using the phrase as a descriptor of monitoring that captures online and offline information in a pervasive manner. Even the most information-conscious, privacy-sensitive consumer cannot escape being profiled through careful information habits. In section II of these comments, we discuss this with concrete examples and in more detail. For now, speaking in broader terms to set a frame for discussion of the FCRA, it is fair to say that there is no limit to the amount of personal information that the advertising industry wants.

Currently, the advertising industry wants to track and micro-target individuals as they use the Internet. The narrow goal of more focused advertising will be served by recording every website visited, every page reviewed, every ad served, every link clicked, and every interest expressed. If tracking to this extent has not yet been proposed, it is only a matter of time until it will be. The result of this sort of pervasive tracking, if it is allowed to occur, will be the creation of the most detailed profiles yet on individuals, with plenty of crossover to offline data sources.

Other types of offline consumer monitoring, such as RFID, video surveillance, face recognition, cell phone tracking, and traffic monitoring are also dropping in cost. In the service of better, more efficient advertising, future consumer profiles and databases will include multiple sources of information in addition to “online” information; this can include geo-location information, products that a consumer touched in a supermarket or retail store, retail items purchased in-person, and various business transactions such as activating a credit card. Commercial companies have no incentive to discard data, and the costs of storage may be less than the costs of deletion.

Databases of consumer identities and transactions attract secondary users, and this is especially the case as the database compilers seek to find new sources of revenue. Secondary users will include government law enforcement at all levels, employers, insurance companies, schools, public health authorities, litigants, landlords, parents, stalkers, and others. The information – like credit reports – will be used to make basic decisions about the ability of individual to travel, participate in the economy, find opportunities, find places to live, purchase goods and services, and make judgments about the importance, worthiness, and interests of individuals. The information will also be used to predict consumer behavior. This will all happen without the knowledge or participation of consumers. Secondary use of unregulated, non-credit consumer information is already commonplace without any consumer awareness, with the government being perhaps the largest customer.

It is within bounds to predict that the result will be that individuals may be held accountable in the future for every click they have made on a webpage. If offline tracking increases using new technologies, this can extend to most places an individual goes, and most things an individual does in the course of their daily lives. In Section II of these comments we give current examples of consumers being profiled based on online and offline behavior, and having predictions made about them based on their demographics and their behavior. Unless consumers take precautions – and often even if they do – the relentless compilation of the tiniest details of their online and offline lives will be collected, compiled, sold, and exploited in ways that exceed anything that has already happened.

Consumers are already being denied goods and services due to database profiles stored about them. But politicians and government workers may be particularly vulnerable to the reputational aspects of increased consumer profiling. Imagine what a confirmation hearing for a Supreme Court Justice might be like in a few years, when the record of the nominee’s “lifetime Web activities” or complete Web search history or Experian Consumer Database File is demanded by the Senate. Will there be a day when a casual or accidental click may prevent anyone from fulfilling his or her personal ambitions? Will there be a day when a consumer database or combination of consumer transactional databases are used to create a predictive modeling score on a Supreme Court nominee?

If the government proposed to compile these sorts of files on citizens, there would be an outcry from people on all sides of the political spectrum. Instead, the advertising and direct marketing industry is developing this system without any public attention to the consequences. The information compiled for advertising has no privacy protections today, and the industry is in complete charge of the activity. The information collected can be sold to government users and to anyone else because there are no rules that say otherwise.

We do have one prediction about how the dossiers and databases and the information they contain will not be used: they will not be used for any activity that is regulated under the FCRA. No dossier compiler will want to fall under the provision of the FCRA that provides individuals with access, correction, and other rights that dossier compilers find to be inconvenient. That provides the key to the solution.

What is needed is an expansion of the FCRA and its principles to cover all consumer dossiers, databases, and files available for use by anyone who might affect a consumer’s rights, benefits, privileges, or opportunities in government, commercial space, or on the Internet.

Some dossier activities should be banned altogether.

Other dossier activities should have strict time limits, much stricter than the seven years allowed under the FCRA.

The compilation of other information should be allowed only with the affirmative, time-limited consent of the data subject.

Individuals should have the right to stop dossier activity and to force the absolute, permanent, and immediate expungment of all dossiers.

Dossiers associated with an individual should be banned for anyone under the age of 16, and all dossiers on individuals should be expunged when they reach the age of majority.

Consumers should have a right to see and change their dossiers at no cost.

The legislation needed to implement these ideas will not be easy. However, what is most important is that we recognize the stakes in the current limited public debate about online behavioral ad targeting. Those who are targeting consumers have no natural limit to the information that they want to collect and exploit. They want to know everything that it is possible to know about consumers so that they can serve better ads, target campaigns, sell more products, categorize consumers into optimal groups for either pitching services or identifying risky consumers, and in some cases, denying opportunities or services.

The issue that a democratic society must debate is whether the prize here – slightly more efficient advertising – is worth the cost and the consequences. Mild-mannered limitations on behavioral targeting that some are considering at present will not be enough to head off the problems that loom. Consumers need substantive control over their data. We need to look further down the road and build appropriate protections.

The stakes here are far greater than Internet advertising or the current model for Internet services. We need to remember what was happening with credit reports before the FCRA. In a similar manner, online and other forms of digital tracking will record the tiniest details, and these details will be used to control, shape, and affect consumer activities in subtle and not-so-subtle ways. This is what happened with credit reports, which have found other uses in spite of regulation. The importance of non-credit consumer profiles in our lives will exceed the importance of credit reports if the non-credit profiles are unrestricted. We need to develop regulatory protections that will place limits on these activities before these practices become cheaper and even more entrenched in business practices.

B. Privacy Standards

We have a very good set of international privacy standards that were created originally in the United States, that have been blessed in U.S. and foreign legislation, and that are perfectly adaptable for present purposes. Those standards are Fair Information Practices (FIPs). For a short history of FIPs, see Robert Gellman, Fair Information Practices: A Basic History. [1]

The version of FIPs from the Organisation for Economic Cooperative and Development represents the gold standard of privacy principles.[2] The eight principles set out by the OECD are:

Collection Limitation PrincipleThere should be limits to the collection of personal data and any such data should be obtained by lawful and fair means and, where appropriate, with the knowledge or consent of the data subject.

Data Quality PrinciplePersonal data should be relevant to the purposes for which they are to be used and, to the extent necessary for those purposes, should be accurate, complete, and kept up-to-date.

Purpose Specification PrincipleThe purposes for which personal data are collected should be specified not later than at the time of data collection and the subsequent use limited to the fulfillment of those purposes or such others as are not incompatible with those purposes and as are specified on each occasion of change of purpose.

Use Limitation PrinciplePersonal data should not be disclosed, made available or otherwise used for purposes other than those specified in accordance with [the Purpose Specification Principle] except: a) with the consent of the data subject; or b) by the authority of law.

Security Safeguards PrinciplePersonal data should be protected by reasonable security safeguards against such risks as loss or unauthorized access, destruction, use, modification or disclosure of data.

Openness PrincipleThere should be a general policy of openness about developments, practices and policies with respect to personal data. Means should be readily available of establishing the existence and nature of personal data, and the main purposes of their use, as well as the identity and usual residence of the data controller.

Individual Participation PrincipleAn individual should have the right: a) to obtain from a data controller, or otherwise, confirmation of whether or not the data controller has data relating to him; b) to have communicated to him, data relating to him within a reasonable time; at a charge, if any, that is not excessive; in a reasonable manner; and in a form that is readily intelligible to him; c) to be given reasons if a request made under subparagraphs (a) and (b) is denied, and to be able to challenge such denial; and d) to challenge data relating to him and, if the challenge is successful to have the data erased, rectified, completed or amended.

Accountability PrincipleA data controller should be accountable for complying with measures, which give effect to the principles stated above.

In 2000, the Commission issued its own incomplete version of FIPs. [3] That statement of FIPs appears to have been abandoned, and it should not be revived. We see no reason for the Commission to deviate from the FIPs principles in general use around the world.

To be sure, the OECD version of FIPs principles may not be perfect. There may be a need to consider, for example, whether there should be a principle addressing anonymity or pseudonymity. Nevertheless, the principles as they exist today are broad enough and general enough for the purpose.

Given the Commission’s authority and its limitations, any time spent on developing revising basic principles will not be constructive. The Commission should accept the full statement of FIPs and move on from there. Having a workshop on the need to reconsider FIPs in a couple of years is a reasonable idea, however.

C. Joint Regulation or Negotiated Regulation

Every time that the Congress or the Commission examines corporate privacy practices, a panicked industry promotes the benefits of self-regulation as loudly as it can. We have seen a long history of privacy self-regulatory efforts that go through a predictable life cycle:

1. A threat of privacy legislation or regulation sparks a new industry self-regulatory effort.
2. The new or revitalized self-regulatory program begins with much fanfare and many promises. The self-regulatory program blesses every current business activities and condemns only those activities that are already illegal, not profitable, or not planned. Much glitz, but there are actually few, if any, new meaningful privacy protections for consumers.
3. As time passes and pressure for change dissipates, the self-regulatory program slowly loses members and interest. If the self-regulatory standards inhibit new profitable activities, the standards are weakened, ignored, or not enforced.
4. The self-regulatory program disappears entirely or remains only as a shell.

This cycle has repeated itself enough times to the point that we see no reason to continue to attempt meaningful consumer protections through this mechanism. See, for example, the history of the IRSG. We have seen the Network Advertising Initiative (NAI) go through this complete cycle one time already, and are now seeing the NAI begin a second “self-regulatory cycle.” [4]

A strange event in the privacy self-regulatory world has been the evolution of TrustE into a for-profit company. Whether TrustE was useful in its non-profit mode is an open question. However, the World Privacy Forum does not see any way that self-regulation can be accomplished in a for-profit context where all of the revenues of the regulatory organization come from the ranks of those who are supposedly being regulated. This is a problem for non-profit self-regulators as well.

In the privacy arena, history demonstrates that self-regulation does not work. A major reason is that self-regulation lacks tension. Regulation by business for business and for the principal purpose of avoiding formal regulation cannot work for privacy. Consumers have no meaningful voice, and attention by the Commission or other potential regulators is inconsistent and short. Once the pressure is perceived as dissipated, there is no incentive for the self-regulators to continue with their self-imposed discipline.

If the Commission wants to continue to find ways to improve privacy protections for consumers without legislation or formal regulation, we suggest that it look for a new form of self-regulation. Our proposal is for joint regulation, a method of developing self-regulatory principles and structures based on formal participation by those who process personal information and those who are the subject of personal information. In other words, any type of informal regulation needs the tension that comes from having two sides struggling over the proper type of protections. We see joint regulation as a form a negotiated rulemaking, but without having a formal government rule resulting from the process.

In the future, the Commission should only recognize a privacy self-regulatory effort that relies on joint regulation developed from beginning to end by representatives of both industry and consumers. The effort should include some type of enforcement agreed upon by both sides. There could be a role for neutral parties to play in the development or enforcement process. The Commission, the States, and other governmental entities could also participate.

We are not sure that business will be interested in genuine joint regulation with consumers. History shows that business is only interested in privacy regulation that it can control and abandon. If one-sided privacy self-regulation is all that business will accept, then we urge the Commission to end the charade and reject those privacy self-regulation efforts as badly motivated, ineffective, short-lived, and offering nothing useful to consumers.

II. Consumer Expectations of Privacy in the Online and Offline World

The Commission asked for comments regarding consumer expectations of privacy with this question: “Are there commonly understood or recognized consumer expectations about how information concerning consumers is collected and used? Do consumers have certain general expectations about the collection and use of their information when they browse the Internet, participate in social networking services, obtain products from retailers both online and offline, or use mobile communications devices?”

Many issues could potentially be discussed in relationship to this question, but we will focus on just one area, which is the universe of activities surrounding direct to consumer marketing. We believe significant consumer privacy interests are being ignored in this area, and that consumers are in fact already experiencing a variety of harms. Consumers’ expectations of privacy in regards to their information and transactions are legitimate, but what consumers think is happening to their information is deeply separated from the reality of current business practices.

Over the years, we have watched as databases filled with consumer information gleaned from offline and online sources have been compiled, exchanged, sold, and stitched together. Marketing has changed with the times, and has become extraordinarily sophisticated. There is a good deal of policy focus at the FTC and to some degree in Congress on the use of consumer information in online behaviorally targeted advertising. There are legitimate reasons for concern in this area. However, it must be said that online behavioral advertising is just one aspect of an entire complex of consumer data collection, exchange, use, and reuse. The universe of this challenging consumer privacy issue is large indeed, and is relatively untouched by any meaningful regulation of any sort.

At the 2009 Direct Marketing Association annual meeting this October, vendors and practitioners discussed the latest advances in real-time consumer tracking, micro-targeting to the individual, and data appending, with plentiful examples. Of note was the persistent emphasis of merging online and offline information sources. Also of note was a strong emphasis on predicting consumer behavior based on past behavior, or even on known relationships with other businesses or other consumers. The discussions at this event generally typify the industry trends.

In the past, marketers focused on acquiring information about the customer. For example, acquiring the age, gender, ethnicity, etc. of a customer was a prime goal. But now, demographic information is just the beginning. Transactional information tied to individual consumers, sliced and diced into scores and predictions, that is the newer model.

A marketing list called Consumer TransactionBase had this to say about why a list of 77 million-plus consumers was so valuable:

Transactional data can be leveraged by direct marketers to gain powerful insight into a household’s needs and wants. Through the examination of past spending patterns, marketers are able to analyze and predict future purchasing behaviors.

Consumer TransactionBase compiles SKU-level transactional data from a variety of online and offline retailers to offer a complete view of economically active purchasing households. Additional uses for this detailed data set include modeling and analytics as well as data enhancement.

The Consumer TransactionBase file is updated quarterly. Compilation comes from a leading nationwide cooperative database of consumer purchasing activity. Company and industry usage restrictions may apply. [5] (emphasis ours)

What does all of this mean to consumers? If consumers simply go about their daily lives, are cautious with their information, conservative with who sees their Social Security Number, shred their bills and pre-approved credit card offers, use safe computing practices, and so forth, they will still have detailed information about their private and in some cases professional lives collected, bundled, bought, trade, sold, compiled, layered, appended, and in general, used in various ways to target or to deny goods, services, and opportunities.

Right now, consumers do not generally know what is happening to them, and if they did, they do not have sufficient rights to manage the information marketplace they find themselves in. Regardless of how cautious and informationally conservative a consumer is, they do not have the ability to live a modern life and avoid being systemically profiled. Consumer profiling is currently unavoidable by the majority of consumers. We believe this truly defies consumer expectations of privacy.

The sheer volume of profiling data already being exchanged about consumers can be seen in the Experian Consumer Database. This database contains approximately 215 million consumers in 110 million living units nationwide.

The data card for the list states:

Target people by exact age, gender, estimated income, marital status, dwelling type, families with children, telephone numbers and a variety of other selections. The vast quantity of names on this database and its varied selection capabilities make this one of the largest and most flexible lists on the market today.

The data card additionally states in regards to predictive targeting:

Experian’s Quick PredictSM modeling process is designed for marketers with small to medium-size customer databases that are looking for a cost-effective modeling solution. Quick Predict gives you fast results for acquisition, retention and cross-sell campaigns and to enhance your market research efforts. of current customers. We run acquisition models against Experian’s extensive consumer data resources, providing you with a steady stream of potential new customers.

To take a concrete example of a data collection that most everyone can identify with, customers at retail stores who are asked for their zip code do not understand that the zip code they are offering leads to a universe of additional new information about them. This practice of “data appending” in the retail environment is a significant point of data collection. While the zip code may be acquired at the retail cash register, that zip code can be and in some cases is merged with substantial amounts of other information, including information from other databases, which may include offline and online information.

This sort data activity is often trivialized by those using the data. One frequently encountered argument is that this data activity is fine, because consumers want better ads, products, and services. But there is no good empirical proof that this is what consumers want. Beyond that, it is crucial to understand that this profiling is not just being used to offer services and goods, it is also used to deny consumers opportunities, products and services. This is especially problematic when predictive analysis based on transactional data is used to categorize consumers.

Note for example, the database of consumers who have disputed charges on their bills; certain of these customers are put into a database that is marketed as “badcustomer.” The badcustomer.com web site states: “Are your purchasing transactions being denied? Find out if you’ve been blacklisted before it’s too late.” [7]

We have a question about identity theft victims — individuals who have to dispute charges. Are they in this database? What services, goods, and opportunities will victims of identity theft be denied because they are in this database? How many lists like this exist that consumers don’t know anything about?

We also note that to get off the badcustomer list, consumers must supply detailed information online. How are consumers supposed to hear about every database list like this? How is badcustomers.com using the consumers’ information after receiving it? Is this company doing more than just taking people off of the bad customer list?

We suggest that consumer data collection is out of control, with no balancing consumer rights or requirements for transparency to counterweight the collection and usage activity. As we discussed in an earlier section of these comments, we believe the institution of a rights-based approach that combines FCRA-like rights with additional Fair Information Practices rights will address this lack of balance.

We also want to note that most consumers would be completely appalled to discover the ways they are being categorized on marketing databases and lists, and appalled at the type of information being sold about them. For example, on a recent search we found 18,684 marketing lists containing the keyword “bad credit.” We found 414 marketing lists containing the keyword “impulse.” We found 1,282 marketing lists containing the key word “mental problems.”

These marketing lists contain millions upon millions of consumers, along with typically their name, age, gender, income, state, and a great deal of other detailed demographic information. Some lists also contain transactional information and merged information. These lists exist outside of most regulatory structures; for example, many consumers often have a vague idea that HIPAA will protect their health information no matter where that information exists. These consumers would be horrified to learn that it is not unusual whatsoever to find highly sensitive health information offered up for sale in these lists.

Note for example the MedNet Mental Health Problems list. We think that many of the consumers named on this list are not likely to know they are on the list. We also think that many of the consumers named on this list would like the option to delete their names and identifying information from this list.

In this list, 2,985,634 consumers with “wide-ranging mental health issues” are identified, including segmented categories of people with depression, poor memory, autism, eating disorders, and other states this list identifies as “mental problems.” The data card for this list states:

“Mental health problems can create a significant burden on the afflicted individual, making them extremely receptive to any campaign that may be able to offer some assistance or relief.” [8]

Returning to the issue of targeted marketing and how consumers purportedly like it, it is unlikely that the caretaker of an autistic adult would be happy to know that this person is being targeted because he or she will be “extremely receptive” to certain types of campaigns. We have included a screen shot of the data card in the appendix of these comments.

We also think that some of the 6 million people on the Credit Card Declines marketing list would like to know they are on a list of people who have been declined for major bank cards, and would like the opportunity to delete their age, the age of their children, the gender of their child, dwelling type, ethnicity, and other information from the list and databases associated with it. [9]

There is an industry argument that consumers land on these lists and in these databases because they have given up their information freely. This may have been true at one time, but it no longer holds universally true. Consumers get on these lists just from conducting their lives; even the most informationally conservative consumer can land on these lists. This completely defies consumers’ expectations of privacy. One example of this is the Passport to Credit – Newly Activated Credit Cards list. This list of 18 million consumers is sourced from a credit card transaction processor.

This dynamic database is sourced from a credit card transaction processor, not from the source who issues the cards. You can select change of address, number of transactions, number of credit cards, type of credit card and more! [10]

Some lists and databases are an assault on the dignity of the people named in the list. One list, Fat Burner II, targets obese and morbidly obese consumers. The data card states: “These weight watching consumers will try anything in hopes of being healthy.” [11] Another list, Free to Me – Impulse Buyers, is targeted to people who made recent online purchases because they received something free with their purchase. The data card states: “Free To Me – Impulse Buyers are very quick to respond to offers that come in the form of contests, sweepstatkes, or other free products and services.” [12]

The World Privacy Forum understands that businesses have a right to exist and to make money, and that advertising and marketing is part of the marketplace. But we also believe that there is not a reasonable balance right now between what data is being collected and used, and what consumers can do to manage that data and their privacy. There are no perfect solutions, but we think that a rights-based framework based on approaches contained in the Fair Credit Reporting Act and on Fair Information Practices will address many of the problems and help create solutions that are equitable for all stakeholders.

Respectfully submitted,

Pam Dixon
Executive Director,
World Privacy Forum

Appendix

_____________________________________________
Endnotes

[1] Robert Gellman, Fair Information Practices: A Basic History <http://bobgellman.com/rg-docs/rg-FIPshistory.pdf>.

[2] < http://www.oecd.org/document/18/0,2340,en_2649_34255_1815186_1_1_1_1,00.html>. There are equivalent statements from the Council of Europe and from the Canadian Standards Association, but the differences are minor. The Privacy Office at the Department of Homeland Security in 2008 issued its own Fair Information Practice Principles that that match closely the OECD version. Privacy Policy Guidance Memorandum (2008) (Memorandum Number 2008-1), <http://www.dhs.gov/xlibrary/assets/privacy/privacy_policyguide_2008-01.pdf>. The DHS issuance is noteworthy since it implements the first statutory reference to fair information practices in U.S. law.

[4] For a more complete discussion of the NAI and self-regulation, see World Privacy Forum, The Network Advertising Initiative: Failing at Consumer Protection and at Self-Regulation, November 2007. < http://www.worldprivacyforum.org/pdf/WPF_NAI_report_Nov2_2007fs.pdf >.

[5] Consumer TransactionBase, <http://listfinder.directmag.com/market;jsessionid=D111DD2A12B5CAE409CBCBE160539072?page=research/datacard&id=267942> last accessed November 6, 2009. Screen shot of the data card from this date is available in the Appendix of these comments.

[6] Experian Consumer Database, Nextmark ID 84312, Last accessed Nov. 6, 2009. <http://listfinder.directmag.com/market;jsessionid=749F1DAB78232862B6E4A48F4C9A7120?page=research/datacard&id=84312>. Screenshot of this data card is available in the Appendix of these comments.

Medical treatments tailored to each individual’s physiology and genetic history have long been a dream, but this dream is data-intensive. The most current effort to turn personalized medicine into a reality is the Precision Medicine Initiative (PMI), which will collect and share biospecimens and health data from over a million volunteers for research -- this report analyzes the privacy protections for this initiative.

This analysis is an in-depth look at the January 2017 Executive Order 13768, Enhancing Public Safety in the Interior of the United States, and its interaction with two laws, the Privacy Act of 1974 and the Judicial Redress Act of 2015. Regardless of the reasons underlying why the order was written, a key question this analysis considers is if the order damages the EU-US Privacy Shield agreement, and what that means.

This substance of this analysis is about the new EU-US Privacy Shield, with contextual background and an analysis of how this new proposal compares to the old EU-US Safe Harbor agreement. The analysis includes a discussion of winners and losers in Privacy Shield, and discusses its potential future.