]]>This is the November-December 2018 recap issue of CDT’s monthly EU Tech Policy Brief. It highlights some of the most pressing technology and internet policy issues under debate in Europe, the U.S., and internationally, and gives CDT’s perspective on them.

The Center for European Policy Studies and the Counter Extremism Project recently published a report examining the impact of Germany’s NetzDG law on free expression, and its effectiveness fighting hate speech, nine months after it entered into force. At the launch event for the report, CDT’s European Affairs Director, Jens-Henrik Jeppesen, highlighted CDT’s longstanding concerns with the law. In particular, it is impossible to determine whether content removed under the law is in fact illegal, as it is never reviewed by a court. Disagreement about what is and is not illegal speech is reflected by very low takedown rates (between 10 and 28%). This underscores the need for court decisions in free speech cases and illustrates that flaggers, reviewers, and automated tools cannot be trusted to make these decisions.

E-Evidence: Council Adopts “General Approach” Despite Concerns from Civil Society and Various Member States

During October’s Justice and Home Affairs (JHA) Council, several Member States noted the lack of review of European Production Orders by authorities other than the ones issuing them. We expressed this concern in our analysis, and this point was also raised by the European Data Protection Board in its Opinion. However, the JHA Council agreed on a “general approach” on 7 December over the objections of several Member States, and over concerns raised by several civil society groups in a joint letter to Ministers. The adopted text does not satisfactorily address these concerns. Meanwhile, the Civil Liberties (LIBE) committee of the European Parliament held a hearing on 27 November to gather feedback on the proposal from various stakeholders across the board. Many speakers, such as the German Judges’ Association, questioned the necessity of the proposed legislation and proposed amending the European Investigation Order as an alternative. Moreover, the rapporteur and shadows in LIBE have identified several areas that need further clarification to reach a sound text, including scope and safeguards. Parliamentary scrutiny of the proposal is likely to continue after the May 2019 elections.

On 6 December, the Austrian Presidency rushed through a compromise text for adoption by the JHA Council. Ministers agreed on a very problematic “general approach”, but several Member States share the concerns we identified. A joint letter we submitted to Ministers with many other digital policy groups questions whether the Regulation is necessary in view of the numerous counter-terrorism initiatives already taken, and highlights serious free expression concerns. It’s now up to the European Parliament to take time to address the issues Member States did not.

DSM Copyright Directive: No Compromise in Sight

Negotiations on Article 11 (press publishers’ right) and Article 13 (upload filtering obligation) of the Copyright Directive proposal continue to prove difficult. Ahead of the 23 November COREPER 1 meeting, we joined human, privacy, and civil rights and media freedom organisations, software developers, creators, journalists, radio stations, higher education institutions, and research institutions in signing an open letter highlighting ongoing concerns with the proposal. “Both the Council and the Parliament texts risk creating severe impediments to the functioning of the Internet and the freedom of expression of all,” reads the letter. Earlier in October, CDT proposed a series of amendments and recommendations for Articles 11 and 13 that aim at preserving the open nature of the internet. We particularly call for mitigation measures for liability to be included in Article 13; excluding hyperlinks and insubstantial parts of text from the scope of Article 11; and a broad mandatory text and data mining exception in Article 3. These contentious points were left undecided during what was supposed to be the ‘final’ trilogue meeting on 13 December. The next trilogue will take place during the first plenary of 2019.

Controversial French Surveillance Regulation to be Scrutinised by CJEU

The judges of France’s highest administrative court (“Conseil d’État”) decided to send France’s legislation enabling expanded electronic surveillance to be scrutinized by the Court of Justice of the European Union (CJEU) in a preliminary ruling. This decision follows from two lawsuits started by French Data Network, La Quadrature du Net, and the Fédération FDN (federation of non-profit Internet access providers), joined by Privacy International and CDT. We opposed this legislation from its inception in 2015, and called for action at the EU level to protect human rights in the surveillance context. On 4 December, CDT filed a brief (in French) with the CJEU challenging French surveillance and data retention laws.

On 20 November, the French National Assembly adopted new legislation to combat ‘fake news’, particularly in the context of elections. The new law calls for increased scrutiny of online platforms in the months preceding elections, and would empower judges to order takedowns of content they deem ‘fake news’ during election campaigns. We continue to worry that any such law could be used to censor free speech, particularly due to vague definitions of the content to be considered ‘fake’. The European Commission has also identified tackling disinformation, particularly ahead of next European year’s elections, as a top priority. It recently outlined an action plan to step up efforts to counter disinformation in Europe. One key area of the plan focuses on collaboration with online platforms and industry in the context of the self-regulatory ‘Code of Practice’ adopted earlier this year. The Code delivers commitments on five main areas: scrutiny of ad placements, political and issue-based advertising, integrity of services, empowering consumers, and empowering the research community. While we believe that commitments in themselves are necessary, we remain concerned that the overall process is oriented toward pressuring platforms to remove or suppress content and accounts without meaningful safeguards.

]]>Reasonable. You keep using that word. I do not think it means what you think it means.

In March of 2018, CDT filed a legal challenge to the FCC’s “Restoring Internet Freedom” (RIF) order in which the FCC removed all of the net neutrality rules it had put in place in 2015. The Commission also reclassified broadband internet access services (BIAS) as an “information service” subject to the weaker regulatory authority Title I of the Communications Act and disavowed the remaining sources of its own authority to implement such rules at all. Instead, the FCC now relies solely on a weakened transparency requirement and market forces to ensure that ISPs refrain from leveraging their gatekeeper positions on the internet to control customers’ access to the internet or to exert influence on providers of online services.

We challenged the RIF’s rule repeal for 4 reasons. First, CDT has advocated for effective protections for an open internet for more than a decade; the RIF erased them all. Second, CDT was part of the legal battle to (successfully) defend the 2015 Open Internet Order against challenges from ISPs and industry groups, so our challenge to the repeal of those rules is a natural extension of that effort. Third, CDT believes that formal regulation of network management practices is necessary in lieu of a sufficiently competitive marketplace for internet access and we disagree with the Commission that today’s broadband market is sufficiently competitive to ensure market-driven self-regulation. Fourth, we strongly disagreed with the Commission’s reading of a single word in the statutory definition of “information service” that would effectively preclude the application of Title II to any type of communication technology, potentially destroying the FCC’s strongest source of authority.

After filing suit in March, CDT and the other petitioners and supporting intervenors, including public interest groups, trade associations, companies, state attorneys general, and more, have been briefing** the judges at the D.C. Circuit on our reasons why it should strike down the RIF. As we’ve written about before, the legal side of the net neutrality policy battle boils down to how the FCC classifies broadband. The Communications Act offers two choices; either broadband is a “telecommunications service” and subject to regulation under Title II, or an “information service” covered by Title I. Essentially, telecommunications involves transmitting information, unchanged, between network endpoints, while information services involve processing, changing, or manipulating information.

In the only Supreme Court case to address the issue, National Cable & Telecommunications Association v. Brand X Internet Services, (Brand X) the Court found that the internet access service “offered” by cable-based ISPs (in 2005) contained a mix of both telecommunications and information services. But since the two service classifications are subject to two mutually exclusive parts of the Act, the FCC must classify internet access service as either one or the other. The key to properly classifying the service, according to the Court, was to determine whether consumers viewed the the “offering” as an inseparable mix of the two kinds of services or whether there was a standalone “offering” of the telecommunications component of the cable modem service in question.

Ultimately, the Court said that the FCC’s decision to classify the cable modem service as an “information service” was reasonable because consumers saw the “offering” as a combination of information services, such as the ISPs own web-hosting and email, combined with the telecommunications service necessary to transmit data between the customer and those services. In his dissent, Justice Scalia compared the information service component to a pizzeria and the telecommunications component to pizza delivery. That analogy gives a glimpse of how prominent the information service components (email, web hosting, “home pages”) were, at least in the mind of one Justice, in terms of how consumers viewed internet access at the time.

Now, of course, internet access and the Web have moved beyond the walled gardens initially offered by ISPs, as have consumer perceptions. The FCC, under Chairman Wheeler, recognized this in 2014, demonstrating that people now see their ISPs primarily as conduits to the rest of the internet, responsible for transmitting information between subscribers and providers of “edge” services like email, social media, music streaming, etc. Many, if not most subscribers, never use the information services, like ISP-hosted email, that providers bundle with internet access. So how did the Pai Commission justify it’s switch back to the view that internet access is an information service?

In short, the Commission ignored the consumer perception test in Brand X and focused instead on the word “capability” in the statutory definition of “information services.” The Commission reasoned that internet access provides the “capability for generating, acquiring, storing, transforming, processing, retrieving, utilizing, or making available information via telecommunications” because without ISPs, people would not be able to access the services at the edge of the internet that actually provide the capability to do those things.

The D.C. Circuit rejected this argument when ISPs made it in US Telecom v. FCC in 2016, but there’s another problem: to the extent that internet access can be said to provide such “capability” it only does so by providing the telecommunications component between end users and the edge services they connect with. The FCC’s new interpretation ignores the “via telecommunications” portion of the definition, which would mean that even telephone service (the original Title II “telecommunications service”) would be an information service since it provides the “capability” of generating (talking), acquiring (listening), and storing (interacting with voice mail) information. The phone system and internet access do this in the same way: by transmitting information between endpoints.

The Commission’s fall-back argument is that, regardless of consumer perception, internet access is “functionally integrated” with two services most consumers have never heard of: DNS and caching. I have written about DNS before (how it works and the privacy concerns it presents), but it is the system that matches up the text-based addresses we type into our browser bars or click on (https://cdt.org) with the IP address of the server hosting that domain. DNS is essentially the phone book of the internet and ISPs are not the only providers; there are several free, public options that also offer enhanced privacy protections (1.1.1.1, 9.9.9.9, 8.8.8.8).

Caching is the practice of storing copies of popular content on servers close to or within the ISP’s network to reduce the time it takes to deliver that content to end users. Again, ISPs are not the only ones caching content; this is also what content delivery networks (CDNs) do. Beyond that, customers don’t ask their ISPs to cache, nor do they choose one provider over another based on their caching service. ISPs cache content because it conserves network resources, reducing the traffic load on routers, switches, and interconnection points. More importantly, neither DNS nor caching are part of what consumers think they are buying with their internet access service, namely, high speed connection to the internet.

From CDT’s perspective, even if DNS and caching are information services and inseparable from internet access, they are such a small component of the service compared to the telecommunications component (shuttling packets across the internet) that classifying internet access as an information service is like classifying air travel as an entertainment service because some planes offer in-seat movies. Or, as the petitioners’ reply brief points out, adding a few drops of fresh water to the ocean does not turn it into a lake.

The courts will decide whether the FCC’s approach, which misconstrues some words while ignoring others, is reasonable. Despite the FCC’s repeated insistence that its approach is reasonable — a word the Commission uses 144 times in its opening brief — it is not.

CDT’s Tech Talk is a podcast where we dish on tech and Internet policy, while also explaining what these policies mean to our daily lives. You can find Tech Talk on SoundCloud, iTunes, and Google Play, as well as Stitcher and TuneIn.

‘Tis the season for online shopping, but of course with that could come online fraud, online privacy violations, or online ads tipping-off what you bought your spouse.

Thankfully, Call for Action has a guide for all online shoppers about what we should be doing to keep our data safe and secure this holiday season. Their Executive Director Ed Bartholme joined us to talk through their tips.

Privacy is a fundamental human right. Physical safety, free expression, access to justice, and economic security depend on it. For too long, Americans’ digital privacy has varied widely, hinging on the technologies and services we use, on the companies that provide those services, and on our capacity to navigate confusing notices and settings. It’s time for Congress to pass legislation providing comprehensive protections for personal information that can’t be signed away.

Civil society, industry, and policymakers across the aisle and at every level of government have called for privacy legislation. But designing meaningful, workable privacy protections is no easy task. Existing privacy regimes rely too heavily on the concept of notice and consent, placing an untenable burden on consumers and failing to rein in harmful data practices. For legislation to be more than a band-aid, we have to rethink the relationship between businesses and the people whose data they hold. We need to establish sensible limits on data collection, use, and sharing, so that people can entrust their data to companies without accepting unreasonable risk.

To advance this dialogue, CDT has put forth a draft federal privacy bill for discussion. We hope this draft will inspire feedback and collaboration from all stakeholders and serve as a resource for decision makers who seek to rebalance our privacy ecosystem in favor of users. This post outlines the objectives behind CDT’s discussion draft and presents some questions to help kickstart discussion.

Requiring fair data practices, not just notice and choice

Legal regimes and industry self-regulation have long relied on notice-and-choice or user control as a proxy for respecting individuals’ privacy. These frameworks simply require companies to provide notice of their data practices and get some kind of consent—whether implied or express—or provide users with an array of options and settings. This status quo burdens individuals with navigating every notice, data policy, and setting, trying to make informed choices that align with their personal privacy interests. With hundreds of devices, thousands of apps, and as many different policies and settings, no individual has the time nor capacity to manage their privacy in this way. Even if we had the time, it’s nearly impossible to project the future risks of harm of each data transaction. Moreover, people can be harmed by data processors with whom they have no direct relationship, making control impossible.

Instead of relying on notice and choice, CDT’s draft prohibits data processing that is presumptively unfair. In describing unfair processing, we focused on practices that are likely unexpected by the average person, hard for consumers to avoid, and/or hard to do with appropriate privacy safeguards. Most of these practices involve repurposing data—using it for purposes other than providing a service that a user has affirmatively requested. The draft also prohibits deceptive practices, such as dark patterns designed to coerce or confuse users into providing their consent.

Another weakness of notice-and-choice models is their inability to address discriminatory uses of data. Commercial data can be used in ways that systematically discriminate based on minority or protected classes such as race, age, gender, LGBTQ status, disability, or financial status. Data-driven discrimination can be completely opaque to the affected person and often goes unnoticed even by the discriminating party. This problem is vast and demands multiple legal and policy approaches. CDT’s draft attempts to address discriminatory ad targeting by giving the Federal Trade Commission (FTC) the authority to make rules prohibiting unfair targeted advertising practices, without having to go through the burdensome process currently required for FTC rulemaking.

Affirmative obligations to protect data

Entities that collect, use, and share data have a responsibility to safeguard it and prevent misuse. CDT’s discussion draft would require covered entities to adopt reasonable data security practices and engage in reasonable oversight of third parties with whom they share personal information. These obligations recognize the reality that participating in modern society often means ceding control of one’s personal information. The entities we trust with our data should handle it with care.

The draft also requires covered entities to publish detailed disclosures of their data practices in a standardized, machine readable format that can be scrutinized by regulators and advocates. Some have argued, understandably, that privacy policies should be shorter and easier for users to understand. However, simplifying privacy policies can unintentionally double down on the idea of privacy self-management while allowing companies to hide the details of their data processing. Our draft prioritizes detail and standardization over simplicity so that regulators, consumer advocates, and privacy researchers can effectively scrutinize covered entities on behalf of consumers.

Individual rights to access, correct, delete, and port data

The right to access, correct, and delete personal information are basic requirements of any federal privacy law; many companies already provide these rights under the EU General Data Protection Regulation. However, individual rights alone are insufficient on their own to protect privacy. CDT’s draft provides broad individual rights, with tailored exceptions to account for technical feasibility, legitimate needs such as fraud detection and public interest research, and free expression rights. The individual rights apply not only to information directly disclosed to a covered entity, but also to information inferred by the covered entity, since inferences can often be more sensitive and opaque to users (e.g., inferring a medical condition based on someone’s non-medical purchase history).

Strong enforcement and meaningful penalties

Perhaps the biggest shortfall of existing consumer privacy protections is the lack of a strong enforcement mechanism and significant fining authority. As CDT has written, the FTC today is stuck with a broken consent decree model, accomplishing all of its privacy enforcement through negotiated settlements, which usually amount to a slap on the wrist. The agency cannot fine a company for a first-time violation of our federal prohibition against deceptive or unfair trade practices; it can only levy a fine after a company has violated its own consent decree. Our draft includes the authority to levy fines that we think are fair but meaningful for a first-time violation of the law.

State legislatures and attorneys general have played an important role in protecting privacy and data security in the absence of federal action, and state AGs should continue to have enforcement authority under a federal law. However, a strong federal privacy law should also provide some regulatory and compliance certainty for covered entities. We have attempted to draft a carefully scoped state preemption provision that would provide that certainty without preempting other protections such as state civil rights laws, criminal laws, and privacy torts. Getting this language right is critical and will be one of the most difficult challenges legislators face.

What’s next?

We have taken our best shot at addressing the numerous difficult challenges that drafting privacy legislation raises. Here are some of the hardest policy and drafting questions we faced that will warrant thoughtful discussion in the new Congress.

Individual Rights: Our bill includes individual rights that in many ways mirror new EU and California laws. We felt strongly, though that the rights and exceptions must be clear and definitive for the benefit of consumers and covered entities alike. Are these rights fair, clear, and operational?

Portability: What is the proper role of data portability rules in federal privacy legislation, and how can legislative language reflect existing technical reality?

Tech and Business Model Neutrality: Our bill applies across all unregulated sectors and sweeps in many different types of services, companies, and business models. How successful was this approach at avoiding unintended consequences?

Civil Rights: Our draft addresses unfair targeted advertising practices, particularly those that discriminate based on a protected or vulnerable class, through an FTC rulemaking. Are there additional measures we should be considering to address discriminatory data processing practices?

Free Expression:Have we appropriately tailored the privacy protections, including the individual rights (such as the right to delete personal information), to minimize the burden on First Amendment protected activities, such as publishing lawfully obtained information?

Collection, Use, and Sharing Limits: Our bill relies heavily on purpose limitations to protect the most sensitive uses of data. Are there additional use cases that should be presumptively unfair? Are our definitions for health information and other categories accurate?

Preemption: We have attempted to draft a provision that preempts state laws addressing the types of commercial data processing addressed by this law, but preserves state-level requirements that may involve data processing but are outside the scope of this bill. Is this tailoring appropriate? Have we missed any categories of state laws that should not be preempted?

Disclosures: Nearly everyone agrees that “transparency” provides limited privacy protections but is a necessary component to privacy legislation. Have we struck a balance between providing individuals with meaningful information and providing regulators and advocates with more detailed information about corporate privacy practices?

Want to see more of our work towards comprehensive federal privacy legislation?

Schools collect a lot of data about students. Of course data can be a valuable tool for improving student outcomes, for instance, by identifying students who are at risk of dropping out, allowing teachers to intervene early on. (If you’re curious about how else schools put data to work, check out the Data Quality Campaign’s Data-Rich Year.) However, that same information can pose a substantial risk to students and their families if it’s not managed well, and managing that information is no simple task. For example, take the seemingly straightforward topic of data deletion. Just drag a file to the trash and empty the trash, right? Unfortunately, deleting data is much more complicated, with a number of important policy, legal, and technical considerations.

You don’t have to look too hard to find instances where student data was exposed because information that should have been deleted was not. For example, in NewOrleans a school district sold laptops containing the names, addresses, and Social Security numbers of 210 students, putting them at risk of identity theft or phishing attacks. And the danger doesn’t just apply to electronic records. In Tulsa, a woman discovered student records including both academic and personal information alongside other discarded school supplies.

One way educational institutions can minimize risk for students and families is to cut down on how much data they collect and store (an approach often called “data minimization”), and keep only what they really need. After all, if the data doesn’t exist, it can’t be misused or breached. And there are other benefits to minimizing data: large data sets are expensive to maintain and search. Also, as data sets grow more complex, updating and searching the data becomes more complex, making it more difficult to separate the signal from the noise.

Students who want to pursue higher education need their high-school transcripts, whether they graduated one year ago or 50.

Unfortunately, in the education context, minimizing data is a complicated task. Institutions have very good reasons to retain some types of data for a very long time. For example, students who want to pursue higher education need their high-school transcripts, whether they graduated one year ago or 50. Schools have to find a way to balance retaining and deleting data so they can provide students with the services they need, but limit the risk as much as possible.

Further complicating the difficult balance between data minimization and data retention is a convoluted legal landscape. Institutions have to comply with a thicket of both federal and state laws that don’t always fit together smoothly. At the federal level, the Family Educational Rights and Privacy Act (FERPA), the Children’s Online Privacy Protection Act (COPPA), and the Individuals with Disabilities Education Act (IDEA) all govern handling students’ data. There are state laws to consider as well, which makes the picture even more chaotic. California, for example, has at least nine laws that impact how institutions must manage student data, both education-specific and more general consumer and child privacy laws. As with the federal laws, it’s not always obvious how the laws interact with one another, or what to do if they seem to conflict.

Let’s suppose a school does manage to navigate these waters, and settles on a solid retention/deletion plan. Well, there’s still one more wrench to throw in the works: deleting data is more technically complex than it might seem. Deleting a file isn’t as simple as dragging to the trash or recycle bin and then emptying the bin. That approach actually leaves the data fairly susceptible to recovery; that is, simple forms of deletion can often be simply reversed. More care must be taken. There are other approaches (like overwriting data or destroying the storage media itself) that provide better security, but these are more expensive or time consuming. In addition to determining when to delete what data, educational institutions also have to consider how technically to delete that data.

But have no fear! CDT is developing recommendations for practitioners, and the companies that work with them, about how to balance data retention and deletion that maximize the value of data and technology while protecting students’ privacy. Stay tuned for more materials in early 2019. In the meantime, try to remember to forget.

]]>https://cdt.org/blog/deletion-and-student-privacy-i-forgot-to-remember-to-forget/feed/0Terrorist content online: Parliament must take time to address the issues Member States did nothttps://cdt.org/blog/terrorist-content-online-parliament-must-take-time-to-address-the-issues-member-states-did-not/
https://cdt.org/blog/terrorist-content-online-parliament-must-take-time-to-address-the-issues-member-states-did-not/#respondTue, 11 Dec 2018 16:28:07 +0000https://cdt.org/?post_type=blog&p=82399…

When the European Commission published its proposal in September, we called out several problems: (1) the lack of evidence to justify its necessity in light of the multiple counter-terrorism initiatives already taken, notably Directive 2017/541 on Combating Terrorism, and (2) its incompatibility with fundamental rights standards. And we are not alone. CDT and about 30 digital and human rights organisations and experts raised these concerns in an open letter addressed to the JHA Ministers ahead of the meeting. Academic research from Dr. Aleksandra Kuczerawy also concludes that the proposal “poses a serious risk to fundamental rights protected by the EU Charter, in particular the right to freedom of expression and access to information”. Member States unfortunately did not heed these warnings. While the Council’s text attempts to address some of our concerns, more work is required. The European Parliament will need to solve these problems.

Further clarification of “terrorist content” definition is required

A major problem in the Commission’s proposal is the vague and unclear definition of “terrorist content” and the risk that it leads to repression of legitimate speech. The Council text attempts to align the definition more closely with the definition of “terrorist offences” under Directive 2017/541 on combating terrorism. It refers to “material which may contribute to the commission of the intentional acts, as listed in Article 3(1)(a) to (i) of Directive 2017/541”. Introducing an element of intent in the assessment of whether a piece of content is considered as “terrorist” may help limit the impact on free expression. However, it must be noted that the language in the Directive itself is already broad and criminalises “glorification”, a vague term that captures a wide range of expressions.

Making reference to “fundamental rights” does not imply their protection

The Council’s text makes more references to the importance of freedom of expression and information, as well as the freedom of the press and plurality of the media (recitals 7 and 9), as elements to safeguard when adopting measures in this area. Dr. Aleksandra Kuczerawy notes that “[t]he change aims to take into account the journalistic standards established by press and media regulation, but it is doubtful that this change will actually facilitate the assessment process” when deciding on content removals. Moreover, given the limitations and accuracy rates of automated tools to analyse context, taken together with the broad definition of “terrorist content”, the risk of erroneous removals of legal content persists. This is not a theoretical risk. Human rights groups like WITNESS whose mission is to document human rights abuses testify that online videos and other material that provide crucially important evidence are routinely blocked and deleted by automated systems and reviewers.

Scope of services covered should be narrowed considerably

The Commission’s proposal includes in its scope a wide range of services, ranging from global social media companies, trading platforms and cloud and storage services to newspaper sites and privately run blogs with comment sections. The Commission reasons that what is deemed “terrorist content”, having been purged from mainstream platforms, is moving to smaller ones. In our view, the focus should be on those services being demonstrably and systematically used to disseminate terrorist content with the intent of inciting violence. The definition of “hosting service provider” should be amended to focus on those services that are designed and used for broad dissemination of user-uploaded content to the general public. The Council’s text may be an attempt to narrow the scope, in that Recital 10 excludes “other services provided in other layers of the internet infrastructure” and adds that “only those services for which the content provider is the direct recipient are in scope”. The changes have been made only in the recital, and seems confusing. The scope should explicitly exclude cloud and storage services as well as infrastructure providers.

The Council’s text does not set conditions for the designation of Competent Authorities, nor specify how many a Member State can designate. There is no requirement that decisions they take are subject to judicial review. This is unsatisfactory, particularly given the rule of law issues in certain Member States, taken together with the fact that Competent Authorities would be given broad powers to impose proactive measures on service providers. The Council clarifies that these measures should be taken “depending on the risk and level of exposure to terrorist content”. However, it also adds that it will be left up to the competent authority to “decide on the nature and the scope of the proactive measures”. Any authority with the powers to impose proactive measures, including the use of automated tools, would need to be fully aware of the technical limitations of available technology. The proposal retains that safeguards should consist of “human oversight and verifications where appropriate”. However, human oversight does not always address the issue of erroneous removals, particularly when content is taken down against a platform’s own terms of service.

Compromise text remains in conflict with e-Commerce Directive

The Council’s position still calls for proactive measures to “effectively address the reappearance of content which has previously been removed”. While this may be a slightly softer wording that the Commission’s (“preventing the re-upload of content”), a general monitoring obligation may still be inferred. This is particularly the case taken together with the explicit derogation to the E-Commerce Directive mentioned in recital 19, which the Council has left untouched. Dr. Aleksandra Kuczerawy highlights that while the E-Commerce Directive “does not foresee any exemptions to the prohibition in Article 15 […] the proposed measures therefore, contradict the EU acquis”.

All in all, it is clear that speed has trumped thorough review in the Council proceedings. The Commission insists that the Regulation must pass before the 2019 Parliament elections. As noted, it does not provide convincing evidence to support this claim. We urge Members of the European Parliament to take the necessary time to consult extensively with business and civil society stakeholders, its internal policy services, free expression experts and press and media organisations. CDT will offer constructive and meaningful input for Parliament’s work.

]]>It’s that time of the year again. The holiday season is almost upon us and with that comes holiday parties, travel plans, and most importantly, gifts. There’s a good chance that some (if not most) of your gifts will be for kids and might even include the latest tech gadget. From new game consoles to network-connected toys, these gadgets collect, use, and may ultimately share a tremendous amount of information about their users, the surrounding environment, and even mom and dad.

As you start your holiday shopping, here are some things to know to better protect your children’s privacy.

What should I, as a consumer, be aware of?Whether you are planning on buying a smart toy, a smart speaker, a smartwatch, or anything that is connected to the internet (Sigh! Whatever happened to kids just wanting ponies and puppies and bikes), get ready to roll up your sleeves and do some good old-fashioned research before you trot-off to the store. First, before buying any product, here are some questions you should ask yourself:

Does the product collect any information from children?

How does it collect the information?

What do they do with the information that is collected? (And what type of security protections does the product have?)

It’s a good idea to read through the privacy policy to get a better understanding of the product and their data practices. As a parent, it is also important to educate yourself on the children’s privacy law, COPPA, and your rights under it. Wait, what is COPPA?

Well, I’m glad you asked. The Children’s Online Privacy Protection Act (COPPA), passed in 1998, is the primary federal law in the United States protecting the digital privacy of children under the age of 13. Under COPPA, companies are required to get verifiable parental consent before collecting any personally identifiable information from children below 13, and the law gives parents the right to access this information and even have the the company delete it, if they so wish. (Check here for the updates that the FTC made to the definition of ‘personal information’ under COPPA in 2013).

Who enforces COPPA?

The Federal Trade Commission (FTC) is the federal body in charge of enforcing COPPA. To date, the FTC has brought about 30 cases against companies for violating COPPA with penalties totaling several millions of dollars.

While COPPA has been very important in setting privacy guidelines for companies offering products or services for children below 13, there are certain areas where some improvement is needed. For example, state attorneys general in New Mexico and New York have highlighted apparently flagrant violations of the law. It is clear more must be done, and for a law having just celebrated its twentieth birthday, reform and reconsideration seems in order. CDT recently filed comments to the FTC on steps they can take to be more effective in enforcing COPPA and protecting children’s privacy. The steps include:

Proactively investigating companies as opposed to being reactive to public data breaches;

Working with the U.S. Department of Education to come up with guidance for businesses on privacy best practices. This is especially important with the increasing use of personal devices by children in the classroom context.
If you are looking for some light reading while travelling this holiday season, you will find our detailed comments to FTC here.

So this holiday season, consider the children in your life and how to protect their privacy rights, and we will continue to do the same!

CDT’s Tech Talk is a podcast where we dish on tech and Internet policy, while also explaining what these policies mean to our daily lives. You can find Tech Talk on SoundCloud, iTunes, and Google Play, as well as Stitcher and TuneIn.

Talking about technical standards might make some people’s eyes gloss over, but in telecommunications, if there aren’t consistent standards, our cell phones likely won’t work in all places or on all networks. Of course, there is actual real technology layered on those standards – chips, microprocessors, circuit boards, and lots of other stuff I could never make work. A lot of these necessary components are patented. Can you sense the growing tense already?

Charles Duan, the Director of Technology and Innovation Policy at the R Street Institute, was my first guest on this episode and he helps us all understand how standards and patents can impact competition and consumers.

]]>The Center for Democracy & Technology and VotingWorks are excited to announce the launch of VotingWorks. In case you missed the announcement on Election Day and subsequent tech presscoverage, VotingWorks aims to shake up the voting equipment market by creating a new non-profit voting systems manufacturer with the mission of being the public works for voting systems. VotingWorks will do this by developing voting equipment that 1) embody the state-of-the-art in usability, security, design, and development; 2) are affordable to maximize any benefit to all sizes of election jurisdictions; 3) allow speedy, efficient voting processes; and, 4) that is extensible to the needs of all types of localities. And all of this will be developed in the open for the public good.

The need here is very real. Election officials often find themselves stuck between a rock and a hard place when choosing a new voting system; there are often few expensive choices that come with serious limitations in how these systems can be used, modified, improved, and studied. CDT has advised localities in procurement decisions in the past and contributed to efforts where jurisdictions are designing their own voting systems – such as the Los Angeles County VSAP project – and the common factor in all these cases is the wide variety of needs and requirements that elections present, and how few systems can meet them all.

CDT will serve as a home for VotingWorks until it becomes its own non-profit entity. This partnership means VotingWorks is working closely with the CDT’s experienced team to rapidly ramp up operations and begin in earnest the development of affordable, secure, open-source voting machines for use in US public elections.

CDT and VotingWorks are ideal partners: CDT has been instrumental to many critical technology policy debates for almost 25 years. We’ve elevated the discussion around privacy and personal data control, security and surveillance, free speech online, and, of course election security. CDT has specifically been a tireless advocate for strengthening election security, through efforts such as helping election officials across the country train their staff to defend against new threats. CDT and VotingWorks also share an important conviction: that the foundation of election security is the widespread use of paper ballots and risk-limiting audits.

Ben Adida, VotingWorks founder, is a brilliant applied security thinker, cryptographer, voting security expert, and has experience building and developing products that are easy and intuitive to use without sacrificing security or privacy. Ben is also a champion of open source and other generous licensing regimes – such as Creative Commons, where he’s a member of the Board of Directors. It’s this open approach that we all hope might translate into a common secure technology base for election operations, similar to how the open source Linux operating system now powers much of the computing infrastructure around us. We’re humbled and proud to be working together.

Furthering this collaboration, I will be joining VotingWorks’ board of advisors. Ben and I have been looking for an opportunity to work more closely together for 15 years, and we are excited to have the opportunity of a lifetime to work on a topic near and dear to both our hearts.

]]>In Washington, D.C., a day hardly goes by where I don’t come upon multiple scooters parked on street corners, near park benches, or outside my apartment building. Lime, Bird, Spin, Skip, JUMP, and Lyft all have “dockless mobility” operations in the capital. These services generate a tremendous amount of data that could potentially improve transportation infrastructure – and early evidence suggests they are already offering new transportation services to underserved communities in Washington – and cities like Detroit and Los Angeles are racing to create new data standards to collect and analyze mobility data.

These efforts raise important privacy and security concerns that deserve further consideration as cities across the country launch dockless mobility pilot programs. Next door to D.C., for example, is Alexandria and Arlington, Virginia, which have started their own pilots. These programs are attempting to find answers to new liability issues, ensure scooters are made available equitably, and set expectations about the scale and timeliness of data being provided to local transportation authorities. The Los Angeles Department of Transportation (LADOT) is currently undertaking its own pilot program, and the Department’s program highlights some of the relevant privacy and security issues involved.

LADOT is asking for ongoing, real-time access to trip data for scooters. While the city has suggested it is “respectful of user privacy” because its data standard asks “for no personally identifiable information about users directly,” this sort of trip data by itself is highly revealing. As Justice Sotomayor has acknowledged, tracing people’s movements reveals information that is “indisputably private in nature,” including their intimate relationships and visits to health care providers such as abortion clinics or HIV treatment centers. Monitoring location data also reveals First Amendment-protected activities such as religious and political affiliation. In the wrong hands, this information can be used to stalk or harass riders, compromising their physical safety. Ride-sharing APIs have been abused for things like spying on ex-partners, and a 2016 Associated Press study found that law enforcement officers across the country abused police databases to stalk romantic partners, journalists, and business associates. The risk of harm from exposing this information is particularly high for survivors of gender-based assault and hate-motivated violence.

This type of data collection raises the specter of surveillance and warrants public discussion about what information must be made available to government officials and at what scale.

We also should acknowledge that scooter riders are likely to rely on their scooters for first- or last-mile transportation, taking it directly from their home and to their final destinations. This is different from car trips in cabs or Ubers that often begin or end some distance away from a user’s final destination. This type of data collection raises the specter of surveillance and warrants public discussion about what information must be made available to government officials and at what scale.

Building on our earlier work on government data demands, we’ve called on transportation authority to adopt clear and robust privacy and security safeguards. These policies should build off of longstanding Fair Information Practices, include appropriate access controls, and address the availability of mobility data to researchers. Specifically, we recommend that LADOT should (1) limit access to and use of mobility data for clearly specified purposes, (2) establish a reasonable retention and deletion policy, (3) clarify how this data will be secured or obfuscated to protect against breaches and minimize the likelihood of disclosure of identifiable data, and (4) better communicate these policies and information to riders and the public.

We believe that these pilot programs provide an opportunity for transportation officials to assess how they can achieve legitimate aims with thinking about how to minimize the amount and granularity of data being collected. Cities must also take careful stock of the types and sensitivity of data for which it is asking and determine whether each data type is necessary for enforcement or how information can be obscured to minimize privacy risks. It should also consider the granularity of location information it needs.

For cities to exercise true leadership in dockless mobility, they must establish policies and procedures that can be followed by cities with fewer resources and less technical capacity or expertise. We hope LADOT will take on this challenge, and we look forward to seeing how dockless mobility programs roll out across the country.