On the face of it, the EARN IT Act of 2020 appears to be a bill to protect against online exploitation of children by creating a National Commission for that purpose. However, the Electronic Frontier Foundation(EFF) has called for its immediate rejection, calling it "anti-speech, anti-security, and anti-innovation".

On the other hand, Stewart Baker, previously the general counsel for the National Security Agency(NSA), has written that the characterisation of the bill as "an all-out assault on end-to-end encryption" is unrealistic, and that instead of this, it is designed to "encourage companies to choose designs that minimize the harm that encryption can cause to exploited kids".

Given the mixed commentary on this bill, as well as the fairly technical details within it, what are the objective implications of this act, and what are the arguments for & against?

A metaphor: Many companies deliver locked boxes and leave them unattended for a while before they're picked up. They will now need to ensure nothing illegal is in the boxes. This would require either having a duplicate of each of the keys or having a way to open their boxes without keys. Either of those can be stolen or discovered and easily used by criminals or misused by anyone at the company with access. And what happens when someone puts their own locked box inside the company's locked box? And what if they just ask someone else to deliver it, who doesn't require a box they can open?
– NotThatGuyMar 18 at 0:47

@NotThatGuy Mass surveillance will make sure there is nobody else who will deliver it. At the very least, alternative deliverers will be breaking the law, with all the same consequences as drug dealers (for example) have.
– user253751Mar 18 at 18:49

6 Answers
6

Background

Section 230 of the Communications Decency Act was not part of the original Senate legislation, but was added in conference with the House, where it had been separately introduced by Representatives Christopher Cox (R-CA) and Ron Wyden (D-OR) as the Internet Freedom and Family Empowerment Act and passed by a near-unanimous vote on the floor. It added protection for online service providers and users from actions against them based on the content of third parties, stating in part that "No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider." Effectively, this section immunizes both ISPs and Internet users from liability for torts committed by others using their website or online forum, even if the provider fails to take action after receiving actual notice of the harmful or offensive content.

The TL;DR here is that Stack Exchange (i.e. this website) cannot be sued for content you or I put on its platform. Stack Exchange is under no obligation to remove the content either, unless we violate someone's copyright.

Arguments For

“We are concerned that internet services, under the guise of Section 230, can not only block access to law enforcement — even when officials have secured a court-authorized warrant — but also prevent victims from civil recovery,” Barr argued. “ . . . Giving broad immunity to platforms that purposefully blind themselves – and law enforcers – to illegal conduct on their services does not create incentives to make the online world safer for children. In fact, it may do just the opposite.”

A January 2017 Senate report accused Backpage of facilitating online sex trafficking by stripping words like “lolita,” “little girl,” and “amber alert” from ads in order to hide illegal activity before publishing the ad, as well as coaching customers on how to post “clean” ads for illegal transactions. Judges in California and Massachusetts previously cited Section 230 in dismissing cases against Backpage.

The U.S. Justice Department doesn’t want Facebook to encrypt messages on WhatsApp and its other messaging services, without giving law enforcement “backdoor” access to such conversations. The goal is to stop child pornographers, terrorists and foreign adversaries looking to disrupt U.S. institutions.

Arguments Against

The problem with EARNIT is that it would create a commission to propose policy going forward. If you didn't comply, you would no longer be protected by Section 230. Since no specific policy is being proposed by the law, it's impossible to predict what this commission will suggest. More importantly, the EFF notes

If the Attorney General disagrees with the Commission’s recommendations, he can override them and write his own instead. This bill simply gives too much power to the Department of Justice, which, as a law enforcement agency, is a particularly bad choice to dictate Internet policy.

The demand for a "golden key" for government access to encrypted data, then, isn't so much about necessity as it is expense and convenience. The problem is that no matter how clever such a skeleton key system might be, it is exceptionally fragile and bound to be misused, exploited by an adversary, or both. Reform Government Surveillance—a coalition formed by Google, Apple, Microsoft, Dropbox, and other cloud platform operators—issued a statement last May warning about the consequences of such efforts:

"Recent reports have described new proposals to engineer vulnerabilities into devices and services, but they appear to suffer from the same technical and design concerns that security researchers have identified for years," the alliance wrote. "Weakening the security and privacy that encryption helps provide is not the answer."

[Police] generally acknowledge that commercial sex—yes, sometimes involving minors and/or victims of abuse—will go on with or without digital tools to facilitate it. Shutting down Backpage didn't even make a dent in the volume of online adult ads, according to a Washington Post analysis. It simply dispersed them through a wider range of platforms. Yet politicians insist on casting classifieds websites as the biggest cause and a main hub of forced and underage prostitution. Sen. Kamala Harris (D–Calif.) has described Backpage as the world's "top online brothel."

It's fair to say that the commission might use its power for political aims rather than trying to protect children or stop sex trafficking.

+1 One could add that arguments for government-mandated or government-facilitated backdoors on encryption are nothing new, going back to the Clipper chip and probably well before they were open about it, e.g. Crypto AG.
– FizzMar 18 at 5:30

4

Hang on a minute, "you removed references to illegal activity, therefore you must be concealing illegal activity"? Seems like a catch-22 for Backpage there.
– user253751Mar 18 at 14:08

1

@user253751, ...insofar as the Right Thing would be to turn away such customers outright, rather than encourage someone you know would, if not given contrary guidance, describe their business in terms implying illegality. That doesn't strike me as a catch-22 at all.
– Charles DuffyMar 20 at 13:43

@CharlesDuffy What would you do though? If you tell the person what they did wrong, they'll stop doing exactly that, so they don't trip your filters (and they'll keep on doing the illegal thing). If you ban their account they'll just make a new account. If you call law enforcement every time, you'll probably just catch jokers and you might not have enough information to track the person down anyway (they'll start using public wifi hotspots after you track the first one by their IP address)...
– user253751Mar 20 at 13:53

1

The good-faith effort to comply (re: the above "call law enforcement every time") would have kept Backpage out of trouble. And you're assuming that criminals are universally competent. That's... not reliably true. Moreover, doing legwork (say, to figure out who was in range of hotspot X at time Y and compare against folks who've been believed to be involved in sex trafficing based on other evidence) is part of a police department's job.
– Charles DuffyMar 20 at 13:55

The proposed argument for this bill is that it will help to cut down on crimes, specifically sexual child abuse. Presently, end to end encryption provides a near impenetrable means for anyone (including bad actors) to send data over the internet without fearing that someone else will get ahold of it. The bill is mostly concerned that big sites with encryption take steps to ensure that they are not being used to transmit child sexual abuse data (sites such as WhatsApp, Facebook, e-mail services, etc)

In order to understand the argument against this bill, I'm going to first need to explain something called "asymmetric encryption." This is not a comprehensive guide to how encryption and security works; rather, this is meant to be a quick and simple means of you understanding it.

Let's say I want to send you a message, but I know that a third party is always listening, and wants to use that message for nefarious purposes. It makes sense that I would send you the message in a code that that third party doesn't know. This is easy if we can meet beforehand and agree upon a code; this is difficult if we can only meet to discuss the code while someone is listening in. The solution to this is a very clever solution called "asymmetric encryption." The short version of it is this: The two parties who are communicating will have digital "keys." Each person has a private key and a public key; they use the public keys to encrypt their file, and only their private keys can decrypt it. if someone wanted to try to guess what those private keys were without knowing, it would take an extremely long time. Like, longer than the heat death of the universe long time. The encryption and decryption is ONLY happening at the local device. Your ISP, anyone monitoring your network traffic, even whoever wrote the chat app that you're using-- none of them have a means of brute forcing their way into your conversation before everyone involved is long dead.

Typically, if a bad actor wishes to access your data, they would instead try to install some sort of malware on your device to steal your public and private keys. Then, the bad actor can read your messages or even impersonate you. This is an issue, but thankfully, it's localized. If Bob Smith chooses to install NotAVirus.exe, my data is still secure (except for any conversations I had with Bob).

Now, let's get back to the bill. Presently, it doesn't mention any of this stuff. It has a section that states that a committee in Congress (which, per the bill, should be up to date on cybersecurity) will come up with "best practices" guidelines for websites to ensure that their platform is not hosting child abuse. Doesn't seem that bad, right? Except, here's the thing-- the bill also wants large tech groups to start scanning through the data they're receiving to ensure that there's no child sex abuse. Thing is, as I just explained-- with end to end encryption, that's not possible. The developer has no clue what the heck you just sent; so far as they're aware, a picture of a rainbow and a picture of a crime are just randomly assorted data. The only way to know what the heck was in that file would be decrypt it. And the only way to decrypt it would be to break end to end encryption somehow. We can't keep using the technology that's worked for so long.

Many are arguing that this would disrupt transactions such as banking (which was one of the first uses of public private key encryption). Those arguments are not really relevant; if you're sending a transaction to a company, they're already decrypting your data. If the transaction can only support things like routing numbers, then it's going to be pretty darned difficult for Congress to justify disrupting banking with the claim that someone is sending child abuse over it. Sites such as Amazon, paypal, your bank, etc. would be utterly unaffected by this.

This would primarily affect chat applications such as WhatsApp, Facebook Messenger, WeChat (assuming you wanted to use it in the States), etc. There are still implications to this.

Chiefly-- if we install a back door for the developers/government to review data, that opens up a means for a bad actor to swipe your data. Traditionally, if they wished to steal it, they had to plant malware on your device or the other person's to steal your keys. You had to be tricked into installing it. With this, hackers can attempt to access company servers to steal information. And unfortunately, it's not uncommon for those servers to be hacked.

The other issue at stake is privacy. Edward Snowden revealed to America the extent of government data collection. While this back door would ostensibly be used only for legitimate purposes, many don't believe that. In response to the classic authoritarian claim "if you've done nothing wrong, then you have nothing to fear" many of them reply, "Then why can't I see your data?" or (my favorite) "I'm not doing anything wrong when I use the toilet either, but I'd still rather you not watch." Remember when all of those celebrity nudes leaked? None of them had done anything wrong when they sent those pictures out. How would you feel if your nude pictures were reviewed by a third party? Pictures of your siblings, children, or friends?

The final argument against this: It's easy as hell for someone to make their own end-to-end encryption app. If you google "End to end encryption tutorial," you'll get millions of hits. The underlying math is very well understood and has been distributed and improved upon since the 70's. There are free packages you can download from most code repos to get ahold of end to end encryption. I don't doubt that we'd catch some pedophiles if we cracked into everyone's chats, but I also don't doubt that there would be a new (probably smaller, admittedly) darkweb of predators inside of a week using their own encryption that they didn't even have to understand all that well.

Keep in mind; this is presently conjecture. If Congress wanted to approve the bill as-is and the committee said "best practices are for sites to put up a text warning that says "don't do anything illegal"" then none of these concerns would hold water.

To summarize: The chief arguments for this bill are to help catch and curb persons who are using popular websites for the transmission of child abuse. The chief arguments against this bill are that it calls upon those platforms to review the data coming through their servers, which could be used as a backdoor for bad actors or the government to get ahold of your data without your consent.

Stewart Baker lays this out fairly well, but severely downplays the impact.

The main point is that the EARN IT Act will revoke liability protections given in section 230 of the CDA, so providers will be liable for content transferred via their infrastructure.

The exception to this is if they follow "best practices" which are to be defined by a committee, and which can be unilaterally adjusted by the attorney general (right now that is William Barr, who's strongly opposed to encryption).

While Baker rejects the notion of an "all-out assault on end-to-end encryption", he does acknowledge that "critics aren’t entirely wrong about the implications of EARN IT for encryption" and that providers may be "require[d] to pay for the suffering [encryption] enables". He just thinks that the end justifies the means in this case. He is also under the impression that there is a way to "minimize the harm to children" while keeping strong encryption, but doesn't expand on how that would be possible.

He also argues that companies are free to "persuade a jury that their product design didn’t recklessly allow the spread of child pornography" (because with the act, they would be liable for that). At the same time, he argues that encryption does cause harm.

[T]his bill is a backdoor way to allow the government to ban encryption on commercial services. And even more beautifully: it doesn’t come out and actually ban the use of encryption, it just makes encryption commercially infeasible for major providers to deploy

This might be a bit redundant to all the other answers, but I think these paras from Matthew Green's analysis capture the essence of the problem as the pro-crypto camp sees it:

Because the Department of Justice has largely failed in its mission to convince the public that tech firms should stop using end-to-end encryption, it’s decided to try a different tack. Instead of demanding that tech firms provide access to messages only in serious criminal circumstances and with a warrant, the DoJ and backers in Congress have decided to leverage concern around the distribution of child pornography, also known as child sexual abuse material, or CSAM. [...]

End-to-end encryption systems make CSAM scanning more challenging: this is because photo scanning systems are essentially a form of mass surveillance — one that’s deployed for a good cause — and end-to-end encryption is explicitly designed to prevent mass surveillance. So photo scanning while also allowing encryption is a fundamentally hard problem, one that providers don’t yet know how to solve.

All of this brings us to EARN IT. The new bill, out of Lindsey Graham’s Judiciary committee, is designed to force providers to either solve the encryption-while-scanning problem, or stop using encryption entirely. And given that we don’t yet know how to solve the problem — and the techniques to do it are basically at the research stage of R&D — it’s likely that “stop using encryption” is really the preferred goal.

As the linked prior article in that quote explains, only some companies are likely to be affected:

For platforms that don’t support E2E [end-to-end] — such as (the default mode) of Facebook Messenger, Dropbox or Instagram — this isn’t an issue. But popular encrypted messaging systems like Apple’s iMessage and WhatsApp are already “dark” to those server-side scanning technologies, and presumably future encrypted systems will be too.

It goes on discuss the much difficult issue that current CSAM
scanning fundamentally relies on secret algorithms to work
without being disrupted or subverted by "the little guy", while allowing
the possibility that the government could perform that kind of scanner subversion for any (non-child-porn) reason they might one day decide it's desirable.

Thus one has to be (additionally) concerned that the EARN IT law will impose
the use of such very specific CSAM scanning methods, decided by a government-dominated committee set up by EARN IT law.

In contrast, Alan Z. Rozenshtein argues that the concerns are overblown because the tech companies might be able to block government initiatives in this committee:

This means that, acting as a bloc, the companies along with the legal and technological experts could block any best practices—although Eric Goldman argues that there’s no guarantee that this group will in fact vote as a bloc.

Additionally Rozenshtein argues that since Congress must ultimately approve these "best practices" (in the newer iteration of the EARN IT draft), the concerns of executive abuse are now much diminished:

Finally, before any recommendation goes into practice, Congress must enact it (Section 4(c)). This is a major change from the previous draft, which provided that the recommendations would enter into force unless Congress affirmatively rejected them. Under the new draft, Congress, not the executive branch, will have the last word on the future of encryption.

Basically, in this latter perspective, the dispute is merely whether the government (both executive and legislative branches) will be able to understand the specific technological "best practice" measures and not abuse the EARN IT law to impose overbearing measures that would subvert the stated intent of the EARN IT legislation.

The (aforementioned) blog of Goldman is a lot more concerned of what will happen if the EARN IT commision doesn't manage to propose anything:

The new commission is called the “National Commission on Online Child Sexual Exploitation Prevention,” but I’ll call it the “Censorship Board.” [...]

Remember that if no recommendations emerge from the Censorship Board, the partial Section 230 repeal still takes effect in four years. This poison pill will motivate recalcitrant Internet companies to “take a deal” and eventually accept odious provisions to avoid even worse outcomes.

"Following the 1988 Lockerbie Bombings, Barr floated the idea of the president convening secret military tribunals to try people accused of involvement with suspected terrorist activities. Barr revived the idea of secret military trials after the 9/11 attacks and testified in support of President George W. Bush’s decision to order them without congressional authorization."

Mr. Barr has demonstrated time and time again his absolute authoritarianism, and is contempt for civil liberties, privacy, due process, or anything else most Americans seem to value.

If you do a bit of reading about Barr's previous and current views on absolute authority, and full unquestioning support of prosecutions over and above all else, and really believe that he is the best person to be given full control over how people exchange their data, pictures, opinions, stories, and other free expressions, and how all of that data can or should be used against people, then by all means, support it.

Supposing the legislation passes, the implications are rather clear: the Attorney General will have broad power to define what companies are allowed and are forced to do with user content. Companies may not be allowed to host data which they cannot decipher, leaving average users defenseless.

Furthermore, this is not mixed commentary at all: NSA officials have consistently lied to the public, what they say has little value, if any.