Cory Doctorow: Thank you very much. Hi. So, there’s lit­tle for­mal­i­ty first. As a mem­ber in good stand­ing of the Order of After‐Dinner and Conference Speakers of England and Wales, I am required as the last speak­er before lunch to make a joke about being the last speak­er before lunch. This is that joke. Thank you.

So, I work for Electronic Frontier Foundation. And I’ve met some of you in the halls here, and when I men­tion I work with EFF they say, ​“Oh you guys have been around for a long time.” And it’s true. Like not just Internet time. Not like ​“this is Zcash’s sec­ond birth­day so we’re the dod­der­ing old men of cryptocurrency”-long time. Like, we’ve been around for a quar­ter cen­tu­ry. A legit­i­mate long time. And I want to talk about our ori­gin sto­ry, about the key vic­to­ries that we scored real­ly ear­ly on, a quar­ter cen­tu­ry ago, that real­ly are the rea­son you guys and you folks are in this room today.

I want to talk about the Crypto Wars. Not those cryp­to wars.

These Crypto Wars.

So, back in the late 90s, the NSA classed cryp­tog­ra­phy as a muni­tion and imposed strict lim­its on civil­ian access to strong cryp­to. And there were peo­ple as you heard Primavera speak about who called them­selves cypher­punks, cryp­toa­n­ar­chists, who said that this was bad pol­i­cy; it was a gov­ern­men­tal over­reach and it need­ed to be changed. And they tried a whole bunch of dif­fer­ent tac­tics to try and con­vince the gov­ern­ment that this pol­i­cy was not good pol­i­cy.

So, they talked about how it was inef­fec­tive, right. They said you can ban civil­ian access to strong cryp­tog­ra­phy and only allow access to weak cryp­to, the 50‐bit ver­sion of DES, and that will not be suf­fi­cient to pro­tect peo­ple. They made this as a tech­ni­cal argu­ment. They said like, ​“Look, we believe that you could brute force DES with con­sumer equip­ment.” And the court said, ​“Well, who are we gonna believe, you or the NSA? Because the NSA, they hire all the PhD math­e­mati­cians that grad­u­ate from the Big 10 schools and they tell us that DES50 is good enough for any­one. So why should we believe you?”

And so, we did this. We built this thing called the DES Cracker. It was a quarter‐million‐dollar spe­cial­ized piece of equip­ment that could crack the entire key space of DES in two hours, right. So we said like, ​“Look. Here’s your tech­ni­cal proof. We can blow through the secu­ri­ty that you’re propos­ing to lock down the entire US finan­cial, polit­i­cal, legal, and per­son­al sys­tems with, for a quarter‐million dol­lars.

And they said, ​“Well, maybe that’s true but we can’t afford to have the crim­i­nals go dark, right. They’re gonna hide behind cryp­to and we won’t be able to spy on them.”

So in the face of all that resis­tance, we final­ly came up with a win­ning argu­ment. We went to court on behalf of a guy named Daniel J. Bernstein. You prob­a­bly heard of DJB. He’s a cryp­tog­ra­ph­er. He’s a cryp­tog­ra­ph­er whose name is all over all the ciphers you use now. But back then DJB was a grad stu­dent at the University of California at Berkeley. And he had writ­ten a cipher that was stronger than DES50. And he was post­ing it to Usenet. And we went to the Ninth Circuit and we said, ​“We believe that the First Amendment of the US Constitution, which guar­an­tees the right to free speech, pro­tects DJB’s right to pub­lish source code.” That code is a form of expres­sive speech as we under­stand expres­sive speech in the US Constitutional frame­work.

And this worked, right. Making tech­ni­cal argu­ments didn’t work. Making eco­nom­ic argu­ments didn’t work. Making law enforce­ment argu­ments didn’t work. Recourse to the Constitution worked. We won in the Ninth Circuit. We won at the Appellate Division. And the rea­son that you folks can do ciphers that are stronger than DES50, which these days you can break with a Raspberry pi, the rea­son you can do that is because we won this case. [applause] Thank you.

So I’m not say­ing that to suck up to you, right. I’m say­ing that because it’s an impor­tant note in terms of tac­ti­cal diver­si­ty in try­ing to achieve strate­gic goals. It turns out that mak­ing recourse to the Constitution is a real­ly impor­tant tac­ti­cal arrow to have in your quiver. And it’s not that the Constitution is per­fect. And it’s cer­tain­ly not true that the US always upholds the Constitution, right. All coun­tries fall short of their goals. The goals that the US falls short of are bet­ter than the goals that many oth­er coun­tries fall short of. The US still falls short of those goals and the Constitution is not per­fect.

And you folks, you might be more com­fort­able think­ing about deploy­ing math and code as your tac­tic, but I want to talk to you about the full suite of tac­tics that we use to effect change in the world. And this is a frame­work that we owe to this guy Lawrence Lessig. Larry is the founder of Creative Commons and has done a lot of oth­er impor­tant stuff with cyber law and now works on cor­rup­tion. That’s a con­nec­tion I’m gonna come back to. And Larry says that there are four forces that reg­u­late our world, four tac­ti­cal avenues we can pur­sue.

There’s mar­kets: what’s prof­itable. Founding busi­ness­es that cre­ate stake­hold­ers for strong secu­ri­ty turned out to be a real­ly impor­tant piece to con­tin­u­ing to advance the cryp­to agen­da because there were peo­ple who would show up and argue for more access to cryp­to not because they believed in the US Constitution but because their share­hold­ers demand­ed that they do that as part of their ongo­ing fund­ing.

There’s norms: what’s social­ly accept­able. Moving from the dis­cus­sion of cryp­to as a thing that exists in the realm of math and pol­i­cy, and to a thing that is part of what makes peo­ple good peo­ple in the world to con­vince them that for exam­ple allow­ing sen­si­tive com­mu­ni­ca­tions to go in the clear is a risk that you put not just on your­self but on the counter par­ties to your com­mu­ni­ca­tion. I mean I think we will even­tu­al­ly arrive at a place where send­ing sen­si­tive data in the clear will be the kind of tech­ni­cal equiv­a­lent of invit­ing peo­ple to a par­ty where you close the door and chain smoke, right. It’s your self­ish lazi­ness putting them at risk.

And then, there’s law: what’s legal.

Now, the rule of law is absolute­ly essen­tial to the cre­ation and main­te­nance of good cipher sys­tems. Because there is no keylength, there’s no cipher sys­tem, that puts you beyond the reach of law. You can’t audit every moth­er­board in every serv­er in the cloud that you rely on for a lit­tle back­door chip the size of a grain of rice that’s tapped right into the moth­er­board con­trol sys­tem.

You can’t make all your friends adopt good oper­a­tional secu­ri­ty. This is a bit of the rules used by the deep pack­et inspec­tion sys­tem deployed by the NSA. This was pub­lished in a German news­pa­per after it was leaked to them. The deep pack­et inspec­tion rules that the NSA was using to decide who would get long‐term reten­tion of their com­mu­ni­ca­tions and who wouldn’t, they involved look­ing for peo­ple who had ever searched for how to install Tor or Tails or Cubes. So if you had ever fig­ured out how to keep a secret, the NSA then start­ed stor­ing every­thing you ever sent. In case you ever com­mu­ni­cat­ed with some­one who wasn’t using cryp­to and through that con­veyed some of the things that was hap­pen­ing inside your black box con­ver­sa­tions. You can’t make every­body you will ever com­mu­ni­cate with use good cryp­to. And so if the state is will­ing to exer­cise ille­git­i­mate author­i­ty, you will even­tu­al­ly be found out by them.

You can’t audit the ciphers that every piece of your tool­chain uses, includ­ing pieces that you don’t con­trol that are out of your hands and in the hands of third par­ties. One of the things we learned from the Snowden leaks was that the NSA had sab­o­taged the ran­dom num­ber gen­er­a­tor in a NIST stan­dard in order to weak­en it so that they could back­door it and read it. And so long as the rule of law is not being obeyed, so long as you have spy agen­cies that are unac­count­ably run­ning around sab­o­tag­ing cryp­to stan­dards that we have every rea­son to believe oth­er­wise are sol­id and sound, you can nev­er achieve real secu­ri­ty. This turns out to be part of a much larg­er thing called Bullrun in the US and Edgehill in the UK that the NSA and MI5 were joint­ly doing to sab­o­tage the entire cryp­to tool­chain from hard­ware to soft­ware to stan­dards to ran­dom num­ber gen­er­a­tors.

Opsec is not going to save you. Because secu­ri­ty favors attack­ers. If you want to be secure from a state, you have to be per­fect. You don’t just have to be per­fect when you’re writ­ing code and check­ing it in. You have to be per­fect all the time. You have to nev­er make a sin­gle mis­take. Not when you’re at a con­fer­ence that you trav­eled across the ocean to when you’re hor­ri­bly jet lagged. Not when your baby has wok­en you up at three in the morn­ing. Not when you’re a lit­tle bit drunk—you have to make zero mis­takes.

In order for the state to pen­e­trate your oper­a­tional secu­ri­ty, they have to find one mis­take that you’ve made. And they get to cycle a new shift in every eight hours to watch you. They get to have some­one spell off the per­son who’s start­ing to get screen burn‐in on their eyes and has to invert the screen because they can no longer focus on the let­ters. They just send some­one else to sit down at that con­sole and watch you. So your oper­a­tional secu­ri­ty is not going to save you. Over time the prob­a­bil­i­ty that you will make a mis­take approach­es one.

So cryp­to is not a tool that you can use to build a par­al­lel world of code that immu­nizes you from an ille­git­i­mate, pow­er­ful state. Superior tech­nol­o­gy does not make infe­ri­or laws irrel­e­vant.

But tech­nol­o­gy, and in par­tic­u­lar pri­va­cy and cryp­to­graph­ic tech­nol­o­gy, they’re not use­less. Just because your opsec won’t pro­tect you for­ev­er doesn’t mean that it won’t pro­tect you for just long enough. Crypto and pri­va­cy tools, they can open a space in which, for a lim­it­ed time, before you make that first mis­take, you can be shel­tered from that all‐seeing eye. And in that space, you can have dis­cus­sions that you’re not ready to have in pub­lic yet. Not just dis­cus­sions where you reveal that your employ­er has been spy­ing on every­one in the world, but all of the dis­cus­sions that have brought us to where we are today. You know, it’s remark­able to think that with­in our life­times, with­in liv­ing mem­o­ry, it was ille­gal in much of the world to be gay. And now in most of those ter­ri­to­ries gay peo­ple can get mar­ried. It was ille­gal to smoke mar­i­jua­na and now in the coun­try I’m from, Canada, mar­i­jua­na is legal, right, in every province of the coun­try. It was ille­gal to prac­tice so‐called inter­ra­cial mar­riage. There are peo­ple who are the prod­ucts of those mar­riages, who were ille­gal.

So how in our life­times did we go from these regimes where these activ­i­ties were pro­hib­it­ed, to ones in which they are embraced and con­sid­ered nor­mal? Well it was because peo­ple who had a secret that they weren’t ready to talk about in pub­lic yet could have a space that was semi-pub­lic. Where they could choose their allies. They could find peo­ple who they thought they could trust with a secret. And they could whis­per the true nature of their hearts to them. And they could recruit them into an ever‐growing alliance of peo­ple who would stand up for them and their prin­ci­ples. They could whis­per the love that dare not speak its name until they were ready to shout it from the hills.

And that’s how we got here. If we elim­i­nate pri­va­cy and cryp­tog­ra­phy, if we elim­i­nate the abil­i­ty to have these semi-pub­lic con­ver­sa­tions, we won’t arrive at a place in which social progress con­tin­ues any­way. We’ll arrive at a place that will be much like the hun­dreds of years that pre­ced­ed the legal­iza­tion of these activ­i­ties that are now con­sid­ered nor­mal. Where peo­ple that you love went to their graves with secrets in their hearts that they nev­er con­fessed to you. Great aches that you had unknow­ing­ly con­tributed to, because you nev­er knew their true selves.

So we need good tech pol­i­cy, and we’re not get­ting it. In fact we’re get­ting bad tech­nol­o­gy pol­i­cy that’s get­ting worse by the day.

So, you may remem­ber that over the last two years we dis­cov­ered that hos­pi­tals are com­put­ers that we put sick peo­ple into. And when we take the com­put­ers out of the hos­pi­tals, they cease to be places where you can treat sick peo­ple. And that’s because of an epi­dem­ic of ran­somware. There’s been a lot of focus on the bad IT poli­cies of the hos­pi­tals. And the hos­pi­tals had some bad IT poli­cies. You shouldn’t be run­ning Windows XP, there’s no excuse for it and so on.

But, ran­somware had been around for a long time and it hadn’t tak­en down hos­pi­tals all over the world. The way that ran­somware end­ed up tak­ing down hos­pi­tals all over the world is some­body took some off‐the‐shelf ran­somware and mar­ried it to a thing called Deep Blue. Or EternalBlue, rather. And EternalBlue was an NSA exploit. They had dis­cov­ered a vul­ner­a­bil­i­ty in Windows XP, and rather than tak­ing it to Microsoft and say­ing, ​“You guys had bet­ter patch this because it’s a real­ly bad zero‐day,” they had just kept it secret, in their back pock­et, against the day that they had an adver­sary they want­ed to use it against.

Except before that could hap­pen, some­one leaked their cyber weapon. And then dum­dums took the cyber weapon and mar­ried it to this old piece of ran­somware and start­ed to steal hos­pi­tals. Now, why do I call these peo­ple dum­dums? Because the ran­som they were ask­ing for was $300. They didn’t even know that they’d stolen hos­pi­tals. They’re just oppor­tunis­ti­cal­ly steal­ing any­thing that was con­nect­ed to an XP box and then ask­ing for $300, in cryp­tocur­ren­cy, in order to unlock it.

So, this is not good tech­nol­o­gy pol­i­cy. The NSA believes in a doc­trine called NOBUS: ​“No One But US is smart enough to dis­cov­er this exploit.” Now, first of all we know that’s not true. We know that the NSA… From the Crypto Wars, we know that the NSA does not have a monop­oly on smart math­e­mati­cians, right. These were the peo­ple who said DES50 was strong enough for any­one. They were wrong about that, they’re wrong about this. But even if you believe that the NSA would never…that the exploits that they dis­cov­ered would nev­er be inde­pen­dent­ly redis­cov­ered, it’s pret­ty obvi­ous that that doesn’t mean that they won’t be leaked. And once they’re leaked, you can nev­er get that tooth­paste back in the tube.

Now, since the Enlightenment, for 500 years now, we’ve under­stood what good knowl­edge cre­ation and tech­nol­o­gy pol­i­cy looks like. So let me give you a lit­tle his­to­ry les­son. Before the Enlightenment, we had a thing that looks a lot like sci­ence through which we did knowl­edge cre­ation. It was called alche­my. And what alchemists did is a lot like sci­en­tists. You observe two phe­nom­e­na in the uni­verse. You hypoth­e­size a causal rela­tion­ship, this is mak­ing that hap­pen. You design an exper­i­ment to test your causal rela­tion­ship. You write down what you think you’ve learned.

And here’s where sci­ence and alche­my part ways. Because alchemists don’t tell peo­ple what they think they’ve learned. And so they are able to kid them­selves that the rea­son that their results seem a lit­tle off is because maybe they made a lit­tle mis­take when they were writ­ing them down and not because their hypoth­e­sis was wrong. Which is how every alchemist dis­cov­ers for him­self the hard­est way pos­si­ble that you should not drink mer­cury, right.

So for 500 years, alche­my pro­duces no div­i­dends. And then alchemists do some­thing that is legit­i­mate­ly mirac­u­lous. They con­vert the base met­al of super­sti­tion into the pre­cious met­al of knowl­edge, by pub­lish­ing. By telling oth­er peo­ple what they know. Not just their friends who’ll go easy on them but their ene­mies, right, who if they can’t find a sin­gle mis­take in their work, they know that their work is good. And so, as a first prin­ci­ple when­ev­er you’re doing some­thing impor­tant every­one should be able to crit­i­cize it. Otherwise you nev­er know that it works. So you would hope that that’s how we would oper­ate in the infor­ma­tion secu­ri­ty realm. But that’s not how we’re oper­at­ing.

In 1998 Congress passed this law, the Digital Millennium Copyright Act. They then went to the European Union in 2001 and arm‐twisted them into pass­ing the European Union Copyright Directive. And both of these laws have a rule in them that says that you’re not allowed to break dig­i­tal rights man­age­ment. You’re not allowed to bypass a sys­tem that restricts access to a copy­right­ed work.

And in the ear­ly days, this was pri­mar­i­ly used to stop peo­ple from mak­ing region‐free DVD play­ers. But now, everything’s got a copy­right­ed work in it, because everything’s got a sys­tem on a chip in it that costs twenty‐two cents and has 50,000 lines of code includ­ing the entire Linux ker­nel and usu­al­ly an instance of BusyBox run­ning with the default root pass­word of ​“admin/admin”.

And because that’s a copy­right­ed work, any­one who man­u­fac­tures a device where they could make more mon­ey if they could pre­scribe how you use that device, can just add a one‐molecule‐thick lay­er of DRM in front of that copy­right­ed work. And then because in order to recon­fig­ure the device you have to remove the DRM, they can make remov­ing DRM and thus using your own prop­er­ty in ways that ben­e­fit you into a felony pun­ish­able by a five‐year prison sen­tence and a $500 thou­sand fine.

And so there’s this enor­mous temp­ta­tion to add DRM to every­thing and we’re see­ing it and every­thing. Pacemakers, vot­ing machines, car engine parts, trac­tors, implant­ed defib­ril­la­tors, hear­ing aids. There’s a new closed‐loop arti­fi­cial pan­creas from Johnson & Johnson…it’s a con­tin­u­ous glu­cose mon­i­tor mar­ried to an insulin pump with some machine learn­ing intel­li­gence to fig­ure out what dose you need from moment to moment. And it uses pro­pri­etary insulin car­tridges that have a lay­er of DRM in them to make sure that to stay alive you only feed your inter­nal organ the mate­r­i­al that the man­u­fac­tur­er has approved, so that they can charge you an extreme markup.

So that’s bad. That’s the rea­son we’re see­ing DRM every­where. But the effect of that is what it does to secu­ri­ty research. Because under this rule, mere­ly dis­clos­ing defects in secu­ri­ty that might help peo­ple bypass DRM also expos­es you to legal jeop­ardy. So this is where it starts to get scary, because as micro­con­trollers are per­me­at­ing every­thing we use, as hos­pi­tals are turn­ing into com­put­ers we put sick peo­ple into, we are mak­ing it hard­er for crit­ics of those devices to explain the dumb mis­takes that the peo­ple who made them have made. We’re all drink­ing mer­cury.

And this is going every­where. Particularly, it’s going into your brows­er. So, sev­er­al years ago the W3C was approached by Netflix and a few of the oth­er big enter­tain­ment com­pa­nies to add DRM to HTML5 because it was no longer tech­ni­cal­ly sim­ple to DRM in browsers because of the way they were chang­ing the APIs. And the W3C said that they would do it. And there’s a—it’s a long, com­pli­cat­ed sto­ry why they went into it. But I per­son­al­ly and EFF, we had a lot of very spir­it­ed dis­cus­sions with the W3C lead­er­ship over this. And we warned them that we thought that the com­pa­nies that want­ed to add DRM to their browsers didn’t want to just pro­tect their copy­right. We thought that they would use this to stop peo­ple from dis­clos­ing defects in browsers. Because they want­ed to be able to not just con­trol their copy­right but ensure that there wasn’t a way to get around this copy­right con­trol sys­tem.

And they said, ​“Oh no, nev­er. These com­pa­nies are good actors. We know them. They pay the mem­ber­ship dues. They would nev­er abuse this process to come after secu­ri­ty researchers who were mak­ing good faith, hon­est, respon­si­ble dis­clo­sures,” what­ev­er; you add your adjec­tive for a dis­clo­sure that’s made in a way that doesn’t make you sad, right. There are all these dif­fer­ent ways of talk­ing about secu­ri­ty dis­clo­sures.

And we said alright, let’s find out. Let’s make mem­ber­ship in the W3C and par­tic­i­pa­tion in this DRM com­mit­tee con­tin­gent on promis­ing only to use the DMCA to attack peo­ple who infringe copy­right, and nev­er to attack peo­ple who make secu­ri­ty dis­clo­sures. And the entire cryp­tocur­ren­cy com­mu­ni­ty and blockchain com­mu­ni­ty who were in the W3C work­ing groups, they backed us on this. In fact it was the most con­tro­ver­sial stan­dards vote in W3C his­to­ry. It was the only one that ever went to a vote. It was the only one that was ever appealed. It was the only one that was ever pub­lished with­out unan­i­mous sup­port. It was pub­lished with 58% sup­port, and not one of the major brows­er ven­dors, not one of the big enter­tain­ment com­pa­nies, signed on to a promise not to sue secu­ri­ty researchers who revealed defects in browsers.

So let’s talk a lit­tle about secu­ri­ty eco­nom­ics and browsers. So secu­ri­ty, obvi­ous­ly it’s not a bina­ry, it’s a con­tin­u­um. We want to be secure from some attack. You heard some­one talk about threat mod­el­ing ear­li­er. So like, you’ve got a bank vault. You know that giv­en enough time and a plas­ma torch, your adver­sary can cut through that bank vault. But you don’t wor­ry about that because your bank vault is not meant to secure your mon­ey for­ev­er, it’s meant to secure your mon­ey until a secu­ri­ty guard walks by on their patrol and calls the police, right. Your bank vault is inte­grat­ed with the rule of law. It is a tech­ni­cal coun­ter­mea­sure that is back­stopped by the rule of law. And with­out the rule of law, your bank vault will even­tu­al­ly be cut open by some­one with a plas­ma cut­ter.

So, secu­ri­ty eco­nom­ics means fac­tor­ing in the expect­ed return on a breach into the design of the sys­tem. If you have a sys­tem that’s pro­tect­ing $500 in assets, you want to make sure that it will cost at least $501 to defeat it. And you assume that you have a ratio­nal actor on the oth­er side who’s not going to come out of your breach one dol­lar in the hole. You assume that they’re not going to be dum­dums.

So, there’s a way that this fre­quent­ly goes wrong, a way that you get con­text shifts that change the secu­ri­ty eco­nom­ics cal­cu­lus. And that’s when the val­ue of the thing that you’re pro­tect­ing sud­den­ly goes up a lot and the secu­ri­ty mea­sures that you’re using to pro­tect it don’t. And all of a sud­den your $501 secu­ri­ty isn’t pro­tect­ing $500 worth of stuff. It turns out that it’s pro­tect­ing $5mil­lion worth of stuff. And the next thing you know there’s some dude with a plas­ma cut­ter hang­ing around your vault.

So this chal­lenge is espe­cial­ly keen in the realm of infor­ma­tion secu­ri­ty because infor­ma­tion secu­ri­ty is tied to com­put­ers, and com­put­ers are every­where. And because com­put­ers are becom­ing inte­grat­ed into every facet of our life faster than we can even keep track of it, every day there’s a new val­ue that can be real­ized by an attack­er who finds a defect in com­put­ers that can be wide­ly exploit­ed. And so every day the cost that you should be spend­ing to secure your com­put­ers is going up. And we’re not keep­ing up. In fact, com­put­ers on aver­age are becom­ing less secure because the val­ue that you get when you attack com­put­ers is becom­ing high­er, and so the expect­ed adver­sary behav­ior is get­ting bet­ter resourced and more ded­i­cat­ed.

So this is where cryp­tocur­ren­cy does in fact start to come into the sto­ry. It used to be that if you found a defect in widely‐used con­sumer com­put­ing hard­ware, you could expect to real­ize a few hun­dred or at best a few thou­sand dol­lars. But in a world where intrin­si­cal­ly hard‐to‐secure com­put­ers are being asked to pro­tect exponentially‐growing cryp­tocur­ren­cy pools…well you know how that works, right? You’ve seen cryp­to­jack­ing attacks. You’ve seen all the exchanges go down. You under­stand what hap­pens when the val­ue of the asset being pro­tect­ed shoots up very sud­den­ly. It becomes extreme­ly hard to pro­tect.

So, you would expect that in that world, where every­thing we do is being pro­tect­ed by com­put­ers that are intrin­si­cal­ly hard to pro­tect and where we need to keep adding more resource to pro­tect them, that states would take as their watch­word mak­ing cryp­to as easy to imple­ment as pos­si­ble; mak­ing secu­ri­ty as easy as pos­si­ble to achieve. But, the reverse is hap­pen­ing. Instead what’s hap­pen­ing is states are start­ing to insist that we’re gonna have to sac­ri­fice some of our secu­ri­ty to achieve oth­er pol­i­cy goals.

So this guy used to be Prime Minister of Australia, he’s not any­more. Wait six months, the cur­rent Prime Minister of Australia will also not be Prime Minister of Australia any­more. This guy, Malcolm Turnbull… Sorry, did I just get his name wrong? I just blew up his name. What is his name? God, he went so quick­ly. Malcolm Turnbull, it is Malcolm Turnbull, it’s right there on the slide. I almost called Malcolm Gladwell.

So he gave this speech where he was explain­ing why he was going to make it the law that every­body had to back­door their cryp­to for him. And you know, all these cryp­tog­ra­phers had shown up and they said, ​“Well the laws of math say that we can’t do that. We can’t make you a thing that’s secure enough to pro­tect the gov­ern­ment and its secrets, but inse­cure enough that the gov­ern­ment can break into it.”

And he said…I’m not gonna do the accent. He said, ​“The laws of Australia pre­vail in Australia. I can assure you of that. The laws of math­e­mat­ics are very com­mend­able, but the only law that applies in Australia is,” read it with me, ​“the law of Australia.” I mean… This may be the stu­pid­est tech­nol­o­gy thing ever said in the his­to­ry of real­ly dumb tech­nol­o­gy utter­ances.

But he almost got there. And he’s not alone, right. The FBI has joined­him in this call. You know, Canada’s joined him in this call. Like, if you ever need­ed proof that mere­ly hav­ing good pecs and good hair doesn’t qual­i­fy you to have good tech­nol­o­gy pol­i­cy the gov­ern­ment of Justin Trudeau and its tech­nol­o­gy pol­i­cy has demon­strat­ed this for­ev­er. This is an equal oppor­tu­ni­ty mad­ness that every devel­oped state in the world is at least dab­bling in.

And we have end­ed up not just in a world where fight­ing crime means elim­i­nat­ing good secu­ri­ty. I mean it’s dumb­er than that, right. We’ve end­ed up in a world where mak­ing sure peo­ple watch TV the right way means sac­ri­fic­ing on secu­ri­ty.

Now, the European Union, they just actu­al­ly had a chance to fix this. Because that copy­right direc­tive that the US forced them to pass in 2001 that has the stu­pid rule in it that they bor­rowed from the DMCA, it just came up for its first major revi­sion in sev­en­teen years. The new copy­right direc­tive is cur­rent­ly near­ly final­ized; it’s in its very last stage. And rather than fix­ing this glar­ing prob­lem with secu­ri­ty in the 21st cen­tu­ry, what they did was they added this thing called Article 13.

So Article 13 is a rule that says if you oper­ate a plat­form where peo­ple can con­vey a copy­right­ed work to the pub­lic… So like if you have a code repos­i­to­ry, or if you have Twitter, or if you have YouTube, or if you have SoundCloud, or if you have any oth­er way that peo­ple can make a copy­right­ed work avail­able. If you host Minecraft skins, you are required to oper­ate a crowd­sourced data­base of all the copy­right­ed works that peo­ple care to add to it and claim; so any­one can upload any­thing to it and say, ​“This copy­right belongs to me.” And if a user tries to post some­thing that appears in the data­base, you are oblig­ed by law to cen­sor it. And there are no penal­ties for adding things to the data­base that don’t belong to you. You don’t even have to affir­ma­tive­ly iden­ti­fy your­self. And, the com­pa­nies are not allowed to strike you off from that data­base of alleged­ly copy­right­ed works, even if they repeat­ed­ly catch you chaffing the data­base with garbage that doesn’t belong to you—the works of William Shakespeare, all of Wikipedia, the source code for some key piece of blockchain infra­struc­ture which now can’t be post­ed to a WordPress blog and dis­cussed until some­one at Automattic takes their tweez­ers and goes to the data­base and pulls out these garbage entries, where­upon a bot can rein­sert them into the data­base one nanosec­ond lat­er.

So this is what they did, instead of fix­ing anti‐circumvention rules to make the Internet safe for secu­ri­ty. So, I men­tion this is in it’s very last phase of dis­cus­sion, and it looked like it was a fix and then the Italian gov­ern­ment changed over and they flipped posi­tions. And we’re actu­al­ly maybe going to get to kill this, but only if you help. If you’re a European, please go to savey​our​in​ter​net​.eu and sent a let­ter to your MEPs. This is real­ly impor­tant. Because this won’t be fixed for anoth­er sev­en­teen years if this pass­es; savey​our​in​ter​net​.eu.

So, when we ask our­selves why are gov­ern­ments so inca­pable of mak­ing good tech­nol­o­gy pol­i­cy, the stan­dard account says it’s just too com­pli­cat­ed for them to under­stand, right. How could we expect these old, decrepit, irrel­e­vant white dudes to ever fig­ure out how the Internet works, right? If it’s too tech­no­log­i­cal you’re too old, right?

But sort­ing out com­pli­cat­ed tech­ni­cal ques­tions, that’s what gov­ern­ments do. I mean, I work on the Internet and so I think it’s more com­pli­cat­ed than oth­er people’s stuff. But you know, when I’m being real­ly rig­or­ous­ly hon­est? I have to admit that it’s not more com­pli­cat­ed than pub­lic health, or san­i­ta­tion, or build­ing roads. And you know, we don’t build roads in a way that is as stu­pid as we have built the Internet.

And that’s because the Internet is much more hot­ly con­test­ed. Because every realm of endeav­or inter­sects with the Internet, and so there are lots of pow­er­ful inter­ests engaged in try­ing to tilt Internet pol­i­cy to their advan­tage. The TV exec­u­tives and media exec­u­tives who pushed for Article 13 you know, they’re not doing it because they’re mustache‐twirling vil­lains. They’re just doing it because they want to line their pock­ets and they don’t care what costs that impos­es on the rest of us. Bad tech pol­i­cy, it’s not bad because mak­ing good pol­i­cy is hard. It’s bad because mak­ing bad pol­i­cy has a busi­ness mod­el.

Now, tech did not cause the cor­rup­tion that dis­torts our pol­i­cy out­comes. But it is being super­charged by the same phe­nom­e­non that is dis­tort­ing our pol­i­cy out­comes. And that’s what hap­pened with Ronald Reagan, and Margaret Thatcher, and their cohort who came to pow­er the same year the Apple II Plus shipped. And among the first things they did in office was dis­man­tle our antitrust pro­tec­tions, and allowed com­pa­nies to do all kinds of things that would have been a radioac­tive­ly ille­gal in the decades pre­vi­ous. Like buy­ing all their com­peti­tors. Like engag­ing in ille­gal tying. Like using long‐term con­tracts in their sup­ply chain to force their com­peti­tors out. Like doing any one of a host of things that might have land­ed them in front of an antitrust reg­u­la­tor and bro­ken up into small­er pieces the way AT&T had been.

And as that hap­pened, we end­ed up in a peri­od in which inequal­i­ty mount­ed and mount­ed and mount­ed. And forty years lat­er, we’ve nev­er lived in a more unequal world. We have sur­passed the state of inequal­i­ty of 18th cen­tu­ry France, which for many years was the gold stan­dard for just how unequal a soci­ety can get before peo­ple start chop­ping off oth­er people’s heads.

And unequal states are not well‐regulated ones. Unequal states are states in which the pec­ca­dil­los, cher­ished illu­sions, and per­son­al pri­or­i­ties of a small num­ber of rich peo­ple who are no smarter than us start to take on out­sized pol­i­cy dimen­sions. Where the pref­er­ences and whims of a few plu­to­crats become law. [applause]

In a plu­toc­ra­cy, pol­i­cy only gets to be evidence‐based when it doesn’t piss off a rich per­son. And we can­not afford dis­tort­ed tech­nol­o­gy pol­i­cy. We are at a break­ing point. Our secu­ri­ty and our pri­va­cy and our cen­tral­iza­tion debt is approach­ing rup­ture. We are about to default on all of those debts, and we won’t like what the bank­rupt­cy looks like when that arrives.

Which brings me back to cryp­tocur­ren­cy and the bub­ble that’s going on around us. The bub­bles, they’re not fueled by peo­ple who have an eth­i­cal inter­est in decen­tral­iza­tion or who wor­ry about over­reach­ing state pow­er. Those bub­bles, right, all the frothy mon­ey that’s in there. Not the coders who are writ­ing it or the prin­ci­pled peo­ple who think about it but all the mon­ey that’s just slosh­ing through it and mak­ing your tokens so volatile that the secu­ri­ty eco­nom­ics are impos­si­ble. That mon­ey is being dri­ven by loot­ers, who are firm­ly entrenched in author­i­tar­i­an states. The same author­i­tar­i­an states that peo­ple are inter­est­ed in decen­tral­iza­tion say we want to get rid of. They’re the ones who are buy­ing cyber weapons to help them spy on their own pop­u­la­tions to fig­ure out who is fer­ment­ing rev­o­lu­tions so they can round them up and tor­ture them and arrest them. So that they can be left to loot their nation­al trea­suries in peace and spin the mon­ey out through finan­cial secre­cy havens like the ones that we learned about in the Panama Papers and the Paradise Papers.

And abet­ting the oli­garchic accu­mu­la­tion of wealth, that is not gonna cre­ate the kinds of states that pro­duce the sound pol­i­cy that we need to make our browsers secure. It will pro­duce states whose pol­i­cy is a fun­house mir­ror reflec­tion of the worst ideas of the sociopaths who have loot­ed their nation­al wealth and installed them­selves as mod­ern feu­dal lords.

Your cryp­tog­ra­phy will not save you from those states. They will have the pow­er of coer­cive force and the unblink­ing eye of 24‍/‍7 sur­veil­lance con­trac­tors. The Internet, the uni­ver­sal net­work where uni­ver­sal com­put­ing end­points can send and receive cryp­to­graph­i­cal­ly secure mes­sages is not a tool that will save us from coer­cive states, but it is a tool that will give us a tem­po­rary shel­ter with­in them. A space that even the most total­i­tar­i­an of regimes will not be able to imme­di­ate­ly pen­e­trate. Where reform­ers and rev­o­lu­tion­ar­ies can orga­nize, mobi­lize, and fight back. Where we can demand free, fair, and open soci­eties with broadly‐shared pros­per­i­ty across enough hands that we can arrive at con­sen­sus­es that reflect best evi­dence and not the whims of a few. Where pow­er is decen­tral­ized.

And inci­den­tal­ly, hav­ing good respon­sive states will not just pro­duce good pol­i­cy when it comes to cryp­to. All of our pol­i­cy fail­ures can be attrib­uted to a small, mon­eyed group of peo­ple who wield out­size pow­er to make their bot­tom line more impor­tant than our shared pros­per­i­ty. Whether that’s the peo­ple who spent years expen­sive­ly sow­ing doubt about whether or not cig­a­rettes would give us can­cer, or the peo­ple who today are assur­ing us that the exis­ten­tial threat that the human species is fac­ing is a con­spir­a­cy among cli­mate sci­en­tists who are only in it for the mon­ey.

So you’re here because you write code. And you may not be inter­est­ed in pol­i­tics, but pol­i­tics is inter­est­ed in you. The rule of law needs to be your alpha and omega. Because after all, all the Constitution is a form of con­sen­sus, right. It’s the orig­i­nal consensus‐seeking mech­a­nism. Using the rule of law to defend your tech­nol­o­gy, it’s the most Internet thing in the world. Let’s go back to Bernstein. When we went to Bernstein and argued this case, we essen­tial­ly went on an Internet mes­sage board and made bet­ter argu­ments than the oth­er peo­ple. And we con­vinced the peo­ple who were lis­ten­ing that our argu­ments were right. This is how you folks resolve all of your prob­lems, right? Proof of con­cept. Running code. Good argu­ments. And you win the bat­tle of the day.

So mak­ing change with words? That’s what every­body does, whether we’re writ­ing code or writ­ing law. And I’m not say­ing you guys need to stop writ­ing code. But you real­ly need to apply your­self to the legal dimen­sion, too. Thank you.

Cory Doctorow: So, we're gonna ask some questions now. I like to call alternately on people who identify as women or non-binary and people who identify as male or non-binary, and we can wait a moment if there's a woman or non-binary person wants to come forward first. There's a mic down there and then there's a rover with a mic. Just stick up your hand.

Audience 1: As someone who spent a lot time involved in the Internet, I'm sure you've read the book The Sovereign Individual. And I recently read this book and it talked a lot about how the Internet would increase the sovereignty of individuals and also how cryptocurrencies will. And it predicted a massive increase in inequality as a direct result of the Internet. Could you comment on that?

Doctorow: Yeah I haven't read the book so I'm not gonna comment directly on the book. But I think it's true that if you view yourself as separate from the destinies of the people around you that it will produce inequality. I think that that's like, empirically wrong, right. Like, if there's one thing we've learned about the limits of individual sovereignty it's that you know, you have a shared microbial destiny. You know, I speak as a person who left London in the midst of a measles epidemic and landed in California right after they stamped it out by telling people that you had to vaccinate your kids or they couldn't come to school anymore.

We do have shared destinies. We don't have individual sovereignty. And even if you're the greatest and— You know, anyone who's ever run a business knows this, right. You could have a coder who's a 100X coder, who produces 100 times more lines of code than everybody else in the business. But if that coder can't maintain the product on their own, and if they're colossal asshole that no one else can work with? Then that coder is a liability not an asset, right. Because you need to be able to work with more than one person in order to attain superhuman objectives. Which is to say more than one person can do. And everything interesting is superhuman, right. The limits on what an individual can do are pretty strong.

And so yeah, I think that that's true. I think that the kind of policy bent toward selfishness kind of self-evidently produces more selfish outcomes. But not better ones, right. Not ones that reflect kind of a shared prosperity and growth. Thank you.

Hi.

Audience 2: Hi. I have had the pleasure of seeing you keynote both Decentralized Web Summits, and the ideas you bring to these talks always really stay with me longer than anything else.

Doctorow: Thank you.

Audience 2: With what you've talked about here, this is honestly one of the most intimidating and terrifying topics, and I'm wondering what are some ways besides staying informed and trying not to get burned out by it all, what are some ways that people can make a difference?

Doctorow: So, I recently moved back from London to California, as I mentioned. And one of the things that that means is I have to drive now, and I'm a really shitty driver. And in particular I'm a really shitty parker. So when I have to park, [miming wild steering motions:] I do a lot of this, and then lot of this, and then a lot of this, and a lot of this. And what I'm doing is I'm like…moving as far as I can to gain one inch of available space. And then—or centimeter. And then moving into that centimeter of available space, because that opens up a new space that I can move into. And then I move as far as I can and I open up a new space.

We do this in computing all the time, right. We call it hill-climbing. We don't know how to get from A to Zed. But we do know how to get from A to B, right. We know where the higher point of whatever it is we're seeking is—stability or density or interestingness or whatever. And so we move one step towards our objective. And from there we get a new vantage point. And it exposes new avenues of freedom that we can take. I don't know how we get from A to Zed. I don't know how we get to a better world. And I actually believe that because the first casualty of every battle is the plan of attack, that by the time we've figured out the terrain, that it would have been obliterated by the adversaries who don't want us to go there.

And so, instead, I think we need heuristics. And that heuristic is to see where your freedom of motion is at any moment and take it. Now, Larry Lessig he's got this framework, the four forces: code, law, norms, and markets. My guess is that most of the people in this room are doing a lot with norms and and markets, right. That's kind of where this conference sits in that little two-by-two. And as a result you may be blind to some of the law and norm issues that are available to you. That it might be that jumping on EFF's mailing list, or if you're a European getting on the EDRi mailing list. Or the mailing list for the individual digital rights groups in your own countries like Netzpolitik in Germany, or the Quadrature du Net in France, or Open Rights Group in the UK, or Bits of Freedom in the Netherlands and so on.

Getting on those lists and at the right moment calling your MEP, calling your MP, or even better yet like, actually going down when they're holding surgeries, when they're holding constituency meetings. They don't hear from a lot of people who are technologically clued-in. Like, they only get the other side of this. And you know, I've been in a lot of these policy forums, and oftentimes the way that the other side prevails is just by making it up, right. Like one of the things we saw in this filter debate like, we had computer scientists who were telling MEPs… You know, the seventy most eminent computer scientists in the world, right, a bunch of Turing Prize winners. Vint Cerf and Tim Berners-Lee said like, "These filters don't exist and we don't know how to make 'em." And they were like, "Oh, we've got these other experts who say 'we know how to do it.'" And they had been told for years that the only reason nerds hadn't built those filters is they weren't nerding hard enough, right.

And if they actually hear from their own constituents, people who run small business that are part of this big frothy industry that everybody wants, their national economies to participate in. Who show up at their lawmakers' offices and say, "This really is catastrophic. It's catastrophic to my business. It's catastrophic to the Internet," they listen to that. It moves the needle.

And you know, you heard earlier someone say are we at pitch now? Well, I should pitch, right? I work for Electronic Frontier Foundation. We're a nonprofit. The majority of our money comes from individual donors. It's why we can pursue issues that are not necessarily on the radar of the big foundations or big corporate donors. We're not beholden to anyone. And it's people like you, right, who keep us in business. And I don't draw money from EFF. I'm an MIT Media Lab research affiliate and they give EFF a grant that pays for my work. So the money you give to EFF doesn't land in my pocket. But I've been involved with them now for fifteen years and I've never seen an organization squeeze a dollar more. So, really think it's worth your while; eff.org. Thank you.

Oh. Someone over here. Yes, hi.

Audience 3: Thank you very much. Really appreciate the speech. It was very inspiring.

Doctorow: Thank you.

Audience 3: Um, I think…maybe not sure how many other people feel this way, but one thing that's been hard to me about politics in general, especially in the age of social media is, you know…there's a lot of it that spreads messages of fear and anger and hatred. And sometimes it feels like when you want to say something and you want to spread a certain voice or just spread a certain message, that there's this fear of getting swept up in all these messages and ideas and things that aren't necessarily… You're not necessarily aware of your own biases and things like that. How does one stay sane, and fight for you know, the right fight?

Doctorow:God. I you know, I wish I knew. I like— I'll freely admit to you I've had more sleepless nights in the last two years than in all the years before it. I mean, even during the movement to end nuclear proliferation that I was a big part of in the 80s when I thought we were all going to die in a mushroom cloud, I wasn't as worried as I am now. It's tough.

I mean, for me like just in terms of like, personal…psychological opsec? I've turned off everything that non-consensually shoves Donald Trump headlines into my eyeballs? You know, that we talk a lot about how like, engagement metrics distort the way applications are designed. But you know, I really came to understand that that was happening about a year and a half ago. So for example they changed the default Android search bar? so that when you tapped in it it showed you trending…searches. Well, like, nobody has ever gone to a search engine to find out what other people are searching for, right? And the trending searches were inevitably "Trump threatens nuclear armageddon." So the last thing I would do before walking my daughter to school every morning is I would go to the weather app. And I would tap in it to see the weather. And it's weather and headlines. And the only headlines you can't turn off are top headlines, and they're trend—you know, they're all "Trump Threatens Nuclear Armageddon," right?

So I realized after a month of this that what had been really the most calming, grounding fifteen minutes of my day where I would walk with my daughter to school, and we'd talked about stuff and it was really quiet—we live on a leafy street… I'd just spend that whole time worrying about dying, right?

And so, I had to figure out how to like go through and turn all that stuff off. Now what I do is I block out times to think about headlines. So I go and I look at the news for a couple hours every day…and I write about it. I write Boing Boing, right. I write a blog about it. Not necessarily because my opinions are such great opinions. But because being synthetic and thoughtful about it means that it's not just…buffeting me, right? It becomes a reflective rather than a reflexive exercise.

But I don't know, right? I mean, I think that— And I don't think it's just the tech. I think we are living in a moment of great psychic trauma. We are living in a— You know. The reason the IPCC report was terrifying was not because of the shrill headlines. The IPCC report was terrifying because it is objectively terrifying, right.

And so, how do you make things that're… I don't know how you make things that're objectively terrifying not terrifying. I think the best we can hope for is to operate, while we are terrified, with as much calm and aplomb and thoughtfulness as is possible.

How are we for time, do you want me off? I know my clock's run out. Or can I take one more question? Stage manager? One more or… One more. Alright. And then we'll ring us off.

Audience 4:Yeahhh!. Hi.

Doctorow: Better be good, though.

Audience 4: Okay, I'm ready. I work for the Media Lab, too. So, my question Cory—thank you for your talk. I think a lot of people in the cryptocurrency world think about the current systems that we exist in. And we're trying to exit those systems to some extent and create parallel…financial, you know, political institutions, what have you, versus expressing voice within the current system. How do you balance exit versus voice in the current system?

Doctorow: Well… You know, in a technol— And I said before that like, a Constitutional argument is just an Internet flame war by another means, right? So, when you're arguing about a commit and a pull request, one of the things you do is you do proof of concept, right? You show that the thing that you're patching is real and can be exploited. Or you show that you you've unit tests to show that your patch performs well.

Those parallel exercises are useful as proof of concepts and as unit tests, right? They're prototypes that we can fold back into a wider world. And I think that… The thing I worry about is not that technologists will build technology. I want technologists to build technology. It's that they will think that the job stops when you've built the proof of concept. That's where the job starts, right? When you can prove that you've written a better algorithm, you then have to convince the other stakeholders in the project that it's worth the actual like non-zero cost of patching to make that work, right? Of going through the whole source tree and finding all the dependencies on the things that you're pulling out and gracefully replacing them. Because you know, when you run a big data center you can't just start patching stuff…you've got a toolchain that you have to preserve, right?

And so that's where the job starts, right? Build your proof of concept, build us a parallel financial system, build us a whatever…so that we can figure out how to integrate it into a much wider, more pluralistic world. Not so that we can separate and seastead on our little…you know, world over there. Like, it doesn't matter how great your— [applause] Thank you. Doesn't matter how great your bunker is, right? Like you can't shoot germs, right? Like if your solution allows the rest of the world to fall into chaos, and no one's taking care of the sanitation system, you will still shit yourself to death of cholera, in your bunker, because like, you can't shoot germs, right? So we need pluralistic solutions that work for all of us.