from the good-intentions-do-not-make-good-policy dept

The last twoposts I wrote about SESTA discussed how, if it passes, it will result in collateral damage to the important speech interests Section 230 is intended to protect. This post discusses how it will also result in collateral damage to the important interests that SESTA itself is intended to protect: those of vulnerable sex workers.

Concerns about how SESTA would affect them are not new: several anti-traffickingadvocacygroups and experts have already spoken out about how SESTA, far from ameliorating the risk of sexual exploitation, will only exacerbate the risk of it in no small part because it disables one of the best tools for fighting it: the Internet platforms themselves:

And now there is yet more data confirming what these experts have been saying: when there have been platforms available to host content for erotic services, it has decreased the risk of harm to sex workers.

The September 2017 study, authored by West Virginia University and Baylor University economics and information systems experts, analyzes rates of female homicides in various cities before and after Craigslist opened an erotic services section on its website. The authors found a shocking 17 percent decrease in homicides with female victims after Craigslist erotic services were introduced.

The reasons for these numbers aren't entirely clear, but there does seem to be a direct correlation in the safety to sex workers when, thanks to the availability of online platforms, they can "move indoors."

Once sex workers move indoors, they are much safer for a number of reasons, Cunningham said. When you’re indoors, “you can screen your clients more efficiently. When you’re soliciting a client on the street, there is no real screening opportunity. The sex worker just has to make the split second decision. She relies on very limited and complete information about the client’s identity and purposes. Whereas when a sex worker solicits indoors through digital means, she has Google, she has a lot of correspondence, she can ask a lot of questions. It’s not perfect screening, but it’s better.”

The push for SESTA seems to be predicated on the unrealistic notion that all we need to do to end sex trafficking is end the ability of sex services to use online platforms. But evidence suggests that removing the "indoor" option that the Internet affords doesn't actually end sex work; it simply moves it to the outdoors, where it is vastly less safe.

In 2014, Monroe was a trafficking victim in California. She found her clients by advertising on SFRedbook, the free online erotic services website. One day, she logged into the site and discovered that federal authorities had taken it down.
Law enforcement hoped that closing the site would reduce trafficking, but it didn’t help Monroe. When she told her pimp SFRedbook was gone, he shrugged. Then he told her that she would just have to work outdoors from then on.

“When they closed down Redbook, they pushed me to the street,” Monroe told ThinkProgress. “We had a set limit we had to make a day, which was more people, cheaper dates, and if you didn’t bring that home, it was ugly.” Monroe, who asked that her last name be withheld for privacy reasons, had been working through Redbook in hotel rooms almost without incident, but working outdoors was much less safe.

“I got raped and robbed a couple of times,” she said. “You’re in people’s cars, which means nobody can hear you if you get robbed or beaten up.”

A recurrent theme here on Techdirt is that, as with any technology policy, no matter how well-intentioned it is, whether or not it is a good policy depends on its unintended consequences. Not only do we need to worry about how a policy affects other worthwhile interests, but it also needs to consider how it affects the interest it seeks to vindicate. And in this case SESTA stands to harm the very people it ostensibly seeks to help.

Does that mean Congress should do nothing to address sex trafficking? Of course not, and it is considering many more options that more directly address the serious harms that arise from sex trafficking. Even Section 230 as it currently exists does not prevent the government from going after platforms if they directly aid it. But all too often regulators like to take shortcuts and target platforms simply because bad people may be using them in bad ways. It's a temptation that needs to be resisted for many reasons, but not the least of which is that doing so may enable bad people to behave even worse.

from the this-is-not-a-game dept

Over the last few months, we've had a very, very small, but still vocal group of folks in our comments who have gotten angry every time we've been critical of Donald Trump -- even when we were making nearly identical complaints about him as we did about Barack Obama and Hillary Clinton. That group of people probably won't like this post very much, though I do hope they'll read it with open minds. We're not a political blog. We cover technology and innovation, as well as the legal, economic and policy issues related to those things. Over the years, that's included issues related to civil liberties and civil rights. We don't see these things as being separate. They are all connected and intertwined. We've even spent plenty of time discussing immigration, though focusing on high tech and entrepreneur immigration.

But I don't think there's any need for me to try to justify why I'm making this post on Techdirt today. This is about humanity. And if you want to complain in the comments that you don't want to read this on a "tech" site, well, then maybe take a second and think about what this says about you. Basically my entire family came to America between around 1890 and 1920 -- most of them escaping religious persecution elsewhere. My great grandmother had to hide in the bottom of a boat to escape from where she lived. Many came through Ellis Island, and were welcomed into America. My grandfathers built up businesses here. One fought bravely against Nazis (literally) in World War II for the US in Europe and North Africa, and came back to the US and built a company that (among other things) was a huge supplier for the Boy Scouts of America. While they may have struggled at times, my family came to America and was embraced by America, thrived in America and has always loved America. My wife is an immigrant. Her family moved here when she was young to give her and her siblings a better life. And that's what they found. America embraced them and they embraced America back. They're all US citizens.

All weekend long, I've been reading all sorts of accounts about President Trump's executive order. Some of it has been thoughtful. Some of it has been hysterical. Some of it has been painful. Some of it has been ridiculous.

But it all comes back to one thing: this is about our humanity.

The "excuses" that some have been spewing for the executive order make no sense. They say this is about "safety," yet there is no evidence that the people being kept out were a risk to our safety. As many have noted, not a single terrorist attack has come from people from those countries. They say this is about "extreme vetting" but ignore that refugees already go through a ridiculously long and thorough "extreme vetting" process that can take years. They say that this is just an "inconvenience" to a "small group" of people, ignoring that they are basically upending the lives of entire families -- families including those with permanent resident status, who have been valuable, contributing members to our country for years and years and years.

This is madness.

They say that this is necessary to protect us at home, but even ignoring everything above, it's hard to see how this doesn't make us less safe. How can anyone read this essay by Kirk Johnson and not realize how much harm we're doing. Johnson has devoted a big part of his life to helping Iraqis who literally put their lives at risk to help Americans, and then were ignored by America. Read what he's written and ask yourself what foreigner will sign up to help America again in the future?

They say this is about "the rule of law," but then explain why Customs and Border Patrol are literally ignoring court orders and refusing to even speak to members of Congress? They say this is about stopping threats at home, but that doesn't explain at all why the White House failed to have the relevant experts review the executive order before it was put in place, or why multiple national security experts have noted this will clearly make us less safe. And, of course, nothing explains why the White House directly overruled Homeland Security saying this wouldn't apply to permanent residents (and then required DHS to later come out and try to clarify).

Again, this goes back to a story about basic humanity. And our country massively failing.

I know there are lots of people speculating on a variety of things around all of this. There are explanations that this is part of a "shock doctrine" move to sow chaos and confusion and deligitimize organizations before the really bad policy is put into place. And perhaps that's true. It's certainly something to watch for. But, as someone who tries to look at every policy proposal based on a "does this make sense" or "does this make the world a better place" metric -- and not on a "does this help my team" scale -- it must be stated that the executive order Donald Trump signed late last week, that was put into effect over the weekend is not just a disaster. It hasn't just created a constitutional crisis with parts of the executive branch ignoring the judiciary. It's not just a policy that is impacting millions of lives.

It's simply inhumane.

This is about humanity, and anyone who has any sense of humanity has a responsibility to speak up about it and to say that this is not right. If you get angry about this post, and think you need to insult us and attack me or this site, I would only recommend that you first take stock of your life, and think about what message you are sending out into the world when you do so.

No one should be surprised that the [Texas] Department of Public Safety overstated the number of "high risk" arrests they made as part of the border surge, as AP's Paul Weber ably reported last week. "Among the 'high-threat' incidents was a trailer that unlatched from an RV and rolled into oncoming traffic, killing another driver in a town more than 150 miles from the border. Other crimes lumped in with suspected killers and human traffickers were speeding teenagers and hit-and-runs that caused no serious injuries."

Nothing sells like fear. And the Department of Public Safety is in need of some sales. There's $800 million in border security dollars at stake. At least. The DPS would like $300 million more this year because it's just damn unsafe to share a border with a foreign country.

The DPS classified 1,800 arrests as high-threat. And apparently many of them actually were. But, according to an Associated Press analysis, many of these arrests turned out to be for drunken driving, failure to pay child support and minor drug possession. The DPS didn't bother to differentiate user amounts from smuggler amounts in its statistics. It also considered 60 counties to be border-region, which undoubtedly would tax the imaginations of even our best and brightest middle school geography teachers.

It's not just a border thing. The same number are used to nail down Drug War dollars. The DPS says "high threat" traffickers, conveniently obscuring the fact that nearly 30% of the busts it references were for less than a gram of cocaine or marijuana.

In response to the AP's findings, the Department of Public Safety said it will recommend removing child support evaders from the list and signaled a willingness to stop classifying other arrests as "high threat." However, it defended the data overall, saying it isn't intended to measure border security, even though the figures are included in briefings to lawmakers.

Whatever. Lawmakers certainly believed the DPS was making border security "measurements" with its twisted facts. That's all that matters. While there are some disbelievers, like Rep. Terry Canales ("It's deceptive to say the least..."), there were certainly far more who bought into the "fear for your safety" narrative.

And why wouldn't they? Using illegal immigrants to gather tax dollars is a longstanding Texas tradition. If the DPS feels comfortable calling deadbeat dads "high threats," it's only because its role models -- a long string of indistinguishable governors (Rick Perry, Greg Abbott) and alleged serial killer Sen. Ted Cruz -- have built sustainable political careers by capitalizing on a fearful narrative not actually backed up by numbers. And if it's not Mexican immigrants, it's ISIS setting up franchises right over the fence.

So, yeah... this:

It's not just a Texas thing. The federal government does the same thing and has for years. The Drug War hasn't resulted in any less funding over the past forty years despite steep drops in crime rates and legalization efforts in multiple states. The existential threat of terrorism makes sure everyone from the DHS to local PDs have plenty of grant money to buy surveillance tech to hunt down amateur photographers and fast food thieves.

The truth is America has never been safer. Crime rates have generally remained at historic lows for much of this decade. For all the talk about a "war on cops," officers are safer than they've been in fifty years. Law enforcement wants us to belive sex trafficking is an epidemic, but long, well-funded investigations routinely fail to turn up any evidence of this. Any number of law enforcement and parenting groups portray every passing vehicle and pseudonymous social media account as potential child molesters, but the pedo playground they believe the internet is simply doesn't exist.

All these histrionics have almost nothing to do with safety or any government agency's concern about the wellbeing of those putting money in their pockets. It's about the money. It always has been. The primary goal of any government agency is to secure the same amount of funding as last year, if not a little more. Whatever it takes to achieve that goal is deemed acceptable, even if it means using stats to lie to those controlling the purse strings.

from the but-i-want-my-flying-car! dept

If you listen to some entrepreneurs and investors, the flying car – a longstanding staple of science fiction – is right around the corner. Working prototypes exist. At least twocompanies already take orders for the vehicles, with deliveries promised next year.

The last decade has seen the introduction of practical consumer videoconferencing, voice recognition, drones, self-driving cars and many other items that once were found only in science fiction stories. It therefore might seem plausible that practical flying cars are around the corner. They aren't. Indeed, massive safety, infrastructure and technology problems make them a near impossibility.

The first concern is safety. While flying a commercial airline is always safer than driving oneself the same distance, it's an entirely different story if one looks at per-trip fatality rates. The Department of Transportation estimates that Americans take about 350 billion car trips per-year and experience about 30,000 fatal accidents; roughly one fatal accident per 11 million trips. By contrast, there are roughly 35 million scheduled air flights around the world each year. Over the past decade, the number of commercial aviation incidents that have proved fatal has averaged 17 annually. This means about one of every 2 million commercial air flights ends in death.

We see these fatalities every year, despite pilots' years of intense training, planes' extensive safety equipment requirements, regular maintenance checks and airlines' need to maintain sterling reputations for safety. All of these provide far more safeguards than anything that applies to cars on the road.

It's true that there are some factors that might make flying cars safer than commercial jetliners. They would travel at lower speeds and lower altitudes, for instance. But there's no practical way to subject them to the same safety and training standards imposed on commercial airplanes if they are to become anything like a consumer product. Indeed, the per-trip fatality rates for private planes already is very likely higher than commercial airliners, but there are no worldwide statistics available. Safety advocates would make a plausible case for banning flying cars on these grounds alone.

Even if one thinks these risks are acceptable—and they probably are, given the potential advantages of flying cars—that doesn't solve the even greater infrastructure or technological problems. The current working models of flying cars need runways to take off and land. Bringing them into regular use would require runways just about everywhere, without obviating the need for parking lots. The world's busiest airport, Atlanta's Hartsfield-Jackson, accommodates slightly less than 2,500 aircraft movements each day on its five runways and 4,700 acres. Any sizable office building would need its own version of Hartsfield-Jackson if people were to commute to work via their flying cars. The space to build facilities this size for flying cars simply doesn't exist anywhere near any city of any size.

New technologies could theoretically obviate the need for runways. One Japanese team has shown off a modified lightweight drone supposedly capable of vertical takeoff and landing like a helicopter. But making these vehicles practical would require breakthroughs that appear to be decades away. Existing helicopters and military "jump jets" still require a significant amount of space to land, are even noisier than commercial jets and drink huge amounts of fuel. As such, they're not really used for travel. Commercially produced helicopters have existed since the 1940s and aren't currently used for scheduled commercial service anywhere in the United States. Technological breakthroughs could eventually solve these problems, but it's unlikely that a few years of flying-car development will overcome problems that have bedeviled helicopter designers for more than seven decades.

While the promised 2017 deliveries of working flying cars seem unlikely, it's far from impossible that a commercially produced civilian airplane with the kinds of retractable wings and safety equipment that would allow it to be driven on highways might make it to market within the next decade. But widely available flying cars, more likely than not, will remain clearly in the realm of science fiction.

from the the-public-should-decide-this,-says-agency-seeking-compelled-action-from-a-priva dept

The FBI's attempt to use an All Writs order to force Apple to help it break into a locked iPhone is seemingly built on a compelling case: a large-scale shooting involving people with ties to terrorist groups. This is exactly the sort of case Comey hoped would help push his anti-encryption agenda forward. Or so it seems.

But is the case really that "compelling," especially in the legal sense of the word, which requires the court to weigh the imposition on Apple against the public's interest in seeing wrongdoers punished/future terrorist attacks prevented?

Even so, there are many factors that make me underwhelmed by the San Bernardino case, too. The killers deliberately destroyed other devices but didn’t bother destroying this one. It was a work phone issued by San Bernardino County. Until some time prior to the killings it was backing up to iCloud, and the FBI was able to get those backups.

]So once again, law enforcement personnel got information about some of what was on the phone. They’re missing a mere fraction of its contents. And one would guess that in the time between the last iCloud backup and the terrorist attack, Syed Farook did not suddenly start updating his work iPhone’s contacts with the addresses of his terrorist buddies, but refrained from calling or texting them (which the phone company could tell the FBI).

The FBI likely knows there's little chance much of investigative worth will be found on this phone. It wants the precedent, not the iPhone's contents. That's why Comey is telling anyone who will listen about other cases where encrypted phones have led to "dead ends." Comey's pitch for an Apple-assisted break-in presumes locked phones are literally helping people get away with murder. Friedersdorf notes that Lawfare's Ben Wittes -- arguing on behalf of the FBI -- cites testimony Comey gave to Congress in which he discusses a pregnant woman who was murdered last summer. Her phone was found near her body but the FBI has been unable to access the contents.

Again, Comey is making serious presumptions about the worth of the phone's contents to investigators. Taking a more logical approach to the facts of the case, Freidersdorf points out this phone is likely to be as useless as the phone central to the FBI's current actions.

For the first 230 years of U.S. history, police officers managed to investigate murders without the benefit of any evidence from smartphones belonging to the victim. Keep that baseline in mind, because any information that the cops get from the Louisiana victim’s device leaves them better off than all prior generations of law enforcement. And they presumably did get some useful information, because a locked device doesn’t prevent them from going to the phone company and getting access to call, text, and location data generated by the device.

It doesn’t stop the authorities from going to Uber to see when the victim last took a ride, to her cloud backup to see what was last uploaded, to her email client to see if she sent or received any messages through that medium, and on and on and on.

Admittedly, there could be something only on the device relevant to the murder. The victim could’ve recorded a quick voice memo. She could’ve jotted a note to herself. She could’ve snapped a photograph of the killer in the moment before he acted. But you’ve gotta think that the odds are against there being conclusive evidence on the phone that isn’t available elsewhere, not only because so much data is available in multiple places, but because the killer left the phone with the victim.

Encryption has turned smartphones into Chekov's Device: if a locked phone is owned by a criminal or found with a crime victim, the automatic assumption is that it houses relevant information of high investigatory value.

Comey is carefully picking his examples when arguing for greater law enforcement access… for the most part. A mass shooting that left 14 dead. A murdered pregnant woman. A fender bender at a busy intersection?

I’d say this problem we call going dark, which as Director Clapper mentioned, is the growing use of encryption, both to lock devices when they sit there and to cover communications as they move over fiber optic cables is actually overwhelmingly affecting law enforcement. Because it affects cops and prosecutors and sheriffs and detectives trying to make murder cases, car accident cases, kidnapping cases, drug cases. It has an impact on our national security work, but overwhelmingly this is a problem that local law enforcement sees.

Yes, the FBI would like Apple to help it break into an iPhone because law enforcement needs a leg up in traffic enforcement. If the FBI is given what it wants, the precedent will allow local law enforcement agencies to seek the same assistance any time the most mundane of investigations runs into a locked phone.

James Comey said (but didn't mean) the public needs to determine for itself whether they'd rather have security or safety (and presumably apply that conclusion to every member of the public who diagrees with them). But as Friedersdorf points out, every individual member of the public is free to make that decision for themselves, without having to drag in either party or 1789's All Writs Act.

The fact that there’s an unsolved murder case where some evidence might or might not be on a locked phone doesn’t seem like a compelling anecdote demonstrating the need to weaken security on everyone’s consumer devices. And if you’re worried that you’ll be killed and the lock on your smartphone will prevent the cops from finding your killer, by all means, disable the lock screen or leave a copy of your code in a safety deposit box. Everyone is free to decide when the costs of having strong security outweigh the benefits.

This is a decision any individual can make on their own. It doesn't have to involve the government compelling companies to purposefully weaken protective measures. Any member of the "nothing to hide" crowd can shut off anything meant to keep others (criminals or law enforcement) from accessing their phones' contents. Anyone who thinks the government has a "right" to access the contents of their phone should do whatever they can to ensure the government actually has that access -- provided it's limited to their own devices.

Comey says the public should consider the balance between safety and security in light of his agency's efforts. What's left unsaid is that the public has always had the ability to consider the tradeoffs and act on those conclusions. It didn't need the FBI's permission. Or Apple's. But this isn't about the public's ability to determine its own security/safety balance and applying it on a case-by-case basis. This is about the FBI seeking to have that balance determined by a magistrate judge and applied to everyone, regardless of the public's determination of where that balance should actually lie.

from the dishonest-doj dept

While everyone's waiting for Apple's response (due late next week) to the order to create a backdoor that would help the FBI brute force Syed Farook's work iPhone, the DOJ wasted no time in further pleading its own case, with a motion to compel. I've gone through it and it's one of the most dishonest and misleading filings I've seen from the DOJ -- and that's saying something. Let's dig in a bit:

Rather than assist the effort to fully investigate a deadly
terrorist attack by obeying this Court's Order of February 16, 2016,
Apple has responded by publicly repudiating that Order. Apple has attempted to design and market its products to allow
technology, rather than the law, to control access to data which has
been found by this Court to be warranted for an important
investigation. Despite its efforts, Apple nonetheless retains the
technical ability to comply with the Order, and so should be required
to obey it.

This part is only marginally misleading. The key point: of course Apple has designed a product that allows technology to control access because that's how encryption works. It's as if the DOJ still doesn't understand that. Here's a simple, if unfortunate, fact for the DOJ: there are always going to be some forms of communications that it doesn't get to scoop up. Already we know that Farook and his wife destroyed their two personal iPhones. Why not just recognize that fully encrypted phones are the equivalent of that? No one seems to be whining about the destroyed iPhones and what may have been lost even though the very fact that they were destroyed, and this one was not, suggests that if there was anything important on any of his phones, it wasn't this one. There are also things like communications between, say, a husband and wife in their own home. The DOJ can never get access to those because the two people are dead. Think of that like their brains were encrypted and their death made the key get tossed.

There are lots of situations where the physical reality is that the DOJ cannot recover communications. It's not the end of the world. It's never been the end of the world.

Apple, now (finally) trying to design encryption systems that make it so no one else can get in sees this is the best way to protect the American public, because it means that their own information is much safer. It means fewer phones get stolen. It means fewer people are likely to have their information hacked. It means much more safety for the vast majority of the public. And I won't even get into the fact that it was the US government's own hacking of private data that pushed many companies to move more quickly towards stronger encryption.

The government
has reason to believe that Farook used that iPhone to communicate
with some of the very people whom he and Malik murdered. The phone may contain critical communications and data prior to and around the
time of the shooting that, thus far: (1) has not been accessed; (2)
may reside solely on the phone; and (3) cannot be accessed by any
other means known to either the government or Apple. The FBI
obtained a warrant to search the iPhone, and the owner of the iPhone,
Farook's employer, also gave the FBI its consent to the search.
Because the iPhone was locked, the government subsequently sought
Apple's help in its efforts to execute the lawfully issued search
warrant. Apple refused.

"May contain" is a pretty weak standard, especially noting what I said above. Furthermore, if there were communications with Farook's victims, then shouldn't that information also be accessible via the phones of those individuals as well? And if they already know that there was communication between the two, much of that data should be available elsewhere, in terms of metadata of a phone call, for example.

Apple left the government with no option other than to apply to
this Court for the Order issued on February 16, 2016.

Actually, there are plenty of other options, including traditional detective work, looking for information from other sources or just recognizing that sometimes you don't get every piece of data that exists. And that's okay.

The Order
requires Apple to assist the FBI with respect to this single iPhone
used by Farook by providing the FBI with the opportunity to determine
the passcode. The Order does not, as Apple's public statement
alleges, require Apple to create or provide a "back door" to every
iPhone; it does not provide "hackers and criminals" access to
iPhones; it does not require Apple to "hack [its] own users" or to "decrypt"
its own phones; it does not give the government "the power
to reach into anyone's device" without a warrant or court
authorization; and it does not compromise the security of personal
information. To the contrary, the Order allows Apple to retain custody of its software at all times,
and it gives Apple
flexibility in the manner in which it provides assistance. In fact,
the software never has to come into the government's custody.

And here's where the misleading stuff really starts flowing. It absolutely is a backdoor. Anything that makes it easier for a third party to decrypt data without knowing the key is a backdoor. That's the definition of a backdoor. That it comes in the form of making it substantially easier to brute force the passcode doesn't change the fact that it's still a backdoor.

And, yes, this impacts "every" iPhone. As Senator Ron Wyden correctly notes, if the precedent is set that Apple can be forced to do this for this one iPhone, it means it can be forced to do it for all iPhones. No, this single piece of code may not be the issue -- though there are some concerns that even creating this code could lead to some problems if the phone connects to a server -- but forcing a company to hack its own customers puts everyone at risk.

And yes, there is no legitimate way to describe this without claiming that it's hacking Apple's own customers. The whole point of the system is to get around the fact that they don't have the key, and building a tool to disable security features and then allow a brute force attack on the passcode is very much exactly "hacking" Apple's own customers. Sure, this one still requires a warrant, but once Apple is pushed to create that kind of code -- and other companies are forced to build similar backdoors, the technology itself is being designed with extra vulnerabilities that will put many more people at risk. It's not just about the DOJ seeing what's on this damn phone.

The fact that Apple can retain control over the software is a total red herring. No one cares about that. It's about the precedent of a court requiring a company to hack its own customers, as well as forcing them to create a backdoor that can be used in the future -- even to the point of possibility requiring such backdoors in future products.

In the past, Apple has consistently complied with a significant
number of orders issued pursuant to the All Writs Act to facilitate
the execution of search warrants on Apple devices running earlier
versions of iOS. The use of the All Writs Act to facilitate a
warrant is therefore not unprecedented; Apple itself has recognized
it for years. Based on Apple's recent public statement and other
statements by Apple, Apple's current refusal to comply with the
Court's Order, despite the technical feasibility of doing so, instead
appears to be based on its concern for its business model and public
brand marketing strategy.

And the misleading bullshit gets ratcheted up a notch. First of all, we already went through why the "Apple helped us in the past" story is wrong. This is totally different. One is giving access to unencrypted information that Apple had full access to. The other is building a system to hack away security features in order to hack into an encrypted account. Very, very different. Second, the whole idea that better protecting its customers is nothing more than "a brand marketing strategy" is insulting. Should the US government want the American public to be protected from criminals and malicious hackers and attacks? The best way to do that is with encryption. The fact that consumers are demanding that they be safer is not an "Apple marketing strategy" it's Apple looking out for the best interests of its customers.

And I won't even dig deep into the fact that one of the big reasons why the public is clamoring for more protection these days is because the US government ran roughshod over the Constitution over the past few years to suck up all kinds of information it shouldn't have.

Later in the motion, the DOJ again argues that there's no "unreasonable burden" on Apple to hack its own customers. It trots out a similar line that was in the original application for the order, saying "what's the big deal -- we're just asking for software, and Apple makes software, so no burden."

While the Order in this case requires Apple to provide or employ
modified software, modifying an operating system which is
essentially writing software code in discrete and limited manner is
not an unreasonable burden for a company that writes software code as
part of its regular business. The simple fact of having to create
code that may not now exist in the exact form required does not an
undue burden make. In fact, providers of electronic communications
services and remote computing services are sometimes required to
write some amount of code in order to gather information in response
to subpoenas or other process. Additionally, assistance under the
All Writs Act has been compelled to provide something that did not
previously exist the of the contents of devices seized
pursuant to a search warrant. In United States v. Fricosu..., a defendant's computer whose
contents were was seized, and the defendant was ordered
pursuant to the All Writs Act to assist the government in producing a
copy of the contents of the computer. Here, the type assistance does not even require Apple to assist in producing the
contents; the assistance is rather to facilitate the
FBI's attempts to test passcodes.

Again, this is both ridiculous and extremely misleading. Creating brand new software -- a brand new firmware/operating system is fraught with challenging questions and potential security issues. It's not just something someone whips off. If done incorrectly, it could even brick the device entirely, and can you imagine how the FBI would react then? This is something that would require a lot of engineering and a lot of testing -- and still might create additional problems, because software is funny that way. Saying "you guys write software, so writing a whole new bit of software isn't a burden" is profoundly ignorant of the technological issues. Update: If you want a long and detailed post from someone who absolutely knows how iPhone forensics works, and how incredibly involved creating this software would be, go read this blog post right now. In it, Jonathan Zdziarski, notes that the DOJ is flat out lying in the way it describes what it's asking Apple to do, and it would be incredibly involved, and would create all sorts of risks of the code getting out.

Second, the Fricosu case is quite different. That was compelling someone to give up their own encryption key -- something that not all courts agree with by the way, as some view it as a 5th Amendment or 1st Amendment violation. That's quite different than "write a whole new software thing that works perfectly the way we want it to."

As noted above, Apple designs and implements all of the features
discussed, writes and signs the routinely
patches security or functionality issues in its operating system, and
releases new versions of its operating system to address issues. By
comparison, writing a program that turns off features
that Apple was responsible for writing to begin with would not be
unduly burdensome.

This shows a profound technological ignorance. Yes, Apple updates its operating system all the time, but yanking out security features is a very different issue, and could have much wider impact. It might not, but to simply assume that it's easy seems profoundly ignorant of how software and interdependencies work. Again, the DOJ just pretends it's easy, as if Apple can just check some boxes that say "turn off these features." That's not how it works.

Moreover, contrary to Apple's recent public statement that the
assistance ordered by the Court "could be used over and over again,
on any number of devices" and that "[t]he government is asking Apple
to hack our own users," the Order is tailored for and limited to this
particular phone. And the Order will facilitate only the FBI's
efforts to search the phone; it does not require Apple to conduct the
search or access any content on the phone. Nor is compliance with
the Order a threat to other users of Apple products. Apple may
maintain custody of the software, destroy it after its purpose under
the Order has been served, refuse to disseminate it outside of Apple,
and make clear to the world that it does not apply to other devices
or users without lawful court orders. As such, compliance with the
Order presents no danger for any other phone and is not "the
equivalent of a master key, capable of opening hundreds of millions
of locks."

We discussed some of this above, but the issue is not the specific code that Apple will be forced to write, but rather the very fact that it will be (contrary to the DOJ's claim) forced to hack their own phones to eliminate key security features, in order to allow the FBI to get around the security of the phone and access encrypted content. If the court can order it for this phone, then yes, it can order it for any iPhone, and that's the key concern. Furthermore, again having Apple tinker with the software can introduce security vulnerabilities -- and already this discussion has revealed a lot about how hackers might now attack the iPhone. I'm all for full disclosure of how systems work, so that's okay. But the real issue is what happens next. If Apple looks to close this "loophole" in how its security works in the next iPhone update, will the court then use the All Writs Act to stop them from doing so? That's the bigger issue here, and one that the DOJ completely pretends doesn't exist.

To the extent that Apple claims that the Order is unreasonably
burdensome because it undermines Apple's marketing strategies or
because it fears criticism for providing lawful access to the
government, these concerns do not establish an undue burden. The
principle that "private citizens have a duty to provide assistance to
law enforcement officials when it is required is by no means foreign
to our traditions."

Again, this is a made up talking point. Protecting user privacy, as they demand it, is not a "marketing strategy." It's a safety and security strategy. You'd think, of all agencies, the FBI would appreciate that.

Anyway, you can go through the entire 35 page filing yourself, but these were the key points, and almost all of them are misleading. It should be interesting to see Apple's response next week.

from the hidden-virtues dept

Despite a few daring experiments in the space, the dark net (or dark web, if you prefer) is generally seen as a dangerous, frightening place inhabited by terrorists, pornographers and general ne'er-do-wells. That makes a report in The Guardian about drug dealers moving online unusual, because it shows how the dark net can also be beneficial to society:

Research into internet drug markets by the European Monitoring Centre for Drugs and Drug Addiction (EMCDDA) suggested the self-regulation of online markets such as Silk Road provide a safer environment for users and dealers of illicit substances.

Feedback mechanisms similar to eBay mean customers are able to hold dealers to account for the service they provide, the report said, while remote access to the market almost eliminates the risk of violence that has long been an integral part of the black economy.

Moving online not only safeguards drug users from violence and theft when they buy drugs in the physical world, it provides a natural way for customers to provide feedback on the quality of the drugs provided. Just as with traditional e-commerce companies, drug dealers who go digital can no longer risk bad customer reviews by providing inferior or dangerous products, since their future sales are likely to suffer. As a result:

Drugs available through darknet markets tend to be of a higher purity than those available on the streets.

The new report comes from the European Monitoring Centre for Drugs and Drug Addiction, which is funded by the European Union, and, as usual, is accompanied by an official comment from the relevant EU commissioner. Unfortunately, Dimitris Avramopoulos, the European Commissioner for Migration, Home Affairs and Citizenship, trots out the usual unthinking reaction to drug sales that has made the long-running and totally futile "war on drugs" one of the most destructive and counterproductive policies ever devised:

We should stop the abuse of the internet by those wanting to turn it into a drug market. Technology is offering fresh opportunities for law enforcement to tackle online drug markets and reduce threats to public health. Let us seize these opportunities to attack the problem head-on and reduce drug supply online.

That blinkered attitude ignores the important advantages moving drug sales from the physical world to the digital one brings not just for for users and dealers, but also for society as a whole, which does not have to deal with the social and economic consequences of violence on the streets, or with the long-term damage caused by poor-quality products. Along the way, his remarks inevitably and unhelpfully reinforce the view that the dark net is evil, and thus is something to be destroyed.

The latest case in point: an internal Homeland Security Inspector General investigation of the TSA revealing massive failure on the part of the TSA across dozens of the country's busiest airports. According to a copy of the findings, undercover investigators were able to smuggle mock explosives or banned weapons through security checkpoints -- 95% of the time:

"According to officials briefed on the results of a recent Homeland Security Inspector General’s report, TSA agents failed 67 out of 70 tests, with Red Team members repeatedly able to get potential weapons through checkpoints. In one test an undercover agent was stopped after setting off an alarm at a magnetometer, but TSA screeners failed to detect a fake explosive device that was taped to his back during a follow-on pat down. Officials would not divulge the exact time period of the testing other than to say it concluded recently."

That's not just a little dysfunction, that's a wholesale systemic breakdown, and it once again raises the question of what precisely taxpayers are paying for. And while the TSA is quick to consistently insist it's making improvements every day, this is effectively the same thing that happened last year when a "red team" member was able to smuggle a mock bomb onto an airplane. This latest survey also piggybacks on earlier reports that indicate no matter how much money we throw at the TSA, it's still awful at doing its job, whether that's passenger or luggage screening:

"That review found “vulnerabilities” throughout the system, attributing them to human error and technological failures, according to a three-paragraph summary of the review released in September. In addition, the review determined that despite spending $540 million for checked baggage screening equipment and another $11 million for training since a previous review in 2009, the TSA failed to make any noticeable improvements in that time."

Great job, team! Of course, whenever you find this level of dysfunction you can usually find a dodgy money trail -- since just like nation destroying and rebuilding -- presenting the illusion of security is extremely profitable. That likely won't be getting better anytime soon, with a lobbyist for Rapiscan Systems (the company that provides the controversial X-ray scanners used in most major airports) taking a job with the House Appropriations Committee's Homeland Security Subcommittee, which oversees the TSA budget. We've been talking about Rapiscan's overly-cozy relationship with government for five years now.

Realizing how broken the TSA truly is can easily lead one down the path of worrying that there's a much thinner line between you and being wind-dispersed fertilizer than most would care to admit. But as Robert Graham points out, the fact that the TSA is horrible at its job -- combined with the fact that planes aren't exploding -- statistically suggests that bombs are a much smaller threat to airplanes than we've historically assumed:

"This leads to a counter-intuitive result that the TSA is really not stopping any danger. If guns and bombs are getting on planes, but planes aren't falling out of the sky, it must mean that they aren't a danger. I point this out because in the end, safety is an emotional thing rather than logical. No matter how much I do the math, people do not believe it. They believe bombs are a danger to airplanes in much the same way many don't believe the safety statistics compared to driving."

Note that's not to say we don't the need the hidden and not-so-hidden security apparatus that exists outside of this costly new game of make believe. But these latest findings continue to prop up the argument that the TSA is a glorified high school theater troupe playing dress up, doing little more than adding a very expensive layer of dysfunction to the already frequently-dismal air travel experience.

from the after-denying-it-all dept

There are lots of apps out there for parents spying on their kids computer/smartphone activities -- with the marketing pitch often being about how this will help "keep them safe" or some other such thing. mSpy is one of those companies, advertising right on the front page about how its snooping software can "keep children safe and employees efficient." It leaves out the bit about making both distrustful, but that's another debate for another day. Brian Krebs recently revealed that a "huge trove of data" had been leaked from mSpy and was being shared around the darkweb. And it exposed not just customer names but "countless emails, text messages, payment and location data" of those children and employees that the company was supposedly making "safe" and "efficient."

“There is no data of 400,000 of our customers on the web,” a spokeswoman for the company told the BBC. “We believe to have become a victim of a predatory attack, aimed to take advantage of our estimated commercial achievements.”

"Much to our regret, we must inform you that data leakage has actually taken place," spokeswoman Amelie Ross told BBC News.

"However, the scope and format of the aforesaid information is way too exaggerated."

She said that 80,000 customers had been affected. Initial reports suggested up to 400,000 customer details had been exposed.

"Naturally, we have communicated with our customers whose data could have been stolen, and described them a situation. We put in place all the necessary remedial measures and continue to work on mechanism of data encryption," she added.

We'll see. If history is any guide, the hack may be even worse. In almost every story of a big hack into corporate computer systems, the initial estimate on the number of accounts impacted is too low, and adjusted upward at a later date.

Either way, it appears that in the process of trying to make children "safe" -- the company may have ended up doing the exact opposite.

from the a-bit-late-now? dept

Back in March, we reported on a campaign in Japan seeking to raise awareness about the extreme copyright provisions in TPP. Of course, making copyright even more unbalanced is just one of many problems with TPP, and arguably not even the worst. Now activists in the country have launched a much broader attack on the whole agreement by filing a lawsuit against the Japanese government in an attempt to halt its involvement in the talks. As Mainichi reports:

A total of 1,063 plaintiffs, including eight lawmakers, claimed in the case brought to the Tokyo District Court that the Trans-Pacific Partnership pact would undermine their basic human rights such as the right to live and know that are guaranteed under the Constitution.

The envisaged pact would not only benefit big corporations but jeopardize the country's food safety and medical systems and destroy the domestic farm sector, according to their written complaint.

As well as oft-voiced concerns that Japan's key agricultural sector would be harmed, the plaintiffs are also worried that TPP will push up drug prices -- something that is a big issue for other nations participating in the negotiations. The new group rightly points out that corporate sovereignty jeopardizes the independence of Japan's judicial system, and said that the secrecy surrounding the talks:

violates the people's right to know as the document is confidential and the negotiating process will be kept undisclosed for four years after the agreement takes effect.

Although it is hard to judge how much of a threat this move represents to Japan's continuing participation in TPP, the legal firepower behind it is certainly impressive: according to the Mainichi story, there are 157 people on the legal team. At the very least, it shows that resistance to TPP and its one-sided proposals is growing -- and not just in the US. But you can't help thinking it would have been a good idea for concerned Japanese citizens to have made this move earlier, rather than leaving it to the eleventh hour, with TPP close to the finishing line.