from the nice-one,-idiots dept

The Australian Parliament has passed a law ordaining compelled access to encrypted devices and communications. The legislation was floated months ago and opened up for comment, but it appears the Australian government has ignored the numerous complaints that such a law would violate civil liberties and otherwise be an all-around bad idea. But that's OK. It's completely justified, according to the Prime Minister.

Scott Morrison, Australia’s prime minister, told local radio on Thursday that encryption laws were necessary to target Islamist terrorism, paedophile networks and organised crime. “These laws are used to catch the scum that try to bring our country down and we can’t give them a leave pass,” he said.

Sure, and if innocent people find their communications compromised by government-mandated holes, so be it. The law was rushed through Parliament in a late evening session since every moment wasted was just one more leave pass for scum. Legislators promise to review the law in 18 months to ensure it hasn't been abused or created more problems than it's solved, but let's be honest here: how often does legislation like this get clawed back after a periodic review? It's never happened in the history of the laws governing our surveillance programs, even after leaked docs exposed unconstitutional practices and widespread abuse of surveillance authorities.

Here's a short summary of the new powers the legislation hands over to law enforcement and national security agencies:

The law enables Australia’s attorney-general to order the likes of Apple, Facebook, and Whatsapp to build capability, such as software code, which enables police to access a particular device or service.

Companies may also have to provide the design specifications of their technology to police, facilitate access to a device or service, help authorities develop their own capabilities and conceal the fact that an agency has undertaken a covert operation.

This law will go into effect before the end of the year. How it will go into effect is anyone's guess. The law provides for compelled access -- including the creation of new code -- but no one seems to have any idea what this will look like in practice. The new backdoors-in-everything-but-name will be put in place by developers/manufacturers at the drop of a court order, with the onus on the smart people in the tech business to iron out all of the problems.

The law only prevents the government from demanding that "systemic weaknesses" be built into devices or programs. Everything else is left to the imagination, including the actual process of introducing code changes in multi-user platforms or targeted devices.

An actual software developer, Alfie John, has put together a splendid Twitter thread pointing out the flaws in the government's assumptions about software development. Since the compelled participants are forbidden from discussing surveillance court orders with anyone (which would include coworkers, supervisors, the general public, etc.), these requested alterations would have to be implemented in secret. The problem is coding changes go through a number of hands before they go live. Either everyone involved would need to be sworn to secrecy (which also means being threatened with jail time) or the process falls apart. Changes ordered by a court could be rejected by those higher up on the chain. Worse, the planned encryption hole could see the compelled coder being viewed as a data thief or foreign operative or whatever.

Law enforcement is going to have to make everyone involved in the product/device complicit and covered under the same prison threat for this to work. The more people its exposed to, the higher the chance of leakage. And if the code will break other code -- or the request simply can't be met due to any number of concerns -- the government make ask the court to hold the company and its personnel in contempt for their failure to achieve the impossible.

To make matters worse, the company targeted with a compelled access request may be monitored for leaks before and after the request is submitted, putting employees under surveillance simply because of their profession.

In some cases, the only weakness that can be introduced will be systemic, which will run contrary to the law. How will the government handle this inevitable eventuality? Will it respect the law or will it simply redefine the term to codify its unlawful actions?

Even if all of this somehow works flawlessly, users of devices and communications platforms will be put at risk. Sure, the compelled access might be targeted, but it will teach users to distrust software/firmware updates that may actually keep them safer. The government may even encourage the forging of credentials or security certificates to ensure its compelled exploits reach their targets. And just because these backdoors theoretically only allow one government agent in at a time, that doesn't mean they aren't backdoors. They may be slightly more difficult for malicious actors to exploit, but once the trust is shattered by compelled access, other attack vectors will present themselves.

It's a terrible law justified by the spoken equivalent of a bumper sticker. And it's going to end up doing serious damage -- not just in Australia, but all over the world. Bad legislation spreads like a communicable disease. If one democracy says this is acceptable, other free-world leaders will use its passage as a permission slip for encryption-targeting mandates of their own.

from the wiretaps-but-for-Whatsapp dept

Are we "going dark?" The FBI certainly seems to believe so, although its estimation of the size of the problem was based on extremely inflated numbers. Other government agencies haven't expressed nearly as much concern, even as default encryption has spread to cover devices and communications platforms.

There are solutions out there, if it is as much of a problem as certain people believe. (It really isn't… at least not yet.) But most of these solutions ignore workarounds like accessing cloud storage or consensual searches in favor of demanding across-the-board weakening/breaking of encryption.

A few more suggestions have surfaced over at Lawfare. The caveat is that both authors, Ian Levy and Crispin Robinson, work for GCHQ. So that should give you some idea of which shareholders are being represented in this addition to the encryption debate.

The idea (there's really only one presented here) isn't as horrible as others suggested by law enforcement and intelligence officials. But that doesn't mean it's a good one. And there's simply no way to plunge into this without addressing an assertion made without supporting evidence towards the beginning of this Lawfare piece.

Any functioning democracy will ensure that its law enforcement and intelligence methods are overseen independently, and that the public can be assured that any intrusions into people’s lives are necessary and proportionate.

The same can be said for the "functioning democracy" on this side of the pond. Leaked documents and court orders have shown the NSA frequently ignores its oversight when not actively hiding information from Congress, the Inspector General, and the FISA court. Oversight of our nation's law enforcement agencies is a patchwork of dysfunction, starting with friendly magistrates who care little about warrant affidavit contents and ending with various police oversight groups that are either filled with cops or cut out of the process by the agencies they nominally oversee. We can't even get a grip on routine misconduct, much less ensure "necessary and proportionate intrusions into people's lives."

According to the two GCHQ reps, there's a simple solution to eavesdropping on encrypted communications. All tech companies have to do is keep targets from knowing their communications are no longer secure.

In a world of encrypted services, a potential solution could be to go back a few decades. It’s relatively easy for a service provider to silently add a law enforcement participant to a group chat or call. The service provider usually controls the identity system and so really decides who’s who and which devices are involved - they’re usually involved in introducing the parties to a chat or call. You end up with everything still being end-to-end encrypted, but there’s an extra ‘end’ on this particular communication. This sort of solution seems to be no more intrusive than the virtual crocodile clips that our democratically elected representatives and judiciary authorise today in traditional voice intercept solutions and certainly doesn’t give any government power they shouldn’t have.

We’re not talking about weakening encryption or defeating the end-to-end nature of the service. In a solution like this, we’re normally talking about suppressing a notification on a target’s device, and only on the device of the target and possibly those they communicate with. That’s a very different proposition to discuss and you don’t even have to touch the encryption.

Suppressing notifications might be less harmful than key escrow or backdoors. It wouldn't require a restructuring of the underlying platform or its encryption. If everything is in place -- warrants, probable cause, exhaustion of less intrusive methods -- it could give law enforcement a chance to play man-in-the-middle with targeted communications.

But there's a downside -- one that isn't referenced in the Lawfare post. If both ends of a conversation are targeted, this may be workable. But what if one of the participants isn't a target? This leaves them unprotected because the suppressed messages wouldn't inform other non-target parties the conversation isn't protected. Obviously it wouldn't do the let anyone targets converse with know things are no longer normal on the target's end, as it's likely one of those participants will let the target know they've encountered a security warning while talking to them.

In that respect, it is analogous to a wiretap on someone's phones. It will capture innocent conversations irrelevant to the investigation. In those cases, investigators are told to stop eavesdropping. It's unclear how the same practice will work when the communications are being harvested digitally via unseen government additions to private conversations.

This proposal seems at odds with the authors' suggested limitations, especially this one:

Any exceptional access solution should not fundamentally change the trust relationship between a service provider and its users.

When a service provider starts suppressing warning messages, the trust relationship is going to be fundamentally altered. Even if users are made aware this is only happening in rare instances involving targets of investigations, the fact that their platform provider has chosen to mute these messages means they really can't trust a lack of warnings to mean everything is still secure.

On the whole, it's a more restrained solution than others have proposed -- but it still has the built-in exploitation avenue key escrow does. It's better than a backdoor but not by much. And the authors of this proposal shouldn't pretend the solution lives up to the expectations they set for it. Their own proposal falls short of their listed ideals… and the whole thing is delivered under the false pretense law enforcement/intelligence agencies are subject to robust oversight.

from the an-actual-thing-that-happened dept

The FBI still hasn't updated its bogus "uncrackable phones" total yet, but that isn't stopping the DOJ from continuing its push for holes in encryption. Deputy AG Rod Rosenstein visited Georgetown University to give a keynote speech at its Cybercrime 2020 Conference. In it, Rosenstein again expressed his belief that tech companies are to blame for the exaggerated woes of law enforcement.

Pedophiles teach each other how to evade detection on darknet message boards. Gangs plan murders using social media apps. And extortionists deliver their demands via email. So, it is important for those of us in law enforcement to raise the alarm and put the public on notice about technological barriers to obtaining electronic evidence.

One example of such a barrier is “warrant-proof” encryption, where tech companies design their products or services in such a way that they claim it is impossible for them to assist in the execution of a court-authorized warrant. These barriers are having a dramatic impact on our cases, to the significant detriment of public safety. Technology makers share a duty to comply with the law and to support public safety, not just user privacy.

Rosenstein says this has resulted in a "significant detriment [to] public safety," but can't point to any data or evidence to back that claim up. The FBI's count of devices it can't access is off by at least a few thousand devices, by most people's estimates. In terms of this number alone, the "public safety" problem is, at best, only half as bad as the DOJ has led us to believe.

Going beyond that, crime rates remain at historic lows in most places in the country, strongly suggesting no crime wave has been touched off by the advent of default encryption. Law enforcement agencies aren't complaining about cases they haven't cleared -- if you exclude encryption alarmist/Manhattan DA Cyrus Vance. (Anyone hoping to have an honest conversation about encryption certainly should.)

Somehow, Rosenstein believes the public would experience a net safety gain by making their devices and personal info more easily accessed by criminals. Holes in encryption can be marked "law enforcement only," much like private property owners can hang "no trespassing" signs. But neither is actually a deterrent to determined criminals.

Rosenstein goes on to tout "responsible encryption" -- a fairy tale he created that revolves around the premise tech companies can break/unbreak encryption at the drop of a warrant. But broken encryption can't be unbroken, not even with some form of key escrow. The attack vector may change, but it still exists.

That Rosenstein is advocating inferior encryption during a cybercrime conference speaks volumes about what the DOJ actually considers to be worth protecting. It's not businesses and their customers. It's law enforcement's access. He spends half the run time talking about security breaches involving tech companies and follows it up by suggesting they take less care securing all this info they collect.

He even goes so far as to claim better security is something customers don't want and is bad for tech companies' bottom lines.

Building secure devices requires additional testing and validation—which slows production times — and costs more money. Moreover, enhanced security can sometimes result in less user-friendly products. It is inconvenient to type your zip code when you use a credit card at the gas station, or type a password into your smartphone.

Creating more secure devices risks building a product that will be later to market, costlier, and harder to use. That is a fundamental misalignment of economic incentives and security.

The implicit statement Rosenstein's making is that ramped-up security -- including default encryption -- is nothing more than companies screwing shareholders just so they can stick it to The Man. Following this bizarre line of thought is to buy into Rosenstein's conspiracy theory: one that views tech companies as a powerful cabal capable of rendering US law enforcement impotent.

And as much as Rosenstein hammers tech companies for security breaches that have exposed the wealth of personal data they collect, he ignores the question his encryption backdoor/side door advocacy raises. This question was posed in an excellent post by Cathy Gellis at the beginning of this year:

"What is a company to do if it suffers a data breach and the only thing compromised is the encryption key it was holding onto?"

We're headed into 2019 and no one in the DOJ or FBI is willing to honestly discuss the side effects of their proposals. Rosenstein clings to his "responsible encryption" myth and the director of the FBI wants to do nothing more than make it the problem of "smart people" at tech companies he's seeking to bend to his will. No one in the government wants to take responsibility for the adverse outcomes of weakened encryption, but they're more than willing to blame everyone else any time their access to evidence seems threatened.

Rosenstein's unwavering stance on the issue makes this statement, made at the closing of his remarks, ring super-hollow.

We should not let ideology or dogma stand in the way of constructive academic engagement.

from the picking-up-the-torch-the-FBI-accidentally-dropped dept

Because no one has passed legislation (federal or state) mandating encryption backdoors, Manhattan DA Cy Vance has to publish another anti-encryption report. An annual tradition dating back to 2014 -- the year Apple announced default encryption for devices -- the DA's "Smartphone Encryption and Public Safety" report [PDF] is full of the same old arguments about "lawful access" and evidence-free assertions about criminals winning the tech arms race. (h/t Riana Pfefferkorn)

You'd think there would be some scaling back on the alarmism, what with the FBI finally admitting its locked device count had been the victim of software-based hyperinflation. (Five months later, we're still waiting for the FBI to update its number of locked devices.) But there isn't. Vance still presents encryption as an insurmountable problem, using mainly Apple's multiple patches of security holes cops also found useful as the leading indicator.

The report is a little shorter this year but it does contain just enough stuff to be persuasive to those easily-persuaded by emotional appeals. Vance runs through a short list of awful crime solved by device access (child porn, assault) and another list of crimes unsolved (molestation, murder) designed to make people's hearts do all their thinking. While it's certainly true some horrible criminal acts will directly implicate device encryption, the fact of the matter is a majority of the locked phone-centric criminal acts are the type that won't make headlines or motivate lawmakers.

More than a third of these cases involve minor crimes like theft and check kiting. Another 20% is comprised of "sex crimes," which encompasses prostitution -- a crime where law enforcement sometimes chooses to believe the device itself is an "instrument of crime," never mind what other evidence might be hidden inside it.

So, more than half the crime involving locked phones isn't the sort of stuff that suggests encryption backdoors are the key to making New York City a safer place to reside. The stuff Vance throws in about unlocked devices producing exonerating evidence is a dodge. It's meant to show how granting law enforcement carte blanche access would be a net benefit for the public. But the examples given use stuff like cell site location info and social media app data -- things that could be obtained from third parties without having to go through the locked phone.

Then there's the other part of this argument Vance leaves completely undiscussed: if someone's phone contains exonerating evidence, it's very likely they'll provide officers with this evidence voluntarily, either by unlocking the device or handing over the relevant info/files. Using the very small percentage of cases where exonerating evidence may be recovered from locked phones as an argument for mandated backdoors is incredibly disingenuous.

And that's all this "report" is: a petition for federally-legislated encryption backdoors.

III. Federal Legislation Remains the Only Answer

[...]

For the reasons advanced in each of our prior Reports, national legislation of the sort we have proposed remains the most rational and least intrusive means to require device manufacturers to comply with lawful court orders in serious criminal cases upon a finding of probable cause.

"Most rational and least intrusive." I guess creating new security holes in millions of personal devices isn't "intrusive." And if this wasn't enough of a laugher, Vance ends his report with this sentence:

[O]ur Office stands willing to assist Congress and all relevant stakeholders in the effort to find a more rational balance among the interests of device makers, consumers and law enforcement in the regulation of smartphone encryption.

When your conclusion is that the only solution is federally-mandated encryption backdoors, you cannot honestly assert you're seeking to "balance" the interests of everyone involved. The only interest served by mandated backdoors is law enforcement's. Portraying device encryption as a threat to public safety is intellectually dishonest. Vance's own numbers undercut his threat level claims and his repeated failure to even generate serious discussion among federal legislators shows it's probably time for the Manhattan DA to retire his annual alarmism.

Multiple sources familiar with the GrayKey tech tell Forbes the device can no longer break the passcodes of any iPhone running iOS 12 or above. On those devices, GrayKey can only do what’s called a “partial extraction,” sources from the forensic community said. That means police using the tool can only draw out unencrypted files and some metadata, such as file sizes and folder structures.

Some in law enforcement may view this as confirmation of their "going dark" complaints and claim that Apple cares more about its customers than it does about fighting crime. As if that was bad thing. Apple should care more about what its customers want and need than government access to locked devices. A security hole is a hole that can be used by everyone who can exploit it. There's no way to prevent a flaw from being exploited by criminals even if law enforcement agencies find the exploit super-useful.

Grayshift's products are still somewhat useful, but it's going to be hard to justify a premium price for a stunted service. This new development might be Grayshift's fault. Soon after Apple announced one fix for an exploit used by Grayshift, the company bragged it could still crack phones just as easily. This appears to have prompted closer examination of the problem Apple thought it fixed with the first round of patching. The second pass has blunted the exploit's usefulness, even if it hasn't made it completely impossible to access some data contained in locked devices.

Even with the fix in play, law enforcement complaints about "darkness" are overblown. There are other technical solutions available, along with a wealth on information stored by third parties and cloud services. The more technical solutions won't scale, but that's not really something law enforcement should complain about. Security protections for phone owners shouldn't be viewed as weapons deployed against law enforcement. Phone manufacturers have an obligation to their customers to protect their personal data, and encryption is just one of the tools deployed to keep customers' information out of the hands of others. That some of the "others" are cops and investigators is just a side effect of providing solid products and service.

This won't make government critics of Apple any happier, though. And its closing of security holes is just going to lead to more demands for anti-encryption laws. Very few legislators seem interested in mandating backdoors, so these complaints aren't gaining any traction. But government agencies like the FBI have endless time and infinite resources, so the calls for backdoors will never completely cease -- not as long as there's a chance a major tragedy might prompt reckless Congressional action.

Apple's protection of its users is great, but its sincerity should be questioned when it's willing to put Chinese users' data where the Chinese government can easily access it. If it wants to be a champion for its customers, it needs to protect all of them, not just the ones it's currently convenient to protect. When you've got to explain why you're "locking out" US law enforcement but letting a foreign government walk in the front door, you're doing customer service wrong.

from the nah,-we're-ramming-it-through-anyway dept

The battle against encryption is being waged around the world by numerous governments, no matter how often experts explain, often quite slowly, that it's a really bad idea. As Techdirt reported back in August, Australia is mounting its own attack against privacy and security in the form of a compelled access law. The pushback there has just taken an interesting turn with the formation of a Alliance for a Safe and Secure Internet:

The Alliance is campaigning for the Government to slow down, stop ignoring the concerns of technology experts, and listen to its citizens when they raise legitimate concerns. For a piece of legislation that could have such far ranging impacts, a proper and transparent dialogue is needed, and care taken to ensure it does not have the unintended consequence of making all Australians less safe.

The Alliance for a Safe and Secure Internet represents an unusually wide range of interests. It includes Amnesty International and the well-known local group Digital Rights Watch, the Communications Alliance, the main industry body for Australian telecoms, and DIGI, which counts Facebook, Google, Twitter and Yahoo among its members. One disturbing development since we last wrote about the proposed law is the following:

The draft Bill was made public in mid-August and, following a three week consultation process, a large number of submissions from concerned citizens and organisation were received by the Department of Home Affairs. Only a week after the consultation closed the Bill was rushed into Parliament with only very minor amendments, meaning that almost all the expert recommendations for changes to the Bill were ignored by Government.

…

The Bill has now been referred to the Parliamentary Joint Committee on Intelligence and Security (PJCIS), where again processes have been truncated, setting the stage for it to be passed into law within months.

That's a clear indication that the Australian government intends to ram this law through the legislative process as quickly as possible, and that it has little intention of taking any notice of what the experts say on the matter -- yet again.

from the all-the-best-jobs-are-the-inside-ones dept

The problem with giving law enforcement access to so many databases full of personal info and so many tech tools to fight crime is that, inevitably, these will be abused. This isn't a law enforcement problem, per se. It's a people problem. When the job demands the best people but still needs to maintain minimum staffing levels, things like this happen:

A French police officer has been charged and arrested last week for selling confidential data on the dark web in exchange for Bitcoin.

[...]

French authorities also say the officer advertised a service to track the location of mobile devices based on a supplied phone number. He advertised the system as a way to track spouses or members of competing criminal gangs. Investigators believe Haurus was using the French police resources designed with the intention to track criminals for this service.

He also advertised a service that told buyers if they were tracked by French police and what information officers had on them.

The discovery of the officer's misconduct came to light after French police shut down a dark web market. That there was a cop selling cop stuff to criminals on the dark web is inevitable. If it wasn't this investigation, any of the dozens of others happening around the world would have uncovered a law enforcement officer doing bad things. Two of the federal agents involved in the Silk Road investigation ended up being charged with money laundering and wire fraud after they stole Bitcoin and issued fake subpoenas to further their own criminal ends.

Again, it's a people problem -- one that's present anywhere people are given power and access not present in most jobs. The problem is government agencies, in particular, tend not to hold their own employees accountable and work hard to thwart their oversight. The more brazen examples of government malfeasance are enabled by the dozens of smaller infractions that go unpunished until they're the subject of a lawsuit or government investigation.

More to the point, this is exactly why no government agency -- not to mention the private companies involved -- should be allowed to utilize encryption backdoors, as the EFF's Director of Cybersecurity, Eva Galperin, pointed out on Twitter. It's not just about the hundreds of malicious hackers who will see an inviting, new attack vector. It's that no one -- public or private sector -- can be completely trusted to never expose or misuse these avenues of access. And since that's a fact of life, sometimes the best solution is to remove the temptation.

from the senior-DOJ-counsel-sitting-in-darkened-office-with-'Behind-Blue-Eyes'-on dept

The DOJ is now 0-for-2 in encryption-breaking cases. The DOJ tried to get a judge to turn an All Writs Order into a blank check for broken encryption in the San Bernardino shooting case. Apple pushed back. Hard. So hard the FBI finally turned to an outside vendor to crack the shooter's iPhone -- a vendor the FBI likely knew all along could provide this assistance. But the DOJ wanted the precedent more than it wanted the evidence it thought it would find on the phone. It bet it all on the Writ and lost.

Other opportunities have arisen, though. A case involving wiretapping MS-13 gang members resulted in the government seeking more compelled decryption, this time from Facebook. The FBI could intercept text messages sent through Messenger but was unable to eavesdrop on calls made through the application. Facebook claimed it didn't matter what the government wanted. It could not wiretap these calls for the government without significantly redesigning the program. The government thought making Messenger less secure for everyone was an acceptable solution, as long as it gave investigators access to calls involving suspected gang members.

The case has proceeded under seal, for the most part, so it's been difficult to determine exactly what solution the government was demanding, but it appears removal of encryption was the preferred solution, which would provide it with future wiretap access if needed. If this request was granted, the government could take its paperwork to other encrypted messaging programs to force them to weaken or destroy protections they offered to users.

U.S. investigators failed in a recent courtroom effort to force Facebook to wiretap voice calls over its Messenger app in a closely watched test case, according to two people briefed on the sealed ruling.

[...]

Arguments were heard in a sealed proceeding in a U.S. District Court in Fresno, California weeks before 16 suspected gang members were indicted there, but the judge ruled in Facebook’s favor, the sources said.

The DOJ was hoping to have Facebook held in contempt for failing to comply with the decryption order. It appears this isn't going to happen. The government may need to seek outside assistance to intercept Messenger calls. But that's a much more difficult prospect than breaking into an iPhone physically in the possession of the government and it would possibly involve vulnerabilities that might be patched out of existence by developers.

The government is presumably hunting down its next test case for encryption-breaking precedent. With the Supreme Court finding for the Fourth Amendment in two recent cellphone-related cases, the chance of obtaining favorable precedent continues to dwindle.

from the [not-pictured:-the-Federal-Bureau-of-Sucking-At-Counting-Phones] dept

We don't hear much from anyone other than FBI officials about the "going dark" theory. The DOJ pitches in from time to time, but it's the FBI's baby. And it's an ugly baby. Earlier this year, the FBI admitted it couldn't count physical devices. The software it used to track uncrackable devices spat out inflated numbers, possibly tripling the number of phones the FBI claimed stood between it and justice. FBI officials like James Comey and Chris Wray said "7,800." The real number -- should it ever be delivered -- is expected to be less than 2,000.

The FBI also hasn't been honest about its efforts to crack these supposedly-uncrackable phones. Internal communications showed the agency slow-walked its search for a solution to the San Bernardino shooter's locked iPhone, hoping instead for a precedential federal court decision forcing device manufacturers to break encryption whenever presented with a warrant.

The FBI appears to have ignored multiple vendors offering solutions for its overstated "going dark" problem. At this point, it's public knowledge that at least two vendors have the ability to crack any iPhone. Israel's Cellebrite -- the company presumed to have broken into the San Bernardino phone for the FBI -- is one of them. The other is GrayShift, which sells a device called GrayKey, which allows law enforcement to bypass built-in protections to engage in brute force password cracking.

We don't know how often the FBI avails itself of these services. A pile of locked phones numbering in the thousands (but which thousands?!) suggests it is allowing the serviceable (vendor services) to be the enemy of the perfect (favorable court rulings and/or legislation).

Other federal agencies aren't waiting around for the next horrifying terrorist attack to nudge Congress towards mandating encryption backdoors. They're spending tax dollars now to take advantage of vulnerabilities that may be patched out of existence in the near future, if they haven't been addressed already. Thomas Brewster of Forbes has spent some time sifting through government records to see who's buying and how much they're spending. The FBI isn't on the list. The DEA is. But the Daddy Warbucks of federal law enforcement agencies is none other than the one voted Most In Need Of Immediate Abolishment.

According to government contract records on FPDS.gov, ICE acquired the services of GrayShift earlier this month. And it’s spent more than any other government department on GrayShift tech, with a single order of $384,000. Other branches of the Trump government, from the Drug Enforcement Administration to the Food and Drug Administration, have splashed between $15,000 and $30,000 on different models of the GrayKey, which requires physical access to an Apple device before it can break through the passcode.

ICE wants everything on the menu. In addition to spending big on cellphone-cracking devices, the agency has also thrown money at forensic tools from Cellebrite, social media tracking software, "intercept software" from a Nebraska-based vendor, and "computer support equipment" from foreign companies (one of them Russian) known for their ability to extract data from encrypted messaging services.

It would seem the agency involved in investigating the widest variety of crimes would be joining ICE in its encryption-breaking spending spree. But there's no trace of FBI expenditures to be found in these records. It may be the FBI has exempted itself from reporting this information under the theory that naming dollar amounts and/or vendors would allow wily criminals to escape its grasp. If so, it seems unlikely this refusal has a legal basis. The DEA and ICE have both allowed these records to be published and both agencies routinely engage in investigations that theoretically could be compromised by making spending data public. (The key is "theoretically." In reality, it's unlikely publishing contract data has any noticeable effect on criminal behavior.)

Moving past the FBI, there's reason to be concerned ICE is making purchases like these. Given its main concern is the speedy removal of undocumented immigrants, this tech seems to be more of a "want" than a "need." Most of the cases ICE deals with don't need to involve cracked phones and forensic searches. But because it has the tools on hand, it will make sure it gets our money's worth.

from the shorter-Five-Eyes:-we-like-encryption-that-doesn't-work dept

The Five Eyes nations -- UK, US, Australia, Canada, and New Zealand -- still think there's a way to create encryption backdoors (that they studiously avoid calling backdoors) that will let the good people in and the bad people out.

The backlash against government calls for backdoors has made these demands a bit more subdued in most Five Eyes countries. The UK government really doesn't seem to care and uses every terrorist attack as another reason to prevent law-abiding citizens from using secure encryption for their communications. Others members have taken a more measured approach, talking around the subject while legislative inroads continue unabated.

In the US, the periodic "going dark" discussions have taken on a (no pun intended) darkly comical tone as FBI and DOJ officials continue to claim harder nerding with solve the "problem" it has misrepresented for years.

The countries may be taking different approaches to undermining encryption, but they're all still looking to do this in the future if they can just find a way to sell it to the public without the actual nerds speaking up and ruining all their plans. The Register notes the Five Eyes surveillance partnership has delivered another ultimatum (that it won't call an ultimatum) about encrypted communications following a meeting in Australia. But it is taking care to couch its wants and desires in pretty words about the safety and security of the general public.

In an official communiqué on the confab, they claim that their inability to lawfully access encrypted content risks undermining democratic justice systems – and issue a veiled warning to industry.

The group is careful to avoid previous criticisms about their desire for backdoors and so-called magic thinking – saying that they have "no interest or intention to weaken encryption mechanisms" – and emphasise the importance of privacy laws.

"Privacy laws must prevent arbitrary or unlawful interference, but privacy is not absolute," the document stated.

And there it is. The only thing Five Eyes considers "absolute" is its supposed "right" to access contents of devices and communications. First, the confab talks about "mutual" cooperation, as though the tech industry is being unnecessarily resistant to undermining protections it provides to users. Five Eyes may not have the strength of conviction to actually demand encryption backdoors, but the wording here indicates what it wants is pretty much just a backdoor.

Providers of information and communications technology and services - carriers, device manufacturers or over-the-top service providers -– are subject to the law, which can include requirements to assist authorities to lawfully access data, including the content of communications. Safe and secure communities benefit citizens and the companies that operate within them.

This means key escrow or having encryption removed during transit so service providers can access contents of communications. Nothing about either plan makes users safer or less accessible to malicious parties not associated with the Five Eyes partnership.

The next section's headline makes it clear who's going to be answering to who:

Rule of law and due process are paramount

In other words, if you've got a warrant, I guess you're gonna come in I'll let you in. This appeal to authority says providers must subject themselves to pestering governments, even if it means harming their entire userbase just so the government can go after a few users. The nod to due process really means nothing, what with indefinite gag orders accompanying demands for communications and data, and an ongoing refusal by government agencies to discuss surveillance means and methods in open court. As long as parallel construction is still a thing, due process will never be given the respect it deserves.

So, Five Eyes may be trying to make it sound like the countries agree encryption is a valuable protection for its collective citizens, but what it really wants is the protection to be weakened to the point law enforcement -- and anyone else not governed by the rule of law -- can access it at will. No one's saying "backdoor," but they're all thinking it very loudly.