tag:www.schneier.com,2015:/blog//2/tag:www.schneier.com,2007:/blog//2.1305-2015-05-13T03:47:36ZComments for U.S. Government to Encrypt All LaptopsA blog covering security and security technology.Movable Typetag:www.schneier.com,2007:/blog//2.1305-comment:268712Comment from Curt Sampson on 2008-05-10Curt Sampsonhttp://www.cynic.net
Ironically enough, having FDE on your laptop is probably a good way to get it confiscated by US Customs when you're trying to come into the country.

I am a GOVie implementing an in-house answer to the whole-disk encryption requirement list. Bear in mind that my opinions posted here are not spoken on behalf of my employer :).

I appreciate your insight into the technical deficiencies of the requirements list.

I guess I should explain our intentions. It's been said that no solution offers 100% coverage. This is especially true where physical access to a machine can be gained by an adversary (as in a laptop's hard disk). What we're trying to do is
minimize the risk.

I think you're running into the classic butting heads of policy versus reality. Policy states that our secure laptops are not to be carried in the same container as our CAC. Policy also states that our CAC PIN is not to be written down (let alone written down and taped onto our CAC).

Reality dictates that there are probably violators out there, true. The risk is minimized through policy, though. The intersection of people that carry their laptops and CACs in the same container is very small. The intersection of that small group with people that write their PIN on their CAC is even smaller. The intersection of this very, very small group with people whose laptops get stolen is hopefully 0, or somewhere very close to it. Further, the intersection of those stolen laptops with thieves that care about the CAC + PIN is even smaller -- they're probably most interested in the value of the machine. This is what I mean by risk minimization. It's still possible for someone to get the laptop + CAC + pin, but the chances of them doing this successfully and know what they've got are very, very, very, (did I mention very?) small, because most .GOV workers follow policy.

It's true that an adversary could print up a fake CAC with a custom applet on it that grabs the user's PIN. The user will know something is up, though: they won't be able to sign in to the laptop, they won't be able to VPN back to home base, etc, because the fake CAC won't have their key in its private memory. They'll call their help desk (hopefully) and their CAC will be determined dead, it will be revoked, and added to the certificate revocation list. A new card will be issued, with a new PIN. It's hard for a laptop's disk encryption scheme to actually obey the CRL, as it has to decrypt the hard disk before OS services are available, so the adversary could still steal the laptop and use the original CAC to decrypt it, I suppose. Of course, the adversary could also rig up a custom laptop with custom ccid reader and custom CAC, and leave the old CAC plugged in somewhere, allowing the new laptop to do a kind of man-in-the-middle...

Still, anyone capable of performing this type of "fake CAC" feat has significant resources behind them. They aren't your common thief, they likely know what they're trying to get (nation-state actor or something like it). Laptop hard disk encryption is not meant to protect against this kind of adversary. Data that must be protected against this kind of adversary should be classified at a sufficient level, as in SECRET or above (technically the classification is a measure of damage that the data could do to the US if it is leaked, but if a resourceful actor is attempting to gain the data, it is highly probable that this is the case). Classified data is not allowed on a laptop used in an unclassified environment (e.g. outside of a classified facility, like your home or starbuck's). In order for such an actor to gain access to such a device, they would have to have a security clearance, would have to get past armed guards, etc...insider threat and armed enemy combatants are also threats that this solution is not meant to protect against.

A different variety of safeguards are put into place on machines with classified data. The protection provided is commensurate with the security classification of the data on the device. Laptop disk encryption is meant for unclassified data, where harm will not cause significant damage to operations of the US government. As such, it does not require the more stringent safeguards, and disk encryption should suffice.

I hope this provides a little more insight into the rationale behind the list, and I hope that it dispels the idea that we're trying for a total solution. We recognize the problems, we're just trying to make it very unlikely for petty theft ala the VA laptop case to put unclassified but for "for official use" data at risk in the future.

Cheers, and thanks for the input,
Reid

]]>
2007-01-16T22:48:49Z2007-01-16T22:48:49Ztag:www.schneier.com,2007:/blog//2.1305-comment:138311Comment from Mike on 2007-01-15Mike
If they could get the fundamental security right, I could see it actually making like easier for government employees. Could full disk encryption be combined with some form of multi-boot to end the need for some government employees to carry up to three different computers for classified, unclassified, and personal use?

The Army's practice of putting the private keys on ID cards seems like a good solution. It is highly unlikely any Army officer is going to put their ID in their laptop bag. But it is a pain to have to get it out every time you need to do some little personal task.

Is that Full Disk Encryption of the software only variety, or does it use the TPM?

]]>
2007-01-14T05:07:18Z2007-01-14T05:07:18Ztag:www.schneier.com,2007:/blog//2.1305-comment:137353Comment from aeschylus on 2007-01-10aeschylus
"What few people are noticing is there is not one, but two kinds of pork in this competition."

Sounds delicious! :^)

]]>
2007-01-10T21:41:26Z2007-01-10T21:41:26Ztag:www.schneier.com,2007:/blog//2.1305-comment:137232Comment from Ouroboros on 2007-01-10Ouroboros
What few people are noticing is there is not one, but two kinds of pork in this competition. One is for the likely overinflated purchase and ongoing encryption support costs. The second, and likely more lucrative, is the IT support services that have to support FDE machines. Without encryption, if the drive gets borked, you had a standing chance of recovering data without too much trouble. With FDE, the IT support service costs for dying and dead drives will be enormous (for the work alone, or for the sane alternative of a consistent incremental backup policy and hardware). I'd keep a close eye on which vendors have cosy relationships with backup/storage/NAS/SAN vendors. If they ignore the corruption problem, they'll only shoot themselves in the foot.

Still, identity theft for the government is still largely an external cost, and good luck busting them for violations

]]>
2007-01-10T09:30:22Z2007-01-10T09:30:22Ztag:www.schneier.com,2007:/blog//2.1305-comment:136870Comment from erectile dysfunction on 2007-01-08erectile dysfunctionhttp://www.sexualhealthnet.info
This blog is informative and very useful for the visitors. If you want more information about sexual health visit our site. http://www.sexualhealthnet.info.]]>
2007-01-08T10:34:36Z2007-01-08T10:34:36Ztag:www.schneier.com,2007:/blog//2.1305-comment:136841Comment from Ron on 2007-01-07Ronhttp://www.rons-sandbox.com
I work for a major manufacturer of commercial and millitary aircraft. We have had a couple laptop thefts make the news over the last few years and we are switching to whole disk encryption on all of our laptops.

The key is assigned by company security so no token is required. Although the laptop will boot without any need to enter or have a key, you still need a domain or local account to log in. If you use a program like Norton Commander or some Linux boot CD, you cannot use the utilities to change the passwords or view the files on the drive because the drive is encrypted and therefore unreadable without booting from the drive first.

]]>
2007-01-08T04:27:52Z2007-01-08T04:27:52Ztag:www.schneier.com,2007:/blog//2.1305-comment:136741Comment from Jeff on 2007-01-06Jeffhttp://www.voip-news-net.com
This is a good initiative.

No, it's not going to provide 100% security against every possible attack, but it will provide very good security in general, and excellent security for cases of laptop loss or theft by ordinary theives.

Also, we're talking about unclassified laptops here. The kind of super-sophisticated theives who are going to forge and secretly replace CACs, serruptitiously obtain PINs, etc., aren't going to go through all this trouble to get a list of veteran's names.

Your posts are interesting and you seem to be (rightly) frustrated about the way the government approaches data processing and lifetime handling.

You use the US Navy system for Secret data processing as an example of how things should be done. Fine. Judging by your description, the Navy have a good working system. I presume that anybody caught breaking the rules is subjected to military disciplinary proceedings (btw, I cannot help wondering how many mistakes get covered up by sailors anxious to avoid punishment but that's another story).

Can these rigorous standards be applied to everything, including systems run by Federal/state employed civilians and contractors? I doubt it. It is well known that there are tradeoffs between security and convenience and I suggest that the amount of Secret data processing carried out by the Navy is rather small in comparison with less restrictvely classified data.

"The specific penalty is not important; it's not even the point."
"The secrets are safe because there are real penalties for failing to handle them properly."

Sorry, but that looks like a contradiction to me. If you could apply miltary style discipline to all staff employed in government data processing then you would be onto a winner (and the US would be some sort of Stalinist state).

Let's imagine you are given Presidential Authority and a big budget to tighten government data processing security. I suggest that before starting the data and systems classification, procedural changes and punitive regime required to meet your standards, you might want to identify and eliminate all the unnecessary data that the government keeps first.

Consider what the chances are that the government will agree to stop collecting and retaining unnecessary data. I'm sorry but I just don't believe it. As the technology becomes cheaper and more efficient, the government will keep collecting data. In my country (UK), the government seems to be obsessed with storing more and more detailed records about its citizens.

I can see your point about the failure to consistently apply data handling security standards but I think you are underestimating the practical difficulties involved in applying your principles to government data processing.

I suppose my comments above could be seen as a rallying cry for apathy, which isn't my intention. I suggest that more effort should be given to educating the public about the nature of modern IT systems and the new risks they pose as well as technical and procedural guards for data processing security. The sooner the public appreciate the problem then the sooner you will be able to get people to buy into your ideas (without threatening penalties). We don't all need to be IT experts but perhaps we do all need to understand modern privacy and security.

"In the commercial world, jailing an employee for a mistake (unrealistic) or taking revenge by sacking etc will not affect the damage of the data leak after the event."

With appropriate penalties, there won't be any data leak after the event, because there won't be sensitive data on the stolen equipment.

The specific penalty is not important; it's not even the point.

The White House has unconsciously defined a new classification for sensitive information in a completely idiotic series of memos that mandate broad technical controls for a small collection of computer systems, rather than the information they pretend they want to protect. These drooling incompetents at OMB are concerned strictly with appearances, and are doing nothing to address the real problem, which is that agencies are collecting and storing personal information without knowing whether or how it is to be protected.

I have nothing against FDE per se--indeed, I would demand it for every laptop that a government agency decides this data MUST be stored on. But first I would demand that they identify where the data is already, decide whether it needs to be there, label it appropriately, protect it, and train everyone in how to handle it. Instead, they're blowing their budget on protecting the laptops they use to give their pathetic PowerPoint presentations. The vast majority of these systems don't have any personal information on them anyway, and the vast majority of those that do, shouldn't.

"It is worth remembering that most public sector employees involved in data processing roles are constantly bombarded with moronic bureaucracy; it's hardly surprising that they don't pay attention to the finer points of losing government data."

I work in IT security for a federal government agency and I'm keenly aware of the paucity of technical knowledge. This is why a full-fledged classification, not some silly parlour game, needs to be brought to bear on this problem. The Navy doesn't have a problem with compromise of Secret data stored on laptops. That's not because they're encrypting the laptops, though they may well be, and it certainly isn't because everyone who handles Secret data is an expert in IT security. They have no problem because they know where the Secret data is being stored, they keep track of it, and they control access to it. The secrets are safe because there are real penalties for failing to handle them properly.

Meanwhile, the rest of the government is blithely mishandling an ocean of personal information, and no one is doing a damned thing about it. They're just wasting money--a lot of money--on theatre.

"I'd suggest that the problem is a lack of user discipline about what they put on the laptop in the first place."

"After making an example of the first few victims who lose both devices, the laptop users might start to take this problem a bit more seriously."

I have no problem about user education and a bit of thoughtfully applied pressure to persuade users not to copy sensitive data to removable storage but getting users to take this seriously is difficult. In the commercial world, jailing an employee for a mistake (unrealistic) or taking revenge by sacking etc will not affect the damage of the data leak after the event. Encrypting laptops is a step towards mitigating the damage caused by leaking data through lost/stolen laptops which might be justifiable on a cost/risk analysis.

I don't know how the law works in the US but jail for accidental information disclosure for a public sector employee also seems a bit unlikely to me.

By definition, if you read this blog then you are more security aware than the average (US?) citizen. It is worth remembering that most public sector employees involved in data processing roles are constantly bombarded with moronic bureaucracy; it's hardly surprising that they don't pay attention to the finer points of losing government data.

]]>
2007-01-06T02:14:58Z2007-01-06T02:14:58Ztag:www.schneier.com,2007:/blog//2.1305-comment:136598Comment from Peter on 2007-01-05Peter
We use safeboot for one client's laptops. The purpose of key escrow is when an (ex)employee forgets or refuses to unlock the company's laptop.

This isn't significantly different from GLBA requirements to keep "non-public information" protected, as the only real interpretation of that part of GLBA is the requirement of encryption. Most folks interpret that to mean that if you have names and passwords, such as in the ubiquitous "users" table, you're going to need to encrypt some or all of the contents of that table.

]]>
2007-01-05T20:44:40Z2007-01-05T20:44:40Ztag:www.schneier.com,2007:/blog//2.1305-comment:136557Comment from annoyed on 2007-01-05annoyed
@Sceptic
"Hard-working company man is running out of time for his report at night so he dumps a copy of a fileshare/database/whatever onto his laptop to take it home... That's what laptop FDE is intended to address..."

Better solution: put hard-working company man in prison for mishandling sensitive data. I mean, if you actually care about the problem.

"One problem with Windows is that it really doesn't care where it is creating tmp files and what the permissions on those tmp"

I think you have missed my point (I am quite happy to concede that Windows OS and Apps tend to be a bit sloppy about file management).

Scenario:
Hard-working company man is running out of time for his report at night so he dumps a copy of a fileshare/database/whatever onto his laptop to take it home. The laptop is then lost and the company may have a problem.

That's what laptop FDE is intended to address - not the finer points of TMP/swapfile forensics and suchlike. It doesn't matter what OS the employee in the scenario is using; if the data is lost then it's lost.

Note that the list of requirements for the FDE competition also requests a solution for UNIX and Linux.

]]>
2007-01-05T13:18:09Z2007-01-05T13:18:09Ztag:www.schneier.com,2007:/blog//2.1305-comment:136479Comment from annoyed on 2007-01-05annoyed
@Nicolai
"PII is defined in the privacy act but as always people start fantasizing movie plots and so expand the definition until it includes even a person's initials in some randomized order."

PII is never mentioned in the Privacy Act, which was written in 1974. The closest term that does occur is "individually identifiable information", which is just as much gibberish as PII, and occurs only once.

The most applicable term that *is* defined in the Privacy Act is a "record":

"the term 'record' means any item, collection, or grouping of information about an individual that is maintained by an agency, including, but not limited to, his education, financial transactions, medical history, and criminal or employment history and that contains his name, or the identifying number, symbol, or other identifying particular assigned to the individual, such as a finger or voice print or a photograph;"

Note the "including, but not limited to..." This is completely open-ended, which is why agencies don't know where to stop.

Meanwhile, a crucial term that is *never* defined in the Privacy Act is "identify". It could mean "identify an individual uniquely," but a name doesn't do that in most cases, and a name is an alternative to other "identifying" information. So it could also mean "to narrow the field of candidate individuals", which is the sense in the common term "identifying mark". But then, how much does information have to narrow the field before we consider it "identifying"? Is hair color enough? Hair and eye color? Last four digits of phone number? MD5 digest of the street address? IP address? (If you think IP address can't identify an individual, go talk to someone who's been sued by the RIAA.) Who knows?

Now look at the definition of "record" and tell me what a "record" is when we don't know what "identify" means.

It should be no surprise that, with this gobbledygook as starting point, agencies are defining PII to include things such as "a name combined with a telephone number". On the one hand, they're being instructed to protect all "PII" on mobile devices; on the other, they're being given no practical guidance whatsoever on what PII is. Sure, if you have medical records and SSN, that's PII, but most agencies don't have that. They have names, phone numbers, email addresses, IP addresses in logs. Are web logs PII? What about email headers?

This isn't movie-plot fantasy--it's simply the chaos that results when the White House says "Jump!" without saying how high. It's the fault of OMB for issuing broad mandates with short deadlines and no forethought. Agencies have already spent tens of millions trying to implement this crap. With this blanket purchase, they'll spend hundred of millions, perhaps billions more. All that money could be used effectively if it could be targeted to protect the information that is deemed so important. But without defining that information in an actionable way, all the government can do is shovel money into the boiler and hope nobody asks any questions.

It's time to start asking questions. Now.

]]>
2007-01-05T09:08:31Z2007-01-05T09:08:31Ztag:www.schneier.com,2007:/blog//2.1305-comment:136469Comment from Ossifer X on 2007-01-05Ossifer X
I work for the government. My agency is still using DOS-based applications. This all sounds really neato, but I shake my head when I think of the implementation. The gubmint is an ungainly, plodding behemoth.]]>
2007-01-05T07:07:37Z2007-01-05T07:07:37Ztag:www.schneier.com,2007:/blog//2.1305-comment:136456Comment from averros on 2007-01-04averros
...and of course, nobody asks the ONLY relevant question: what exactly the secrets the government so gets to guard are?

Plans to new offensive wars? The evidence of massive corruption and waste?

Oh, sure, encrypt everything, to reduce chances of accidential glimpses of inner workings of The Gang by the general public.

Any sane person would be very concerned about this "security" initiative. Because the next logical step after encryption of data will be prohibition of keeping the same data on paper. To save the trees, no doubt.

Here goes your FOIA, and the evidence of malfeasance can now go away in a second, with a key "lost" or "accidentally destructed" - with no piles of shredded paper. Neat. Clean. Unaccountable. The vaunted Democracy is well on its way to the quite ordinary fascism.

...and the key escrow is a whip over low-level flukes, so they won't be tempted to double-cross their political overlords. How convenient that it is "trusted" to some shadow agency. So they can read everything and assemble dossiers on the potential challengers to the ruling few.

I think this scheme would make any dictator wannabe piss himself from happiness.

Not really. One problem with Windows is that it really doesn't care where it is creating tmp files and what the permissions on those tmp. Windows tend to be more open as to where a user can save their files compared to linux and unix.

Having said that, even unix / linux create files in /tmp or /var/tmp folder but on a limited basis......

"FDE is being implemented because there is no file storage discipline in Windows applications"

That seems a bit harsh on Windows to me. I'd suggest that the problem is a lack of user discipline about what they put on the laptop in the first place. It's so easy to dump a large amount of data to laptop (and memory sticks etc) for convenience; that's why we will keep hearing about data loss on lost and improperly disposed of laptops and disks.

Even if we all used Linux or whatever, this would still be a problem.

]]>
2007-01-05T01:18:43Z2007-01-05T01:18:43Ztag:www.schneier.com,2007:/blog//2.1305-comment:136395Comment from Nicolai on 2007-01-04Nicolai
FDE is being implemented in response to the lost laptop reports so popular last summer. In fact the deadline to have them encrypted is now passed. They are moving on to PDA and will be coming back to encrypt the desktops. It isn't to protect the data on a running system, it is so they can say we lost a laptop but it's okay the disk was encrypted. FDE is being implemented because there is no file storage discipline in Windows applications. Might be in the user's directory or might be in the root directory. Only solution is encrypt them all and let God sort it out.

PII is defined in the privacy act but as always people start fantasizing movie plots and so expand the definition until it includes even a person's initials in some randomized order.

Microsoft advises against FDE in the Vista EULA (under the virtual system restrictions). I guess they'll be changing that bit of legalese and hopefully the OS won't choke since it never has had to deal with security before.

]]>
2007-01-05T00:21:28Z2007-01-05T00:21:28Ztag:www.schneier.com,2007:/blog//2.1305-comment:136373Comment from WhatIsBeingProtected on 2007-01-04WhatIsBeingProtected
The company I work for just did this, by using PGP encryption of the entire laptop contents. However, it was not done to protect the contents. It was done to avoid fines that can be assessed if the laptop is stolen and contents weren't encrypted. ]]>
2007-01-04T22:57:18Z2007-01-04T22:57:18Ztag:www.schneier.com,2007:/blog//2.1305-comment:136329Comment from annoyed on 2007-01-04annoyed
@Elliott
"But if there is an access path from the network service to the disk decryption key"

Even if there isn't a path to the decryption key, there's one more piece of crap vendor software presenting a new attack and fingerprinting surface on every government laptop. If there's a network vulnerability in that code, it doesn't matter whether the key is recoverable, because the laptop is already running the OS.

FDE only helps when the laptop has been turned off. Once the OS is running, FDE is nothing more than a drag on system performance.

]]>
2007-01-04T20:15:49Z2007-01-04T20:15:49Ztag:www.schneier.com,2007:/blog//2.1305-comment:136316Comment from Elliott on 2007-01-04Elliott
@annoyed: "That network component alone will be a bigger target..."

My answer to bob related only to the local encryption stack and software, not the remote access facility that is specified to be part of the solution. Sorry for not being clear.

I do expect that it is economically impossible to design and implement a network service free of vulnerabilities.

First I felt that the key management could be designed to limit the scope of leaked key data /and other exploits/ to only one notebook, and use secret splitting to allow key escrow only if multiple guarantors cooperate.

But if there is an access path from the network service to the disk decryption key, which seems plausible if the remote service is supposed to access the data when the regular user forgot her passphrase, then an exploit over the network could use that path as well. And it would work on all notebooks that are equipped with a similar software version.

An interesting variant might be if the symmetric key is stored in a TPM, and can be retrieved either with the user passphrase, or with an escrowed key that works only for that machine. But then again, who guarantees that there is no way to access the user passphrase or any derived state, or circumvent the protection of the TPM. Plus, without a non-TPM access paths, the TPM might be used in a DOS attack to make many systems inaccessible at once.

Wrong. The implementation of this will include a network software component on every laptop, because of the need for key escrow and centralized management. That network component alone will be a bigger target than all of the government laptop thefts in history ever were.

]]>
2007-01-04T18:11:55Z2007-01-04T18:11:55Ztag:www.schneier.com,2007:/blog//2.1305-comment:136295Comment from annoyed on 2007-01-04annoyed
@Cerebus
"'- No definition of PII.'"
"Incorrect. However, as a new definition, it takes a little time to settle down."

Kindly point us to the useful definition of PII. The Privacy Act is tangentially germane, but when it gets down to brass tacks, agencies are having to define PII for themselves, and they're doing a *lousy* job. You're obviously not having to deal with dozens of reports of a stolen cell phone or PDA that "might have had some home phone numbers on it". The failure to define PII in any rational way is wasting incident response resources in a profligate way, all the way up to US-CERT. Call US-CERT and ask them what proportion of their report-handling time is now spent on stolen mobile devices.

"How do you protect information on a *compromised* system?"

First of all, a stolen laptop *is* a compromised system. And you miss the point, which is that if we are defining a class of sensitive data, it needs to be tracked and protected everywhere it goes, not just on laptops. The only reason a blanket purchase is under discussion here--Bruce himself says it--is because it's easier than determining which laptops need protection. That's not a crypto problem; it's a management problem. No one knows what or where the PII is, so purchasing FDE for laptops is an expensive way to sidestep the real problem.

Imagine if DOD were putting plaintext Secret data on a few laptops, and no one knew which ones. Would it make sense for them to buy FDE for every laptop they own, when only a few had the data? Wouldn't the real problem be, obviously, that Secret data was not being handled properly?

The right way to solve this problem wouldn't waste money purchasing, installing, and maintaining FDE and a keystore infrastructure for hundreds of thousands, perhaps millions, of laptops. Instead, it would establish rules for tracking all the PII and appropriately protecting it wherever it is stored. If PII has to be stored on a laptop (e.g. by Census), then FDE could be purchased for that laptop, along with appropriate other countermeasures (anti-virus, anti-spyware, firewall, etc.). A blanket purchase accomplishes the *opposite* of what is needed, because now every drone with a laptop will think it's safe to put PII on his laptop, and storage of PII on laptops will increase, not decrease. And laptops, even with FDE, are the least secure systems in any enterprise because they keep getting connected to insecure networks.

Do you work at a government agency? If so, what care do your HR and physical security people take with personnel files? What email encryption do your HR and physical security people use? How does your incident response team deal with theft of paper files, or are they even consulted? If not, why is PII theft on a laptop handled by one group, while PII theft on paper is handled by someone else? Shouldn't there be uniform handling of PII theft, no matter how it occurs?

"In re: backups and other electronic records, these things are being addressed"

Not in any uniform or even rational way.

Remember that all this stupidity was in reaction to *one* theft of a laptop that was not supposed to have the data on it in the first place. Meanwhile, no one keeps track of which other systems have these data on them, and laptop thefts are just the tip of the iceberg, even as mobile devices are concerned--how many PDA cell phones have been stolen in the past year? Why hasn't that number been reported in the media, when laptops have? (Answer: because it's theatre.)

"'- No safeguards for transmission of PII over networks.'"
"Incorrect. Data has to be protected while in motion, and there are rules and standards for this."

Really? And which of those rules and standards apply to PII? Why not enforce those rules?

Oh, wait--what *is* PII, anyway?

"The RFP being discussed is for data *at rest* which is a newly *recognized* requirement."

The RFP is for a tiny fraction of the real problem, and is going to waste a large proportion of everyone's security budget on theatre. Just setting up the FDE on all those laptops is going to chew up major resources, and then we have follow-on costs with maintaining laptop software, infrastructure hardware and software, and help desk support for the thousands of technical problems which will arise. And what evidence do we have that there has ever been any identity fraud based on data from a stolen government laptop anyway?

"'- No requirements for labeling PII to indicate its status as such.'"
"Privacy Act again. What, you've never seen a Privacy Act cover sheet?"

Not on a laptop. Not on a web server. Actually, not anywhere, no. And I work for the government.

]]>
2007-01-04T17:58:19Z2007-01-04T17:58:19Ztag:www.schneier.com,2007:/blog//2.1305-comment:136284Comment from Israel Torres on 2007-01-04Israel Torreshttp://blog.israeltorres.org
@Josua
"The point is that requiring FDE for everybody reduces the possibilities for compromising security."

I am sure everyone here would agree that something is better than nothing with the following understanding:'

@Cerebus
"No solution is 100%."

Exactly and advocating otherwise only demonstrates ignorance of the real world.

Nobody outside the spook world can do it, at least not in anything resembling an economically feasible way.

Proof: Assume the contrary. Go look at all the disk-recovery companies that can recover deleted material off drives that have been through floods, fires, and other disasters. We're talking here companies that will charge you several hundred/thousand dollars just to *look* at the drive.

And yet, not one single company claims to be able to recover from even a "single-overwrite" condition. Even though there's a big market for it. (Note that nobody outside the commercial arena, such as academia or non-spook government, is doing it either - there'd still be a big market for that ability there, in the form of "published papers").

Conclusion: The fact that nobody's trying to make money off doing it is proof that it's not feasibly doable.

Counter-proof: Feel free to cite a *single* verifiable example of somebody who *has* recovered data off a drive made in the last decade after a single-overwrite.

]]>
2007-01-04T17:10:24Z2007-01-04T17:10:24Ztag:www.schneier.com,2007:/blog//2.1305-comment:136268Comment from Ian Ringrose on 2007-01-04Ian Ringrosehttp://www.ringrose.name
There is one easy way to make people keep the smart card with the PIN safe. Just require that the same card + PIN can be used to take cash out of an ATM that is then deducted from their next pay packet!]]>
2007-01-04T16:26:18Z2007-01-04T16:26:18Ztag:www.schneier.com,2007:/blog//2.1305-comment:136262Comment from Elliott on 2007-01-04Elliott
@bob: Yes, monoculture makes big targets.
But unencrypted notebook drives make even bigger targets.

This is an opportunity to get quite much security for quite few money, with no damage to privacy and civil liberty. A better deal than most these days.

]]>
2007-01-04T15:54:08Z2007-01-04T15:54:08Ztag:www.schneier.com,2007:/blog//2.1305-comment:136254Comment from bob on 2007-01-04bob
Basically a good idea. But when you create 10 million laptops with the same encryption protocol you are painting a BIG target on that protocol.]]>
2007-01-04T15:29:35Z2007-01-04T15:29:35Ztag:www.schneier.com,2007:/blog//2.1305-comment:136246Comment from Cerebus on 2007-01-04Cerebus
@annoyed:

"- No definition of PII."

Incorrect. However, as a new definition, it takes a little time to settle down.

"- No uniform guidelines for defining PII across government agencies."

Different agencies typically deal with different data and will need different rules. Insofar as there are common data (addresses, SSNs, etc.) there are already rules established through vehicles like the Privacy Act and HIPAA.

Umm, WTF? How do you protect information on a *compromised* system? In re: paper, gov't has rules on protecting that data (Privacy Act requirements). In re: backups and other electronic records, these things are being addressed, but only so much can be done at once.

This isn't to say the rules are any good or that there's a good track record, just that there *are* rules.

"- No safeguards for transmission of PII over networks."

Incorrect. Data has to be protected while in motion, and there are rules and standards for this. The RFP being discussed is for data *at rest* which is a newly *recognized* requirement.

"- No requirements for labeling PII to indicate its status as such."

Privacy Act again. What, you've never seen a Privacy Act cover sheet?

]]>
2007-01-04T15:02:15Z2007-01-04T15:02:15Ztag:www.schneier.com,2007:/blog//2.1305-comment:136244Comment from Cerebus on 2007-01-04Cerebus
@Israel: I cannot prevent people who intentionally aim at their extremities from shooting themselves in the foot. So the idiot who writes his 6-digit PIN on the card *which he uses every damn day yet can't seem to remember* and *keeps the card in the bag despite explicit training to the contrary* simply can't be stopped. But I *can* stop compromise from *most* people who are *far smarter* than that. Would you rather nothing was done? That seems to be the only course that would make you happy.

No solution is 100%. But for the threat being addressed (which is, as some above seem to understand but others do not, the threat to FOUO data through casual system theft--*not* the threat to FOUO data from *targetted* compromise) FDE is the best way to go. When coupled with off-system strong two-factor authentication, it's the best we're going to get given the systems we have available.

In re: your convoluted and artificial scenario of stealing my card and temporarily replacing it with a look-alike (but, I might stress, not a function-alike), I challenge you to carry that attack off. Let me know when you succeed. You'll pardon me if I don't wait around, mmm'kay?

@whoever was asking about TPM: BitLocker does secure boot (which is good), but the BitLocker pre-boot OS doesn't do smartcards (which is bad). In addition, BitLocker provides no way to recover systems *without* domain admin involvement (which is a law enforcement requirement), and BitLocker boot PINs are shared among all users of the device, so there's no user attribution at boot time (which is a forensic requirement). Some flaws there. In addition, TPM pre 1.2 isn't worth it, and most gov't systems are old enough that they don't have a TPM chip at all--and the product selected has to be deployed *now*, not in 3 years when everyone has a TPM 1.2 chip. Rest assured that the next competition in a few years will have more stringent requirements.

]]>
2007-01-04T14:52:06Z2007-01-04T14:52:06Ztag:www.schneier.com,2007:/blog//2.1305-comment:136237Comment from Elliott on 2007-01-04Elliott
@Gopi Flaherty: "Another strategy would be to have a physical:logical mapping that changed all the time, moving sectors around all the time."

That would make selective zeroization much harder, if not impossible.

Say you encrypt a disk with symmetric key s, then encrypt s with n password hashes h1..hn and store the resulting enc(s,h1)..enc(s,hn) on disk. This scheme allows n users to access the disk, each with their own password. Since no user knows s or the passwords of others, you can easily revoke access for any user by zeroizing her password.

If the disk moves logical sectors around, you can zeroize one copy of enc(hx), but not all copies because you don't know where other, not yet overwritten copies are stored.

Modern drives have a number of spare sectors to remap bad sectors, so we already have this problem, but we can deal with k spare sectors by splitting enc(hx) into k+1 (or more) blocks, where k blocks are filled with random data r1,r2,...,rk and one contains enc(hx) xor r1 xor r2 xor ... xor rk. Now, even if k of the k+1 sectors get remapped, at least one sector is not, so overwriting all k+1 sectors guarantees that at least for one of the blocks there is no unoverwritten copy left. Unless, of course, someone got a copy before it was overwritten...

]]>
2007-01-04T14:13:18Z2007-01-04T14:13:18Ztag:www.schneier.com,2007:/blog//2.1305-comment:136233Comment from Elliott on 2007-01-04Elliott
@quincunx: "Do you really believe that a 90-day competition will suddenly increase investment into some sort of R&D, or something? ... No effort at improving is made once you get the government teat to fill your coffers."

I agree that competition would be over before it even began if only one vendor is selected for a long time to come. The alternative is, of course, an infinite loop:

Would it make our data more secure, for less money? I am inclined to think so, given that all selection functions are properly implemented. Especially the selection of scope has a strong impact on security and cost, because the criteria should depend on the scope, and the people working within that scope should not need training for new software too often.

]]>
2007-01-04T13:52:45Z2007-01-04T13:52:45Ztag:www.schneier.com,2007:/blog//2.1305-comment:136230Comment from Elliott on 2007-01-04Elliott
@Stefan: Presumably the electron microscope needed to recover overwritten data would be too slow and too expensive to bundle it with the hard disk. Also, the head movement is not precise enough to reliably NOT overwrite old data.]]>
2007-01-04T13:25:57Z2007-01-04T13:25:57Ztag:www.schneier.com,2007:/blog//2.1305-comment:136229Comment from Elliott on 2007-01-04Elliott
Israel Torres wrote:
"Time to patent a locking handcuff to laptop interface..."

Well, I would not want to use one because motivated criminals are more than willing to cut hands off.

]]>
2007-01-04T13:20:36Z2007-01-04T13:20:36Ztag:www.schneier.com,2007:/blog//2.1305-comment:136218Comment from Andy B on 2007-01-04Andy B
Seems a good idea to me. In the UK we've a news story, oh, every few months about somebody having a laptop with personal/financial/defence details being nicked. An encryption policy as described, while perhaps not stopping an actual attack, will stop a typical criminal with a brick and a keen eye for the contents of people's cars. And by applying it universally, it'll cost less than trying to select what needs to be encrypted.

And you'll still have your 'Classified' system for the things you _really_ need to keep secret.

It's just raising the bar - stopping casual thieves, who're much more common than key-logging, trojan-wielding, uber-hackers.

]]>
2007-01-04T12:28:49Z2007-01-04T12:28:49Ztag:www.schneier.com,2007:/blog//2.1305-comment:136211Comment from greg on 2007-01-04greg
Remeber its a repsonse to a particular type of thieft. The thief just wants the laptop. But hes a theif so you can't trust em with the data. Encryption is a good idea.

Please take the tin foil hats off guys.

The real irony is that companies are alowed to sell this type of data to other compainies anyway. So why all the fuss about security when the law is the problem?

]]>
2007-01-04T11:44:12Z2007-01-04T11:44:12Ztag:www.schneier.com,2007:/blog//2.1305-comment:136202Comment from HAL on 2007-01-04HAL
First, not really, other than to coordinate/attend meetings. It is 'suite B' crypto. Second, yes, as it will show up on all sorts of things, but the eval will not be released in order to prevent damage to vendor if findings are negative. Vendor will be provided copy to fix/address issues, if desired. ]]>
2007-01-04T09:32:09Z2007-01-04T09:32:09Ztag:www.schneier.com,2007:/blog//2.1305-comment:136201Comment from annoyed on 2007-01-04annoyed
@supersnail
"Problem:- lots of confidential data on portable devices which are easy to steal, once you have possesion of the hard disk you can read the data.
"Solution:- encrypt all the data."

Uh, no. The only problem that solves, if everything goes right, is theft of the device. What about viruses/trojans/back doors? What difference does full-disk encryption make when the system containing it is part of a botnet?

And you didn't state the problem correctly.

Problem: lots of personal information defined by who knows what criteria, collected by who knows what programs, stored on who knows what systems, protected by who knows what measures, backed up on who knows what media, printed on who knows what paper, shared with who knows what contracting companies, remotely accessed by who knows what teleworkers, disposed of by who knows what measures.

Guess what? The DOD has had an information classification system for many years. All of this nonsense is a poorly disguised, ill-conceived, crude, and garishly inept attempt to define a similar classification system ("PII") for the civilian sector, with the following obvious flaws:
- No definition of PII.
- No uniform guidelines for defining PII across government agencies.
- No protection for PII stored on backup tapes, paper files, compromised computer systems (other than stolen laptops), etc.
- No safeguards for transmission of PII over networks.
- No requirements for labeling PII to indicate its status as such.

Full-disk encryption of laptops doesn't even strike the real problem obliquely. It's not even wrong. It shouldn't be a model for anything other than a Three Stooges short, because it's nothing more than pure security theatre.

One *correct* solution would be to define another classification in the DOD classification system, and require all government agencies to adopt that classification and *all appropriate practices* to implement it. This might include physically separate networks for transmission of PII, clear labeling of all media containing PII, rules governing disclosure of PII, a method for declassifying PII, etc.

Of course, one would have to actually think about the problem to solve it. OMB doesn't trouble themselves with that whole thinking thing--it's just too hard. Much easier to just say we're encrypting all laptops, isn't it?

]]>
2007-01-04T09:15:07Z2007-01-04T09:15:07Ztag:www.schneier.com,2007:/blog//2.1305-comment:136200Comment from Paul Larson on 2007-01-04Paul Larson
Sorry to be cynical but...

I wonder which "Bush Pioneer" will get the contract for their totally inadequate and overpriced encryption system!

]]>
2007-01-04T09:02:10Z2007-01-04T09:02:10Ztag:www.schneier.com,2007:/blog//2.1305-comment:136198Comment from supersnail on 2007-01-04supersnail
Sorry guys but everything about this seems a sensible rsponse to an real security problen.

Problem:- lots of confidential data on portable devices which are easy to steal, once you have possesion of the hard disk you can read the data.

Solution:- encrypt all the data.
Furthermore put out hte tender to open competion and choose the product that best suits your requirement.

I dont see how anyone could gripe about this, and, I think this should be a model for any organisation that routinely holds confidential data in portable computers (i.e 90% of the companies listed on the NYSE and NASDAQ!).

I hope this scheme is widely adopted because if it becomes "standard practice within the industry" it gives you an opening to sue any institution that doesnt encrypt data and than lets it loose due to poor security practices.

]]>
2007-01-04T08:22:50Z2007-01-04T08:22:50Ztag:www.schneier.com,2007:/blog//2.1305-comment:136188Comment from annoyed on 2007-01-04annoyed
This is one more step into a swampy cesspool which the Office of Management and Budget dug when it overreacted to the VA laptop incident by issuing the infamous OMB 06-16 memo:

This declared an open season of vast government waste spending on redundant encryption purchases on tight deadlines to protect a class of data that had not been clearly defined, while simultaneously enshrining a nonsense phrase--"personally identifiable information", a.k.a. PII--as the code word for government stupidity in FY 2007.

Just think about that phrase for a minute. It means "information that is personally identifiable". It's gibberish. What they perhaps meant was "personally *identifying* information", and it's not clear how this is distinct in their tiny little minds from good old "personal information". But now thousands of tie-wearing, headless chickens are running around the halls of government crying PII, PII, without any real thought given to what the real objective is--presumably, that government programs not contribute to identity fraud through mismanagement of collected data.

The result is a room at every agency filled with amateur philosophers, all incompetently and futilely trying to suss out the nature of OMB's fabulous beast, the dreaded PII. Because of the divers sophomoric output of these committees, definitions of PII now in practice usually include things as aggressively moronic as publicly available names and telephone numbers, and thus we end up with these sorts of expensive mandates to encrypt all mobile devices, instead of a more sensible mandate to simply stop putting the so-called PII on mobile devices in the first place.

Meanwhile, the idiots at the top utterly fail to address the continuing real loss of personal information through myriad avenues such as improperly secured paper files, telework conducted from insecure home PCs, failure to encrypt emails between HR personnel, failure to sanitize media on excess equipment, failure to properly shred documents before disposal, etc, etc, etc.

It's absolutely, utterly, infuriating. This is the classic exemplar of a poor security tradeoff, and I hope Bruce reconsiders his endorsement.

]]>
2007-01-04T06:17:26Z2007-01-04T06:17:26Ztag:www.schneier.com,2007:/blog//2.1305-comment:136186Comment from BeauKey on 2007-01-04BeauKeyhttp://beaukey.dyndns.org/wordpress
The TPM is a 'desired' requirement. Is the TPM not mature yet? Vista (Bitlocker) can use the TPM (including key recovery).]]>
2007-01-04T06:09:37Z2007-01-04T06:09:37Ztag:www.schneier.com,2007:/blog//2.1305-comment:136185Comment from quincunx on 2007-01-04quincunx
Bruce,

No government can run like a competitive business as long as it has the non-business like ability to a) conscript money from its subjects, b) dilute its value by issuing credit and paper tickets, and c) borrowing with the actions of (a) being the security.

The fact that you praise what is corporate welfare is quite revealing.

"And I really like that there is a open competition to choose which encryption program to use."

An open competition, for one winner?

I wonder suspect it is more likely that decisions will be made by political bickering, campaign contributions, and kickbacks rather than by real competitive merit. How could it be other wise?

"It's certainly a high-stakes competition among the vendors, but one that is likely to improve the security of all products. "

No, this view is incorrect. The incentives fostered here is to make improvements in an effort to get a profit-guaranteed stream from the government, once the effort fails for many there is no improvement elsewhere. Government, being the monopsonist that it is, can not bargain like other firms, otherwise it can drive the profit down to zero. Because that would discourage the business, the government must GUARANTEE a profit that is basically suitable to encourage a certain amount of competition, competing against all other industries in other fields. This is primarily how the 'public' utilities work. Those suckers make 9-12% profits, no matter what! How sweet it must be to not compete.

The government will set that rate way above what real markets would set, and can decide on how many bidders there will be in the first place. All improvements will not payoff - alternatives can not be tried because only one gets the gig, and the others have no extra outlets for their work.

Thus just like a patent, innovation is stimulated before the patent, and then reduced afterwards. Afterall your monopoly lasts 20 years. Hell, why not even not use it, and then sue some bastards when they do.

Do you really believe that a 90-day competition will suddenly increase investment into some sort of R&D, or something? Inventors will pull magic formulas out of their ass, that they never possessed before?

"I've long said that one of the best things the government can do to improve computer security is to use its vast purchasing power to pressure vendors to improve their security. "

Woah, hold off there Bruce, now you are acting crazy. This is exactly the opposite of what public choice economics tells us. Guaranteed income to select corporations, does not result in efficiency, rather it results in the exact opposite: waste and inefficiency. No effort at improving is made once you get the government teat to fill your coffers. Why bother, really?

Maybe one should look to see what happens in other industries before jumping the gun.

If you want to increase inefficient output - subsidize it.

If you want to decrease efficient output - tax it.

"I would expect the winner to make a lot of sales outside of the contract, and for the losers to correct their deficiencies so they'll do better next time."

Bruce, there are already many 'winners', that can be calculated by the profits they are making. Just because they are not picked by the biggest gang in town to be endowed with what is essentially corporate welfare, doesn't make them worse, in a business or security sense.

]]>
2007-01-04T06:00:46Z2007-01-04T06:00:46Ztag:www.schneier.com,2007:/blog//2.1305-comment:136180Comment from Jungsonn on 2007-01-03Jungsonnhttp://www.jungsonnstudios.com/blog/
I don't get it. Really.

What is wrong with AES 128/256 bit ?
And why are they consulting Microsoft in this? are they insane? Yeah, I doubt they lobbied with the NSA on this one.

]]>
2007-01-04T05:13:45Z2007-01-04T05:13:45Ztag:www.schneier.com,2007:/blog//2.1305-comment:136179Comment from Gopi Flaherty on 2007-01-03Gopi Flaherty
@Stefan:
If data recovery after 1, 6, or 34 overwrites is possible, why doesn't the manufactoror sell his 100 GB drive with a special driver as 34x100 GB device?

The paper I read awhile ago dealt with recovering data that had been sitting on the drive for a long time. The bits were written in narrow tracks; over time, they got wider. Over-writing lots and lots of times would only get you new narrow bits. If you had a very high-res probe, you could recover the older bits.

One way to deal with this problem would be to have a one time pad on a per-sector basis. Divide the disk up into a few sections, say 10. 10 different pads. When the disk was idle, create a new pad for one of the seconds, and re-write the blocks with the new pad.

Even if your data stayed the same, it would be constantly over-written with different data.

Another strategy would be to have a physical:logical mapping that changed all the time, moving sectors around all the time. Note that this scheme wouldn't guarantee that the bits would be always changing - if you had a very empty disk, with lots of sectors with zeros, you may find that, say, alternating between an all-zero block and a block with the bank account number you're hiding would result in enough long term spread that it could be recovered. Again, only an issue IMHO if you had a nearly empty disk. Writing random data into empty blocks might help.

This reminds me of a computer a friend of mine had. The floppy drive was supposed to be double sided - you could write to both sides without flipping the disk. In fact, he described his as "four sided." One of the heads was mis-aligned enough that, if you flipped the disk, you could write two more sides of data. Presumably, the tracks were narrow enough that this didn't cause problems. I don't know how he fared with pre-written disks, or what the error rate was...

]]>
2007-01-04T04:48:18Z2007-01-04T04:48:18Ztag:www.schneier.com,2007:/blog//2.1305-comment:136173Comment from Stefan Wagner on 2007-01-03Stefan Wagnerhttp://home.arcor.de/hirnstrom/
@sceptic: I tend to agree.
If data recovery after 1, 6, or 34 overwrites is possible, why doesn't the manufactoror sell his 100 GB drive with a special driver as 34x100 GB device?

Wouldn't it be sufficient, to write random patterns 35 times to the whole disk _before_ the first use?

]]>
2007-01-04T03:37:35Z2007-01-04T03:37:35Ztag:www.schneier.com,2007:/blog//2.1305-comment:136163Comment from Sceptic on 2007-01-03Sceptic
Several commentators have noted the problem that users tend to keep crypto passwords and dongles close the laptop for convenience. I agree this is a very real problem.
Perhaps a solution would be to adopt a crypto solution that requires a hardware token/fob of some sort and mandate the following rule:

If the laptop is lost, the user *MUST* return the token/fob device when reporting the loss or explain in an interview why both items are gone.

After making an example of the first few victims who lose both devices, the laptop users might start to take this problem a bit more seriously.

@Ritchie

"On top of that, it also provides secure deletion (1, 7 or 35 passes)"

Every so often, I come across a warning that just wiping a disk once isn't good enough to ensure that the data cannot be recovered. For older disks (let's say before year 2000) I accept this but can anybody tell me more about recovering files on overwritten disks using modern hardware? Just curious.