"OMG, you may not mention that our pet poster-boy company is just as evil as the very very evil monopolist. You have to say "Oooooh, but in AMD's case, blah blah blah...", and since Apple's MacOS X includes mandatory activation even surpassing the invasiveness of Vista, you also have to say "Oooooh, but in Apple's case, blah blah blah..."."
OS X requires no activation, it does not even have a CD key, every retail copy is identical so it's impossible for apple to tell if you pirate it, that registration

I work for the army and although i'm highly motivated, i sort of like this idea. Its a fun feature that i'm sure the good folks at intel could implement and force down our fun throats. The idea is all new computers should be able to read the CAC ( http://www.defenselink.mil/news/Oct2000/n10102000_ 200010107.html ). If you note the date on the previously mentioned article, they have been issuing CAC cards since Moses went through boot camp.
Just recently the US Army website announc

The question still remains whether the user himself can trust the trusted computing platform.

If your government or seller or whatever doesn't trust you, doesn't even try in the least, how the hell are you supposed to trust him? The most logical path would be to fully distrust him. And therefore to distrust and refuse trusted computing platform.

If your government or seller or whatever doesn't trust you, doesn't even try in the least, how the hell are you supposed to trust him? The most logical path would be to fully distrust him. And therefore to distrust and refuse trusted computing platform.

Careful, we ARE talking about the Army here. I follow what you're saying, but this circular logic might cause someone in the Army to have an aneurysm from having more than a minimal amount of neurons firing!

BTW, I have a lot of respect for the Army as I have a lot of friends on active duty, and almost became a soldier myself. Still, I couldn't pass up a chance to make a military joke)

The US army includes a load of good folks (and a much smaller number of bad ones). The soldiers are not the problem, their superiors are.
To be exact, the problem is that one of their superiors got bribed by a criminal company. If someone whose duty is to manage security doesn't recognize snake oil and backholes in TPM even w

That makes about as much sense as saying: "Mafia killers are fine, the mafia bosses that control them from above are not." In fact it's exactly the same, since US Army enforces the will of US government, just like Mafia killers enforce the will of Mafia leaders.

You're quite right of course. If the "resistance" in Iraq confined its attacks to America soldiers, they would be freedom fighters. In reality, attacks on American troops are rare. They mostly target other Iraqis who simply aren't the "right" type of Muslim. That barely even qualifies as terrorism; it's more along the lines of a slow, decentralized holocaust.

Imagine if the French resistance in WW2 had schismed into seperate Catholic and Protestant factions, and they'd spent all their time killing each other instead of collecting useful intelligence for the Allies. The people of Yugoslavia put aside enormous cultural difference, ceased all internal violence, and totally unified to form the largest and strongest resistance army that there has even been -- and ousted the Nazis themselves. Tito and company -- probably the best example of freedom fighters since the American war of independence. By way of contrast, consider China during WW2. If the Chinese had cooperated, Japan would have never been able to successfully invade let alone retain control once they were in. Chinese resistance failed because imperialists and Maoists were never able to put their own civil war on hold (although the Maoists apparently tried several times, which part of the reason that the people supported them after the war). It is just mind boggling how far the Iraqi extremists are from being anything other than a plague upon their homeland.

Why would the British, Iraqi, and other foreign armies be fighting for establishment of an America Empire? And this wasn't a "daddy had a war, I want one too"; this has been in conservative republican think tanks since before Bush took office. The flogged to death comment was spot on, how is it bad here? You have no repercussions on criticizing the Bush administration, military officials, or anyone (as long as you don't stray into libel, slander, or threats).

I'd say it's more like "We didn't like how you were doing things before, so we're going to change them." Call it an expansion of the Monroe Doctrine, if you will. And yes, the prospect of getting large amounts of oil from a nation other than Saudi Arabia was most certainly a factor.

Not saying that any of this SHOULD have happened. It just sounds like your reasoning is grounded solely in your dislike for Bush, and that makes a poor basis for a rational argument.

The point is: if the computer trusts someone else more than the end-user, in a security sense, then the end-user is not in control of the security of their machine. In a corporate IT context, this is (generally) a good thing. In an individually-owned computer, this is not really a good thing [schneier.com].

That's my understanding of it. The Army can do what it feels it must do to protect its own security. My fear is, as the submitter wrote, "They are a large-enough volume buyer that this might kickstart an adoption loop."

TCP and the whole concept of having trusted binaries running on your machine can indeed be a real boon in a security conscious environment provided that you have the tools to make use of that platform.

In itself TCP isn't inherently evil, the idea makes sense and appears to be reasonably well concieved. What is feared is a lock-in from proprietary software makers coercing the hardware vendors in not releasing the tools to anyone but them.

There might be a glimmer of hope if the trend continues with actions such as the EU vs. Microsoft anti monopoly suit. This kind of thing, focusing on interoperability could well be used so that FOSS (and through that possibly casual Windows and other commercial users) gets to access all the tools required to fully access the system (i.e. keys, etc.).

I don't have a problem with adoption so long as its use is not mandatory. I don't believe I've seen a single proposal which would make the use of this technology in a way that could undermine the end-user mandatory. Sure, it might be used to tighten up existing DRM systems. But I don't use DRM, and have no intention of doing so in the future. So why should this bother me?

From what I understand, Trusted in this context is used as in "I entrust it with my security" rather than "I find it worthy of my trust."

No, that's a common fallacy; in fact, it's an intentionally constructed fallacy. Trusted in this context means that you have evidence to trust that the computer will behave in a specified way, particularly from the point of view of remote access. Normally when you connect to a computer remotely you have no way of knowing what it's doing. It could be essentially running any software at all. But if you connect to a Trusted Computer, it provides cryptographic evidence about its software configuration. Knowing what software it is running gives you grounds to know how it will behave; and to trust that behavior. That is the real meaning of Trusted Computing.

And its real use is Digital Rights Management: this doesn't just mean preventing people from playing MP3's, but ensuring that only the software that the document author or the software vendor authorizes to open a document can open that document. There are actually good security uses for such authentication. Unfortunately, it also means that documents become much more traceable, and that the encryption keys for almost all such software, especially purchased software keys, are sitting in a database somewhere

Actually, Trusted in this context means "the people in control can trust my computer to be secure against me," where "the people in control" refers to those who hold the private key to the TPM. In the case of the general public, this is the Trusted Computing Group (which includes such bastions of personal freedom as Microsoft); in the case of the Army it should be the Army, but I fear it will still be the Trusted Computing Group.

See, that's what's so bad about Trusted Computing: if the owner of the PC had

If I am clinging to the side of a cliff, I'll take whatever rope I'm thrown. It'll probably be better than holding to the rock, waiting for my strength to give out.

My point was that users like me may someday be using one of these whether I'm happy about it or not. If it comes down to a choice between using a 10-year-old machine and a new one, I may end up using this Trusted stuff. I may not like it, but I'll take it because it's better than trying to get Enemy Territory 4: The Axis Finally Win MLB Tem

My '85 Buick Elektra (I still miss him) was a Trusted Transportation Platform. It was what I had. I Trusted it to get me from home to college and around town and back. At 280,000 miles, some would think it unworthy of such trust. I Trusted it.

Now, the real fun begins: The pointing-out of the flaws in the analogy. Bring it on!

(Actually, I love car analogies for 2 reasons: they are fun to make up, and fun to shoot down(even when mine is getting sho

Well one could say that you were the one trusting your car, while in the case of Trusted Computing the chips are leashes. Trusted Computing is not about trust, really, it's about restrictions and control.

Well, I think a correct car analogy for Trusted Computing would be not YOUR car but your DADDY's car. You would trust your daddy to issue you the keys when you needed and your daddy would trust you not to damage the vehicle. Of course, any time there would be any conflict between you two ("dad, I swear to God that this scratch was here before!"), daddy would have the ultimate saying ("swear to anyone you want, kid, but you're gronded").

And you could only trust your dad won't abuse his power. TPM is the same provided that you trust Microsoft, Apple et al love you like your parents.

If your government or seller or whatever doesn't trust you, doesn't even try in the least, how the hell are you supposed to trust him? The most logical path would be to fully distrust him.

Given how often and severely government suppliers and contractors like Halliburton, Bechtels-Parsons, etc engage in all manner of willful, obvious fraud- anyone in the government that trusts their supplier is most likely benefitting in some way from the fraud. I think the challenge wouldn't be to name all the suppliers

TFA says:
Is TCG creating specifications for just one operating system or type of platform?
No. Specifications are operating system agnostic. Several members have Linux-based software stacks available. In addition to our work on the PC platform, we have a specification for Trusted Servers and are working to finalize specifications for other computing devices, including peripherals, mobile devices, storage and infrastructure.

No. Specifications are operating system agnostic. Several members have Linux-based software stacks available. In addition to our work on the PC platform, we have a specification for Trusted Servers and are working to finalize specifications for other computing devices, including peripherals, mobile devices, storage and infrastructure.

This doesn't answer the question at all.

It all depends on who controls the root certificates that are used by the trusted computing hardware to verify the signatures o

It all depends on who controls the root certificates that are used by the trusted computing hardware to verify the signatures of the BIOS and of the boot image.

If the FOSS fraternity are left out in the cold by the certificate authority, this will lead to some almighty class-action type litigation. It would be utterly anti-competitive to lock out a huge potential competitor, and Europe in particular would have a field day with Microsoft. Look at the trouble MS got into merely by locking people to their br

It all depends on who controls the root certificates that are used by the trusted computing hardware to verify the signatures of the BIOS and of the boot image.

I'm sorry, but you don't know how Trusted Computing works. Almost everything you have been told about it is a lie.

There are no root certificates used by TC hardware to verify the signatures of the BIOS and the boot image.

What happens is that the BIOS, OS loader and potentially the OS itself send information to the TPM chip about the hashes of the software that is loading. User software can then, if it chooses, query the TPM chip and get a cryptographically send message telling what these hashes are. The software can use this to report the software configuration that booted.

The root certificates get involved because the TPM crypto key never leaves the chip. The TPM manufacturer has a root certificate which it uses to sign each TPM key. This way people can tell that a message actually comes from a valid TPM and not a fake. It prevents virtualization of TPMs. This is what allows software to report its configuration in a trustable way. It is what gives the system its name, Trusted Computing.

The TPM manufacturer has a root certificate which it uses to sign each TPM key. This way people can tell that a message actually comes from a valid TPM and not a fake. It prevents virtualization of TPMs.

Unless the root certificate gets stolen.

Not that I would ever advocate such a thing, goodness, no ! It would mean that we, the computer owners, would have complete control over our property - and then Disney might lose potential future profits ! Clearly Disney's intellectual property rights trump our p

While doubtlessly you are technically correct, for desktop computing i'm not sure it makes much difference.Since only the windows hash will allow secured files to be open and secured apps to be run.Microsoft will easily be able to convince the MPAA/RIAA that the only safe hash is the windows one and make the office formats "secured" to the windows hash. Some organisations like debian may not wish or be able to restrict peoples rights to their own machine so there will be no reason for anyone to value their

these stacks might involve a GPL shim and a non-GPL binary that's checked and verified by the TPM

No, the main one is TrouSerS [sourceforge.net]. It's fully open source and GPL'd. Contrary to the many lies which have been circulated about it, TC is fully compatible with Linux. In fact, that's where most of the research and development work is at this time. Trusted Grub [sourceforge.net] is another good example. It hashes the Linux kernel and some of the config files into the TPM chip before booting it. This way Linux systems can prove what ker

Bullshit... you cannot modify the code, compile it and still have it be "trusted". Yet more disingenous crap from the trusted computing shills.

Of course it's not still trusted. It's different. You can't change your password and have it still verify to the same hash either, can you? The hash proves what kernel you loaded, if you load a different kernel, it'll be a different hash. What you can do, if you are in the position to trust or distrust binaries, is just mark the new kernel as trusted. No problem.

What you can do, if you are in the position to trust or distrust binaries, is just mark the new kernel as trusted. No problem.

No, there is a problem. In fact, it's a huge problem. The problem is, the users are NOT in the position to trust or distrust binaries!

Because Microsoft et. al. designed the system to be secure against the user, they made it a point to withold the private key so that all signing is done by them, not the user. Considering that the entire point of the GPL is to have the user in contr

The problem is, the users are NOT in the position to trust or distrust binaries!

Why isn't the user in the position to trust binaries? In a TPM-supporting Linux stack, the only people in control of the trust or distrust are the administrators of the system. The hardware doesn't block software, the software uses the hardware to authenticate it. The software can then block it based on the rules set up by the administrator.

"The TCG design does not have any requirement that software be "certified" in order to use it. The specification talks in some length about ways of using the platform to create certificates for keys that are provably secure and yet not identify the platform they came from."

In principle then, FOSS operating systems should be able to use TPM to enhance the trust that their owners have in them, in contrast to the way in which MS systems will

...in contrast to the way in which MS systems will use it to enhance the trust that content providers have in the platform.

I personally think of this as FUD to some degree, simply because if one does not buy DRMed media, it doesn't affect MS users in any way. People seem to confuse a system supporting something with its mandatory use, which hasn't even been proposed.

That's my point. A lot of people seem to be implying that mandatory use is a given, when I'm really not certain that it will ever come around. It doesn't look like a smart business decision for anyone, to me.

The way Treacherous Computing works is by only allowing privilaged operations to be run by "Trusted" (i.e., crpytographically signed) binaries. Even though you could get a binary of the Linux kernel signed by the certificate authority, it destoys the point because if you exercise your rights under the GPL by modifying and rebuilding the software, it's no longer "Trusted" because it's not signed.

Yes it's true. After you make changes to the sourcecode of software and re-compile it, it's no longer 'Trusted'.

I have a Dell laptop with a TPM chip, which was also non-functional until explicitly enabled in the BIOS. I enabled it to play with the file encryption functionality it offered, but it turned out to be impractical. Judging by the performance I get, the TPM chip seems to have a 9600 bps serial bus connecting it to the motherboard.

If you buy a business-oriented motherboard from Intel, there is generally an option for a board with TPM. My 915GEVLK has the integrated video and audio and gigabit LAN I wanted, along with TPM which I can disable in BIOS. So long as it's not drastically raising the price of the board, there's nothing wrong with letting the end user have an extra chip or two that he can choose to use or not.

I personally abhor the notion of Trusted Computing on my personal computer, but if you're using a computer provided to you by the government or a corporation for the express purpose of working, it's their right to control what goes on on that computer. It's possible that this will help to stem the tide of malware (at least in corporate environments) by rejecting execution privledges, and allow IT staff to better enforce policies about what can and cannot be run on their computer. It would also help stop things like the Free USB Key Attack [darkreading.com] (formerly discussed [slashdot.org] on slashdot).

Of course, this could also make users feel like they are not trusted, and could even lead to overconfidence in the security of the system. Still I see it as a major plus, at least unless I get saddled with it at home.

Given the way DRM is implemented it amounts to a serial chain of single points of failure, but that's what TCM is supposed to be the basis of. As errors in military procurement are standard, not an exception, this strikes me as, um, just a tad stupid (I think this may later emerge as the understatement of the century).In addition, for a sovereign nation it is, of course, a perfectly sensible idea to hand the on/off switch of your entire infrastructure to another nation, potentially giving rise to a whole

All of Apple's Intel-based Macs have a TPM module, in order to restrict Mac OS X to running on genuine Apple hardware.Does this decision pave the way for Apple to become a preferred supplier as shortly their entire model lineup will feature TPM modules with a relatively secure operating system?

TC provides a computing platform on which you can't tamper with the application software...

That's a total lie. Almost everything in that piece of propaganda masquerading as a FAQ is a lie.

If you want the truth about TC, try Seth Schoen of the EFF. He has a good summary in his recent blog entry [loyalty.org]:

What the TPM does do is support remote attestation so that a computer user can tell the computer to prove to a remote party what software it is running (if the software that's running also supports being proven in a way that the remote party understands). Then the remote party can make its own decision about whether the software is good or bad, and what it wants to do about that.

This sounds innocuous in a certain sense. We have learned to mistrust the notion of a single centralized entity that decides what we can and can't do. TCG is not that entity, and TCG is not chartering that entity; instead, we have an unlimited number of entities that potentially make their own decisions, on various scales, about what we can and can't do in particular contexts, small and large. (We don't know yet which of those entities will turn out to have enough power to set which kinds of policies, or how the network externalities will shake out. Some entities with a lot of power, like Microsoft, can try to delegate some of their power, but there are plenty of technical and business obstacles to be worked out on both sides of that sort of delegation.)

What the TPM does do is support remote attestation so that a computer user can tell the computer to prove to a remote party what software it is running (if the software that's running also supports being proven in a way that the remote party understands). Then the remote party can make its own decision about whether the software is good or bad, and what it wants to do about that. The user could also choose not to offer any proof at all; however, although the user has the right to remain silent, the user's silence can and will be used against her. Not offering proof is, of necessity, the functional equivalent of offering proof of the most unacceptable and contrary-to-policy facts imaginable.

That does offer an avenue for a lot of control over you via your computer -- if someone else controls a resource that you need, there is a prospect of conditioning your access to that resource upon the provision of proof that you're running software that the resource controller considers "good". Not TCG, but the individual entities that you deal with: a bank, an entertainment company, an employer, an ISP. Furthermore, each of them could have its own independent definition of what "good" means, because there is no central signing or certifying authority. It is logically quite possible that one entity might refuse to talk to you if you're running configuration A instead of B, whereas another entity would refuse to talk to you if you're running B instead of A. (This is trivially true if each entity gave you a bootable CD and said "you can only communicate with us while you're running from our CD" -- with a TPM and the appropriate software, they can actually tell, and you probably can't fool them.)

The ISP scenario is the point at which the most pervasive possible control could be exercised. TCG has already developed a specification called Trusted Network Connect which is based on the idea that you can be forbidden to connect to a network unless you're running a software configuration that the nework operator approves. This is designed for use in corporations, most of which are accustomed to having a high (but imperfect) degree of control over the software running on their employees' PCs. Of course, the technology is more general, and, as TCG told me, there is nothing to stop it from being used by the People's Republic of China, or by a commercial ISP.

Imposing this requirement on a general population has a very high cost; for one thing, it mea

You could well end up with a choice of only two sources of information: the media conglomerate that owns your cable company, local news paper and local network affiliate television station, or the other conglomerate that owns your DSL service, most of the radio sta

We recently visited a customer who seem to be on the verge of announcing that anybody accessing their systems with any sensitive information will be required to use e-Gap, a dongle based security system from a Microsoft subsidiary (and not to be confused, as Google does, with electronic Grant Application and Processing.) The internal IT people told us e-Gap would refuse to allow a client to connect if it did not have working anti-virus installed, and that in order to verify this, active-x objects would be downloaded to inspect the system. If I have this wrong, apologies, but I'm reporting what I was told.

This is a worrying scenario. Apart from the minor issue that external users will not want to pay for the dongles and that the internal customer is seeing his IT bill spiral, Trusted Computing seems to be heading to a Mexican standoff situation as follows:

Device 1: Permit me to inspect your system by downloading and running this program.
Device 2: Only after YOU have allowed me to verify your credentials by uploading and running this program.
Device 1: No, it is I who am deciding whether you are to be trusted!
Device 2: No, it is I who am deciding that!
Device 1: Anyway, my content is digitally signed by Microsoft, and you must trust it.
Device 2: Microsoft? Not a hope in Hell. I require all downloads to be digitally signed by Steve Jobs in person with a DNA signature.

And so on. Quis custodiet ipsos custodes? And how long before an army unit gets wiped out because of a defective dongle?

What's most amusing about that is the number of fake 'verification' sites that it will lead to, loaded with ActiveX controls that actually are disguised rootkits... grab some large company's key, and then you could pose as them and -- since users would be used to just running ActiveX controls from that company -- nab their computers during the "security sweep."I love the irony. Use a technology probably responsible for more zombiefied machines than any other... in order to ostensibly secure them.

Unfortunately, if this type of tech gets into citizens' living rooms, they will probably not have the option of requesting credentials from all the important services. Governments/corporations do not want to be forced to provide actual, working credentials that can hold them accountable, so I really doubt they would allow the tech (read: Wintel) to do that.

Of course, then this opens up the whole issue of a service getting 0wned and then securely propagating trusted malware.

"Federal Computer Week is reporting that the US Army will require hardware-based security via the Trusted Platform Module standard in all new PCs. They are a large-enough volume buyer that this might kickstart an adoption loop."

Let's say the US Army buys a million night-vision goggles. Would that mean bird-watchers would throw away their good old binoculars and go in for this one?

The TPM is actually a very sound functional and business requirement in the Army... it provides for centralised surveillance and

So now the question is, will it be legal to transfer (or as they say, "convey") GPLv3 [slashdot.org] software to a Trusted Computer? It violates the principle that users must be able to alter their software in such a way that remote servers can't tell. Will that make it illegal to run GPLv3 software on a TC?

It makes sense for the Army to require TCP. Stolen/lost laptops wouldn't immediately result in a security leak. But this can be achived cheaper, quicker and (and here comes the key point) with more control on the Army's side. Linux can encrypt documents just the same way TCP wants to offer, the difference lies in the open source concept: This inherently gives you the ability to check the security (provided you can read code, but I guess the Army can afford hiring someone who does) of your system.

TCP requires you to trust the person/group that made the security for you. You put yourself completely into the hands of the corporation(s) that create your TCP platform, and you are fully dependent on their ability to come up with a good protection scheme. Not to mention that you have to trust them, implicitly, that they do not want to spy on you and that they are better than their adversaries.

With TCP you hand over the responsibility for security. But you also hand over control. And it has the potential to lure you in a false sense of security which invariably leads to slacking. More than once I've seen a behaviour of neglect in a high security area (I've had my share of time in that field), with people relying so heavily on the technical implementations that they forgo the most basic security measures called for by common sense, because "Hell, what DO we have that security concept for, if I can't trust it fully?"

A country's armed forces ought to have the power to demand the full source code of every application running on their computers, and the resources to write all their own software wherever necessary. There is no shortage of Open Source applications they could use for starting points.....

If I gather correctly, the TPM takes care of providing decryption keys to the operating system once it can confirm the system is in a known state. What I still don't understand is how this "known state" together with the necessary decription keys are communicated to the TPM in the first place. Is there a central authority that takes care of this? If so, how would this affect Open Source operating systems?

Give the power to disable software used by the US military to tech companies. Brilliant, why didn't anybody think of this earlier? Will software vandors be permitted to run validatation servers on sirpanet?

ATTENTION DOD EMPLOYEE:
MICROSOFT HAVE DISABLED THIS SYSTEM AS WE ARE IN THE PROCESS OF NEGOTIATING A GOVERNEMENT CONTRACT WITH IRAN. THE FUNCTIONALITY REQUIRED TO WAGE WAR WILL BE RESTORED WHEN THIS TRANSACTION COMPLETES.

TCG/TCPM stuff, though not completely finished (the DAA mechanism that was introduced in v1.2 is a good example of how the TCG adapted to outside criticisms, and they're starting to work on v1.3) and surely not understood (the word "trust" is a huge factor in that), is having the same effect as PKI a few years back. Except that nowadays times of ignorance and fear (in particular of the big companies behing the TCG) multiply this effect by thousands. "Trust" is more and more acting like the point of concentration of the security problems, its complexity being coupled with new emerging (and very innovative) threats.

First think of the TPM as a chip that provides standard cryptographic functions (RAS SHA-1, HMAC, AES), so instead of doing it in software anyone will be able to use hardware implementations. Furthermore there are facilities for key creation and management. With the special focus on this "security chip" (such chips already existed in various forms), the designers hope to improve drastically the level of security of modern computer science (95% of emails are spam, botnets of millions of computers, hackers make huge money out of their job, ransomware, etc. etc.).

Obviously this TECHNOLOGY (and please always keep this in mind: it's a tool, it is to be used by other applications, most importantly OSs, to improve security; apart from secure boot, that is not compulsory at the moment, there's no obligation to use the TPM even if it's here) is not perfect, it will evolve. It will have to CONVINCE, to get TRUST. As I'm saying to most of my Trusted Computing colleagues, I think that challenges set by the opponents of TCG are actually a means to improve the security of this technology (but beware of popularity-seeking criticisms, not all the criticisms are well-founded).

This would be a really worrying thing, but the fact is TPM has already won. It won the instant that Apple adopted TPM and the communities who were publicly worrying and complaining about Palladium and Trusted Computing for all those years went suddenly silent and shrugged the instant that nebulous notions like "freedom" came into conflict with solid, purdy white plastic.

Here is the thing: TPM's adoption was waiting not on an adoption cycle exactly, but an apathy cycle. TPM was never something that the consumer was supposed to approve of, want, or even really know was there. The adoption of TPM was mostly counting on the consumer not having any idea what they were buying, counting on the blinking 12:00 effect, counting on the idea that most consumers would not even know TPM was in their computer until the first time that they try to do something and the computer says "no".

TPM isn't there for the consumer. It's there to protect the computer from the consumers. It's there to allow software and content vendors to trust your computer, to trust your computer to ensure it will act in their interests and not yours. These vendors are the ones that TPM is being done for the benefit of, not the consumer. This means that in order for TPM to win, it isn't necessary for the consumer to "adopt" it. All that has to happen is for the consumer to fail to actively reject it when it is quietly dropped into the hardware they were going to buy anyway.

And that's already happening. So although the military would legitmately represent an adoption cycle-- the military, of course, has a legitimate and logical need to create networks within which the machinery is trusted and the user is absolutely not-- it doesn't really matter. The military isn't the kind of adoption TPM needs to reach enough critical mass that vendors can begin requiring it in new applications, I don't think-- it's not like military hardware is going to be used to run lots of games and DRMed consumer media, as far as I know. The worrying thing is TPM's level adoption in the consumer segment, since that's where it has potential to do actual harm. And that's already begun, and so far nothing is happening to stop it...

You're in the Army. You're in the field under fire. You have a hardened Army laptop. You are sending and receivingvital messages back and forth with another unit directing fire around your position. Your laptop doesn't have anysoftware or files on it that are personal to you. Not your music. Not your games, etc. What is has is a trusted andfool-proof means of getting and receiving messages that you can trust with your life and the lives of your unit.Therefore, you trust the info on your Army issued laptop.

To say that "the army" is requiring all pcs to do anything is questionable at best. What this appears to apply to is the enterprise systems. That's maybe a couple hundred servers that fall into the command of Netcom. I see no mention of netcom having responsibility for things like desktops, agency by agency servers, etc. Never can tell though.

BZZZT wrong... with a Linux based software stack, you should be able to sign your own code and thus ensure only code you've signed and code signed by others YOU trust can be run...

Signing your own code is not what he's talking about. Signed, and encrypted, code downloaded to run on your machine from elsewhere and how it is used is totally at the mercy of what vendors stipulate can be done with it. If they want an effective way of timebombing software because you haven't paid up then they have the framework to do that. If they want to break data protection laws and start communicating usage statistics and other sordid details, encrypted and safe from prying eyes, then they now have a means for doing that. It also means that it is almost certainly going to be nigh on impossible to switch to a competing vendor's products.

Some people seemingly have no idea what the trust in Trusted Computing actually means. What it means is that external people and organisations, particularly software vendors, content companies etc. have a way for them to trust my computer or equipment. Whether I can trust the computer or electronic equipment I own, and what software run on there actually does, is an entirely different matter. It's a fundamental shift in the idea of how computers work that will probably end in anarchy and chaos.

There is no excuse for the military to use any Microsoft-provided software, other than the expectations of the purchasing agents to "retire" into fat civilian jobs.

As much as I dislike many Microsoft products, I can't let this go by. The federal procurement system is too complicated for some random purchasing agent to have much influence over major procurement decisions. The reason that the federal government, including the military, buys Microsoft Office, is that they are trying to save money by purcha

...minimizing costs and risks. Compared to the other costs of doing business, the cost of a Microsoft Office license is minimal.

WTF are you smoking? Between the legendary insecurity of Microsoft software and formats, and the fact that the formats are proprietary (meaning they will be expensive to archive and maintain), MS Office is the worst possible thing for the military to use!

The module holds keys, but the Army will not be able to control the installation of keys into the module.
How does this make the system trustworthy?

Think of the Army as an ISP. Whenever a computer tries to connect to their network, they can
query the TPM module to verify that the configuration of the machine matches what they allow - Not
only that they have allowed it (not sure how their network looks, but think "on the domain" here),
but also that someone hasn't snuck in, sat at an authorized-and-logg