What's the difference? In most cases, only the
use to which you put the tool. Security is
a fascinating subject because it exercises both
your logical, problem-solving side—what would an
attacker have to compromise to get from point A
to point B—and your conscience.

You've often heard that security has to be
designed in, not bolted on. That makes everyone
in information technology a security professional,
whether it says “security” on your business card
or not. And as a security professional, you have
to consider security threats at two levels: the
many small attacks from people who want to copy
credit-card numbers, send spam and deface
web sites, and the larger, slower attack from
those who want to destroy our civilized way of
life on the Net, with all its messy free speech,
and institute a tidy regime of surveillance and
“digital rights management”.

Professor Lawrence Lessig, in Code and
Other Laws of Cyberspace, makes the most
powerful case for considering your beliefs and
your politics when you go to work on technology.
Code is law. How you build a system affects how
some users of the system can regulate others.
So the security you put into place to protect you
from small attacks should not facilitate the one
large attack on freedom itself.

It's important to let your conscience guide your
technical decisions, but it's just as important to
back up your political positions with the facts
about the technologies to which they apply.
Proposals for “trusted computing” are the subject
of justifiable concern among freedom lovers.
Nobody wants to give up the PC for a sealed box with a
so-called Fritz chip, named after authoritarian
US Senator Ernest “Fritz” Hollings, that would
prevent you from running a free operating system
or recording your own music.

But Fritz chip hysteria is sometimes misdirected
at new technologies or proposed specifications
that wouldn't take away your freedom to run the
software of your choice and might even have
some beneficial applications. Is the Trusted
Computing Platform Alliance unfairly maligned?
Read the article on TCPA by David Safford, Jeff
Kravitz and Leendert van Doorn on page 50, then
get their free TCPA code and decide for yourself.

You can give a big boost to your personal
information security by encrypting your home
directory. Making it work seamlessly is tricky,
though, and Mike Petullo addresses the hard parts
head-on on page 62.

The US National Security Agency's SE Linux is one
of the hottest topics in security today, and Faye
Coker gives us an introduction in Kernel Korner
on page 20. Russell Coker follows up on page
56 with a report on what happens if you give out
the root password—can the SE Linux rules alone
protect the system?

Daniel R. Allen has written a helpful article
on one of the most common Linux security tools,
OpenSSH, and Mick Bauer continues his series on
OpenLDAP, a multifunctional directory service.
There's plenty of thought-provoking information
this issue, so stay informed and, in the immortal
words of the Google employee handbook, “Don't
be evil.”

Now ask yourself, do you want to redesign the memory bus of your computer just so you can read your own documents (cause that is what it would take) ? If the answer is "no" I suggest you stay as far away from TCPA as humanly possible.

Code is rules, the law comes in when the rules are broken. Normally, the law has to be properly thought through. The rules have been designed with the best method in mind. The law comes in and is used as a guide to help determine the best course of action.

Code is not law, laws are passed in congress and are enforced by congress and authorities empowered with it. By saying code is law your implying that the federal goverment is somehow watching over our shoulders. If the goverment is watching over our shoulders then they are using valuable and limited resources in the wrong place. Progammers are not criminals and should not be 'scared straight' when the majority of us arent criminals.

Succumbing to an unconstitutional implementation on our personal rights. Linux security seemed to have been doing just fine! This *is* an affront to our rights. You say it is important to back up ones political based on the facts of technology. The facts on this technology is that it is a form of eavesdropping and exthortion. Potentially a website can refuse access to a user if the user does not have this chip installed or enabled on his computer. This is like saying a bank wont allow a person to deposit money because they dont dress with a certain blazer.

And no code is *not* law, law is law. You dont walk around saying that the words to a song is law. The script to a movie is law. Code is *not* law. To say otherwise is to surrender the power of the law!

'Dont be evil'! What kind of way is that to sign off? Have you encountered some kind of evil recently that prompted you to say that?

Code is rules, code is expression, code is code. The law comes in when the rules are broken. If the law has been properly thought through and the rules have been designed with the best method in mind. The law comes in and is used as a guide to help determine the best course of action.

Succumbing to an unfair and unconstitutional implementation on our personal rights. Linux in and of iteslf never needed any outside help on security. It seemed to have been doing just fine! This *is* an affront to our rights. You say it is important to back up ones political based on the facts of technology. The facts on this technology is that it is are a form of eavesdropping and exthortion. Potentially a website can refuse access to a user if the user does not have this chip installed or enabled on his computer. This is like saying a bank wont allow a person to deposit money because they dont dress with a certain blazer.

And no code is *not* law, law is law. You dont walk around saying that the words to a song is law. The script to a movie is law. Code is *not* law. To say otherwise is to surrender the power of the law!

'Dont be evil'! What kind of way is that to sign off? Have you encountered some kind of evil recently that prompted you to say that?

...must be pretty stupid to resort to spamming a GNU/Linux magazine/site, with the level of technological knowledge of its readers.

With developers for a product or project with an utter sense of stupidity, how useful can the application be?

I keep a list of applications that I normally recommend to companies and individual users new to GNU/Linux. These applications are thoroughly reviewed and tested by me prior to me making a recommendation for it. My clients have learned, over the years, to trust my judgement and advice when it comes to selecting and implementing various applications and technologies.

I can assure the linuxcad developers that I will never look at linuxcad because of this spamming incident.

Spam is a pestilence that must be wiped out. It's seldom that one is handed the keys to the spammer themselves in a spamming incident. Thanks to your sheer stupidity, I know exactly who the spammer is in this case.

This chip sounds like a nightmare for testing and auditing. Of course, testing has always been a difficulty in the field of cryptography.What are the possible uses of the chip? It seems it might be useful for some organizations concept of DRM.As a general security device it seems problematic. The question is, "Who _owns_ the keys"? Who can determine what they will be? A hardware chip is a deterministic device. For those keys that do not include the optional user secret, it seems like the answer is the manufacturer of the chip. It also seems a near impossible task to test that the chip does anything with a user secret other than storing it.And so, it seems to me, the end result of this hardware chip is that any user who relies on it for general security automatically adds the manufacturer and its employees to the user's circle of trust. I guess this is something of an improvement over a software solution to the extent that the chip can only be modified or replaced by someone who has physical access to the machine.Some questions for those who would use this chip for general security:What are the possible impacts of selling my machine(with the chip) to someone else?What are the possible impacts of my machine(with the chip) being stolen?Will this chip prompt the creation a new realm of "identity theft"?

Frankly, what I want to see is hardware design specs on the chips. Very little else will really reassure me. Sure, I don't know how to read them - but knowing that they're available and that there are people out there who /can/ read them and will have would be reassuring in the extreme.

It's not just "who owns the keys" but "who manages the keys". Users will insist that they can specify what software can legitimately run on their systems, so they need to be able to mark code as trusted. But managing this combination of manufacturer keys (which might trust code you *don't* want to run) and local code you do want to run will be rather complex.
I'm not convinced it adds significantly more security than a properly-implemented execute (-x) bit, where the user or administrator has to explicitly approve which programs can run.
So, for most users, TCPA replaces a problem of file and ACL management with a problem of key management. Which you can only offload to the vendors at the cost of handing over control (as well as money).