tag:www.schneier.com,2015:/blog//2/tag:www.schneier.com,2008:/blog//2.2423-2015-03-28T03:56:06ZComments for Full Disclosure and the Boston Farecard HackA blog covering security and security technology.Movable Typetag:www.schneier.com,2008:/blog//2.2423-comment:305076Comment from Bill McGonigle on 2008-09-08Bill McGoniglehttp://blog.bfccomputing.com
If we assume these guys are doing something valuable and that the 'system' is biased against them, and that they can be bullied by well-funded adversaries, the only prudent approach would be to set up a legal defense fund to protect said individuals. The EFF might get involved if it's a high-enough profile case, but that's hardly re-assuring. To counter the chilling effects one needs to remove the potential downside cost of publishing.
]]>
2008-09-08T19:35:58Z2008-09-08T19:35:58Ztag:www.schneier.com,2008:/blog//2.2423-comment:303465Comment from JBD on 2008-09-03JBDhttp://jbd.mee.nu
I don't get the whole "free speech" defense of what amounts to breaking and entering. Nor do I believe that somehow free speech is in any way enhanced by these bright fellas nitpicking at marginal security risks. And the notion that free speech is being in some way curtailed by the judge is curious to me. It looks more to me like he's curtailing commercial action (providing deliverables that transfer value), actions that plainly infringe on straightforward property rights. And by any measure, speech is more free, and worth less, now than it has ever been. Keeping these guys from their Vegas end zone dance didn't threaten anybody's free speech rights.

Free speech rights ensure that citizens can criticize the government without persecution. It is not criticism of the government to break into a turnstile and steal money - whether by hammer or by 'warcart' - it is stealing. Teaching other people how to do it is aiding and abetting burglary.

I'm a computer security moron, so I won't attempt to address those areas, or the ethics of the vendors and how they sell their products, topics which are best left to our host and you regulars.

]]>
2008-09-03T06:07:46Z2008-09-03T06:07:46Ztag:www.schneier.com,2008:/blog//2.2423-comment:301858Comment from moo on 2008-08-28moo
@David: "It does seem that informing the potential victim and the company that offers the vulnerable item is praiseworthy. Just telling everyone seems alarmist, publicity-seeking and encourages people to attempt break-ins they'd otherwise not do."

You are advocating security by obscurity. The reason it doesn't work was explained in the article -- if they don't go public, there is no pressure on the companies to actually fix the problems.

If it was only the company with the insecure service or product that bore the costs of that security, there would be no problem (the market would sort it out). The problem is that security issues are often externalities, where the cost of dealing with them is dumped on someone else. For example, when governments, corporations and banks lose people's private identifying information or financial details, and fraud is later committed using that leaked info, it is the affected individuals who have to deal with the mess.

Even though the fraud was made possible by the negligence of the government or corporation, and even though it is enabled by the banks and financial institutions who accept that information and give out loans or whatever to the fraudsters, it is the *individual* whose identity was used in the fraud who ends up having to deal with it. Maybe the defrauded bank eats some of the costs, but the individual can definitely be affected to a large degree too, having their credit history trashed, having to dispute with collection agencies or banks, having to file police reports, monitor or freeze their credit report, etc.

None of that would be necessary (or at least, a lot less necessary) if companies and governments were held properly accountable for the security of our information when they collect it and store it and share it around. Instead, the costs of their behaviour are externalized onto us, their customers.

]]>
2008-08-28T20:47:43Z2008-08-28T20:47:43Ztag:www.schneier.com,2008:/blog//2.2423-comment:301340Comment from Alex on 2008-08-27Alex
@Dave: right. The same goes for credit-card fraud. Credit-card companies just bear the costs as additional security measures will cost them more. It's always economics.]]>
2008-08-27T18:45:19Z2008-08-27T18:45:19Ztag:www.schneier.com,2008:/blog//2.2423-comment:301331Comment from David on 2008-08-27David
Wouldn't the transit "companies" demand more security in the fare cards themselves if they felt the pinch economically? Have they suffered losses that demand changes yet? Maybe the cost of repair is higher than the losses from exploits? Time will tell....]]>
2008-08-27T18:19:48Z2008-08-27T18:19:48Ztag:www.schneier.com,2008:/blog//2.2423-comment:301325Comment from David on 2008-08-27David
What's sad is the huge hoopla granted to those who discover "vulnerabilities" versus discovering "exploits."

Windows can be smashed, making most door locks useless. But nobody demands that window makers make smash-proof glass and all buildings using breakable glass.

Doors to banks allow robbers to enter with guns and steal money, yet the solution isn't to prevent it.

While each vulnerability can be judged on its own merit and actual crimes committed that exploit them, very often they are more alarmist than helpful.

Most exploits are simple social engineering tricks, or physical break-ins.

It does seem that informing the potential victim and the company that offers the vulnerable item is praiseworthy. Just telling everyone seems alarmist, publicity-seeking and encourages people to attempt break-ins they'd otherwise not do.

]]>
2008-08-27T18:10:46Z2008-08-27T18:10:46Ztag:www.schneier.com,2008:/blog//2.2423-comment:301321Comment from Pat Cahalan on 2008-08-27Pat Cahalanhttp://padraic2112.wordpress.com
What we need, actually, is a timelocked publication method.

"My paper on your security vulnerabilities is in timelock, and will be released in 2 months. There's nothing I can do about it now. You've got two months to fix the problem."

"Finding a bug in existing software is in general not a subject for... ...hence only provides a small contribution to the advancement the scientific knowledge."

Although I agree with your view point for scientific journals and their like, I would not agree with it in other areas and I suspect that at the end of the day it is this that is one of the real issues.

For instance the U.S. like many other parts of the world has laws about "fit for purpose" not just for tangable objects (goods) but intangables (services, contracts etc) as well, all of which appear to be based around either equibility or loss.

To most it appears to have been only software companies via EULAs that have avoided responsability for "fit for purpose" in their products.

In reality the situation is actually different, and all businesses and organisations where possible seak protection via some kind of liability limitation (agreement / licence) or risk externalisation (insurance).

However a significant factor is usually over looked and that is "times arrow" which traces a "golden thread" through risk and liability in society and thus legislation.

Software and the products based around it are less than half a generation in age, and their potential utility to cost effectivly doubles every year. This gives rise to products with expected life times of just a few months and significantly comparable to the development times involved.

This by any human standard is pace beyond comprehension and as a result risk liability and legislation are significantly behind the reality of the industry.

One asspect of this is that there is no formaly recognised method by which product failings and the best practices that arise from mature reflextion of them have a way of becoming recognised before they are obsoleat.

Therefore either the industry needs new methods comparable to the rate of change or it needs to slow it's rate of progress to the methods that currently exist.

Of the two only the former appears to be the likley way forward by industry even though considerably harder to do. The latter is a matter that would have to be addressed outside of the industry by the likes of Government (legislation) and in reality would only serve to protect vested interests and therfore hog tie the industry.

However for "new methods" it would require buy in by the industry as a whole and importantly that all forms of research be acceptable and have appropriate forums to discuss current issues. Also that appropriate methods of measurment (metrics) be found so that risk can be quantified and dealt with in a more traditional method (like insurance etc).

Unfortunatly I cannot see this happening without external influence by those with sufficient power to ensure it happens which is the rub...

My best guess so far is compulsory "product liability" and "user liability" insurance as is seen for the manufactures and users of cars etc.

]]>
2008-08-27T13:15:09Z2008-08-27T13:15:09Ztag:www.schneier.com,2008:/blog//2.2423-comment:301218Comment from Peter Galbavy on 2008-08-27Peter Galbavy
@miw: I agree that finding a bug is not a suitable subject for publication, however detailing how you found a bug, a new class of exploit or other failure and bringing it to the attention of your peers is a valid subject, which is what this is about. People don't do presentations on finding yet another buffer overflow in sendmail/bind/et al. but they should and do presentations on how they build frameworks for diagnosing them and also for preventing them in the future.]]>
2008-08-27T11:21:52Z2008-08-27T11:21:52Ztag:www.schneier.com,2008:/blog//2.2423-comment:301216Comment from MathFox on 2008-08-27MathFox
@Miw, you seem to ignore the size of the break... The students broke both the magstripe and the chipcard system used in Boston public transport. They could create farecards and add credit to them. It certainly is relevant work if you consider that security evaluation of RFID systems is a young area of research and reports on only a few class breaks have been published.

I do think that consumers and professional buyers of systems should get proper and reliable information about the security of the systems they buy. The maker hardly says something else than "perfectly secure", for something that is "easily broken" in practice. Should we "trust the vendors" when millions a year are at stake?

]]>
2008-08-27T11:16:59Z2008-08-27T11:16:59Ztag:www.schneier.com,2008:/blog//2.2423-comment:301201Comment from miw on 2008-08-27miw
@phil: Finding a bug in existing software is in general not a subject for a publication in a scientific journal or conference proceedings. Similarly, finding a bug in a security product in general should not be regarded as serious security research. A high profile system, may have modest security requirements and hence make a perfectly reasonable design. Breaking that security layer hence only provides a small contribution to the advancement the scientific knowledge. ]]>
2008-08-27T09:28:23Z2008-08-27T09:28:23Ztag:www.schneier.com,2008:/blog//2.2423-comment:301193Comment from altjira on 2008-08-27altjirahttp://www.gregrperry.com/blog
@rob

While your argument holds true for deciding whether to design and build a structure to resist a 100-year event vs. a 100,000-year event using extreme value analysis, I don't think it reflects the realities that Bruce is talking about. Perhaps building something reasonably well and then responding quickly to discovered flaws is the optimal solution. You're right in that you can't aim to build something invulnerable, but the cutting-edge researchers are not a threat - it's the script kiddies who are the threat. Pure research promotes our understanding of what can go wrong - just like Tacoma Narrows taught us about aerostatic flutter. But researchers don't cause massive losses. Ignoring the results of research causes massive losses.

It is perfectly acceptable from an engineering standpoint to put in required maintenance; or, since unquantifiable threats cannot be engineered against, to design a maintenance system to respond to the unforeseeable. We already find it necessary to maintain systems like that - they're called fire fighters. Maybe one day the people that patch faulty computer systems will earn the same respect.

]]>
2008-08-27T08:36:29Z2008-08-27T08:36:29Ztag:www.schneier.com,2008:/blog//2.2423-comment:301008Comment from nerdboy on 2008-08-26nerdboy
I thought you might be interested in this:

A blog post by Megan McArdle about a town in Holland with the street signs removed which led to people feeling less safe, meaning they drove more attentively and in fact had fewer accidents. Involves the difference between feeling and reality in security.

]]>
2008-08-27T01:37:13Z2008-08-27T01:37:13Ztag:www.schneier.com,2008:/blog//2.2423-comment:300988Comment from rob on 2008-08-26rob
One missing aspect from this discussion is that of security requirements analysis: As an engineer, it is important to define a credible threat against which you will be secure (anything else is throwing money away and not engineering). The essay implies that all commercial products should start from an assumption that their threat is highly educated research students with high level funding for equipment. This doesn't match my expectations for bad guys for most products. Perfection is the enemy of the good.]]>
2008-08-27T00:14:59Z2008-08-27T00:14:59Ztag:www.schneier.com,2008:/blog//2.2423-comment:300950Comment from LC on 2008-08-26LChttp://www.lcstyle.net/blog
Bruce got it right, the damage has already been done. The students first amendment rights were effectively trampled on and they were not able to give their talk. Today its a week delay, tomorrow the delay will be a year. In the future, it will be indefinite.

"The U.S. court has since seen the error of its ways -- but the damage is done. The MIT security researchers who were prepared to discuss their Boston findings at the DefCon security conference were prevented from giving their talk."

]]>
2008-08-26T21:27:55Z2008-08-26T21:27:55Ztag:www.schneier.com,2008:/blog//2.2423-comment:300927Comment from mcb on 2008-08-26mcb
I'm not a lawyer, and don't play one on TV, but it seems to me some really serious wiseguys might have replaced their planned DEFCON presentation with a detailed examination of the affidavit submitted by the MBTA, a public document which included a "confidential vulnerability assessment report" detailing the very flaws the MBTA sought to conceal.]]>
2008-08-26T19:52:18Z2008-08-26T19:52:18Ztag:www.schneier.com,2008:/blog//2.2423-comment:300921Comment from Davi Ottenheimer on 2008-08-26Davi Ottenheimerhttp://davi.poetry.org/blog
@ R. Scott Buchanan

Exactly what I found as well. A former colleague of George W. Bush and an appointee of Reagan, Judge Woodlock has a history of striking down first amendment rights:

"Woodlock said he had initially assumed that activists were exaggerating when they likened the protest zone near Canal Street to an internment camp. But he said that after touring the area for 90 minutes Wednesday, he concluded that comparison was 'an understatement.'
[...]
'One cannot conceive of other elements [that could be] put in place to create a space that’s more of an affront to the idea of free expression than the designated demonstration zone,' Woodlock said.

Nonetheless, Woodlock said that unruly demonstrators at other political events have made the precautions necessary to foil protesters who might hurl objects at delegates arriving on buses"

Bruce, nice essay, but you barely touch on the the root issue.

A judge in America made a horrible error of judgment (pro-corporation, anti-individual liberty) under the mistaken guise of security.

This is not an exception, it is the rule of the neo-con.

]]>
2008-08-26T19:15:39Z2008-08-26T19:15:39Ztag:www.schneier.com,2008:/blog//2.2423-comment:300913Comment from Alex on 2008-08-26Alex
The kidnap analogy is in theory correct but in practice, Bruce, I would see whether you will deny paying a ransom for a loved one.
Sure, on the long term it is the only sensible thing to do. But the benefits of not paying are shared by many while the costs of not paying are carried by the few that by not paying directly put the life of a loved one at stake.
]]>
2008-08-26T18:52:05Z2008-08-26T18:52:05Ztag:www.schneier.com,2008:/blog//2.2423-comment:300894Comment from moo on 2008-08-26moo
I like the "don't pay kidnappers" thing.

By the way, the same idea applies to terrorist events too---the way to defeat the terrorists is to simply refuse to be terrorized, refuse to give up our freedoms and our privacy in the name of fighting terrorism, refuse to change the way we live our lives. It may suck if terrorists succeed in bombing a school bus or whatever, but we must refuse to change our decision-making or our beliefs in response to incidents like that. It is the very height of cowardice to give in to such pressure. To allow terrorists to set the boundaries of discourse merely by smashing some airplanes into a building is extremely weak.

The most annoying thing is that on 9/11, 2001 it was immediately obvious to me that America was going to react, and react in a big way. In that way, they gave the terrorists a satisfaction which they wholly do not deserve. In the end, I think the reaction has been kind of pathetic and mostly counterproductive, with the media and the corporates and the politicians all ruthlessly exploiting 9/11 and inducing unwarranted fear in the American population, to further their own agendas. Compare it with the reaction of the UK after the London subway bombings---the Americans were ripe to be victims of a "shock and awe" campaign, while years of violence between the IRA and the British had prepared the UK to react pragmatically and with courage and stoicism.

Like kidnappers, there's only one proper response to terrorism: hunt them down and kill them. Any other kind of reaction ultimately does more harm than good. Aside from terrorism, we should be trying to improve our foreign policy and trying to treat more fairly with the people of other nations rather than selfishly exploiting them... but terrorism is basically crime, and should be investigated and avenged using the criminal justice system---not by trying to turn a whole country into cowering "victims".

]]>
2008-08-26T17:21:08Z2008-08-26T17:21:08Ztag:www.schneier.com,2008:/blog//2.2423-comment:300893Comment from 2Simple on 2008-08-262Simple
Why isn't the simple truth made open:Insecure systems are another form of access capitalism, selling access and manipulation for some price/favors. Problems give uncertain risk and allow for cheaper negotiation through intimidation. Powers do not want accountability, except improper accountability.

The result is a prohibition style outlaws and law breaking everywere, its fun its cool, its very ugly to the country. Oh well, it makes more incentives for a country to have more programmers, the new army of the future wars.

Full Disclosure is said to be dead. It would be good to read some comments on what it has turned into.
Perhaps Negotiation Disclosure? Uncertain Disclosure. Limited Disclosure.

]]>
2008-08-26T17:18:56Z2008-08-26T17:18:56Ztag:www.schneier.com,2008:/blog//2.2423-comment:300885Comment from Fred P on 2008-08-26Fred P
For those interested, Groklaw has a relatively good legal analysis of the Boston case.

]]>
2008-08-26T16:53:10Z2008-08-26T16:53:10Ztag:www.schneier.com,2008:/blog//2.2423-comment:300868Comment from grge on 2008-08-26grge
It simply isn't true that the MBTA only found out about the exploits 10 days before the conference.

The students informed the MBTA about the exploit well ahead of the conference. The MBTA didn't act on it - well, except for trying to squelch the release of the info.

Ironically, they not only didn't succeed in silencing the researchers, they themselves released more details through court briefs than the presentation contains.

Dumb. Dumber. Dumbest.

]]>
2008-08-26T16:02:16Z2008-08-26T16:02:16Ztag:www.schneier.com,2008:/blog//2.2423-comment:300862Comment from Phil on 2008-08-26Phil
@miw: "The majority of cases are an easy hack on a high profile system. This is not a good way to use research funding."

I disagree. If high profile systems are vulnerable to so many easy hacks, why shouldn't this be brought to attention? If people don't know about the problem, or the extent of it, it won't get fixed.

]]>
2008-08-26T15:31:57Z2008-08-26T15:31:57Ztag:www.schneier.com,2008:/blog//2.2423-comment:300845Comment from miw on 2008-08-26miw
Breaking the security of a commercial product should no longer be regarded as a scientific endevour. Such publications extremely rarely contribute to scientific understanding of security measures. The majority of cases are an easy hack on a high profile system. This is not a good way to use research funding.]]>
2008-08-26T14:45:57Z2008-08-26T14:45:57Ztag:www.schneier.com,2008:/blog//2.2423-comment:300843Comment from SteveJ on 2008-08-26SteveJ
An important part of defending public disclosure is to ensure that any vulnerability which attracts a gagging order is publicised as widely as possible, in as much detail as possible, and spun as hard as possible to blame the flaw on whoever launched the gagging attempt.

If the effect of a gagging order is to *increase* the extent to which the flaw and its details are published, and increase the negative publicity for the organisation concerned, then they will be less keen to seek them. They might still do so sometimes, because dealing with the court case is an additional burden on the individual researcher (and perhaps university), with a chilling effect on future security research. But that requires a gagger taking the long view: the immediate benefits are negated.

Furthermore, courts are sometimes reluctant to apply gags in cases where the information is already in the wild, since the gag would not have the practical effect sought, of preserving secrecy.

So, if something is banned in some jurisdiction, then (subject to your own legal position) read it, discuss it, and distribute it. If banning speech doesn't work, then speech won't be banned so often...

]]>
2008-08-26T14:40:03Z2008-08-26T14:40:03Ztag:www.schneier.com,2008:/blog//2.2423-comment:300842Comment from Alan on 2008-08-26Alan
The MIT students should have ignored the court order and presented at DefCon. Our freedoms can only be taken away when we allow them to be taken away.]]>
2008-08-26T14:39:39Z2008-08-26T14:39:39Ztag:www.schneier.com,2008:/blog//2.2423-comment:300834Comment from Eam on 2008-08-26Eam
@Mike B:
You can't blame the security researchers for wanting credit for their work, but you do bring up an interesting idea:

Say a week before the Black Hat schedule was published, these rascally goofballs anonymously released the details of the vulnerability. Everyone would know they were the researchers (since they had a presentation ready so quickly after the disclosure), but it would be a pretty tenuous link in court.

Its not about fame and fortune (trust me on the fortune bit). Its about how science is done. You present, publish and otherwise communicate your results to others in your field. Its your job as a researcher.

When you don't, then well your are not doing your job and even universities won't pay you forever doing nothing (unless you are the accountant ;)).

]]>
2008-08-26T14:09:17Z2008-08-26T14:09:17Ztag:www.schneier.com,2008:/blog//2.2423-comment:300832Comment from Josh O on 2008-08-26Josh O
Very succinct. We already knew this, but I'm going to bookmark to send to those "less in the know" when they don't understand this issue.

@Rich: Excellent point. There are many things like this. I don't want to delve into politics, but over zealously prosecuting soldiers for their treatment of POWs will result in less enemies being "caught" alive. It becomes easier to just shoot them. It truly is astounding how many policies have the negative effect of increasing the problem (like the red light cameras from yesterdays post.) It doesn't matter if it works as long as it looks like their doing something, or they get ulterior motives satisfied (increased revenue.)

]]>
2008-08-26T14:07:27Z2008-08-26T14:07:27Ztag:www.schneier.com,2008:/blog//2.2423-comment:300830Comment from Seth on 2008-08-26Seth
I agree with Mike B.: one solution for prior restraint is the doomsday scenario. Provide the full details (if feasible, with cookbook instructions for cracking the system) to several foreigners, with instructions that they release them unless the talk is given at the conference, and to ignore any instructions otherwise.

Since that communication takes place prior to any court order, the court can't prevent it (or claw it back, that's why foreigners are specified). Likewise, while the court could order me to instruct them not to release, they can't be forced to follow those instructions.

(I remember how badly the NSA lost when it tried to prevent one of the initial "Zero Knowledge" theory papers from being given a number of years ago; the resulting talk was quite amusing.)

]]>
2008-08-26T13:57:44Z2008-08-26T13:57:44Ztag:www.schneier.com,2008:/blog//2.2423-comment:300827Comment from Paeniteo on 2008-08-26Paeniteo
@Mike B: "First and foremost of these is do not pre-announce your intention to present your findings at a large conference. "

Not that easy if said conference reviews your article prior to accepting it and also announces things like schedules, programmes, abstracts and the like in advance.

]]>
2008-08-26T13:52:59Z2008-08-26T13:52:59Ztag:www.schneier.com,2008:/blog//2.2423-comment:300813Comment from Rich on 2008-08-26Rich
Consider an alternative. You discover a major vulnerability. If you disclose it to the company or government agency, you will be prosecuted. What are your alternatives? We know that an active black market exists for vulnerabilities. You could make a LOT of money selling your vulnerability. The incentives push in that direction.

Is that good public policy?

]]>
2008-08-26T13:10:23Z2008-08-26T13:10:23Ztag:www.schneier.com,2008:/blog//2.2423-comment:300806Comment from Mike B on 2008-08-26Mike B
I always feel that half the problems seen with vulnerability "disclosure" stems from the security researcher's desire for fame and notoriety. Without the need to stand behind a podium and get the gold star the information could be disclosed anonymously, first to the vendor, then to the public and the researcher would be spared all of the attendant legal trouble. There are plenty of ways in this day and age to launder information and leak news to the public in an untraceable fashion, but the researcher's quest for fame and credit puts them in the path of the inevitable blowback.

Surely, you say, without the ability to garner some reward from their efforts where is the motivation for researchers to continue their research? Well for one there is a large group of people who simply enjoy the thrill of discovery and I am sure that the majority of security researchers are happy in their work. Second, perhaps they can take a lesson from national intelligence services (or perhaps Larry David) and realize that one can be Anonymous and still tell people. If worked properly people can guess, assume or "know" you were the discoverer without being able to ever prove something in a court of law. One way could involve the presentation of "follow-up research" shortly after the vulnerability is leaked and made public by the "anonymous researcher".

Also, it amazes me how niave some security researchers are. It's like they never expect to be sued and thus have few countermeasures prepared to protect themselves. First and foremost of these is do not pre-announce your intention to present your findings at a large conference. Right or wrong vendors will panic and sure and you will be in for a world of hassle. Black hat et al needs to list talks as TBA Hacking Talk and then surprise people on the day of. Anything else is just poor OPSEC. Another countermeasure is the pre-disclosure to third party foreign nationals outside the jurisdiction of domestic courts who have instructions to release the information on their own.

Yes it would be nice to live in a world where researchers aren't hassled, but it would also be nice to live in a world without door locks. Just because the law is on your side doesn't mean you can ignore precautions.

]]>
2008-08-26T12:48:29Z2008-08-26T12:48:29Ztag:www.schneier.com,2008:/blog//2.2423-comment:300796Comment from Gerrie on 2008-08-26Gerrie
I am interested in the timelines here...

In the Dutch case the researchers told NXP about the issues they discovered in early March, planning to present to work in September. NXP then tries to prevent them doing so after knowing the details for a few months already - judge rightly throws it out.

In this case (unless the orginal reporting is a bit misleading) the vendor heard about the talk, from a third party, approx 10 days before the presentation? In which case a short-term restriction is probably sensible (to give the vendor some time to at least try something if they wanted), with long-term gagging being rejected - which is pretty much what happened.

Clearly the authors had everything ready at least a month earlier to submit their talk for review. Why did they not notify the relevant parties then?

]]>
2008-08-26T12:12:25Z2008-08-26T12:12:25Ztag:www.schneier.com,2008:/blog//2.2423-comment:300795Comment from R. Scott Buchanan on 2008-08-26R. Scott Buchanan
The judge in question, Douglas Woodlock, has a history of showing contempt for the First Amendment, so the prior restraint issue didn't surprise a lot of locals. And honestly, given the current legal and political climate in the US, it would have been silly to expect a district court judge to do anything but side with Gabauskas's ignorant goons anyway.]]>
2008-08-26T12:11:35Z2008-08-26T12:11:35Ztag:www.schneier.com,2008:/blog//2.2423-comment:300789Comment from igloo on 2008-08-26igloo
@Roy - and, unfortunately, the Australian way and almost every other 'Western' system. We pay lip service to the principle, even have laws to protect whistleblowers, but it doesn't stop the 'system' shafting these public minded citizens!!! ]]>
2008-08-26T12:07:04Z2008-08-26T12:07:04Ztag:www.schneier.com,2008:/blog//2.2423-comment:300786Comment from Roy on 2008-08-26Roy
Death to the whistleblowers! It's the American Way.]]>
2008-08-26T11:58:41Z2008-08-26T11:58:41Ztag:www.schneier.com,2008:/blog//2.2423-comment:300784Comment from D0R on 2008-08-26D0Rhttp://d0rblog.wordpress.com/
The problem is that the security of a product cannot be immediately evaluated (and appreciated) by customers. It's a "hidden" value. Therefore companies choose not to spend too much money on it.]]>
2008-08-26T11:52:07Z2008-08-26T11:52:07Z