Search

Subscribe

Dan Geer on Trade-Offs and Monoculture

In the April 2007 issue of Queue, Dan Geer writes about security trade-offs, monoculture, and genetic diversity in honeybees:

Security people are never in charge unless an acute embarrassment has occurred. Otherwise, their advice is tempered by "economic reality," which is to say that security is means, not an end. This is as it should be. Since means are about tradeoffs, security is about tradeoffs, but you already knew that.

"Security people are never in charge unless an acute embarrassment has occurred"

As far as Managment are concerned,

"The lunatics never run the asylum"

For that folks is how technical people of all types are seen by those in mahogany row.

It is not simply a case of not speaking the same language and the techs having to learn the walk it is a fundemental difference in outlook in most cases.

The simple fact is that technical types tend to be "risk averse" and managment only make it up the greasy pole by taking risks.

Technical types like longevity in their job most senior managers know they are going to last 18 months on average, therefore they are usually gone before their risks come back to byte them. So rather than be risk avers that activly seek high risk for short high gain returns.

Understand this and you start to understand what makes the manderins tick (like little nihilist's with their bombs).

Very astute observations Clive. That is as well put as Iv'e ever seen the corporate IT/Executive struggle

My personal feeling is that though this paper is interesting it's not anything that anyone doesn't already know. It's just a complicated ego-stroking way of saying something very basic with a bunch of ten dollar words.

We all hate the idea of hegemony. It's in the bones of most tech types. I refuse to believe that any one brand is more secure than others, the reality is that though everyone knows Microsoft creates insecure software, so does every other company out there. Nobody is perfect, and open source isn't the answer either, becasue Linux is just as vulnerable, it's just not widely used enough for anyone who matters to care.

Until people begin to realize that the answer lies in a holitsic solution, this will be the same argument ad infinitum. I've dealt with tons security professionals who bemoan that they're not important until something's gone wrong. Until IT personnel can figure out a way to sell the suits on the fact that being proactive far surpasses being reacitve for the common good, we're going to continue to spiral downward.

Sadly the analogy is flawed, and I don't think I'm being pedantic by pointing out the differences since they defeat the value of the comparison.

First, a virus infection doesn't blow up computers, wipe out a company, and leave room for its competitor (or competitor's IT department) to move in and take over. The LAN gets pwned by some malware, and business carries on. The risk for IT is nowhere near as dire as it is for bees, so strategies bees use are obviously more drastic. Surviving as a business != surviving the next email worm.

Second, diversity in bees is radically different from diversity in operating systems. If a queen bee laid a bunch of eggs that hatched into ants, earthworms, and spiders, that would be bad, even if they were somehow trying to cooperate with the hive's goals. Yes, spiders are much better for combat with other insects than bees, but the stockholders are only interested in honey production. The ants bring back food but they bury it in the ground; our workers can't interop because the tunnels are too narrow, so again, not good for honey production.

From a perspective other than "being hacked = corporate death", this sort of comparison looks pretty silly, yes?

Agreed. A major difference between organisms and computers is that life requires continuity: when no viable members of a species exist, the species is extinct. There is no (natural) reboot or "recovery from backup" processes to bring the species back.

For computers, you can generally take the system offline, restore the system back to a recent consistent state, develop a fix or workaround, and bring the system online again.

Unless the application is so critically dependent on the continuity of service, computer systems are not vulnerable *in the same* way as biological species.

"Technical types like longevity in their job most senior managers know they are going to last 18 months on average, therefore they are usually gone before their risks come back to byte them."

I think that's an overly cynical view of what causes the "technical types" and top management to see things differently. The fact is, they have different scopes of concern.

The "technical types" (by that I take it to mean the IT and infosec people) are only tasked to see to it that the system is operating the way it should, so that Operations, Marketing, Strategy etc. can use it to make money for the company. They are not in a position to make the call that shorter time-to-market trumps robustness, for example. Senior management, on the other hand, is directly on the hook for making sure that the company continues to exist and makes money for its shareholders. They are in a position to make the decision to trade security for other considerations.

@Jay:
"becasue Linux is just as vulnerable, it's just not widely used enough for anyone who matters to care."
On the contrary, Linux and Unix firewalls are used whenever people do care.
It's easy to say "they're all the same". It usually isn't true.

In Ottawa, there is an ongoing Commission of Inquiry into the 1985 crash of Air India flight 182. (http://www.cbc.ca/news/airindia/)

Lifting a quote from the CBC's page: "Daniel Lalonde, who was an 18-year-old Burns security guard at Montreal's Mirabel airport in 1987 when Air India flight 182 took off, told the Air India Inquiry that he believes the doomed plane left Montreal without a full baggage search because airline officials decided it would cost to much to delay the flight."

There's a treasure trove of information coming from this Inquiry, which is prying apart the secrecy from the agencies involved; secrecy which looks like it was there to cover up incompetence.

I agree that in practice Unix and Linux are more secure, but in 99% of cases that is because the people using them take the time to make them so.

The point I'm trying to make is that if everyone used Linux, the collective knowledge of attackers would focus on exploits in Linux. This would reveal just as many issues and holes as Microsoft. I do believe that Linux is inherently better, but it's a value judgement based on my belief in the concept behind it.

A determined attacker will breach a well guarded Unix/Linux environment just as easily as a Microsoft one. The environment doesn't create security in itself, the people admining it do that. It just so happens that most security perple prefer Unix/Linus because it is easier for them to do what they want to do with it.

>It just so happens that most security perple prefer Unix/Linus because it is easier for them to do what they want to do with it.

For the most part, I agree with Jay. The coding practices and overall design is not nessaryly more secure than modern Windows. (Except Windows has no concept of chroot/jail, so that part is a better design in Linux/BSD)

But with Linux and FreeBSD and OpenBSD a part that does make all of them easier to secure is better security monitoriing tools easily available at the touch of a few commands (apt-get, yum, ports, etc.). So for a few hours more effort I can have a machine not only setup with a web server, but also a reverse proxy, a file integrity checker (a la tripwire type tools), and log parsing/alerting tools. With most other OSes, you have to manually find these tools and download them if they are free, otherwise you normally have to get approval to buy such tools and make room in the budget.

So in my case Linux/BSD servers are easier to secure because I can just to what it takes to secure them and not have to beg a manager for the resources to secure them.

Hence if I had a monoculture of FreeBSD machines, I bet the concepts Dan G. proposes would not apply becuase my overall design limits attackers and the system is always reporting its status.

I think it's as it should be when senior management tempers IT management with "economic reality." However, I think it is unfair to then turn around and make IT management the scapegoat when an incident occurs in such a situation.

It's unreasonable to expect IT to make reasonable decisions today when they will be scapedgoated by tomorrow's 20/20 hindsight. This is what makes CYA security necessary from the perspective of IT management--after an incident, I rarely hear senior management defend IT by saying that the money they saved by not implementing unreasonable controls directly contributed to other value. (for example, we lose more infants die to medical reasons than by abduction, and we were able to spend $x more dollars towards reducing infant death rate by 10% by not spending that money to upgrade to a high tech IT security monitoring system.)

If management wants reasonable decisions, they need to be reasonable with their hindsight.

There is certainly a difference in attitude to security between people working on closed-source versus open-source products. Closed-source vendors have a habit of dragging their feet over responding to security vulnerability reports, and then when they do bring out a patch, they often get it wrong.

While I agree with you to a certain extent, I think this is an oversimplification.The article you cited states both viewpoints. I personally wouldn't rely on any open source developer for security patches.

To be completely honest I wouldn't be sure of what they did. Lacking in the knowledge to determine that their work is what they say it is, I could be installing something that would create a backdoor in itself. The backdoor would be exposed, but how long would it take? A perfect example of this is "hacker" websites. 90% of the script-kiddy tools they offer for download are infected, which is why they give them away free.

I think you are right that open source is typically more responsive, my issue is that the emphasis needs to be put on creating secure systems instead of throwing something together to beat your competitor to market.