Cyber War: Microsoft a weak link in national security

Most reviews of Richard A. Clarke's new book Cyber War haven't noted Clarke's …

"Microsoft has vast resources, literally billions of dollars in cash, or liquid assets reserves. Microsoft is an incredibly successful empire built on the premise of market dominance with low-quality goods."

Who wrote those lines? Steve Jobs? Linux inventor Linus Torvalds? Ralph Nader? No, the author is former White House adviser Richard A. Clarke in his new book, Cyber War: The Next Threat to National Security and What to Do About It.

It has been a few months since Clarke's latest opus appeared, but it's still making quite a splash. Clarke, after all, was the guy who repeatedly warned the White House about Al Qaeda before September 11, 2001. As a result, he has quickly become the most publicly identifiable person on the subject.

"While it may appear to give America some sort of advantage," Cyber War warns, "in fact cyber war places this country at greater jeopardy than it does any other nation." The enormous dependence of our financial and energy networks on the 'Net open us up to potentially devastating online attacks. "It is the public, the civilian population of the United States and the publicly owned corporations that run our key national systems, that are likely to suffer in a cyber war."

Large scale movement

Clarke takes readers through various famous cyberwar incidents, most notably the Distributed Denial of Service (DDoS) attack on Estonia back in 2007, but how bad could such events really get?

The hypothetical answer is on page 64. There Clarke deputizes you as Assistant to the President for Homeland Security and takes you through a scenario of doom. The National Security Agency has just sent a critical alert to your BlackBerry: "Large scale movement of several different zero day malware programs moving on Internet in US, affecting critical infrastructure."

But by the time you get your office, one of the DoD's main networks has already crashed; computer system failures have caused huge refinery fires around the country; the Federal Aviation Administration's air traffic control center in Virginia is collapsing, and the hits just keep coming.

"The Chairman of the Fed just called," the Secretary of the Treasury tells you. "Their data centers and their backups have had some sort of major disaster. They have lost all their data." Power blackouts are sweeping the country. Thousands of people have already died. "There is more going on," Clarke narrates, "but the people who should be reporting to you can't get through."

File under fiction

Clarke's book has gotten tons of play with this sort of stuff—check out, for example, the scary interview he did with Terry Gross on NPR's Fresh Air. But little of it impresses his critics.

"File under fiction," begins Ryan Siegel's review over at Wired. "Like in real war, truth is the first casualty." Siegel warns that the tome is based on hypothetical scenarios (see above) or alarmist and inaccurate rehashings of various cyber emergencies. Plus, we note the book has no references or index.

Ditto, says Evgeny Morozov in the Wall Street Journal. "We do not want to sleepwalk into a cyber-Katrina," he writes, "but neither do we want to hold our policy-making hostage to the rhetorical ploys of better-informed government contractors." Clarke is one of four partners in the Good Harbor Consulting security firm.

But even his detractors acknowledge that some of Clarke's broad arguments make sense—most notably his warning that the Pentagon can't assume that the energy and financial sectors will effectively defend themselves from cyber attacks.

"At the beginning of the age of cyber war," Clarke ruefully notes, "the US government is telling the population and industry to defend themselves."

Money talks

Why has the national response to this problem been so slow? Lack of consensus on what to do and fear of the "R-word"—government regulation, Clarke contends. Then there's Reason Number Five on his list, which basically boils down to "Microsoft."

"Some people like things the way they are," Clarke obliquely observes. "Some of those people have bought access." Microsoft, he notes, is a prominent member of OpenSecrets.org's "Heavy Hitters" political donor list. Most of the list's stars are trade associations. "Microsoft is one of only seven companies that make the cut."

The software giant's largesse has shifted from Republicans back in the Clinton antitrust days to Obama, he continues, but the agenda is always clear: "Don't regulate security in the software industry, don't let the Pentagon stop using our software no matter how many security flaws it has, and don't say anything about software production overseas or deals with China."

Clarke tries to be fair. He notes that Microsoft didn't originally intend its software for critical networks. But even his efforts at fairness are unflattering. Microsoft's original goal "was to get the product out the door and at a low cost of production," he explains. "It did not originally see any point to investing in the kind of rigorous quality assurance and quality control process that NASA insisted on for the software used in human space-flight systems."

But people brought in Microsoft programs for critical systems anyway. "They were, after all, much cheaper than custom-built applications." And when the government launched its Commercial Off-the-Shelf program (COTS) to cut expenses, Microsoft software migrated to military networks. These kind of cost cutting reforms "brought to the Pentagon all the same bugs and vulnerabilities that exist on your own computer," Clarke writes.

Floating i-brick

The former White House advisor cites the 1997 USS Yorktown incident as a consequence. The Ticonderoga-class ship's whole operational network was retrofitted with Windows NT. "When the Windows system crashed, as Windows often does, the cruiser became a floating i-brick, dead in the water."

In response to this "and a legion of other failures," the government began looking into the Linux operating system. The Pentagon could "slice and dice" this open source software, pick and choose the components it needed, and more easily eliminate bugs.

Clarke says that, in response:

[Microsoft] went on the warpath against Linux to slow the adoption of it by government committees, including by Bill Gates. Nevertheless, because there were government agencies using Linux, I asked NSA to do an assessment of it. In a move that startled the open-source community, NSA joined that community by publicly offering fixes to the Linux operating system that would improve its security. Microsoft gave me the very clear impression that if the US government promoted Linux, Microsoft would stop cooperating with the US government. While that did not faze me, it may have had an effect on others. Microsoft's software is still being bought by most federal agencies, even though Linux is free.

The company took a similarly hard line towards the banking and financial industry, Cyber War says, rebuffing access requests from security specialists for Microsoft code. When banks threatened to use Linux, Microsoft urged them to wait for its next operating system—Vista.

"Microsoft insiders have admitted to me that the company really did not take security seriously, even when they were being embarrassed by frequent highly publicized hacks," Clarke confides. Sure enough, when Apple and Linux began to offer serious competition, Microsoft upgraded quality in recent years. But what the company did first was to lobby against higher government security standards.

"Microsoft can buy a lot of spokesmen and lobbyists for a fraction of the cost of creating more secure systems," concludes Clarke's section on the software firm. "They are one of several dominant companies in the cyber industry for whom life is good right now and change may be bad."

Required to do so

Given the considerable amount of criticism Cyber War has come in for, we're not endorsing Clarke's nightmare version of Microsoft's history. And we're more than a little nervous about some of his prescriptions for "change." These include government rules ordering the big ISPs "to engage in deep-packet inspection for malware."

Although these provisions should include high standards for privacy, "the ISPs must be given the legal protection necessary" so they won't fear being sued for stopping malware, viruses, DDOS attacks, and worms. "Indeed, they must be required to do so by new regulations," Clarke insists.

But many of the reviews and notices of Cyber War gloss over one of the principal observations of the book: the privatization of government over the last two decades may have saved cash but compromised the government's ability to defend crucial portions of America from big and small attacks on the 'Net. That's a concern that bears further discussion, whatever you think of Clarke's scary cyber stories.

Matthew Lasar
Matt writes for Ars Technica about media/technology history, intellectual property, the FCC, or the Internet in general. He teaches United States history and politics at the University of California at Santa Cruz. Emailmatthew.lasar@arstechnica.com//Twitter@matthewlasar

99 Reader Comments

As much distain for MS an their poor security I have had in the past. I dont think regulation is needed or helpful. MS will be happy to fix things as the market changes. They have no choice evolve or die. There are plenty of other options to choose from. I think they realize they dont have a choice.

Now regulation on campaign contributions and astroturfing. That would be good for everyone but the politicians.

Great, so he uses an example from 1997 to demonstrate what exactly about the computer landscape of 2010? Please don't give talking heads like this credibility in a forum where they know nothing...it is WORSE for our security. If he can't actually discuss the security model right now, he's just spouting political nonsense.

Wow, Clarke may know a great deal about terrorist organizations but he apparently hasn't been keeping up to date with the application security field very well. 2001 called Mr. Clarke, and its glad you haven't bothered learning anything since then.

If you want to have a truly secure application layer it will cost you at least $25k per line of code, or at least that was the estimated cost of bug free software over a decade ago. There is no commercial vendor anywhere that does that, because it would bankrupt any private company, and the OSS movement certainly doesn't not have that sort of engineering discipline (read up on the methodology used to develop space shuttle code - it is laborious and actively punishes creativity. not the sort of project that attracts people for free). And you can't stop at the OS layer - as Microsoft will tell you in this day and age when Adobe and Apple software are to blame for much of windows malware every piece of software running on a system that parses a file or accepts network traffic is an attack vector. So if you want to have the level of assurance that Clarke is talking about you are looking at a massive, massive DoD undertaking. In the process you would also lose much of the current level of capability in the conventional sense while waiting for IT systems to completely reinvent themselves.

If you aren't willing to go that route, Microsoft is a better choice than most. Any attack targeting computer systems on the level that Clarke is talking about will be intentionally crafted with those systems in mind, so the obscurity that Linux and OS X offer are completely worthless (which is also why Google's supposed switch of from Windows is either misreporting or represents a complete lack of basic security understanding on Google's part). That leaves built in defenses, which MS leads the pack at (Microsoft is at the point where windows itself is pretty sound, and are engineering as much as they can to mitigate attacks against the application layer. DEP, ASLR, UAC, etc are all really there for apps more than the OS itself) and the ability to manage the systems en mass, which also favors MS. Clarke clearly hasn't been talking much to anyone who actually handles enterprise infosec or he might not have come across as ignorant as the average slashdot commentor.

Matthew...editor's critique: Pick a topic and stick with it. You go from national security, to politics and finance, to computer security, to corporate law and finance, back to security, to software stability... It's very difficult to follow the point you're trying to make (which, based on the way the evidence reads - is just a pile of sticks). You keep conceding the opposition case in the supporting commentary for each piece of evidence. It reads like it's quite disorganized (which may bet the fault of your *real* editor). As it stands, the piece is barely coherent. It tries to link too many completely unrelated and imaginary premises to a conclusion that (as I noted) doesn't follow. Microsoft sucks because a 11 year old operating system had stability problems? Richard Clarke 'warned the White House about 9/11' and that's his credibility? Fat lot of good he did us. Linus Torvalds is the computer security fairy come to magick our worries away?

I'm not any programmer, or know much about coding, or much more about operating systems, but isn't during those conventions where people try to hack into operating systems and finding backdoors, that it was stated that Windows was much more secure, and the comparison to the analogy of "having a secure home in a bad neighborhood versus an unlocked cottage in an open meadow?" I feel like the problem with cyber terrorism and security is in the lack of the users. I'm sure we have tons of people who are knowledgable in the field, but it is hard relaying what they know to other people such as the political correspondants and what now. When I think government, I think a lot of old men and women who know mostly only politics, and nothing more on other issues, thats why they make teams to study these smaller subjects such as technology and science, because they themselves don't know much about it. I feel that if our politicans were much more educated, thus we would have a much more educated government, which leads to a higher awareness of topics that are sensitive to debate, and would be able to make much more logical decisions rather than basing their decisions on what the teams tell them is best (heck if they knew more they might be able to better participate and communicate with the studies that are conducts, and have a better understandment to what decision they should make). So all in all, I think it falls back on education, which I feel is kinda in the backseat right now, especial with all those dollar signs floating the other directions.

And as an addendum to my last point, attempts to regulate security have been pretty miserable. SOX offers no real guidance, just a stick to hit people with after the fact. PCI is a nice way of legislating an audit business but other than being a pain in the ass for companies that aren't in the audit business hasn't improved security substantially. HIPPA doesn't even comprehensively cover the data it actually claims to protect (hence Google strutting around touting that Google Health doesn't need to be HIPPA compliant).

And in the case of Microsoft, what would you legislate that they aren't already doing. No commercial vendor on the planet does anything as comprehensive as the SDL. Threat modeling - check. Automated analysis via fuzzing, vulnerability scanners, static and dynamic source analysis - check. Manual reviews of specific high impact code - check. Yearly training for every single developer, tester, and PM that actually works directly for MS - check. Blacklisting of dangerous functions - check. Review and tracking of all giblets - check. Full post mortem and root cause analysis of every patched vulnerability - check. For fucks sake I dream of implementing something near that comprehensive at my enterprise. What the heck else would you add to the MS SDL through legislation?

I don't really forsee this changing anytime soon. Windows will continue to be the dominant system for years yet, mostly because most of the software that anybody has heard of is written for Windows. Macs have made a pretty good jump into things over the past few years, by first securing the semi-niche market of graphical design and similar. These days, most anybody who considers themselves an artist uses a Mac, if they have the choice. But to most, still, a computer = Windows.

I personally prefer Linux, and have used it for years. That being said, I still keep a copy of Windows installed for games, because the majority of PC games are still written for Windows only [on that note, go Valve!]. But even then, I generally have 2-3 firewalls, and at least a couple AV packages running, and generally locked down just about as tight as I can make it, and I still get more crap on Windows than I ever have on Linux, which I haven't really even bothered to fully secure. Again, this is because there is still not much software [read: malware] written for Linux-based OSs yet. Linux/Unix is just by design more secure than Windows. You can't really get around that with even the best software, and anybody who has written and/or debugged anything written for Windows knows that it's very rarely the best software anyways, and often stolen from other systems and "claimed" as a Windows original [see Windows SuperFetch].

Basically, until the software development community and companies decide that they want to write software for systems other than Windows, there won't be any major switch. Once people begin to realize that the same software is available for all systems, and one system is cheaper, faster, and safer than the other, then and only then will the common folk begin to switch.

Please explain your evidence for asserting this. Last I checked there are still plenty of remote execution flaws in various components of Linux, various elevation of priveledge flaws, and a crap ton of bloat that every distro decides to ship out of the box. I also haven't seen Linus strutting around evangilizing the SDL process used by Linux developers - instead I hear most of them reciting the bullshit "Many eyes" arguement, as if there are actually many eyes that are both trained at security code review and looking at Linux (they sure as hell didn't catch the non-randomness of the Debian OpenSSL vulnerability for several years). Of course Secunia, DeepSight, the NVD, etc might just be full of MS bias by also reporting Linux vulnerabilities found in the wild (they aren't). The truth of the matter, something you would be familiar with if you didn't live in an echo chamber, is that MS is miles ahead in "secure by design". The absence of malware has very little to do with the absence of either design or implementation flaws. Incidentally, the most secure design in the world matters not at all if there are implementation level flaws and the fact that you don't distinguish between the two suggests you aren't that knowledgable on the subject. For example, Quantum Crypto by design is fairly bullet proof, but the damn clever Norsk cracked it anyway by going after the implementation flaws in the system they tested.

Also, if you are running 2-3 firewalls and a couple AV packages you really don't know what you are doing. Ports don't get double closed because you have two firewalls blocking them (and if they are client based firewalls you are probably causing all sorts of conflicts). Two virus scanners just bog down your system and conflict with each other. Really the only thing you achieve is to create an unmanageable system.

Cue the trolls saying that if "locked down" Windows has greater security than Macs or Linux...

Well there are really three meaningful comparisons:

- Most lax settings- Default settings- Most secure settings

The bulk of the "locked down [...]" bits refer to Windows at maximum security (via gpedit mostly) vs. defaults for the other two, and chatter by pwn2own people. Windows has a larger attack surface in any case due to market penetration, but most techhnically secure and most practically secure are different -- and it's not really honest to measure "practically" secure.

As far as I know, there isn't really a good way to do a fair comparison between the three big OS's without doing something Pwn2Own-ish, but that's not the best controlled experiment in the world ...

Windows has a larger attack surface in any case due to market penetration

You misuse attack surface. Attack surface is essentially a measure of potential entry points into an application, a product of complexity, size, and accessibility (i.e. how data gets into the application). The attack surface of windows has steadily decreased with each major release because MS has removed services on by default that listen to network traffic, client apps that parse files (these have mostly been moved to live essentials which aren't available by default), and so forth (not that it matters, since the OEMs go right ahead and install every app under the sun and increase the attack surface right back). Out of the box the attack surface is substantially smaller than the likes of Ubuntu out of the box, which includes a crap ton of additional apps (Gimp, OpenOffice, etc) which increase Ubuntu's attack surface.

Their market penetration increases the economic insentive of malware targeting the platform. It also increases the amount of 3rd party software that is available for the platform, which can indirectly increase the attack surface, but largely that doesn't matter since malware will only target 3rd party apps that also have huge market penetration (Quicktime, flash, acrobat, and still occassionally the Sun JVM, but as client java continues to die like it deserves to this last vector will be decreasingly exploited).

Quote:

As far as I know, there isn't really a good way to do a fair comparison between the three big OS's without doing something Pwn2Own-ish

Sure there is - you can measure the number and severity of vulnerabilities found in released versions of the platform. It isn't the end all be all, but security is ultimately the measure of vulnerability (not to be confused with Risk, which is the measure of impact and likelihood). The reason those of us in InfoSec stand by the security gains in Windows is because with each subsequent version it quantifiably reduces the number of vulnerabilities both in total, and on average found over a period of time, both relative to previous versions of Windows and relative to competitors.

I remember software security lecture during my University years. The only truly secure system is a computer sitting within physical cage with multiple security details, no internet connection and plugged out. If there is has contact with outside world, it can and will be hacked if given enough incentive.

That said, large bulk of the security breach happens as an aftermath of human error anyway. In fact, it's probably a lot easier to just bribe personnel than organize a coherent hacking attempt as far as industrial espionage is concerned..

I'm not sure I agree with all your points. Securing the underlying operating system alone would bring enormous benefits, especially where these systems will be used for critical infrastructure. Remember, we're not necessarily talking about every home desktop. Really, these "modern" OS's (Windows and *-nix) were not designed with a fraction of the concern for security that Multics had in the 1960's. These defensive measures you mention (DEP, ASLR, UAC) are quite useful--defense in depth and all that. But, they are no substitute for security by design, possibly with formal methods. It would be worth it to put this type of security in the core of a commercial OS.

I think it is a valid criticism that Windows has a history of being far worse than most and the allegations that they lobbied and played dirty tricks to stay in the game is quite damning. Their corporate culture has had all symptoms that one sees in an organization that produces low quality software. If you had actually developed on that platform in the previous decade and had the experience to have a basis for comparison there'd be no question in your mind that this was terribly low quality stuff. Yes, they are much better than they have been, but I think considering the risks it is not anywhere near good enough. Sadly, Windows is still a huge system; that alone creates security issues. No amount of SDL practice is ever going to satisfactorily solve this problem. It's going to require a major new architecture to really fix it.

Admittedly, the application layer is a tougher nut and clearly formal methods are not going to be applied to every application. SELinux with MAC policies is a good approach--there was a Windows equivalent that was acquired a few years ago that did something similar, but I can't remember the name at the moment. Again, it is not necessary to secure everything to this level, just those things that touch the critical infrastructure and critical organizations.

Really, if I had my druthers all this infrastructure stuff would be on SELinux and even desktop systems in sensitive organizations would be SELinux with carefully selected applications with carefully chosen MAC policies. Anything else I wouldn't connect to the network--that includes any version of Windows, Mac OS X, or other variants of Unix or Linux.

I also don't think that this doomsday scenario is out of the question. The problem is that our ability to respond to a powerful state actor attack would be severely compromised because we are so heavily dependent on our computers and networks. Seriously, if the attack is serious enough you might not even be able to make a phone call to coordinate a response.

<quote>The National Security Agency has just sent a critical alert to your BlackBerry: "Large scale movement of several different zero day malware programs moving on Internet in US, affecting critical infrastructure."

But by the time you get your office, one of the DoD's main networks has already crashed; computer system failures have caused huge refinery fires around the country; the Federal Aviation Administration's air traffic control center in Virginia is collapsing, and the hits just keep coming.

"The Chairman of the Fed just called," the Secretary of the Treasury tells you. "Their data centers and their backups have had some sort of major disaster. They have lost all their data." Power blackouts are sweeping the country. Thousands of people have already died.</quote>

But there are concepts behind computer security that are fundamental and transcend operating systems. Why in the world would we attach computer systems that regulate refineries to the internet?? What is the business argument for ATC computers being accessible or even having access to the internet?? My questions are rhetorical, I'm not doubting that there are real world examples of computers which control X-Ray machines and other medical equipment that are infected with malware and spyware, nor am I doubting that there are CNC machines out there controlled by computers running Windows 98 (cannot be patched) and can be accessed via the internet. My questions are really meant to question why we are moving in a direction where the scenario Clarke describes could be possible simply because we are incorporating computers into the internet that probably shouldn't be attached to the internet to begin with.

SELinux is good stuff. Linux is definitely competitive with Windows in terms of the security in depth components available.

Unfortunately, SELinux with full MAC is so hard to run that no main-line distribution can support that. Fedora and Red Hat have targeted policies for service daemons, but if you want full MAC, you need a custom distribution tuned for your environment.

Of course, the main advantage that Linux has over Windows is that an organization is perfectly entitled to create their own distribution and do full MAC.

I am sorry this is garbage. These kind of articles are so bad for security it's not even funny. Honestly the people who have Micro hate always go back to examples before 2000 its insane. Microsoft is producing some of the best software around right now and all this negative BS commentary is only making them stronger because they know perfection is needed to avoid idiotic commercial's and bitter company statements. I love using Linux, Free BSD and Windows. I know the strength and weaknesses of all them and together its very powerful. Anyone who thinks they are "secure" by simply switching companies is an idiot.

I generally have 2-3 firewalls, and at least a couple AV packages running, and generally locked down just about as tight as I can make it, and I still get more crap on Windows than I ever have on Linux

2 to 3 firewalls? A couple of AV packages? And your windows machine still gets attacked by malware, you say.You might not realise it, but the fact that you have "2 or 3 firewalls and a couple of AV packages" betrays your complete lack of knowledge on the subject of security. You are officially not qualified to participate in security discussions.The fact that you are apparently still being attacked by malware probably has to do with the software you are downloading illegally, or porn, or (heaven forbid) you're using a hacked copy of Windows. It's something YOU are doing. It's not Windows, it's not Microsoft, it's you.I'll tell you what: You use Windows only for Steam you say?Format your hard drive, re-install your legitimate copy of Windows, install Steam, keep Windows & Steam updated properly and have the bog-standard firewall running. Don't even bother installing AV. Your system is now virus and malware free, and shall remain as such as long as you only use Steam to play games.I think you'll find that your system runs better now also, as it doesn't have to deal with the many conflicts occurring on account of your multiple AV packages and firewalls.

While I have no love for Microsoft products, I don't think the products themselves are currently to blame. Years ago, sure - if your MS-supplied browser is susceptible to a buffer overflow that leads to someone gaining control of your PC, yeah, shame on you, MS. But it does seem like they've finally started to focus on tightening things up and turning on more sane (security-wise) defaults.

What's currently at fault I believe is the culture that they have created in the IT field. While I do sometimes run into very skilled IT folks in large MS shops, I also run into a bunch of morons that have pointed and clicked their way through their career without truly understanding the underlying technology. This I blame MS for - they've marketed "ease of use" both for end users and IT personnel as a way to save big wads of cash. What that did was bring in a bunch of fools who believed that you didn't need to understand how things work to run a large network of PCs; people who only understand email as it pertains to Exchange; people who freak out when asked to do something at a command line driven device. Those are not IT people, they are paper tigers and/or trained monkeys - take your pick.

The bottom line is that security is expensive - if you want to secure a small office or a huge enterprise (both of which in this day and age can have extremely sensitive and/or crucial data), you need to pay for someone who knows not just MCSE bullet points, but who knows something about security. You need management that understands that having skilled IT people is like insurance - if you let them set design things properly and stand behind them on security policy, you'll likely be safe. Skimp on your IT budget or hire some IT fool who just wants to spend money on the latest shiny thing he/she saw in some trade rag and you are seriously risking the future of your business.

Great, so he uses an example from 1997 to demonstrate what exactly about the computer landscape of 2010? Please don't give talking heads like this credibility in a forum where they know nothing...it is WORSE for our security. If he can't actually discuss the security model right now, he's just spouting political nonsense.

It illustrates the use in government of software where it has absolutely no fudging business. Software may change quickly, but civil and military bureaucracy does not...

While I have no love for Microsoft products, I don't think the products themselves are currently to blame. Years ago, sure - if your MS-supplied browser is susceptible to a buffer overflow that leads to someone gaining control of your PC, yeah, shame on you, MS. But it does seem like they've finally started to focus on tightening things up and turning on more sane (security-wise) defaults.

What's currently at fault I believe is the culture that they have created in the IT field. While I do sometimes run into very skilled IT folks in large MS shops, I also run into a bunch of morons that have pointed and clicked their way through their career without truly understanding the underlying technology. This I blame MS for - they've marketed "ease of use" both for end users and IT personnel as a way to save big wads of cash. What that did was bring in a bunch of fools who believed that you didn't need to understand how things work to run a large network of PCs; people who only understand email as it pertains to Exchange; people who freak out when asked to do something at a command line driven device. Those are not IT people, they are paper tigers and/or trained monkeys - take your pick.

The bottom line is that security is expensive - if you want to secure a small office or a huge enterprise (both of which in this day and age can have extremely sensitive and/or crucial data), you need to pay for someone who knows not just MCSE bullet points, but who knows something about security. You need management that understands that having skilled IT people is like insurance - if you let them set design things properly and stand behind them on security policy, you'll likely be safe. Skimp on your IT budget or hire some IT fool who just wants to spend money on the latest shiny thing he/she saw in some trade rag and you are seriously risking the future of your business.

It will take a few disasters for this to sink in.

Of course some companies have idiots who are just doing whatever they can to get by. That's life not MS's fault. You think Ubuntu is hard to use? It does not matter what you are using because some human can fu$% it up. All software has an unknown number of severe vulnerabilities. Secure software is tested software. I have about 50 podcasts I can send to anyone interested in the subject.

Honestly I don't see the point of this whole article. Granted MS might not be releasing the most secure software in the world but the gov't and DOD aren't doing thier part either. It was only in 2009 that they started adopting vista within the Airforce. If they aren't willing and/or able to keep up with the latest tech policies and operating systems then how would it matter if MS provides the absolute best software if by the time it's officially adopted it's already obsolite?

On a side note Linux might not be better due to their user base not always being the brightest or most willing for change. (Being kind there btw) Also going back to custom software isn't much better since there are dozens of contracts out for custom software which are long over budget and over schedule. Often Operating Systems and Secure Networking aside off the shelf is the best option when used properly. Hopefully there'll be a shift as more modern mentaliites of computer usage works it's way on up throughout the Military.

The former White House advisor cites the 1997 USS Yorktown incident as a consequence. The Ticonderoga-class ship's whole operational network was retrofitted with Windows NT. "When the Windows system crashed, as Windows often does, the cruiser became a floating i-brick, dead in the water."

I don't read that quote pull as a jab at Microsoft 2010 based on 1997 software, but simply as a historical incident that illustrates the consequences of using a class software where it should probably not be used. Maybe some fault is MSFT, definitely some is the contractor that designed the ship's systems.

The article author prefaced the quote as such, but perhaps most Arsians are savvy enough they can't get past the "well, duh!" factor. The ship could just as easily have crashed on Mac OS 9, or one of the flaky distros of linux at the time.

The point is not the overall system failure, but the response of the government's partner whose component caused the failure:

Quote:

[Microsoft] went on the warpath against Linux to slow the adoption of it by government committees, including by Bill Gates. .... Microsoft gave me the very clear impression that if the US government promoted Linux, Microsoft would stop cooperating with the US government.

THAT is a problem and it is the Microsoft of well before 1997 through Steve Ballmer's reign today. Could have worked with your customer (in a matter of National Security no less), but why do that when you can spend that money on lobbyists instead? As in...

Quote:

"Microsoft can buy a lot of spokesmen and lobbyists for a fraction of the cost of creating more secure systems," concludes Clarke's section on the software firm.

Of course Microsoft has the right to make fiscally sound decisions, but they used (and possibly use, depending your perspective) the monopolistic lever to cut costs to extreme degree. A degree that to any rational non-politcal would take them right out of the bid for such a project. Thank goodness malware started wrecking WinXP to such a degree that Bill Gates belatedly started to get a clue, but security never got the kind of focus that, say, Chinese pirated copies of Windows had.

Why would you trust a partner with such an incompatible goal? Microsoft has every right to talk, but the government was and continues to be a fool to listen. Some things _should_ be expensive, because they are _hard_. The operating system underpinning critical national infrastructure, in my opinion, is one of those things.

Let me guess... the next class of aircraft carrier will run entirely on Flash?

Can someone explain to me why everything needs to be connected to the internet? Refineries shouldn't be connected to the internet IMO. DoD should just invest in their own network. It may cost a lot but I'm sure China can spot us a few bucks

The evidence for why Linux/UNIX is more secure than Win32 by design is in the API.

Ever compared a POSIX API to a Win32 API to perform a similar task? Chances are thePOSIX API will be relatively simple and comprehensible, the Win32 API will be baroque and complex.Unsurprisingly, when trying to program against Win32, there's a high likelyhood developerswill get it wrong in interesting and subtle ways that invite exploitation.

Only an OS you can totally customize and audit has the potential of being highly secure. OSX and Windows are ruled out from the start. All you Windows and OSX people really need to learn a thing or two about security.

The former White House advisor cites the 1997 USS Yorktown incident as a consequence. The Ticonderoga-class ship's whole operational network was retrofitted with Windows NT. "When the Windows system crashed, as Windows often does, the cruiser became a floating i-brick, dead in the water."

Oh jeez not this old myth again. The Yorktown was running an app called the remote database manager, some guy entered a zero, the app crashed and all the clients of the app on the network crashed. That's it. Windows didn't crash, the custom application crashed. Y'know, the type of custom app this idiot thinks will make us safer.

I heard Terry Gross's interview w/ Richard Clarke on NPR, and I really think the Microsoft bashing posters, and the anti-Microsoft bashing posting are kind of missing the point as to what Clarke was actually trying to get at. His key point is critical infrastructure systems, like the power grid, the federal reserve, the air traffic control system, etc. REALLY need to be secured, regardless of whether or not they are privately or publicly owned.

Microsoft stands an obstacle to this, not because their operating system is less-secure then commercially available alternatives, but because they would oppose any regulation which would potentially interfere with either their ability to continue with their current software development model or their ability continue to sell large numbers of software licenses to the US government. This ISN'T because Microsoft has a completely cavalier attitude about security, just that any change to the status quo is bad for them, since right now, they are king of the market as far as OS and application software goes. If Apple or Red Hat were in a similar position of dominance they would probably oppose change as well. But they're not, and Microsoft is, so it's Microsoft that's the problem, and not Apple or Red Hat.

Let us take a step back from the O/S wars and the rhetoric regarding M$. The lesson that can be learnt from this article is a very valuable one: place all your eggs in one basket and see what happens when a 10 ton truck happens to ride over the basket full of your eggs!

Here are some very interesting comments:

Quote:

"Some people like things the way they are," Clarke obliquely observes. "Some of those people have bought access." Microsoft, he notes, is a prominent member of OpenSecrets.org's "Heavy Hitters" political donor list. Most of the list's stars are trade associations. "Microsoft is one of only seven companies that make the cut."

Quote:

The software giant's largesse has shifted from Republicans back in the Clinton antitrust days to Obama, he continues, but the agenda is always clear: "Don't regulate security in the software industry, don't let the Pentagon stop using our software no matter how many security flaws it has, and don't say anything about software production overseas or deals with China."

Quote:

"Microsoft can buy a lot of spokesmen and lobbyists for a fraction of the cost of creating more secure systems," concludes Clarke's section on the software firm. "They are one of several dominant companies in the cyber industry for whom life is good right now and change may be bad."

M$ is the 1000 pound gorilla in the room. Their pockets are very deep. They can afford to lobby governments around the world. That has a lot of clout and this is what Clarke is going on about. Lets forget the issues of the operating systems that M$ put out and the flaws that go with them. I believe that Clarke is on the money on this. It is an over-reliance on M$ to the extent that it can affect national security. Let us not forget that the Chinese have a very effective national IT policy that keeps out what they don't want their folk to see - and that leads to a very high level of sophistication in IT. It has been alleged by the Pentegon that their systems have been under some form of probing - from within China. And what about terror organisations that could also gain very specific skills in IT?

Problem with relying on a M$ O/S is that it is everywhere. Available to be broken down and reverse engineered to find the flaws. Same with OSX and Linux. But the big problem is that a M$ O/S is in every country in the world.

The root of the problem according to Clarke is the political clout that M$ has. It is the same with the entertainment industry - too much clout and that is why "the Man" will win - through regulation and legislation...

And one point I forgot to mention in my previous posting; I believe that where a government department needs to have the highest form of security, then it is best to use a proprietary O/S - one that is designed for the express use of the department, along with custom apps written for that O/S. Only way it is going to work...