Posted
by
Roblimo
on Tuesday May 24, 2011 @08:03AM
from the watch-those-centrifuges-spin-out-of-control dept.

From the article: SCADA systems -- computer systems that control industrial processes -- are one of the ways a computer hack can directly affect the real world. Here, the fears multiply. It's not bad guys deleting your files, or getting your personal information and taking out credit cards in your name; it's bad guys spewing chemicals into the atmosphere and dumping raw sewage into waterways. It's Stuxnet: centrifuges spinning out of control and destroying themselves. Never mind how realistic the threat is, it's scarier."
What worries Bruce Schneier most is that industry leader Siemens is keeping its SCADA vulnerabilities secret, at least in part due to pressure from the Department of Homeland Security .

You all keep on pissing and moaning about Iranian nukes, while part of the new Saudi arms deal is to protect future Saudi nuclear ambitions.. which, by the way, also involves Pakistan [google.com] (had to use google cache to get the whole article)

And what did this clown [foreignpolicy.com] ever do to deserve all those medals?

The image the author creates is of a machine spinning at such velocity it explodes in a shower of fragments. While that makes for great copy, it's hardly what happened. In reality, Stuxnet caused the affected centrifuges to alter their rotational speed by only a few percent, which resulted in lower material rendering in the cascading purification process. This result has several advantages to a "self-destructing" centrifuge. 1) a destroyed centrifuge is an obvious problem which would trigger immediate inves

Government officials care only about their own lives and those of their friends and family. That's why we can have wars. Note that the draft exemptions are generally met by the family of anyone who is involved in mandating a draft.

The Japanese nuclear plant in Fukushima ran on Siemens computers that the Stuxnet worm was programmed to infect- in fact the virus was found in Fukushima systems last year.Makes you wonder why the cooling system wasn't functioning. Maybe the tsunami caused failures which Stuxnet made the reactors unable to handle.Failures at four other plants in Japan, German and South African reactors shut down.Using Siemens systems as well?

Stuxnet doesn't "target" anything other than Windows SCADA systems (which should cause concern when you see those three words together...), notably those from Seimens. Anywhere you've got one of those SCADA systems, you've got a possibility of Stuxnet. It's just that Iran was using them for their process control systems for the enrichment plant.

Stuxnet doesn't "target" anything other than Windows SCADA systems (which should cause concern when you see those three words together...), notably those from Seimens. Anywhere you've got one of those SCADA systems, you've got a possibility of Stuxnet. It's just that Iran was using them for their process control systems for the enrichment plant.

Stuxnet targets a Siemens centrifuge controller that's programmed by an (air-gapped) Windows machine. Unfortunately this same basic pattern repeats itself all over the place.

For any given SCADA system --- regardless of manufacturer --- you're extremely likely to see it connected to a modern PC, typically a windows machine. Even if the Windows machine is just running a terminal program, it's connected.

What Stuxnet showed us is that these Windows boxes are a critical vulnerability, even if they're just an ingredient in the programming chain, even if the box is separated by an air gap. I'm sure Israel/US would have found a way to those centrifuge controllers, but without the Windows infection vector it would have been a whole hell of a lot more difficult.

But in all seriousness, the question I would ask is: how many known USB drive infection and privilege escalation vulnerabilities can you download for a 1-year-unpatched* Mac right now. How many can you download for the same Windows machine? In each category many have already been weaponized? How many of these can be tied together with widespread malware vectors that will get them near the machine to be infected?

Stuxnet doesn't "target" anything other than Windows SCADA systems (which should cause concern when you see those three words together...), notably those from Seimens.

You might want to do a little more research on the matter.Stuxnet's code has been picked apart: the trojan was designed to infect SCADA systems, but only to attack very specific hardware configurations.

Stuxnet's payload was designed to (1) spin the uranium centrifuges used by Iran at certain known-to-be-destructive RPMs,(2) lie to the monitoring software which was supposed to prevent out of bounds conditions and set off alarms if they occur,and (3) should 1 & 2 not ruin the centrifuges, Stuxnet would go dormant and reawaken to try (1) and (2) again.

Stuxnet is completely harmless unless you happen to attach the exact same hardware the Iranians had plugged into their SCADA controllers.Just to be very clear: Stuxnet's payload was specifically crafted to attack the known configuration of Iran's uranium centrifuging program

Actually Stuxnet does have a target. Each Siemen system has a unique serial number. Stuxnet only manipulated systems with certain serial numbers and it so happens that these serial numbers only existed in Siemens systems in Iran.

neglible amounts, if any. the real why it would end up in atmo would have been the faults of the engineers working at the plant. if you don't know how to build control systems, you shouldn't be building a nuke in the first place.

anyhow, scada networks, because of how practically all of them are designed, should be separated from untrusted networks anyways, and preferably all control going through some bridge that wouldn't pass "wrong" things - better yet, all control should pass through a human - but this

I'm sorry but I must have missed the proof that the US or Israel was responsible for Stuxnet. But of course actual facts are old fashion and can be quite problematic when the don't support your world view.

... I can see not publicizing vulnerabilities. We don't, for instance, want our military publicly posting our vulnerabilities. Because, they sure as anything aren't going to ask for public patches. Public disclosure only really works if someone in the public can help. On the other hand, if you are running legacy systems in any number of unknown locations, you can't apply the patches anyways.

We always talk about how bad obfuscation is as a security vector. However, it is a vector. Knowledge of a thing

In the event that a cyber attack did cause collateral damage (unlikely, in this case, but maybe not for future ones), whomever is pressing the launch button better be in uniform.

Why? Military operations against actual targets are legitimate acts of military aggression. The Laws of Armed Conflict (LOAC) are the legal basis for determining whether an act is legitimate act or a war crime.

This is why we don't prosecute fighter pilots for targeting a bus with a JDAM, that is known to be carrying Al

Civilian casualties are regrettable, but kinetic operations are not going to be shelved on that basis alone.

Wow, that's cold.

It's easy for you to say that when it's not your wife and children on that bus. If they were, you might have a different view on whether or not merely being a uniformed cog in an industrial death machine should allow "regrettable" murders to be shrugged off as if they were heavy rainfall.

Funny thing is, I thought I was taught in high school that the "can't prosecute me, I was just following orders, sir" defence was smashed apart at Nuremberg. Apparently that wasn't the case?

I'm not sure it would have done much good. The general consensus of opinion is that this was a case of a determined attacker with a lot of resources, not some nutter on the Internet with a copy of the latest Virus Generator Toolkit (TM).

How much weight we should give that opinion is something I'm not going to discuss.

In any case, you think a determined attacker is going to be put off by a small thing like that? Hell, if it boils down to it you either organise double agents to apply for jobs at the target site or you target someone who already works there with a brown envelope full of unmarked, non-sequential notes. The latter is high risk, but find the right person, someone who's in debt up to their eyeballs and has been keeping it from their family for some time perhaps, and away you go.

Why would you need heavy security for something that is air-gapped? If the Bad Guys (TM) get physical access you've lost anyway! We didn't even require passwords for access, because the keyboard was locked in the control cabinet. The only time it was networked was when someone hooked up a modem so it could be remotely debugged or upgraded. After which the modem was disconnected.

Imagine a power plant that takes little to no intervention throughout the year. At most the engineer(s) only need to make adjustments when changing out fuel rods or during an emergancy. Now imagine the engineers that make these changes make $200k+ a year and your company has 10 such reactors. Introduce the internet and... profit!!!

I'd imagine it would be because the company that makes the machines that you're controlling only make drivers and control software for their own special computer systems that you have to buy from them. The advantage there is that if any part of the system goes wrong, from computer to end product, you have a single point of contact to get support from.
I think the mistake that many people on/. make is thinking that everyone is a 'computer guy', where in reality the people running these computers just know h

Quite a lot of people lately. Management wants to see their production on their office computers. They're easy to network and of course easy to hack. Siemens is not the only one vulnerable here, *cough* Schneider *cough* Schweitzer *cough* ABB...

Who is running industrial systems with direct contact with Internet anyway?

Here's a thought, why does it matter? So far there has been only one demonstrated attack on a SCADA system, and that attack didn't use the internet as its vector.

SCADA systems benefit greatly from being connected to the world, but not directly. There should be many tiers of security both virtual and physical. It is the physical security here that was lacking. The best airgap in the network doesn't help you if one of your underlings plugs an infected USB stick into a machine on the process control network.

I would leave exposed SCADA interface in the open, after Stuxnet it should be clear that securing SCADA interfaces should be done on a higher level - by putting it in a different VPN etc.Whether the vulnerabilites are public or not doesn't change the fact that a given setup is secure or insecure by design...

Now imagine the scenario where you have windows machines on the same network as your SCADA devices because the tools you've bought or built work this way. Someone attaches an unauthorized device to your network and fail, fail.

Now, I think we can probably agree that you can and should take steps to prevent something like that from happening, but there is the issue of getting from point A, where your network is insecure, to point B, which requires at least buying or developing a whole bunch of new software. T

Now imagine the scenario where you have windows machines on the same network as your SCADA devices because the tools you've bought or built work this way. Someone attaches an unauthorized device to your network and fail, fail.

Aren't those development tools rather than run-time tools? If so, isolate your system and get serious about how you allow stuff to be moved over to it.

Not really. The process control is done on real-time controllers, but visualization is usually on windows machines. Data historians, configuration databases, OPC servers, etc are often Windows servers. Add to that that hotfixes and service packs have to be vendor approved before putting them on the live system. This means that those systems often run whatever was approved at the time of installation, which can be years out of date.

Many SCADA and DCS systems are also horribly insecure, have default or hard coded administrative passwords, etc. What doesn't help is that they are often managed by people who are good at the actual process stuff, but not necessarily at security or system administration.

The ones who should tell their Customers about the problem is Siemens. But they will play the problem down because it might affect the sales of the next batch of stuff.

The evil hacker will just buy a bunch of systems, analyze it and find the vulnerabilities. This completely independent of the disclosure. Stuxnet was developed before this disclosure and I think the vulnerabilities used by Stuxnet are still there.

The evil hacker will just buy a bunch of systems, analyze it and find the vulnerabilities. This completely independent of the disclosure. Stuxnet was developed before this disclosure and I think the vulnerabilities used by Stuxnet are still there.

This is why security by obscurity does not work in the real world.

Most definitely. Comments about someone not being able to afford to buy the devices not withstanding, it is very much what someone would do if they were to attack a system or come up with a new Stux

And yet obscurity is a valuable tool in security. Absolutely it should not be the only tool - but discarding it completely is like saying we should discard firewalls because firewalls can't stop all attack vectors. They are all tools, and none is sufficient security on its own - but many different tools used in conjunction can make for a formidable defense.

The ones who should tell their Customers about the problem is Siemens. But they will play the problem down because it might affect the sales of the next batch of stuff.

Speaking of real world this is something that doesn't happen on a typical control systems vendor. Your typical vendor releases pages of errata. Your typical vendor knows what you have purchased and your typical vendor typically comes running in with a fix be it hardware or software, or a temporary workaround while they are coming up with the fix.

This has happened to us on many occasions. One instance a certain series of commands issued on the display graphic would cause an alarm manager to stop responding.

Except that the bad guys have already been made aware of the vulnerability, since Stuxnet is out there for anyone to analyze. Do you think the Iranians have not been picking apart Stuxnet and trying to figure out how they can use it?

That's specifically not what they're doing...telling the affected people about it. They're keeping that information to themselves- because it might reveal the exploits in question. As for not disclosing because the bad-guys might figure it out...heh...keep fooling yourselves folks. The bad-guys almost always KNOW about them- it's why they call 'em "0-dayz".

Actually it's probably the CIA, NSA and other TLA's that truly want the security holes. They're just using the DHS as the mouthpiece to convince the companies to keep quiet and not plug the holes. After all, without those holes, Stuxnet (and likely other woms/viruses/trojans) wouldn't be as effective as they apparently have been.

I'm not so sure: Obviously, assorted sinister TLAs are happy to exploit available holes; but all but the really stupid ones have to realize that they don't exactly live in a unipolar world when it comes to writing viruses, and that the US(and its assorted western buddies) have a lot to lose in an atmosphere of general SCADA-smashing.

If all SCADA systems become deeply vulnerable, who loses more? Industrial or post-industrial societies with high levels of complexity that could be on the edge of collapse with a few days of supply chain disruption, or the dusty low-GDP countries of the world where disenfranchised hackers, cheap laptops(and/or exploits provided by friendly powers using them as proxies) are still easily available?

There's another layer here. Having vulnerabilities allows TLAs to do sneaky things, empowers them, and helps them do their job. And you're right, those things cut both ways. But, having them cut both ways ALSO empowers the TLAs. They all got a big budget boost in the aught years. Security was suddenly a really important thing.

If the CIA etc really wanted to infect these things, why wouldn't they just infect the machines at the factory then use a front company to sell them to IRAN or whoever on the cheap.

I remember seeing a JAG episode once where the spooks deliberatly allowed some bad guys to steal an F-14 and extract its control software (knowing that the software was to be given to IRAN for an upgrade of its F-14s and knowing that the software was deliberatly defective) and I see no reason it couldn't happen in the real world.

Responsible disclosure meant giving the company time to fix the vulnerabilities before you released the info to the public.
So the companies stalled and never fixed the vulnerabilities, tried to sue the researchers, etc.
Responsible disclosure came to mean giving the company a set amount of time to fix the vulnerabilities before you released the info to the public.
So the companies kept threatening researchers, called the grace period "extortion" and such, stalled for more time to "test", and didn't fix the

Hole/bugs lifetime is forever. If you find a bug or a hole, and you choose to ignore then, it will not go away. It will be there waiting for his moment to ruin your morning. Maybe bug/holes are not as important as people dedicated to the racketeer industry think. So if you can't fix then on the morning, you can fix then after the tea, if you fix then today.

Believe me I know all about it, as I'm a guy that designs and puts those links in. I've been making a big push with the uppers for more secure links and it is starting to get in now. Licensed bands with AES etc. Previously it would all be modpacs or Canopy links. We're starting to move to licensed stuff with AES now, but damn near everything in this area was already done like this. Anyone could pretty much get on the network with the right know how. Most of the stuff is the water control systems for the cit

I do control/command systems as well and there are many reasons why there's few if any security. One of them is that in the (rare) case of a reboot, you want the system back online automatically as quick as possible. You don't want to wait until 9am when the first employee who knowns the password can type it in... hence no passwords. Note: I'm NOT saying it's a good thing !

Heh... They're "thinking" about using crypto on things like the radio links. They're "concerned" about things like "latency" (Here's a hint, if you're worried about injecting a 1-2 character's worth of transmission time delay at 9600 baud, you're doing it wrong.) so the industry's been reticient at trying to at least lock down some aspects of the remote links. Biggest problem is the downtime of some systems in addition to the overall expense of things while they retrofit to higher data rates, end-to-end

SCADA security isn't. I'm sorry but it's true. And the entire "security industry" is talking just like all the slashtards commenting.

Doing security right in this environment is non-trivial. The SCADA/ICS vendor community isn't providing it because SCADA/ICS customers aren't asking for it. The downside of course is that the SCADA/ICS customer is NOT the individual who is going to suffer when the screwups happen. The SCADA/ICS vendors and customers hav

Astonishingly, yes, at least if you count ProQuest [umi.com]. Not that I'd bother reading it (or at least anything but the background material) if I were you--it was basically about hooking up a SCADA emulator to Snort and an alert correlator to make a testbed you could deploy potential attacks against to see if your filter configuration worked. I have no idea if

Long ago, I worked as an IT admin for a grocery company that owned it's own bakery, ice cream, drink, etc plant. The "industrial control systems" I saw in use were the worst engineered pieces of junk I've ever encountered. I am talking unpatched Windows 95 systems running a crappy VB 4 UI, that talked to a poorly written VxD to control the ice cream mixer, which was a massive piece of equipment that could easily kill someone standing too close to it.

From what I hear from my friends (one of whom used to program for defense embedded systems), it is all like that. Terrible platforms, terrible code, security through obscurity (if that), etc.

There's no practical way to defend the embedded system from the device which programs it. So while it's true that the tools use to program embedded systems are often primitive, it has little to do with attacking them.

The fact that the video signals from some of our drones are broadcast unencrypted over the air?

As the person on site somewhat responsible for managing encryption keys for wireless telemetry and wireless process control systems on my site I call bullshit. Either that or you did your thesis in the 80s oldtimer.

The problem with such systems is that they can be 'infected' through the programming platform. In the case of Stuxnet [wikipedia.org] it was the PCs used to program the PLCs that were infected. And one of the vectors of infection was the use of infected USB flash drives on these (Windows) systems. Programming PLCs is often done through a direct cable connection, so while keeping industrial control systems off 'The Internet' may be a good idea, it isn't sufficient to prevent such an attack.

there should also be strict government oversight to ensure the vulnerabilities are being fixed.

... And that the fixes don't make it to other governments. See: VUPEN's alleged Chrome exploit.

VUPEN released a video of the exploit in action to demonstrate a drive-by download attack that successfully launches the calculator app without any user action.

The exploit shown in this video is one of the most sophisticated codes we have seen and created so far as it bypasses all security features including ASLR/DEP/Sandbox (and without exploiting a Windows kernel vulnerability), it is silent (no crash after executing the payload), it relies on undisclosed (0day) vulnerabilities discovered by VUPEN and it works on all Windows systems (32-bit and x64).

VUPEN, which sells vulnerability and exploit information to business and government customers, does not plan to provide technical details of the attack to anyone, including Google.

Guess it depends on who you think the "bad guys" are. I say, show the world and let the good 'n malicious duke it out -- hint: Bug fixes are often easier to code than full exploits.

Hi, I’m Dirk Gebert, system manager for security for Siemens Industrial Automation Systems. I’m on the team working on the topic mentioned in this article.
We are posting updates on this website: http://www.siemens.com/industrialsecurity [siemens.com]. Let me know if you have questions that are not answered there.