The US House of Representatives has overwhelmingly passed a bill that would direct almost $400m toward research designed to shore up the nation's cybersecurity defenses.
The Cybersecurity Enhancement Act would approve $108.7m over five years to establish continue a cybersecurity scholarship program. In return, students would …

COMMENTS

Yeah

it is not like security work isn't the most complex element of software development. If you want to know cyber security you have to code, and not code in some poncy retard language, like .Net, PHP or Java you are looking to code in assembly and have an intimate knowledge of where the instructions meet the hardware.

And the only way to code well in assembly is to understand how higher level languages are written, this is computer science on steroids, and most CS students are shite at coding anyhow so you are looking at a very small number who will really be able to comprehend cyber security.

Yes and No

I am one of those assembly language programmers possessing an "intimate knowledge of where the instructions meet the hardware". I am very fond of programming on bare metal, but I've had to incorporate platforms such as Java, .Net, PHP because that's where businesses are. This is where the security battles will need to be fought.

It's true that platforms have internal bugs (I'm looking at you PHP), and unmanaged languages can leave open low level overflow attacks (c). However in my experience most security bugs exist at a much higher level where low level hardware knowledge isn't applicable.

That said, I agree CS programs have been watered down, but this is a reaction to the markets no longer needing those skills. Until the market appreciates us more, I could not in good conscience advice a student to become an assembly language expert.

Question...

perpetuating the fundamental error

This initiative is likely to do little more than perpetuate the error of considering "cyber security" as a technological issue. It isn't - it's a conceptual issue. It's current state of weakness is a function of the same appalling quality of risk judgement that is increasingly evident in national policy decision-making (Katrina, Homeland Security, banking &c.). We have become so dependent on rule-based systems (both technological and social/legislative) that we have effectively ceased to be able to think flexibly and holistically. As a result we race behind the bad guys fixing a cascade of symptoms, unable to recognise, let alone address, the fundamental disease.

Contrary to popular opinion, software development is not such an overwhelmingly complex activity that it's impossible to create error-free code. You just have to pay attention, really understand what you're doing, and, most importantly, actually care about what you're delivering. It seems the majority of developers/programmers don't , don't and don't - not because they use abstracted high-level development tools but because they rely on such tools to absolve them from taking the personal responsibility for getting it right. It's an attitude problem before all, and is no different from the almost universal desire of our student population to get the degree without having to make the effort required to actually learn the subject.

We need people in charge of our security (and that includes not only "security specialists" but also application and service designers, programmers, testers, deployers, service managers and users) who actively seek to bear the requisite responsibility for fulfilling that task . Such people will make sure of their own accord that they are sufficiently competent to do so. Absent that attitude, no training programme will help.

Why

It's about time...

I was wondering, what with practically outlawing creative hacking with the DMCA/PatriotAct nonsense, when they would wake up and realise that the best person to have on your security detail is Somebody Who Can instead of somebody who just thinks they can.