President Barack Obama focused on a number of new cyber security proposals that will encourage greater information sharing between the government and corporations. How boards of directors and CXOs can build the proper foundation to address today's IT security challenges is the topic of Cybersecurity Boardroom Workshop 2015, 2-day seminar well-known cybersecurity expert Edgar Perez will conduct in financial centers Dubai, Hong Kong, Seoul, Singapore, London and New York City. This is the first seminar developed for leaders for whom cybersecurity preparedness is a relatively new yet critically important area to be intelligently conversant about.

Thanks for your comments. I completely agree that responsibility extends beyond IT and should include executive level management up to and including the board. I do find this to be a two-front battle – one is in engaging executives that do not have an appetite for technological detail and the other in getting IT to open communication lines that have hardened over time. This article was meant for an IT audience and needed to be focused as such; a broader conversation could have explored some of the ideas you mention. I appreciate and share your perspective.

Deena Coffman's analysis that the responsibility for IT security should be taken away from IT departments and moved somewhere else is certainly very helpful. Today's mainstream IT security is broken beyond repair, however typical IT departments still try to conceal that ugly fact and are therefore not very likely to fix that huge problem.

But finding a new scapegoat in the form of some Information Security department which is not reporting to IT won't help much either - most likely it will just result in buying more addon products and hiring more experts without solving the underlying problem. Rather, the bank's top management now need to accept ownership of the IT security misery themselves. Why ? Because the bank's core business is so much depending on IT - much more than in any other industry.

The next step is to do a honest root cause analysis on why, for heaven's sake, today's mainstream IT is so vulnerable. The outcome is quite likely to reveal that today's mainstream IT is built on vulnerable hardware and software platforms - it is built on PC technology (initially designed for single user systems) and on a networking architecture that was initially designed for closed user groups that could trust each other.

True - since many years a huge amount of work was and is done, and huge amounts of money are spent each year trying to retrofit security into an environment that wasn't meant to be secure from day one. Looking at the poor shape of security in mainstream IT today, most people will readily agree that the success of those attempts has been rather limited so far.

Top bankers are likely to understand that it takes a lot of effort and money to create a much more robust and secure IT infrastructure. They are also likely to understand the ugly risks of not doing so.

Now they need to take a bold decision to move away from vulnerable IT infrastructures - and yes, there are better alternatives around, and their IT department can find those when being ordered to do so. Top management also need to tell their beancounters about that new priority.

The outcome will be a much more robust IT for banking and payment purposes which is not so easy to manipulate for the average hacker. Security by obscurity ? To some part yes - but there is nothing wrong about that approach, in the real world it is usually found to work quite well. And fortunately it will turn out that operation and administration of that new IT is much less costly than for today's mainstream IT.

I agree that cybersecurity poses a challenge to both students and institutions. Earning the education and credentials necessary to get a job in cybersecurity costs students a lot of time and money, and schools have to keep up with the constant changes in the field. Given that the field is pretty new, it's not terribly surprising that there is little coordination among the schools, businesses and accrediting institutions involved in educating and hiring cybersecurity professionals. Hopefully we'll begin to see more collaboration among them in 2015.

I couldn't agree more, though I feel that the issue stems from under or undeveloped cybersecurity curricula and failed coordination between academic institutions, hiring organizations, and accrediting institutions. Cybersecurity is a fairly new field and the combination of information security and physical security leaves some broad gaps and very heavy biases among "qualified" cybersecurity professors. Many physical security practitioners who moved laterally into the cybersecurity field may be more focused on biometrics, mantraps, and concertina wire. IT professionals who evolved into Cybersecurity practitioners may fail to see the value in physical deterrence and mitigation. There has to be an even balance and folks must stop referring to physical security as something separate from cybersecurity.

Many academic institutions are racing to develop new curricula to keep up with the expanding demand for cybersecurity practitioners. The increase in job market demand leads to increases in students and ultimately, student revenue and tuition fees. The problem here is that a student may spend thousands of dollars and anywhere from two to eight years studying cybersecurity. Although, without the proper certifications, which are required by many employers, the student will still be hard-pressed to find a job, post-graduation, that pays a reasonable salary. There has to be a way to bridge this gap, not only between academic institutions and the organizations that hire their graduates, but also between academic institutions and accrediting organizations, such as CompTIA and ISC2. Common nomenclature and standardized curricula are vital to a healthy and qualified cybersecurity workforce.