I've been watching the brouhaha over the article in WSJ for most of a month now, with some bemusement. Essentially, 95% of the 'informed opinion' in the infosec blogosphere has been along the following lines:- The WSJ is irresponsible to have published this piece;- The journalist is even more irresponsible to have penned it;- It is outrageous!! Something Must Be Done!! Prepare the noose!!

What I haven't seen anyone cover in depth as yet is the concern that information security controls on the corporate desktop are so pathetic that an editorial piece in WSJ can blow them wide open. Que? Aren't the bloggers completely missing the point?

I've never bought the argument of 'security by obscurity' which they seem to be arguing for. We in the infosec profession should be redoubling our efforts to design and apply sound desktop security controls, not bleating at the journalist who says "The King has no clothes". As to those 'infosec pros' who are baying for her blood, shame on you. Shooting the messenger won't alter the fact that desktop security stinks.

Isn't this just the same argument as with full disclosure of security vulnerabilities? Most of the profession are outraged that someone would even consider posting an exploit in a public forum, let alone doing so without giving the relevant party time to analyse it, create and test a fix, and then wait N months for everyone to implement the patch. Hackers, meanwhile, argue very convincingly that if they do not at least disclose exploits "responsibly", they will never be fixed because vendors are far too busy adding new bells and whistles. They say that crackers, the criminal underground and 'terrists' will eventually discover the self-same vulnerabilities and exploit them for criminal purposes and the world as we know it will come to a sticky end. Both points of view have merit but the real issue is that FAR TOO MUCH SOFTWARE HAS BLATANT BUGS THAT CREATE SECURITY VULNERABILITIES BECAUSE SECURITY IS NOT A DEVELOPMENT OR SALES IMPERATIVE. In that context, the full/responsible disclosure argument is simply irrelevant bickering.

I'm looking forward to the WSJ's forthcoming editorials blowing open web security, multifactor authentication, database security and all those other oxymorons so beloved of the 'infosec profession'.

Although the survey and case studies are European in origin, I'm sure the general discussion and ideas on the thorny issue of measuring information security awareness programs, and in fact measuring information security as a whole, are broadly applicable. Three-quarters of the Europeans surveyed said they have to do security awareness as a compliance requirement. I didn’t realize it was such a high proportion.

References in the report to the lack of consensus and evolving good practices indicate the variety of awareness and metrics techniques in use. I was interested to see markedly different opinions on the value of CBT (Computer Based Training) or posters, for examples, and ambiguity throughout the report about "training" vs "awareness" (NIST SP800-50 speaks to the difference, as does the NASCIO report noted below). I heartily agree with the implication that security awareness should be a rolling year-long event, continually updated to reflect current issues, rather than a sporadic/once-a-year training course (the dreaded 'sheep dip'!) or, even worse, the once-a-career induction course, no matter how effective is classroom-based training.

The awareness topic list on page 5 of the report seems 'about right' to me although there are many other topics perhaps worth covering (e.g. software development, database security, privacy ...) if you are creative about it, which also helps keep the program fresh and interesting. All in all, it's 20 pages well worth reading.

"Since a holistic approach to security revolves around people, cultural change is needed to truly ensure that employees and contractors understand their IT security responsibilities and take them seriously."

The report promotes the value of continuous, long-term, broad-based security awareness activities in addition to more narrowly focused and spasmodic training activities.

"Continuous and ongoing awareness and training activities for state employees (and contractors) could help prevent a major state crisis ... Cultural change to the fabric of the state government workforce is needed to make IT security and the ethical use of state IT resources as ubiquitous as technology. Since that cultural change involves changing the way that state employees perceive IT security, consistency and patience are necessary ingredients. Isolated presentations or training sessions, while a good start, will not lead to the creation of a long-term culture of IT security. After all, state employees, like everyone else, have many plates to juggle and may not retain the entirety of the aweareness and training content to which they hjave been exposed, expecially upon the passage of months or years. Hence, regularized and constant reminders in mand forms are needed the enact this cultural shift ... Consistency is a key factor. One isolated presentation does not make for adequate awareness. Presentations on a more frequent basis can help to keep IT security at the forefront of government officials' agendas so that executive and legislative support does not wane over the long term."

Absolutely! This is probably the key reason that old-fashioned "security awareness" programs (usually consisting of sporadic and uncoordinated security training sessions in fact) do not achieve the instant results that are anticipated. People who naively expect security awareness to turn things around within a few weeks or months are missing the point: genuine cultural change takes continuous gentle pressure in the right direction over years not weeks.

"Innovative approaches may serve to spark IT security awareness in the minds of many state employees. By starting with a marketing campaign of sorts for IT security, a state can start to build a culture of IT security vigilance."

Again, I agree wholehartedly. With the marketer's hat on, NoticeBored's security awareness posters (for example) are efffectively 'advertizing' information security as a whole, with a touch of humor and a little information on the monthly awareness topics for good measure. A distinctive logo on all the materials helps bind them into a whole, while the underlying messages in all the materials reinforce the fundamental core values in information security such as: confidentiality, integrity and availability; risk and control; and prevention, detection and correction. This is quite clearly a branding technique. [By the way, that idea suggests to me a novel way of measuring the effectiveness of security awareness programs, namely using the same techniques that marketers use to assess the effectiveness of advertising programs. Surveys might for example assess the recall of key program images, sayings and messages by representatives of the target audiences, and measure the retention of information security concepts compared to 'competing' awareness initiatives such as health-and-safety or legal compliance.]

As you read the report, do check out the sidebars with numerous examples of security awareness activities from several states. Many of them have a public outreach element with security awareness activities targeted beyond satte employees.

The NASCIO report quotes Insider Security Threats: State CIOs Take Action Now! published earlier this year from which the graph above is taken. The obvious increase in incidents on the graph presumably reflects better incident reporting processes (otherwise there seems to have been a severe lapse of security since 2005) but the proportion of insider vs external hacker attacks is interesting. Insiders, of course, have ready access to the information required to do their jobs and often much wider access to information due to the practical problems of trying to enforce 'need to know' outside of a military context. When insiders go bad, therefore, they can cause a lot of damage without triggering the intruder alerts that (some) hackers trip. Other insiders are often best placed to identify and report internal security incidents, provided they are aware of their responsibilities and know what to look out for - in other words, security awareness is a very important element of control against the insider threat.The report also touches on the difficulties of getting executive support for security awareness and offers some practical tips, essentially starting with specific high-level security awareness activities targeting the very executives who should understand and fund awareness.

Go ahead: print out both reports, sit yourself down somewhere quiet with a cup of coffee, red-pen them and cogitate. There are good ideas and complementary approaches in both of them. I certainly came away with a number of interesting thoughts and quotations that will appear on the NoticeBored site and our awareness materials in due course.

Aug 24, 2007

WatchGuard has released some short audio clips making fun of IT help desk callers and IT pro's alike. If you're into multimedia presentations, they might liven up your security awareness seminars, along with their video clips on drive-by malware and rootkits.

Aug 22, 2007

We've received loads of similar malware spams today, all basically the same structure with minor differences and spelling mistakes (see above).

The links vary but we understand that one (at least) attempts to infect visitors' PCs with a downloader Trojan. Good up to date antivirus software should trap it but do not rely on this as your sole control: it is not recognized by all antivirus programs.

A quick search of my spam/deleted box for emails containing the string "account number" reveals a whole bunch of em received so far today.

The spams are believed to be the result of a new mutant of the Storm worm that has been very active for weeks. SANS Internet Storm Centre has some technical info on it and there's more on F-Secure's blog.

The usual advice "Don't click on dubious links" applies here. Now might be a good time for your security awareness person to inform your fellow employees in calm, helpful tones about the threat. PLEASE do not add to the problem by circulating wild warning emails with "Please tell everyone you know!" or similar - leave the job to the professionals and the news media. Oh and don't forget to check that your antivirus software is updating itself regularly.

The CSO Executive Council is running a series of surveys to assess security metrics practices. The latest survey report revealed that two thirds of respondents do not gather security program data in order to create statistical reports to present to senior management, and followed up with the following (sample) of explanations:

- Not requested, and no value to security program at this point.- Lack of management interest in seeing security metrics.- Lack of interest by senior management.- No funding'- Embryonic'- Information is gathered and presented to senior IT security management.- Security organization is not established due to budget constraints.- Nobody is asking, and I would not know what to prepare.- Time, not sure what to measure.- No good collection method.- Didn't start to do it yet. We plan to do it in the near future.- Data points too qualitative.- No manpower.- No formal security program.- Not my responsibility.- No demand from The Clueless.- Not my role.- Narrative reports are provided, not statistical.- Not needed for awareness, budgeting, etc.- Haven't developed metrics.- Management doesn't know that they want this.- Not requested.- Don't have the requisite systems in place.- Insufficient resources to gather automatic and consistent metrics.

"Lack of interest from senior management" caught my eye and "No demand from The Clueless" made me smile but rather than simply accepting this sad state of affairs, how about running some security awareness activities to give senior managers a clue? If information is seen as a valuable organizational asset, the need to protect it is a natural and easy step (and if not, you have more fundamental issues!). If protecting information assets is important, measuring the extent of protection and identifying improvement opportunities is also important, isn't it? So there we are: an executive security awareness program in one paragraph.

Aug 21, 2007

Educational Security Incidents (ESI) is a blog comprising brief summaries of (mostly privacy related) security incidents culled from the news media. These are intended to be used for security awareness purposes: analysis and deconstruction of the incidents can indeed be used for case studies or just to pep-up other awareness materials.

Stories of security incidents from within the organization are even more powerful, although in highly political organizations they are quite likely to be suppressed by those involved. I know of at least one Internal Audit function that uses incidents in this way, regardless of the company politics: they produce an annual booklet describing chosen incidents, in each case outlining the background to the situations and the impacts, and usually they add some subsequent commentary about how the controls were (belatedly) changed for the better. The booklet becomes a control, governance, security and fraud education resource for management. Nice!

Aug 18, 2007

Details of audits of US Army websites and blogs run by soldiers, disclosed under the Freedom of Information Act, reveals that far more security policy breaches occur on official Department of Defense websites than on blogs. The audits were conducted by the Army Web Risk Assessment Cell, a special unit with a remit (evidently) to minimize unauthorized disclosure of sensitive military information via the Web.

It seems to me the Army was absolutely right to highlight the information security risks relating to blogging. I believe the audit results reflect the outcome of a highly successful security awareness program. If this issue had not been addressed so effectively, I'm convinced there would have been far more noncompliance issues, in other words this is a lesson to us all.

In respect of security awareness programs and policy compliance, 'the military' have a significant advantage over most of us in that the workforce is specifically trained to respect authority and follow orders - or at least, that is the classical view. In fact, I understand modern soldiers are increasingly being taught to think for themselves and operate autonomously, albeit within a highly structured (literally 'regimented') operating framework and, when necessary, at gunpoint. The traditional approach to the blogging security issue would presumably have been to send out an order banning blogging. What actually happened was more subtle: Army bloggers were instructed to register their blogs with commanding officers and pre-clear what they publish. 'Blog responsibly' is a rather softer message but the audit results seem to indicate its effectiveness in this situation.

Aug 16, 2007

I have been researching the origins of ISO27k, particularly the bit before it was launched as BS7799 in 1995, to complete the 'definitive history' on ISO27001security.com.

I dimly recall using an A5/booklet version of the Code of Practice for Information Security released by BSI DISC as PD003 in 1993, and an accompanying informational booklet PD005. I have also heard about but can't quite remember a "Users code of practice for security" released by the UK's National Computing Center (NCC) in the late 80s/early 90s, which I believe was largely derived from a Royal Dutch/Shell information security policy manual.

Does anyone reading this have copies of PD003, PD005, the NCC document or Shell's original policy manual, please, or other relevant information from that pre-1995 period? If so, please contact me (gary@isect.com). I'd really appreciate your help to set the record straight.

Lawyers acting for the US Federal Trade Commission in an anti-trust case against a food company released inadequately-redacted documents, thereby disclosing highly sensitive proprietary information about the company's competitive strategies - "dozens of trade secrets" according to the Washington Post article. The failed redaction attempt involved pasting black blocks over the relevant text but since the original text was there 'underneath', it was a simple matter to remove the blocks from the electronic documents published. After being alerted to the gaffe, the lawyers printed and scanned the redacted documents: the hidden text cannot be revealed from the published scanned images, but of course it's all too late since Associated Press got the originals.

Aug 14, 2007

That's the third on a list of dozen observations on security failures by a bunch of Gartner security consultants. The list is highly cynical but most of the observations ring true. Here's another dozen.

Why is it, I wonder, that 'security awareness' has come to be so firmly equated with 'posters' and/or [generally annual] 'training sessions'? It's such a lame paradigm and does a huge disservice to those of us working on creative security awareness programs.

What we need is a security awareness awareness program. I'm just off to the printers to get some posters done. Anyone want to sign up for a security awareness awareness training session next August?

An Australian businessman chasing an AU$100m deal with some Nigerian businessmen has lost AU$1.7m in what sounds like a classic 419 advance fee fraud.

"[T]he scam started a year ago in Japan before spreading to other countries, and then ended in Amsterdam where he came for an appointment with his alleged business partners. After advancing large sums of money, supposedly for such things as notary fees, the Australian man finally started getting the idea that he was being ripped off, police said. He alerted Dutch police who were then able to arrest the three suspected swindlers in an Amsterdam hotel where they had arranged to meet the Australian with a suitcase full of money claiming it would soon be his."

Being a businessman, I guess he assessed the potential reward and decided that a 1.7% advance was worth the risk, but no more.

Aug 13, 2007

Through a series of nearly 300 “lessons”, the authors of Lessons Learned in Software Testing (~$27 from Amazon) share around 60 years of accumulated wisdom about how to test application systems - not so much which buttons to press but more how to establish and manage a test team, plan the work and dynamically adjust the testing process according to what is found and how much time is left.

Aug 9, 2007

In a shining example of integrity, transparency and customer service, 365 Main, a data center company that promises extremely high levels of availability, has published details of a serious power failure that took out service to over 40% of its San Francisco colocation clients for as much as 45 minutes. The diary of events describes the frantic investigative engineering work required to analyze and resolve a problem in the backup power systems, finally traced to a timing issue (one of the nastiest forms of software bug!) in a PLC (Programmable Logic Controller - a type of Supervisory Control and Data Acquisition SCADA) subsystem that failed to clear the memory reliably when the diesel generator control units reset. Although I'm not a SCADA security expert, the fact that the failure occurred after a number of set/reset events sounds like a memory leakage and buffer overflow problem to me, but then I'm reading another texbook about software security testing at the moment so it's on my mind.

In the course of explaining the failure, the company outlines the design of its "N+2" standby power system using ten 2.1MW diesel generators, two of which are backups in case of maintenance or failure of the remaining eight. This level of power system investment is evidently sufficient to deliver 99.99% availability ("four nines")in an area subject to "dozens of surges and utility failures" during the last five years, although it is patently insufficient to reach five nines. Close but no cigar.

Describing the rapid sequence of five poer surges as a "unique event" implies that they had not previously tested the power systems under the specific conditions that led to the failure. This is known as Sod's Law or Murphy's Law, I'm not sure which. The preventive maintenance and testing regime looks reasonable by most standards i.e. "preventative maintenance logs on the Hitec generators are currently available for customer review. All generators in San Francisco pass weekly start tests and monthly load tests where diesels are started and run at full load for 2 hours. Both of these tests simulate a loss of utility and the auto start function is accurately tested." That said, however, if I were advising them [which I am not!], I would probably suggest running occasional on-load tests for much longer - perhaps 24 to 48 hours or more - to ensure that the diesel tanks, pumps/valves and pipes are clear, to confirm their capacity for exceptional long-term outages, and to refresh the diesel in the tanks. One of our clients experienced a backup generator on-load failure due to a blockage between the diesel header tank and the main diesel tank: the header capacity was sufficient for short on-load tests but not for a multi-hour power failure.

Reading between the lines of the diary a little, it looks as if the company had 'full and frank exchanges' with senior management at Hitec, the supplier of the no-break diesel generators and controls. The fact that they name the supplier is perhaps indicative of a frosty chill in the business relationship, but equally could imply their confidence in the way the supplier responded to the incident.

Anway, this is all fascinating and will probably form the basis of a case study in our forthcoming awareness module on physical security and environmental services for IT, due for release in October, or perhaps a later as-yet-unplanned module on application security. As with this month's case study based on the ongoing Ferrari-McLaren spying incident, real world cases often make more convincing classroom assignments. The trick is to summarize and crystallize the key factors into a format suitable for discussion.

My favourite discovery in the 101 is a neat little tool called SyncToy. At last I can replicate the files and directories from my desktop on the laptop, work for a while on the laptop and then re-synchronize to put the altered files back on the desktop. It works well. Having never quite got the hang of the Windows' functions for the same thing, it's good to find a tool so easy to configure and use.

A majority of US Inland Revenue Service employees failed a social engineering penetration test recently, despite their "security awareness training" warning them of the threat:

"In a test sample, nearly 60 percent of 102 IRS employees were duped into handing over their access information, the IG said in a report released today. TIGTA auditors used social-engineering methods to survey the degree of compliance with data security. Posing as help-desk representatives, they called IRS line employees, including managers and contractors, and asked for their assistance to correct a computer problem. They requested that the employee provide a user name and temporarily change his or her password to one TIGTA callers suggested. TIGTA test callers convinced 61 of the 102 employees to comply with the requests. Only eight of the 102 employees in the sample contacted the appropriate offices to report or validate the test calls, the report said. The sample employees were from across IRS’ business units and geographic regions."

I'm uncertain exactly what is meant by "security awareness training" - is it security awareness (ongoing, continuous awareness activities), security training (periodic training courses/classes) or some hybrid? The full report only refers to compulsory annual security awareness activities. Anyway, whatever they are doing is evidently having some effect (some tested employees did attempt to verify the calls) but pushing that proportion towards 100% will be very tough. ['Course I know a company that could help ... we'll be releasing an updated social engineering awareness module later this year.]

"Would update its security awareness program to include training on computer intrusions and unauthorized access and use existing media, such as the annual security training and security awareness week, to communicate IRS security standards on password protection procedures."

and also that it

"Had incorporated the topic of social engineering into its mandatory annual Online Security Awareness Training, which included examples and scenarios of attempts used to gain access to IRS systems. In addition, the IRS stated periodic reminders would be issued in the forms of (1) all-employee notices that would be included with employees’ Earnings and Leave statements and (2) articles in the computer security newsletter."

Obviously I would agree with the emphasis on security awareness but find the notion of an annual awareness event something of a joke. Putting notes on payslips and circulating a "computer security newsletter" are reasonable ideas but still nothing like enough to raise awareness. Is it any wonder so many IRS employees remain clueless?

Aug 4, 2007

CA is seeking $200m (!) from Rocket Software, claiming they used CA's intellectual property for their own database management system.

"The management software giant said in the complaint that Rocket hired programmers and software developers formerly employed by CA or Platinum technology International, which CA acquired in 1999. These employees (Mark Pompeii, Robert Schulien, Michael Skopec, and David Rowe), used CA's source code and development environment to fashion Rocket's software tools for the IBM DB2 relational database management system, CA alleged."

The transfer of intangible intellectual property in the form of employees' accumulated knowledge and experience is a frequent cause of trade secret disputes. The courts have a tough time differentiating deliberate theft and abuse of trade secrets from application of general knowledge, experience, competencies and skills. Employees who move to a new employer inevitably find it hard to stop thinking about their previous position and unintentionally transferring proprietary information to the new. Former employers have problems proving that proprietary information was disclosed or taken by leavers, especially without hard evidence (e.g. data transfer media, email records etc.)

Aug 2, 2007

"Agency computer systems are vulnerable because many lack basic controls, and one of the best ways to improve information technology security is to improve the metrics for how departments measure how these basic controls are implemented."

Golly. Those in charge of rewriting FISMA have figured out that they probably need information security metrics to track government departments' performance.

OK guys, the next baby step is to work out what metrics are needed.

I'll put money on "number of security incidents" being one of the 'cutting edge security metrics' about to be proposed, followed shortly by some bright spark noticing and promoting NISP SP 800-55 as The Answer.

It's old news but I've just been reading about the Department of Veterans Affairs' response to the theft of a laptop containing Personally Identifiable Information from a VA employee's home. The response is in several parts:

1. "The VA announced the appointment of a special adviser for information security.2. "Members of the senior management team were forced to retire or resign and the hapless employee and his line managers were all sacked."3. The Secretary of State has ordered all VA employees "to complete an annual data privacy and cyber-security awareness training course immediately".4. Senior officials at the VA have been ordered "to compile an inventory of all workers and contractors who need access to sensitive data."5. Senior department managers have been told "to remind staff to protect sensitive information" 6. A "security review of all laptops" has been ordered.

The 'annual data privacy and cyber-security awareness training course' caught my eye. An annual course?! VA employees are likely to be riding high on a wave of security awareness arising from bad publicity about the incident but putting them through some sort of training course once a year is more or less pointless. Imagine if roads only had one speed sign every 1,000 miles. Or if big trucks only beeped once when reversing. Or if Coca Cola put all its budget into one advertisement per year. It's crazy. The VA (or rather their US Government bosses) are missing a trick. Telling staff to 'remind staff' about their responsibilities to protect sensitive information , if that's all they said, is hardly proactive.

Elsewhere I've been reading that "Your common sense is the world's best firewall. Just make sure that you turn it on." Likewise Ira Winkler is constantly telling us that common sense is not common but people need "common knowledge" i.e. information to make sound judgments about information security.

Citigroup seems to 'get it', judging by their advertisement for a London-based information security engineer: "The candidate will be a good communicator, capable of producing viable solutions by distilling the assumptions and requirements of the customers. The candidate will be required to promote information security awareness through interaction with technology peers and customers."

Aug 1, 2007

An IT professional has been accused of hacking into a former employer's server to 4,000 confidential documents:

"A press note issued by S. Balu, Deputy Superintendent of Police, Cyber Crime Cell, said police had arrested M.S. Ramasamy, a 37-year-old software engineer from Avadi, on charges of hacking and stealing confidential and proprietary information from the server of Caterpillar, a US-based construction and mining company ... When contacted, Mr. Balu said the accused had gained access to the company’s server headquartered at Peoria in Illinois, US, using another employee’s user ID and password and downloaded over 4,000 confidential documents. A closed circuit camera had visuals of him accessing the server at the time when the files were downloaded. "