Welcome to NBlog, the NoticeBored blog

Apr 30, 2008

A 46-page academic paper by Richard Thompson Ainsworth of Boston University School of Law describes "zappers" - programs designed to divert some sales transactions from the normal sales processing and accounting systems. Fraudsters with sufficient access to an organization's sales systems (e.g. small business owners) sometimes use zappers either to misappopriate the entire sales income for the diverted sales (steal the entire value from the company - the sales don't go through the books) or to to manipulate the value (for example to steal the VAT/GST/sales tax content).

So-called "zap" and "super-zap" programs have existed for decades in the mainframe world. They allow intervention on databases, overriding normal access constraints to manipulate the data, and potentially programs, directly. They are supposed to be used only under carefully controlled emergency conditions, for instance to modify or delete a rogue data record that is somehow blocking an entire batch from processing. Most competent sysprogs (systems programmers) or systems administrators have the knowledge and capability to run zap programs and can potentially meddle with the systems in a virtually unstoppable and undetecable manner, if they are careful anyway: well-written programs have built-in integrity checks and other controls that at least identify and flag direct interventions. Unfortunately, if the sysprogs also have the capability to suspend or edit the audit trails, or substitute hacked programs, or subvert the operating system calls, or ... or ... all bets are off. Remember this possibility if you ever hear a sysprog for a financial institution bragging about the speed of his new Ferrari.

Going back to sales zappers, the article points out differences in the ways such frauds are detected in the UK and EU. In the States, it seems the evidence suggests that income tax investigations "often" (or rather occasionally!) catch zapper users, while in EU they are more likely to be caught by sales tax investigations. This begs the question: why not do both? And while you're at it, why not take a close look at those "shrinkage" stock losses - the ones that conceal employee as well as customer thefts of goods?

Apr 29, 2008

Trust is an important concept in security but few awareness programs give it the coverage it deserves. This month’s NoticeBored module brings together trust, integrity, fraud in an IT context, and touches on closely related concepts such as honesty, governance and whistleblowing.

We’ve all had our share of disappointments and incidents in life due to misplaced trust in someone or something. Such painful experiences are all part of the rich experiential lessons from life’s School of Hard Knocks. With hindsight, things would have been different, we hope. On the upside of risk, we are sometimes pleasantly surprised when people and systems deliver on their promises, or even better exceed expectations. Such is the way in which trust is built up.

Trust comes in two flavors: blind faith means we ‘just trust’ something or someone with no rational basis beyond our belief system. In most cases, however, trust must be earned, in other words a level of trust is established gradually over a period of successful interaction and performance. By the same token, trust can be damaged or destroyed by negative events – when a person, organization or system “lets us down”, we are naturally more dubious about it the next time.

There can be immense personal satisfaction in being trusted and respected by someone else. Computer systems and other inanimate objects may not have feelings but those that prove their worth accrue value above those that are unreliable in practice. How would you feel about, say, a heart monitor that sporadically shut down or gave nonsensical readings? Do you dread getting into an elevator that sometimes jerks or stops between floors? That subconscious sense of unease tinged with fear is the result of not being able to trust something.

Technological controls alone are seldom adequate to reduce the risks, placing emphasis on human controls through training and education, policies and procedures, and various forms of management supervision (including, by the way, the IT audits we covered last month).

In relation to information, specifically, trust brings up related subjects such as integrity and fraud. The NoticeBored awareness materials explore these concepts through presentations, briefing/discussion papers, case studies and more. We’re delivering a bundle of 30 different types of awareness material (see below), too much for all but our largest customers to use perhaps but that’s not the intention. Customers are encouraged (through the ‘awareness activities’ paper provided) to review the materials and pick out the pieces that are most appropriate for them, given their circumstances and the maturity of their awareness programs.Content of the module

May’s NoticeBored security awareness module is out now. If you're not already a NoticeBored customer, see what you're missing on the NoticeBored website.

Apr 26, 2008

"ISACA has tapped its global network of leading IT governance, control, security, and assurance experts to develop a widely embraced framework to help ensure the quality, consistency, and reliability of IT assessments. ITAF also contains a helpful set of good practice-setting guidelines and procedures."

The ITAF content is largely a repackaging of existing ISACA standards and guidelines in the areas of IT audit, assurance and governance. I'm pleased also to note that the ISO27k standards merit a mention.

Apr 25, 2008

City of London Police officers thinking of transferring information on USB memory sticks can self-assess the risks using a questionnaire. It's a simple idea really: a police officer's responses to a few questions determine the 'risk score' leading to approval (or rather a requirement to seek approval from the relevent level of management authority, and/or to use USB sticks with additional security controls) or disapproval of the use of a USB stick for the intended situation.

Being self-assessment, the system depends on users answering appropriately and is open to deliberate abuse and inadvertent errors. However, this risk is offset to some extent by compliance procedures and structures in the police. Furthermore, it's better than nothing - without the system, police officers presumably make such decisions on a more arbitrary basis, assuming they even consider the security risks. The tool at least raises security awareness (assuming the tool is suitably promoted, for instance by being embedded in standard operating procedures).

Automating USB risk assessment is interesting at another level too. The decision tree in this instance is relatively simple, much simpler than with many other information security risks yet still complex enough to benefit from being presented as a structured questionnaire. The assessment output is based simply on the net total of scores from each question, and has only a few possible recommendations. Someone has had to write the questions and determine the score ranges and recommendations, somehow. The assessment could give inappropriate responses under certain circumstances since it does not take account of all possible situations (e.g. whether the police officer has lost numerous USB sticks before).

Contrast this to, for example, the assessment of security risks relating to a software application. There are so many elements to the risk and so many potential outputs that it is infeasible to automate the assessment - or is it? Some sort of artificial intelligence/knowledge based system is possible and could arguably give better answers than either of the two usual alternatives: asking an information security person to assess the risks or skipping the assessment altogether.

The Auditor General in Manitoba has released high level guidance on the role of audit committees, the need for 'legislative' (legal compliance) audits and more on their website. They also offer some basic advice on policy development.

PS Sorry for the long blogging pause - I've been at an ISO conference in Kyoto.

Apr 4, 2008

BT has admitted to secretly using spyware to monitor the web surfing habits of tens of thousands of its British broadband customers. According to BT, this was merely a technical trial. Allegedly no personal data were collected since machines were identified "by anonymous code numbers" (presumably IP addresses - hardly anonymous) and content keywords were recorded, not website addresses (so what? It's still unethical and possibly illegal inteception in my book).

A supermarket security breach late last year/earlier this compromised over 4 million credit/debit cards and led to thousands of fraudulent transactions. The breach has been blamed on malware on the store's servers. The fact that the store systems were PCI DSS compliant, apparently, doesn't exactly inspire confidence in the system of independent security audits but on the other hand it's a reminder that malware is an omnipresent threat.

Hot topic

NBlogger is ...

Dr Gary Hinson PhD MBA CISSP has an abiding interest in human factors - the ‘people side’ as opposed to the purely technical aspects of information security. Gary's career stretches back to the mid-1980s as both practitioner and manager in the fields of IT system and network administration, information security and IT auditing. He has worked and consulted in the pharmaceuticals/life sciences, utilities, IT, engineering, defense, financial services and government sectors, for organizations of all sizes. Since 2003, he has been creating security awareness materials for clients (www.NoticeBored.com) and supporting users of the ISO27k standards (www.ISO27001security.com). In conjunction with Krag Brotby, he wrote "PRAGMATIC security metrics" (www.SecurityMetametrics.com). He is a keen radio amateur, often calling but seldom heard by distant stations on the HF bands.