Friday, December 31, 2010

Simon Biles is a founder of Thinking Security Ltd., an Information Security and Risk Management consultancy firm based near Oxford in the UK.

The annual process of creating resolutions that we can break within a matter of minutes is a tradition to many of us – no new year would be complete without the heartfelt conviction that you will [delete as applicable] go to the gym/drink and/or smoke less/organise your garage etc. and that this will clearly make your life better, more complete and you will be “a better person”. I, for one, will be drinking less, going to the gym more and organising my garage – I know I will, because I’ve said I will on the first day of the New Year …

Aside from that though, here are some Information/Computer Security resolutions that you might like to give some thought to adding to your list. Like all good resolutions, these have value, but they are much more effective if you actually keep them up!

(1) Good for your security, and good for the environment – if you aren’t using the computer or router, switch it off or put it to sleep. It’s pretty challenging to break into a computer that is switched off, your carbon emissions go down and more importantly, if you are a climate change sceptic like me, so does your electricity bill. Unless your computer is performing an active task why leave it on? Boot times from sleep are negligible on modern systems, and if you really can’t spare 30 seconds to boot your computer – I think you might need to re-evaluate your life …

(2) We’re all guilty of this one, and I know so many security professionals that say the same, we reuse passwords – we have one or two _good_ passwords (complex 8 to 10 characters etc.) that we use for everything, making the assumption that, because it is a strong password that protects us. The trouble is that all websites aren’t created equal, just because we trust Amazon doesn’t mean that we should trust bargainbooksonline.cz – yet we do. True, some of us are looking for the SSL certificates and the like, but to be honest – if they are then storing the password in plain text in a MySQL database that is accessible to the world and his dog then it makes no difference. As much as you can – don’t reuse passwords...

Thursday, December 23, 2010

Craig Ball is a Texas lawyer who limits his practice to service as a court-appointed special master and consultant in computer forensics and electronic discovery.

An op-ed piece in the New York Times called Begging for Your Pay describes the humiliating ordeal of freelancers forced to hound clients to recover thousands of dollars of compensation. It brings to mind the occasional posts on computer forensics lists where our colleagues vent about unpaid invoices.

Sometimes, the unpaid forensic examiners wonder if they must throw good money after bad or whether they can refuse to appear at deposition or trial. Other times, the posts prompt discussion of recourse—lawyerly letters, withheld work product or lawsuits. More than anything, these list posters seek commiseration: reassurance that he or she is not the only knucklehead who let a deadbeat run up a big tab.

Sometimes, we are so anxious to get new business that we don’t protect ourselves against bad business. Bad business is worse than no business because bad business costs you money.

Any business that extends credit to its customers risks non-payment. We don’t think of computer forensics as a business that extends credit, but when you work without retainer or the retainer is used up, you’re financing your client’s investigation and must take steps to limit your credit risk.

In tough times, clients will impose on your goodwill to finance litigation. If you want to be their bank, be sure you’re adequately compensated and acting in compliance with credit regulations, then be prepared for default. A mechanic has a lien on the car being repaired, but you can’t sell client data to defray unpaid bills.

Here are some of the steps I take to insure prompt payment and guard against default...

Dr Chris Hargreaves is a lecturer at the Centre for Forensic Computing at Cranfield University in Shrivenham, UK.

In August 2010 it was announced that Google Wave would not be continuing as a stand-alone product, having been available to the general public for just 2-3 months. In that time period, it is unlikely that much research had begun into the digital artefacts left by Google Wave. However, if an investigation of a machine from that time-period required an examination of a suspect’s use of Google Wave, such research would need to be retrospectively carried out. This article discusses the advantages and disadvantages of pre-emptive and reactive digital forensics research.

Predicting the future is quite hard. This can be evidenced by the many quotes that are used as examples of failed predictions. Unfortunately the provenance of some of these quotes is questionable, but many are attributable:

“This ‘telephone’ has too many shortcomings to be seriously considered as a means of communication”, Western Union (1878)“Heavier-than-air flying machines are impossible”, Lord Kelvin (1895)“A rocket will never be able to leave the Earth’s atmosphere”, New York Times, (1936)“I think there is a world market for about five computers", Thomas J. Watson (1943)"Computers in the future may weigh no more than 1.5 tons", Popular Mechanics (1949)“There is no reason anyone would want a computer in their home", Ken Olson (1977)”640K ought to be enough for anybody”, Bill Gates (1981)

Despite the difficulty in predicting the success or failure of a particular technology, this is precisely what is required in order to conduct pre-emptive research in digital forensics. In this article, pre-emptive research refers to any research conducted that is not in response to a current investigation and is conducted in order to acquire some knowledge in advance of encountering a particular technology in a real investigation. Reactive research is the opposite, and is research that is conducted during an investigation in response to encountering some artefacts left by a suspect’s use of a particular technology...

Please consider submitting similar sample documents for your own jurisdiction to act as examples for other investigators. Discussion of issues surrounding search warrants is encouraged and should be directed towards the forums.

Tuesday, November 30, 2010

Sam Raincock from SRC is an IT and telecommunications expert witness specialising in the evaluation of digital evidence. She also provides training and IT security consultancy.

and he did it via remote access...

When evaluating computer forensics cases the tricky part is often not just evaluating what is found but determining how it came to reside there.

"It was downloaded via a web browser because I identified it in Temporary Internet Files...""I reconstructed the webpage and the image was downloaded as part of the page presented as SR1...""There is also evidence in the Internet History to support the proposition that the image was downloaded as part of the webpage...""Access to this website occurred after use of the search term 'Forensic Focus'...”

However, sometimes computer forensics isn’t just about what happened and proving intent, it’s also about proving whodunit and ensuring the correct person is prosecuted for the crime they committed.

In the simplest of scenarios, it may be that an organisation has a policy (or not, as the case may be) of sharing user accounts or that the computer is used in a location where multiple people have access to it. In these situations, it may be that the perpetrator alleges that someone else is responsible or that there is doubt about who is the culprit.

Beyond Reasonable Doubt?

If a case is not investigated fully, it could fall at the first hurdle no matter how strong the evidence is of the crime. Ultimately, in a Criminal Court in the UK, the Prosecution needs to prove that the case against an accused is deemed to be beyond reasonable doubt. There are books written on the meaning of this phrase and suffice to say I am neither qualified nor knowledgeable enough to comment on its full meaning. However, in essence, it is built upon the fundamental principles that a person is innocent until proven guilty and that a Judge/Jury/Magistrate must be sure that the person is guilty (and if not, they should return a verdict of not guilty). Hence, this may present a problem for prosecuting computer cases where it can be clearly shown other people were accessing the computer...

Monday, November 29, 2010

Lecturer in computer science and digital forensics, and owner of SecurityBible Networks, George is also a well respected member and contributor in the Forensic Focus forums, posting under the username DarkSYN.

If you decide to base your project on one of these suggestions please contact us so that we can discuss making your work available to other researchers and practitioners. By doing so you will be making an immediate and positive impact on the field of digital forensics.

Please note: Projects marked with an asterisk (*) have been suggested by practitioners who are willing to discuss the subject matter in further detail and provide a limited degree of guidance to those students who have already formally agreed the project with their own supervisor. Contact us for details.

New project suggestions are always welcome and should be submitted here together with a short description and an indication of what level of support/guidance, if any, you are happy to provide to students.

Wednesday, November 10, 2010

Forensic Focus: Stephen, can you tell us something about your background?

Stephen Mason

Stephen Mason: After leaving school in 1972 and spending six months at a bank in London, I joined the army (1973-1982). I served in what used to be known as the Royal Army Ordnance Corps as an Ammunition Technician. This work involved the inspection, repair and disposal of military ammunition, and included what is colloquially known as bomb disposal (this includes military bombs found from previous wars (known as explosive ordnance disposal ‘EOD’) and improvised explosive devices ‘IED’, commonly known as terrorist bombs).

I left the army to take a degree in 1982. My first degree is in history and educational philosophy, and I then took further qualifications to become a Barrister. I was called to the Bar in 1988.

Forensic Focus: How did you become involved in writing on electronic signatures?

Stephen Mason: In the autumn of 2002 I realised that few people knew anything about electronic signatures, so I sent in a book proposal to LexisNexis. It was duly accepted, and I wrote the text in the spring and summer of 2003. I had already written about the topic, and wanted to write a book that was useful to lawyers, ordinary users and technical people to illustrate the different types of electronic signatures that the law recognizes. This book covers over 100 jurisdictions with case law, and it is now in the second edition (Electronic Signatures in Law (2nd edn, Tottle Bloomsbury Professional Publishing, 2007)), and I am presently up-dating the text for a third edition in 2011.

Forensic Focus: You have been responsible for two books on electronic evidence, both of which are a first, and both are substantial texts. What made you do it?

Stephen Mason: Once the book on electronic signatures was published, my publisher wrote to ask me if there was sufficient material for a book on electronic evidence and electronic disclosure. I was convinced there was, although there was a gap between the initial e-mail (2004) and the first book being published (2007). Now in its second edition (Electronic Evidence (2nd edn, LexisNexis Butterworths, 2010)), I intend it to be a useful guide to lawyers and digital evidence specialists covering, as it does, 11 jurisdictions: Australia, Canada, England & Wales, Hong Kong, India, Ireland, New Zealand, Scotland, Singapore, South Africa and the United States of America.

The second book came about as a result of my work on the first book. I realised that the issue is global in nature, which is why I put together an additional 35 jurisdictions and edited the second book: International Electronic Evidence (British Institute of International and Comparative Law, 2008), covering: Argentina, Austria, Belgium, Bulgaria, Croatia, Cyprus, Czech Republic, Denmark, Egypt, Estonia, Finland, France, Germany, Greece, Hungary, Iceland, Italy, Japan, Latvia, Lithuania, Luxembourg, Malta, Mexico, Netherlands, Norway, Poland, Romania, Russia, Slovakia, Slovenia, Spain, Sweden, Switzerland, Thailand and Turkey.

Stephen Mason: Once I finished my book on electronic signatures, I realised that most legal journals would not really focus on the practical legal issues and case law that I expected to occur in these fields as the century progressed. This is why I began the journal. It has gone through three name changes, partly because of my attempt to get the title right, partly to ensure people understand what the journal covers. I include articles, legal developments and case reports from judges, lawyers, academics and digital evidence specialists. I aim to cover the industry in relation to digital evidence and electronic signatures from across the world. I also include reports on technical advances and book reviews. Additionally, I publish case reports and translations into English of cases relating to electronic evidence and electronic signatures from across the world...

Monday, November 01, 2010

Forensic Focus is looking for someone within the UK legal profession who might be interested in joining the current group of Forensic Focus columnists by writing a monthly column on computer forensics/computer crime issues.

This is not a paid position but would be useful for anyone wishing to raise their profile within the computer forensics community and present the perspective of UK legal professionals involved in this field. If you're interested, or would like to recommend someone who might be, please contact admin@forensicfocus.com

Thursday, October 28, 2010

I am really pleased to say that I completed the Mizuno Amsterdam Half Marathon on Sunday 17th October 2010 in aid of the Cystinosis Foundation UK.

Please may I thank everyone on Forensic Focus that sponsored and supported me. My family and I are really touched by everyone's generosity and we are going to raise close to 3000 Pounds Sterling from the event. Every penny raised will go to researching improvements to drugs and ultimately a cure.

I am a trustee of the foundation and my 9 year old daughter has cystinosis, which is a chronic genetic metabolic disease. It is very rare with only around 2000 patients in the Western world. The race was particularly challenging for me as exactly 6 weeks prior to the race I competed in the Lichfield 10k event to warm up for the main run. It evolved that in the 10k run I managed to tear the meniscus in my left knee. This meant that I was unable to train or exercise further for the marathon and had to take a chance and just go for it on the day.

I flew to Amsterdam on the 16th and stayed near to the Olympic Stadium where the event was due to finish. On the day all went OK. Things were going well for me up until 15km, when my legs didn't want to work as this point was further than I had ever run before. I pushed on though and every step was very painful but I reached the stadium, where after half a lap I crossed the finishing line with massive relief and a sense of great achievement!

The Dutch crowds and bands along the route were an enormous help and very motivating. I finished in 2 hours and 31 minutes which I was quite happy with. The run has really helped raise awareness of cystinosis and people have been so generous.

Thursday, October 21, 2010

Dr Chris Hargreaves is a lecturer at the Centre for Forensic Computing at Cranfield University in Shrivenham, UK.

This month's article is based very loosely around a recent 5-minute talk from Gary Wolf (link here) which explores the concept of ‘self-tracking’ (the trend for people to record aspects of their life) and how this can now be performed to a much greater extent than was previously possible due to changes in technology. The talk discusses the monitoring of heart rates, sleep patterns, consumption of caffeine, food and alcohol etc. While many of these could be recorded simply with a pen and paper, the talk also introduces a variety of new digital devices that automate the collection, recording and in some cases transmission of this ‘self-tracking’ data. This article ponders the implications of such devices for digital forensics.

Several technologies are mentioned in the referenced TED talk, including general purpose technologies such as Twitter and iPhones that can be used for ‘self-tracking’ of diet or exercise, but it also discusses dedicated devices. This includes technologies such as such as Nike+ (tracking distances and times), Fitbit (for fitness and sleep monitoring), Polar WearLink+ (heart rate) and Zeo Sleep Tracker (sleep monitoring). Outside of those covered in the talk, additional technologies that are already commonly in use that record information about our lives include games consoles such as the Nintendo Wii (amount of time playing a particular game or using other features such as the web browser) and GPS devices (locations visited). There are also other upcoming technologies, for example those which capture and record the total electrical power consumption of your home.

It does not require too much imagination to foresee how data from such devices could be potentially useful (particularly as evidence related to alibis, for example). Really, any additional source of potential digital evidence should be welcomed, and this is particularly true for devices that are difficult to tamper with (there is not yet an evidence eliminator for electricity usage monitors as far as I am aware). There is also an additional benefit from using digital evidence in this way – rather than relying on digital evidence from a single PC or device, multiple, independent devices can be examined for evidence that supports (or refutes) the current working hypothesis of what events occurred. More data sources can only increase the accuracy of any inferences drawn from the evidence...

Wednesday, October 20, 2010

Sam Raincock from SRC is an IT and telecommunications expert witness specialising in the evaluation of digital evidence. She also provides training and IT security consultancy.

In digital forensics we are often asked to determine the presence of evidence. However, what happens when we do not find anything? How do we prove something wasn’t there?

Proving something is present is generally a trivial problem – you find it, it’s there. Of course the complex part is explaining how it came to reside on a digital device and the circumstances surrounding it….that’s what the field of digital forensics is all about. However, proving something isn’t there and/or was never there are also questions we are asked to comment on. Take the following for example:

· Examine this laptop and establish if it has accessed the website http://www.forensicfocus.com.

· Examine this mobile telephone and determine if it sent a text message with the content “Forensic Focus”.

Let’s look at the first example. In the event there is “no evidence of access to http://www.forensicfocus.com found”, what remains is proving (or commenting on) a negative. However, just because you do not find any evidence of connections to the site, does this imply no connections ever occurred?

There are three main possibilities to consider. Firstly, the techniques used in your examination did not facilitate finding the evidence even though it is present. For example, if we simplistically relate this to an examination where only the live Internet history is examined initially, it is possible that a subsequent examination could determine some deleted Internet history and further evidence may be established.

Secondly, you did find the evidence but were unable to determine how to interpret it so you didn’t establish its meaning. For example, you found a partial registry file in deleted space but did not have the knowledge to interpret it and extract the evidence.

Thirdly, there is no evidence on the device of any connections occurring to http://www.forensicfocus.com. So no connection ever occurred?

Even given the last situation, with a computer, often the absence of any evidence is not evidence that it was never present. This is due to the fact that on a computer, data can be deleted and overwritten. Hence, it is possible that an event occurred but evidence of it is no longer available...

Friday, October 15, 2010

David Sullivan has over 15 years recruitment experience and has spent the last 6 years running his own computer forensics recruitment consultancy, Appointments-UK

We all over-complicate things and this is certainly true when seeking a new job. Essentially, to be successful at a Computer Forensics interview you just need to demonstrate two things:

1. You have the technical skills needed to perform to a high standard;

2. You are a likeable person. This is described in numerous ways such as interpersonal skills, company fit etc, etc, but when it comes down to it I would argue strongly that essentially it comes down to whether the interviewer likes you. This is especially important in CF where you are likely to be working long hours, maybe in a hostile environment and often in stressful situations where personality clashes can cause real problems.

In this article we are going to focus on the second point - making sure we are as likeable as possible as, after all, if two people have very similar technical skills guess who gets the job? Think about it like this - when you have contacted a company or a recruiter, or when you have sat in an interview, how much have you thought about helping the potential employer to actually like you?

Who is David Herron?

This whole process starts way before you get to the interview room which I will demonstrate with the example of a CV I received a couple of months ago with the following cover note:

‘I have just finish my degree in BSc (Hons) Forensic Computing with Third Class Honours awarded and I am seeking employment. I heard of your agency when one of your reps who I think was called David Herron or David Sullivan came into our university 2 years ago to give a talk on your agency.’

Who is David Herron?!! I thought he was a line-backer at Kansas – I am David Sullivan. Agency?! We aren’t an agency, we are a Professional Search firm! Although my initial reaction was to laugh out loud that somebody had taken so little care in their cover note my next thought was that I was not going to make any effort at all to help this person. Maybe I just have issues about needing to be loved due to being ignored by my parents when I was five, but I bet that you too can remember a time when you bristled due to somebody having made no effort to know anything about you before they made contact.

On the other hand I do occasionally (very occasionally I should add) receive an email from a prospective jobseeker saying how much they have enjoyed my articles. OK, so having read my articles we both know that is unlikely to be strictly true but it doesn’t really matter – straight away I am keen to help this person purely as they have made me feel good about myself. Even if I can’t help them I am happy to spare the time to talk about the market and help them improve their CV – it is just human nature...

Thursday, October 14, 2010

Simon Biles is a founder of Thinking Security Ltd., an Information Security and Risk Management consultancy firm based near Oxford in the UK.

“You have to know the past to understand the present” – Dr. Carl Sagan

If you have been kind enough to read some of the other articles that I’ve written here on Forensic Focus, you may have noticed that I have a bit of a penchant for historical references ( and quotes, and clichés, but for now – please focus on the references ! ) – something that some of my History teachers might be astonished by, given how long they spent trying to get me to learn who killed who in 1066, possibly nothing compared to the amazement from my English teachers that I’m writing anything at all - but we’ll move on from that swiftly – we are operating in a field that has only been around for, by all counts ( ok, let’s leave Babbage out of it ) not even a century, yet we seem to have run out of innovation. It’s a bit embarrassing actually – we cover it up nicely by making things a bit smaller, or a bit shinier – but really we’re all aware of the fact that, nice as these superficial improvements are – we’re no closer to innovation than a fresh coat of paint on a room is to a Van Gogh.

I’ve known this for a while – not that it stops me from wanting shiny things – but it really came to my attention with “cloud computing”. I don’t know how many of you are aware (or for that matter how many of you would care, really, when it comes down to it) but the British Government has, in its published ICT Strategy (PDF here) proposed the “g-cloud”. This was created by our previous, Labour, government and published January this year, but it doesn’t seem to have gone away under our current, ConDem (I _love_ that abbreviation), rule. I don’t know who’s to blame for the daft name, or for the fact that, whilst “g-cloud” is number 2 in the strategy “Information Security” is number 10 – but nonetheless we have it, and so, as a fully paid up consultant, I was trying to figure out what is required to jump on the bandwagon and charge good money to secure “clouds”.

Fortunately, what I discovered was that I’d already been securing “clouds” for the last 10 years, and, as I pointed out earlier – there is nothing new, just a nice new shiny name, and some (ranging in quality) pretty web interfaces. Now, as a bit of a UNIX head and command line aficionado, the latter is of no great interest to me, so I’m left with a new name …

Friday, September 24, 2010

Dr Chris Hargreaves is a lecturer at the Centre for Forensic Computing at Cranfield University in Shrivenham, UK.

This month I wanted to discuss programming, specifically whether learning a programming language is useful for a digital forensic practitioner. I have been unable to find any surveys or polls capturing the proportion of practitioners who can program, or what the language of choice is for those that do. However, anecdotally, my personal experience has left me surprised by the low proportion of practitioners who have programming experience. By ‘programming’ I am not suggesting that all practitioners should be re-implementing Encase, FTK etc. but that when appropriate, being able to write short simple scripts could be useful in a digital forensics context.

One motivation for being able to write bespoke code for digital forensics is the ability to extract data from binary file formats. It is fairly uncontroversial to say that digital forensics is a fast moving field. New versions of operating systems, new applications and new uses of existing applications emerge frequently, and this results in new digital objects that need to be understood. The digital forensics research community can identify relevant digital evidence using a variety of reverse engineering techniques. However, the output of this research may only result in a schema describing the patterns in the raw data and the rules of how they should be interpreted. If an extraction tool was not developed as part of the research, in order to make use of cutting edge advancements there are several options: manually extract information using the published schema; wait for it to be implemented as a feature in a mainstream forensic package; or develop custom code that extracts information according to the identified data structures. Manual extraction does not scale well for large volumes of data, and waiting for implementation in a commercial package introduces an element of uncertainty about when results could be available. Therefore, the ability to write simple, custom code means that data structures can be interpreted that would otherwise be ignored and more digital evidence can be recovered...

Thursday, September 23, 2010

Sam Raincock is an IT and telecommunications expert witness specialising in the evaluation of digital evidence. She also provides training and IT security consultancy.

Throughout the world, there has been a recent surge in students studying computer forensics. Some courses encourage placement years or work experience to allow students to expand on their academic knowledge and obtain some practical experience.

Times haven’t changed. I remember as an undergraduate I was delighted to work in IT placements for an investment bank. However, as I look back now, what do internships/placements really provide the company employing you? Are the projects you work on as a student worth any money?

Now that I reflect on my own placements – what did they really get for their money? I believe the main gain was a 6 month interview process with no commitment to hire me at the end. That’s a good deal for a company looking for the very best candidates and with the cash flow to find them.

What did I get from them? Well I received a salary and at the time that’s what I mainly valued. However, now when I consider what I really obtained, I realise I actually gained something far more valuable and costly to them – I received the time of some of their most talented staff.

In the world of forensics, for students looking to gain work experience or placements it’s quite a grim situation with few places available and high competition. Unlike computing placements where a company may provide a student with a coding project or assign them to IT support, the forensic world is all about looking at case evidence, most of which involves legal or confidential matters. This may induce issues surrounding appropriate justification of the skillset and experience of the person conducting the work. In some situations, a firm may not be able to rationalise a student working on a case. Hence, a lot of forensic firms do not hire placement students...

Wednesday, September 22, 2010

Simon Biles is a founder of Thinking Security Ltd., an Information Security and Risk Management consultancy firm based near Oxford in the UK.

So what is "Information Security" anyway? The traditional model that is taught to all InfoSec newbies is based around the “CIA Triad” – this isn’t some weird American-Chinese governmental underground society – rather it is the “holy trinity” of Confidentiality, Integrity and Availability that is used to define security. It’s been around for over 20 years, and, dig as I might, I couldn’t find the original source ( if anyone knows – please tell me ! ), it hasn’t stood unchallenged – more of that later – but certainly it is still in daily use, and, if your InfoSec professional doesn’t know what it stands for, it’s time to get a new professional ! In any case, it isn’t a bad place to start, so here are the component parts for you:

Confidentiality – this relates to secrecy of the information in question. Confidentiality comes in many and varied shades – from things that you actively want everybody to know all the way through to the things that you want nobody to know. These levels of secrecy relate to the “protective marking” of documents in Government departments – we are all familiar with the concept of “Top Secret”, they are in fact as follows : “NPM” ( Not Protectively Marked – e.g. anyone can know ), “Protect”, “Restricted”, “Confidential”, “Secret” and “Top Secret”. They are listed in a document called the Security Policy Framework (http://www.cabinetoffice.gov.uk/media/111428/spf.pdf) which is publically available. Figuring out the required level of confidentiality for a given item of information is important – the higher the required confidentiality, the more expensive and difficult the process of securing it from others becomes – thus you only want to apply appropriate controls where necessary, rather than spending a fortune protecting something that is either of no consequence or that everyone already knows!

Integrity – this relates to the “quality” of the information. Is it the same as when it was entered ? Has it been corrupted? Such a corruption could be accidental or deliberate, but the effect, in either case, is that the information can no longer be used, or trusted. It could be as simple as a wrong digit in a phone number, or as complex as accounting fraud, but both are compromises of integrity. Again, the effort made and cost expended in maintaining integrity should be proportional to the value and type of the information – one bit error in a JPEG library ( which uses lossy compression anyway ) may go completely unnoticed, a one bit error in a bank account balance probably won’t...

Wednesday, September 08, 2010

David Benford from Blackstage Forensics is raising money for the Cystinosis Foundation by running in the Amsterdam Mizuno half marathon on the 17th of October. David's daughter suffers from Cystinosis, a very rare and serious condition which is as yet incurable, and he is running to help fund research into improvements in medication and ultimately a cure.

Forensic Focus is supporting David and would like to encourage all our members to make a donation, no matter what size, via David's own page at http://www.justgiving.com/david-benford. David is currently 27% of the way towards his target and urgently needs your support in the final weeks before the race next month.

Please give what you can and, on behalf of David, many, many thanks in advance.

Friday, September 03, 2010

While there is still a need for handwriting analysis experts, modern document authentication techniques takes place primarily in the digital domain. Frequently a document such as a contract or letter of intent comes into question during litigation and we are asked to verify if it is authentic or fraudulent.

One of the first things computer forensic experts check during a document evaluation is metadata. Files such as Microsoft Word documents can contain hidden information known as metadata. Metadata is “data about the data.” If we were to use an analogy, if you were to investigate a homicide in which a gun was used, the metadata would be everything about the gun, including fingerprints on the handle and trigger, the type of bullet fired, the time and date it was fired, and the number of times it was fired. The metadata embedded in a Microsoft Word document might reveal: the creator name, company name, when the file was created, where the file was saved, total editing time and potentially much more. This list is not exhaustive, instead just offering a peek of what most document metadata contains. Any of these elements can be used to show a document is authentic or not.

Unexpected Metadata Revelations

If someone is surreptitiously trying to backdate a contract created in Microsoft Word, one thing they might do is set the clock back and then save the document with an earlier date. Taking a casual look at the computer, you might see Windows shows that the document was created or modified on the earlier date. However, a deeper inspection of the document itself might reveal that the metadata embedded in the document is inconsistent with the Windows time/date stamps. For example, Windows might show a Last Modified Date of Jan. 23, 2005 while the metadata embedded in the document itself might show a much later date and even a different author.

The document metadata can also reveal the total document editing time. When a document is intentionally backdated by setting the clock back and then resaving the document, the total editing time indicated can be unrealistically high, sometime showing that the document was edited for years. Since typical document editing time is measured in hours or days, when we see a document that has been edited for years we become understandably suspicious.

Metadata used in conjunction with other elements of computer forensics such as internet activity, examination of emails and Windows time/date stamps can be used to determine if a document is the real deal or a forgery.

Is The Document Worth The Paper It’s Printed On?

Recently we have looked at a number of agreements, and letters of intent that are provided to us on paper. If the authenticity of the document is questioned, somehow the electronic version of the document is almost always difficult to get access to. However, in those cases where we are able to examine the electronic version of the document, often a very different story emerges, illuminated by the bright light of metadata.

Friday, August 20, 2010

In a recent intellectual property case for which we were retained, among the electronically stored information (ESI) that the plaintiff sought for production were internal company blogs and wikis used by the defendant’s developers to discuss new product ideas, as well as the design and coding of the alleged offending application. Included in the discovery were sites created using Microsoft® SharePoint® and MediaWiki software (and others). The discovery order was crafted with the typical “readily accessible” and “native format” language that seems totally irrelevant to sites which maintain dynamic content.

Due to the nature of the business, none of the sites for which production was requested was required to be managed in accordance with standards for business compliance such as Sarbanes-Oxley or the European Union Data Protection Directive. All were informal sites created by the development team to support collaboration with other team members. It is arguable whether there was any affirmative “duty to preserve” since it appeared that the developers were totally unaware of any intellectual property concerns related to their work.

Thus, the issues that arose during production were two-fold: What constituted “readily accessible” in sites in which the content is frequently changing and for which point-in-time recovery (PiTR) solutions do not exist? The producing party’s view was that snapshots of the current site with resolution and recursion on internal links to one level of depth was sufficient, but how to produce those snapshots in a form which was reasonably complete but did not constitute a hardship for the producing party? Initial attempts using various web crawlers were abandoned after the output far exceeded the volume of space actually occupied by the site itself! And given that the site content is, at least in part, database driven, what is the impact of continued site use, after the alleged point of infringement, on the database contents?

As for “native format”, how does one handle those sites which convert uploaded content from one form to another using processes which are undocumented and proprietary? Even if the conversion process is well documented, what assurances exist that metadata will be preserved? Many Content Management Systems support import/export programs which convert documents from their native format to a format more easily viewed from the Web (e.g. PDF or HTML). In many cases, valuable metadata is removed by the conversion process...

Wednesday, August 18, 2010

In this month's installment, I will take a break from a specific problem and talk about a fundamental issue with deep forensics: Scalability.

Scalability is simply the ability of our forensic tools and processes to perform on larger data sets. We have all witnessed the power of Moore's law. Hard drives are getting bigger and bigger. A 2 TB SATA hard drive is to be had for much under $100. With massive storage space being the norm, operating systems, and software is leveraging this more and more. For instance, my installation of Windows 7 with Office is ~50GB. Browsers cache more data and many temporary files are being created. After Windows Vista introduced the TxF layer for NTFS, transactional file systems are now the norm, and the operating system keeps restore points, Volume Shadow Copies and previous versions. Furthermore, a lot of the old, deleted file data will not get overwritten anymore.

This "wastefulness" is a boon to forensic investigators. Many more operating and file system artifacts are being created. Data is being spread out in L1, L2, L3 caches, RAM, Flash storage, SSDs and hard drive caches. For instance the thumbnail cache now stores data from many volumes and Windows search happily indexes a lot of user data, creating artifacts and allowing analysis of its data files.

That was the good news. The bad news is that most of this data is in more complex, new and evolving formats, requiring more developer efforts to stay current. For instance I am not aware of any forensic tool that analyzes Windows Search databases - not that I had time to look (if you know of such a tool, post in the forum topic, please - see below). Worse than that is the need to thoroughly analyze the data. Traditionally, the first step is to acquire the data to an evidence file (or a set thereof). The data must be read, hashed, compressed and possibly encrypted. All this does take time, despite new multi-threaded and pipelined acquisition engines appearing (for instance in EnCase V6.16). High speed hardware solutions are also more prevalent. Luckily, this step is linear in time, meaning that a acquiring a full 2TB hard drive will take twice as long as a full 1TB drive. Note that unwritten areas of hard drives are usually filled with the same byte pattern (generally 00 or FF) and these areas will compress highly, yielding faster acquisition rates.

Simon Biles is a founder of Thinking Security Ltd., an Information Security and Risk Management consultancy firm based near Oxford in the UK.

Calling something a “Holy Grail” is an interesting term – the intended meaning is well known to most of us – i.e. something miraculous that will solve all of your problems. However given that it’s supposedly a cup, bowl or dish hardly links it sensibly to password management – none the less, Single-Sign-On ( henceforth in this article as SSO to save me from RSI ) is supposedly the “Holy Grail” of Authentication.

SSO is the answer to the dilemma that we were left with at the end of the last article – we want complex passwords, difficult to break ones, that change often, on all the systems that a user has access to … A rather entertaining (if a little dated now) paper from Microsoft tells us that each user has 25 accounts that require passwords, and types, on average, 8 passwords a day – and this is a paper about web-browsing habits, not including primary logons to machines or other work legacy systems. What is more interesting is that each user, on average, has 4.5 passwords each used on 3.9 websites (I love averages – how else can you have ½ a password and .9 of a website !). Looking through some other literature suggests that some people manage way, way more than this – with this particular user dealing with 97 separate and distinct password protected systems.

This was recognized as an issue a long time ago, well before we needed to remember our E-bay, Amazon and Twitter passwords. Project Athena at the Massachusetts Institute of Technology (MIT) started in 1983 and developed the Kerberos SSO protocol. Kerberos, named after the three headed dog of Greek mythology (Harry Potter fans please note – it was called Kerberos before “Fluffy” ), operates on the principal of an authentication server that you authenticate to once with your password, and, assuming you get that right, it grants you a “ticket” that you can take to any other service that identifies you and confirms your authentication. The good news is that all of this happens behind the scenes and all you have to do is remember one password. To be fair, it’s a little bit more complicated than that with some quite fun encryption ideas with regard to authentication of source and time stamps to prevent replay attacks. However, it is, in my opinion at least, the daddy of all SSO – so much so that Microsoft Active Directory authentication is, at least almost, Kerberos. (If you want to know a bit more, you can read this, but I can’t claim that you’ll be awake at the end of it. It can be quite enlightening, if you like that sort of thing (and I do), to watch a WireShark trace of a Kerberos exchange, WireShark has quite a good built in understanding of the protocol and you can see various tickets moving around the system – you can also have a go at recording and replaying them to see if the time stamps really do work … Kerberos also has an interesting sideline in identifying machines to other machines, effectively allowing SSO between clients and servers as well as wetware users...

Tuesday, August 10, 2010

Simon Biles is a founder of Thinking Security Ltd., an Information Security and Risk Management consultancy firm based near Oxford in the UK.

Authentication and Authorisation (please notice the “s” is _not_ a spelling error!) are fundamental to information security – identifying who a user is (authentication), and what they are allowed (authorised) to do allow us to restrict access to data in such a way that only the rightful permitted people can access, modify or copy it. It seems in the current day and age, we have a habit of lumping the two together with the term “Identity and Access Management” – but personally, I think that it is wise to remember that they are separate and distinct processes, handled at different times and by different parts of the computer that you are using.

Let’s start off with Authentication – the “prove who you are” part. Authentication can be performed in certain ways – typically these are described as: something you know, something that you have or something that you are – each one of these is called a “factor” and, logically, combine two or more of them and you have “multi-factor” authentication. The password is an example of the first of these “factor” types, although there are other things, such as the questions you answer for your password reset (the name of your first pet goldfish, your favourite teacher at school, how many warts you have between your toes, that kind of thing). The things that you have are things like smartcards or dongles, whereas the things that you are include all of the biometric measurements – fingerprints, voice recognition and the like. Each of them has their inherent issues – people forget things, lose things, and, rather frighteningly, people can have bits of them taken – numerous examples abound in film – “Angels and Daemons” springs to mind as the most recent, but I recall “Thunderball” also makes use of the concept.

However, by far the most common, cheapest and familiar form of authentication is the password. Passwords are something that are now ubiquitous – if you have a computer & an internet connection, you have a password. Most operating systems implement them in some form now by default, usually for elevation of privileges to an administrative account - those of us that have been using UNIX or its derivatives for the last few decades (or indeed some earlier multi-user operating systems) can all have a good laugh now and congratulate Microsoft and Apple for having caught up. They’ve been around a lot longer than that though, I did a little research and I could certainly find written references as to the use of passwords as far back as the Roman legions, and I’ve every confidence that they were in use well before that in some form or another. The Roman implementation of passwords was exemplary though, and, quite frankly, something that many modern password implementations could learn from – controlled distribution, frequent changes and traceability of distribution. If you are interested there is more to be found here (wordinfo.info).

The commonality of passwords has created a unique problem (and opportunity from a Forensic viewpoint) that people have terrible memories for things, especially where there is little in the way of context to give them a clue, so they tend to use the same password multiple times. Where, as professionals, we have the capability to enforce certain technical restrictions on the user (password length and complexity requirements, change durations, non-repeatability etc.) we quickly find that the user subverts the process one way or another – the most ubiquitous being the dreaded post-it note! I’d like to draw your attention to the “professional” intelligence agents operating until recently in the US, a 27 character password, having been committed to memory could have been a significant issue for the Forensic Officers, however, having it written down alongside the computer … well, need I say more ?

To enter you MUST be a current UK student studying for a BSc, MSc or PhD in a computer science, information security, engineering or computer forensics discipline. You must not be in full or part time employment in a professional role or have obtained a job offer in such a role.

Competition Details

As a telecommunications and computing forensic specialist, one of the biggest challenges is to learn how to write simply and succinctly so your reports/articles can be understood by all.

With this in mind, to enter you need to prepare an article, no more than 1,000 words, discussing one of the following topic areas:

You may provide a summary of the topic area or select a sub-topic on which to concentrate your article. The article should be in a language and style suitable for a non-technical layperson. Your chosen topic should not discuss software products and should be entirely your own work. Articles containing information copied from other sources and not appropriately cited will be instantly disqualified.

The article will be assessed by Samantha Raincock for technical accuracy, readability, structure and the general chosen topic area. One overall winner will be selected and the author will receive the training prize as well as their article being published on Sam Raincock Consultancy’s website. All decisions are final.

Closing date for submission is by 12:00 (noon) BST on 16th August 2010. The article should be emailed to sam@raincock.co.uk in Word format.

* One free training place is available. Recent graduates (2010) without a job offer may apply. The prize will include two days of training and all training materials. It does not include accommodation, subsistence or travel costs. The prize is non-transferable and is only valid for the investigating connection records training course on 23rd – 24th August 2010 in Birmingham. There is no alternative prize. You will need to provide proof of your student status and your identity. In the unlikely event the training course is cancelled the prize will become null and void and SRC will not be responsible for any costs incurred.

Wednesday, June 30, 2010

This month sees the launch of the Forensic Focus Preferred Training program which aims to highlight the very best digital forensics training available.

The first course offered is "Mobile Forensics: Investigating Connection Records" delivered in the UK by Sam Raincock of SRC. This is a 2 day course providing in-depth analysis of connection records and a comprehensive overview of cell site analysis techniques. A 10% discount is available for a limited period.

While some may curse Windows Vista for all its changes, for us forensic investigators it also introduced new interesting 'features'. One is the integration of Windows (Desktop) Search into the operating system. Most corporations have been reluctant to adopt Vista, however more and more Windows XP systems are being replaced by Windows 7 equivalents. Windows 7 also contains Windows Search and enables it by default. It actually can be challenging to disable it so one can conclude that Windows Search is becoming a relevant source of information in forensic analysis of Windows systems.

What is not widely known is that Windows Search uses the Extensible Storage Engine (ESE) to store its data. This is the same engine that Microsoft Exchange uses. Because ESE uses a propriety database format, little information about it is available in the public domain. As a consequence, it is unclear how well different forensic tools support the ESE database format.

Several years after the introduction of Windows Vista and Windows Search, currently only a handful of forensic analysis tools seem to provide support for the Windows Search database even though a Windows Search database can be a valuable source of evidence. This paper provides an overview of the ESE database format and the Windows Search database and what it might contribute in your investigations...

Tuesday, June 29, 2010

When a crime happens, the time of the events may be critical to the legal case. However, how are these times established? Is it the time alleged by the witness? When the CCTV system captured the image? When the computer said the person left their home? When the satellite navigation system recorded they arrived? When the mobile was cell sited in the area? Or is it all of the above?

There is an abundance of studies addressing the accuracy of witness evidence. However, what about the accuracy of times provided by witnesses?

- I know it was 13:05 because I looked at my watch

- I walked past the newsagents and it was open so it must have been after 13:30

- I had just had my lunch, watched the news and left at 13:40

Without looking at your watch/clock (or computer), what time is it? What time does your watch/clock say? What time is it really?

Just like humans, digital devices may tell the incorrect time. In fact, they often do. Hence, when analysing events, it is crucial to compare like with like, otherwise the chronology may become scrambled and the evidence contradictory. In this article, I will discuss the issue of accurate digital device time and some basic techniques to assist in questioning and approximating the correct timings...

Friday, June 25, 2010

The survey should take no more than 2 or 3 minutes to complete and does not require any personal or contact information.

This year the survey has been expanded to include more detailed questions on employment issues with the aim of identifying trends across the industry which can be reported back to the Forensic Focus readership.

Thank you very much in advance for completing this short survey, your responses will have a direct influence on the future of Forensic Focus and I hope provide useful information to everyone working in this field.

Tuesday, June 22, 2010

In 2007, New Jersey Governor Jon Corzine made the news twice for a single event. The first time was the report of a car accident on the Garden State Parkway in which he was seriously injured. The second time was a report, which appeared a few days later, detailing how the governor's account of the accident had been contradicted by a witness, his automobile. Since 2000, most US cars have been equipped with a :black box” known as the Motor Vehicle Event Data Recorder. Standards for a common data set, including protections against data theft, altering of vehicle information, odometer fraud and misuse of collected data on owners and drivers are the subject of the IEEE 1616a Standards for Motor Vehicle Event Data Recorders (MVEDRs). In spite of such protections, a number of states have enacted privacy protections which regulate the recovery and use of such data.

A 2007 Computerworld article was entitled Photocopiers: The newest ID theft threat. In 2010, CBS News was able to recover Personal Health Information, and other personally identifiable information (PII) from the hard disk drives of copiers found in a warehouse in New Jersey. These copiers had been leased by various health care, law enforcement, financial and other institutions.

The focus of these stories was the risk to personal privacy, but to forensic examiners and eDiscovery personnel, there is a more significant issue which is, When does the data contained in such devices constitute evidence deserving of preservation and a possible subject of discovery? More importantly, perhaps, is determining when a thorough investigation demands the investigation of information contained in a peripheral not, normally, the subject of a forensic examination?

A couple of recent cases presented to our offices illustrate when and how such concerns arise.

Case 1: A branch office of a financial services company becomes concerned that confidential information is in the possession of unauthorized employees and outsiders. This arises after a client notices a securities trade that was undertaken on their behalf but without their knowledge or consent. Internal IT personnel examined each of the office computers and found no evidence of malware, keyloggers or possession of PII except by authorized personnel. An outside digital forensics (DF) firm was brought in to investigate and found no evidence of an intrusion or extrusion. A former IT administrator was the principle suspect but he had been gone for over 6 months and his account disabled. A second DF firm was brought in to confirm the findings of the original firm.

The second firm noticed, as did the first, that the small office used a Linksys wireless access point (WAP) in lieu of a wired network. Interviews with, then, current IT personnel and attempts to “sniff” the wireless network confirmed that WPA2-PSK was used and that the key was strong. The SSID was not advertised. Using the Web administrative console, the second DF firm determined that the firmware was not the Linksys default, but a modified kernel based upon Sveasoft Talisman. Further examination showed that it had been configured for port mirroring, something which was, also, not the default. The former IT administrator had set up a rogue access point which, effectively, doubled as the secure access point for the business...

Thursday, June 17, 2010

A co-worker was looking into some strange issue with an acquisition of a flash drive. It seemed that the acquisition hash changed every time the drive was acquired. The write switch was off. Even a software or hardware write blocker did not prevent this odd effect.

My co-worker did isolate some sector differences between the individual acquisitions. She found out that it was a series of sectors located in “Unallocated Clusters”

While looking at the real sector data it changed every time the sector refreshed. It was a series of hex patterns like “44 00”; sometimes they would change to “40 00”, “18 00” or “00 00”

Then we used a disk editor to read the same sector and the same behavior persisted. Same results with other tools. On different computers.

The Hardware

I then paused and thought about the way most flash drives are created. A controller chip sits on the USB bus and communicates with the host machine and the actual flash storage chip. The flash storage chip is usually a flat, thin rectangular chip with a series of pins on both sides. Some flash drives have more than one such flash chip.

The controller is responsible for performing the actual sector writes/reads. Should the write switch be set to “read only” mode at device insertion time, the controller would tell the host that this drive is read-only and ignore / fail out write requests. The host on the other hand then would use this bit of information in the file system driver and mount the drive in read-only mode.

For instance the $Log file on improperly dismounted (“dirty”) NTFS drives would not be processed in order to roll back partial transactions.

Furthermore, the controller usually drives a LED to show disk activity and that the device actually is plugged in...

Dr Chris Hargreaves is a lecturer at the Centre for Forensic Computing at Cranfield University in Shrivenham, UK.

Ethical issues in science are commonplace; examples such as cloning, climate change and genetic engineering are all subject to different ethical debates. Some subjects have clearly defined areas of potential ethical problems, for example in Psychology much consideration is given to the welfare of human participants involved in any experiments conducted. This would involve the consideration of concerns such as participants’ confidentiality, privacy, consent, right to withdraw etc. However, the welfare of human participants in experiments is not the only form of ethical debate and in some research areas there are other particular issues, such as animal rights, or indeed whether a particular technology should be researched at all. This article is not an attempt to identify all the potential ethical issues that digital forensics research could be subject to, but instead highlights a particular issue -- the potential impact of making the results of some digital forensics research publicly available.

To take a simple (and fictitious) example, in the case of research into ‘evidence removal’ tools, if research into a product revealed that while the software removed evidence from several locations on the disk, there were also several other locations where evidence was not erased and could therefore be recovered. From a forensic point of view these are very interesting findings and it would be beneficial to share these results so that when the use of this particular product is encountered in an investigation, evidence could be more easily recovered. However, the publication of these results also has adverse consequences. Firstly, users of that software who run it in an attempt to hide evidence of unlawful activity may then decide to switch to a more effective product that does erase the data areas in question. Secondly, the developer of the software may decide to take the published research and use it to develop updates that fix the problem so that the software now erases the locations in question. In both of these cases, the publication of the results could mean that in future, an analyst may be deprived of useful evidence...

Thursday, June 03, 2010

Forensic Focus: Sam, can you tell us something about your background and how you became involved in computer forensics?

Sam Raincock

Sam Raincock: Prior to university, I’d never considered computing as a potential career; in fact, I hadn’t really used computers apart from playing games. I decided I wanted to be a physicist and solve the world’s particle physics problems. After embarking on a physics degree, I became more interested in computers (even though they were running 3.1 and Solaris!) I made the radical decision to change my degree course to a BSc in computer science even though I was a complete novice in the area. However, I learnt very quickly and really enjoyed the challenges and problem solving. I was also lucky to work in two summer internships in IT departments at Morgan Stanley during my degree, so I at least had an appreciation of bigger businesses.

After my undergraduate degree, I embarked on research into the human factors involved in 3D imagery on 3D display systems – again a completely different area from my previous computing experience. However, I liked the mathematical challenges and the combination with human vision psychology. At the same time I was also working in contracting in web development and providing tutorials in all different types of computer science for undergraduates. I really enjoyed the teaching but I found being a programmer quite monotonous – it was a great insight into proving that a full time role in computer programming was not for me.

The research time and ability to develop a questioning mind is invaluable in my career today but after a few years of academic research I was really looking for a business driven challenge. An opportunity with Keith Borer Consultants, a forensic science company based in Durham, was presented to me and I started working as a mobile telephone/cell site examiner. There was lot of flexibility and encouragement and I was able to perform research and development into whichever digital fields I wished to explore - so I opted for them all!

Forensic Focus: What services does Sam Raincock Consultancy offer? What is a typical working week like?

Sam Raincock: The business is primarily concerned with providing a combination of computer investigation/expert witness services and IT security assessment services to corporates and solicitors alike. I love solving complex problems so I particularly enjoy computer forensic cases involving technically complex scenarios/problems or software system assessments (how does software A produce B logs and what do C logs actually mean).

In the telecommunications field, SRC can offer a full range of services but primarily concentrates on taking expert instructions in complex connection record and cell site analysis cases and providing advice to other companies in these types of cases.

My current passion is working in the breadth of the IT security fields with particular interest in ISMSs, the effective use of encryption, procedures for forensic labs, corporate investigations, process improvement post incident and the ISO 27001 standard. Very recently, I was accepted to work as an assessor with A2LA regarding digital and telecommunications lab assessments. I am very excited to be a part of the American new forensic lab standards.

I also provide training in all of the above and in my ‘spare’ time I write papers and perform research. I am also studying for the CISSP and ISO 27001 lead auditor certs.

Currently, a typical week is very long – around 80 hours if not more. Working in so many fields is quite a challenge but there is also the business element too – I have become my own accountant, marketing manager, IT Manager etc. However, I love the all round skills it is providing me with and how it enables me to work with a diverse range of partners, bodies and clients...

Monday, May 24, 2010

A couple of months ago, one of my clients, an Investigating Officer from a Law Enforcement Agency, had requested me to extract some of the files from an image copy of a hard disk. The total number of files to be copied was 1,030. Sounds easy right? This job of a few clicks turned out to be a nightmare when I found out that I was short of 2 files in my destination folder. I had selected 1,030 files to be copied, but at the end, only 1,028 files were being copied. More surprisingly, I received output from the EnCase Copy operation; ‘Status: Completed’. But where did the other 2 files go? Why had EnCase produced ‘Status: Completed’, when actually, 2 files were missing? Referring back to the image copy, I found out that most of the files had the same filename as each other.

It is very common for an analyst to face evidence files which have the same filename. This circumstance exists when:

1. There are 2 files with same name, but they are put in different folders.2. There are 2 files with same name, but with different MAC times.

A procedure to be followed before analyzing a case is to recover all files and folders. When you use recover options, the deleted files will be recovered, and sometimes these deleted files have the same file name as the existing files, but with different MAC times.

But is it possible that EnCase can fail to copy all the files under these circumstances? For those readers who are new to EnCase, you may ask, why do you need to copy the evidence file in the first place? Let me try to put it simply, in computer forensic methodology, after the analysis phase, we will present the findings to our clients. So usually what we do is to copy out the evidence files using EnCase so that our clients can access the files in their workstation, without looking at the whole hard disk image. So the real question now is how sure are you that the selected files are properly copied to your designated folder? Is it sufficient to rely on the EnCase notification window after the EnCase copying process has been executed?

Friday, May 14, 2010

In my last column, I discussed the concept of prior probability, that is to say, the likelihood that that conclusion A can be derived from fact B with no additional data. In medical diagnosis, prior probability is estimated in order to determine the need for and type of additional investigation.

Another tool used by clinicians is that of the positive predictive value (PPV). In essence, the PPV is the likelihood that a positive value for given test will confirm the operative hypothesis (diagnosis). Given all things being equal, choosing the procedure with the highest positive predictive value will be the single most useful step in confirming the clinician’s suspicion.

Of what relevance is this to digital forensics?

As I commented on, previously, it appears that US courts, especially civil courts, are increasingly limiting the scope of discovery out of concerns that discovery may violate expectations of privacy or be too burdensome to the producing parties. In a recent case in which I was involved, the judge required the requesting party to propose an alternative to production of forensic images of an entire enterprise network’s computers solely to search for possible instances of the plaintiff’s intellectual property (engineering drawings) located on the defendant’s computers. Instead, the court restricted the discovery to only those devices which were used to store or manipulate files of the same type as the engineering drawings and, of course, to only those documents which were reasonably accessible.

In addition, some judges are now following the principle of “one bite of the apple”, i.e., limiting production to a single request. Not surprisingly, though not always successfully, this has led to the notion of discovery for the purpose of discovery; the classic slippery slope...

Thursday, May 13, 2010

Simon Biles is a founder of Thinking Security Ltd., an Information Security and Risk Management consultancy firm based near Oxford in the UK.

Language is a funny thing – even though we may speak the same basic language, the nuances, construction and vocabulary is very individual. I find listening to my children a wonderful thing – sometimes I hear either my words, or those of my wife, but more often I hear distinct phrases and words that are unique to them. This month, they have challenged me, in an oft played game, to insert words that are uniquely theirs, and not mine, into this column – so, embedded somewhere in the next eight hundred or so words are two words that have been given to me – I’ll publish the name of anyone next month who can tell me which two they are!

Given that two people may interpret any given word in vastly different ways depending on their backgrounds how do we ensure there is a consensus of understanding? We operate in a field that has very definite concepts – true or false, on or off, zero or one – binary choices. There are few shades of uncertainty (all smart comments about quantum computing to /dev/null please) – it’s there or it isn’t, and unless we are called upon to give our opinions as experts, we are bound, at least ethically if not legally, to make statements of fact. I personally find it an immense problem though, that so often there are not really clear definitions of terms – or at least not clear definitions that you can easily present to a customer (or worse, a jury).

To add further problems, for me at least, I subscribe to a code of ethics that prohibits the use of “FUD” in dealing with customers (see http://www.csoonline.com/article/217983/The_FUD_Factor). “Fear, Uncertainty and Doubt” have to be the biggest drivers in Information Security sales as a quick survey of some major security vendors supports:

Encryption and the lack thereof changed my life. In the early 1990’s I realized that encryption is very underused and in the near future it will become essential for most people and companies. At that time, hardly any user encrypted any data. Even in the financial sector, good encryption was applied seldom. Thus, I chose the focus of my Masters in Computer Science to be Cryptography. My thesis was researching the synergistic properties of compressing data before cryptographically hashing it.

When the large forensic company I am currently working for decided to create an Enterprise-Level product, I worked on cryptographic protocol, the Authentication and Encryption Algorithms, their FIPS 140-2 validation and implementation. This took a long time, proving the well-known fact that well designed protection is not a simple or quick task. The proper selection of algorithms, threat modeling, secure coding practices, entropy and key management are just some of the many facets I had to address. Finally my co-inventors and I obtained a patent protecting this intellectual property.

I was very careful because I knew firsthand how disastrous a lack of protection can be - it was the trigger for my divorce...

Friday, May 07, 2010

Dr Chris Hargreaves is a lecturer at the Centre for Forensic Computing at Cranfield University in Shrivenham, UK.

Traditional academic publications are peer reviewed, e.g. journal papers and conference proceedings, and there are now many examples of these that specifically cover digital forensics (e.g. Digital Investigation, Journal of Digital Forensic Practice). However, a considerable amount of useful forensic research is available from what are, in traditional academic terms, considered to be less reliable sources of information (including resources such as blogs, non-peer reviewed papers and forum posts). This article highlights the strengths of these media for distributing results of digital forensic research, but also discusses the value that is added when even a brief discussion of the methods used to obtain the results, and an open discussion of the limitations of the research is included when posting results online.

One of the main advantages of peer-reviewed publications in a journal or in conference proceedings is that one or more other people in the field have examined it and they have independently decided that the paper is suitable for publication. This peer-review process ensures that the author has discussed and explained contradictory theories and considered whether the results obtained are general or due to carefully chosen specific experiments. It also ensures that conclusions drawn are well supported by evidence and that enough information is contained for experiments to be repeated and the results verified. The criteria by which a publication can be judged as suitable can vary, but is also likely to include criteria such as technical accuracy, whether the results can be generalised, relevance, timeliness, etc. This process is in place to ensure that the published work has a certain level of quality...

Monday, April 19, 2010

An editorial panel is currently reviewing the ACPO Good Practice Guide for Computer-Based Electronic Evidence and is seeking the views of interested parties from both the law enforcement and private community of users and service providers, the IT sector, and academia . The panel's remit is to update the content to ensure it is current and relevant, and to see if there is any area of digital forensics not included in the guide that would benefit from inclusion. Participants are asked to complete a survey at www.surveymonkey.com/s/YTZVX2W and submit their views.

Wednesday, April 07, 2010

Craig Ball is a Texas lawyer who limits his practice to service as a court-appointed special master and consultant in computer forensics and electronic discovery.

I recently posted an open letter to judges on a blog that caters to an e-discovery audience. I asked judges to stop ordering parties to turn over their systems to the opposing side's computer forensic examiners and argued that most civil forensics work should be reserved to neutral examiners.

Now, while you rush to warm the tar and pluck the feathers, please hear me out.

Yes, most of my work as a computer forensic examiner is done as a neutral, but I do a lot as a partisan on either side of civil cases. Even so, use of a neutral isn't something that uniquely benefits me. It's something any competent, ethical examiner can and should do. What I'm advocating won't hurt you; in fact, it'll likely add to your job satisfaction.

Here's what I posted:

"Your Honors:

I just read another opinion where the Court decided to let one side's computer expert examine an opposing party's computers. The Court seemed more concerned with who would pay for the exam than what its consequences might be.

I'm a lawyer and computer forensic examiner, and I make part of my living doing just the sort of examinations the court ordered. I've done a whole bunch of them. So, while part of me wants to encourage courts to order more forensic exams — and I can surely attest to their efficacy in resurrecting data thought gone and exposing case-making evidence — the angel at my ear requires me to softly whisper, "WHAT THE HECK WERE YOU THINKING, JUDGE?!?..."

Thursday, April 01, 2010

April 1st sees the announcement of two new organisations for digital forensics professionals. The International Federation of Forensic Examiners in Europe (IFFEE) and the Department of Justice Expert Examiners (DoJEE) group in the US both aim to provide practitioners with a variety of benefits ranging from professional liability insurance to full immunity against prosecution and covert extraction to a country of their choosing. These new organisations are also running a special promotion for the next 24 hours offering new members a complimentary packet of felt tip pens to improve the presentation of their reports - anyone wishing to take advantage of this great offer should email idprefercrayons@iffeeanddojee.com immediately.