I am not sure why it is Forensic recovery rather than just recovery software.

I know I call my software 'forensic recovery', but this does include comprehensive logs of dates, files sizes, attributes, as well which sectors from the hard drive were used to make up the file. Thus fragmented files could be reconstructed from the log information. Files should also be hashed if you want the forensic label

Unless I am missing something, logs appear to be missing from the program along with 'F1' Help. The export function just handles names and dates in a non standard .CSV file.

In answer to this point, and in all fairness to ReclaiMe, I added the word 'forensic' to the webinar title myself when uploading it to our YouTube channel (primarily in an attempt to clarify the intended audience). Any criticism - and I understand the point being made - should be aimed at me!

While I admit that data recovery and computer forensic are different fields, they are still closely related to each other. Computer forensics software in general are designed to retrieve and analyze data from healthy, or slightly damaged filesystems. On the contrary, data recovery software like ReclaiMe Pro aims to extract all the data possible even from severely damaged filesystems, including after whatever deletion attempts might have been made. While success rate varies, data recovery software would typically produce more results when more effort was spent to get rid of the data.

More than that, many already widely used filesystems, like ReFS or BTRFS, are still not supported by forensic tools, not to mention RAID/NAS reconstruction. Therefore, data extraction in such cases is possible only with more data recovery than forensic software.

Computer forensics software in general are designed to retrieve and analyze data from healthy, or slightly damaged filesystems. On the contrary, data recovery software like ReclaiMe Pro aims to extract all the data possible even from severely damaged filesystems, including after whatever deletion attempts might have been made. While success rate varies, data recovery software would typically produce more results when more effort was spent to get rid of the data.

With due respect, your contrast between forensic vs data recovery software misses the point. The features Mr. Cotgrove describes refer to the necessity in forensics that results meet a legal standard of reliability. That reliability is bolstered by the process being logged, and the results repeatable and verifiable. Irrespective of whether the subject file system is slightly or severely damaged, forensic software should produce sufficient metadata for an examiner to reconstruct the results even if using a different tool. Ultimately, it is the expert's opinion that counts, not the tool.

Please understand that none of this is intended to diminish the value of ReclaiMe, but in forensics, we don't like black boxes.

See, there are two types of logs/traces that can be theoretically produced.

First are decision logic traces, related to data recovery process. These are more-or-less trade secret, especially when we're around knowledge produced by reverse engineering something.

Second are file/data location traces, like, "this file was produced from such and such sectors on said disk, and timestamps came from that sector". These can be easily provided and do not contain anything untoward. These can be used, for example, to verify the data, say, with WinHex or other disk editor (referring to "other tools"). The reliability of metadata (like timestamps) on a modern CoW filesystem in case of deleted files is a moot point, but that's a full different can-o-worms.

Considering these "copy traces", can anyone suggest a common format for these, if it even exists?

Considering these "copy traces", can anyone suggest a common format for these, if it even exists?

Possibly NOT exactly the answer to your question, but what I find an interesting approach is the one used by FTK imager of recreating the filesystem items or filelist in a .csv which though possibly not the best format in terms of access/indexing/efficiency, remains nonetheless the most "portable" and "cross platform/cross application" one, and as an example it would allow to produce a graphical/browsable interface, like the one Francesco put together herehttp//www.forensicfocus.com/Forums/viewtopic/t=11359/while allowing easy search/access by everyone else from scripts or *whatever*.

Of course a more "proper" database could be more suitable but it would IMHO create interchange problems or the need of a given specific database engine/tool, the only candidate could be possibly SQLite, since it is Public Domain and there are several tools that can deal with that format.

jaclaz

[1] Most probably it would be needed to create a field containing either a range of absolute sectors (for files found to be contiguous) or the name of a separate file containing the list of sectors (for non-contiguous files).

We probably will be releasing the update addressing these issues in a couple of weeks. For a copy "trace", we settled for the moment for one CSV file per copied file. Fields are, well, for each extent, the disk location and the extent format (compression method for compressed extents, NTFS resident, or just plain extent).