I heard good things about R-Studio, so tried it; the Windows and Linux versions crashed every time I tried to scan these drives, and the Emergency version seemed to do the time thing (it kept restarting at odd times for no apparent reason).

I've been having some email conversations with the RTT support folks, and they are aware of my issues; I'm wondering if anyone else out there has had the same problem?

It *appears* that recovering HUGE partitions (these are single 3tb partitions with around 2tb of deleted files on them) might require HUGE amounts of RAM (I have 8gb, but R-Studio ends up using it all up after creating a scan file of about 12gb, then dies). At least, that is my guess.

Has anyone tried recovering from a single huge partition that contains a huge amount of data on it, like I am, and done it successfully? I am using the intelligent scanning function as well, so there's a lot of data R-Studio needs to store; I'm just surprised it's keeping so much of it in RAM as opposed to offloading more of it to the hard drive.

Which version do you use, 32- or 64 bit? Assuming that you have 8GB of RAM installed, your OS's are 64-bit Windows and Linux. Am I right?

I successfully used the 64-bit version of R-Studio to recover a 5 TB RAID 5 consisting of 5 disks (app. 3TB of digital movies). But it was done on a dual-Xeon computer with 64GB of RAM running under 64-bit Windows 2008 server.

Alt wrote:Which version do you use, 32- or 64 bit? Assuming that you have 8GB of RAM installed, your OS's are 64-bit Windows and Linux. Am I right?

I successfully used the 64-bit version of R-Studio to recover a 5 TB RAID 5 consisting of 5 disks (app. 3TB of digital movies). But it was done on a dual-Xeon computer with 64GB of RAM running under 64-bit Windows 2008 server.

Yes, Windows 7 Pro x64, and the 64-bit version of R-Studio; it uses up all the 8gb before crashing.

I'm almost tempted to build myself a server-class machine just to see if I can get this working. :-/

I *did* order another 8gb of RAM which arrives tomorrow - this machine uses the last generation of Intel Core processor and chipset, so it can only get up to 16gb on the board; I could upgrade or build another machine to the newest series and get 32gb, but I'm thinking you really need a server class machine to accomplish this kind of task. A local computer store does that with R-Studio. The only other options are accepting what extundelete restored (about half my files), giving the disks to this local store, or paying $1400-$2700 to a local data restoration service which *claims* they have near 100% success with this sort of thing... it'd take a long time and a lot of cleverness and computing power to reconstruct these files, I'm sure. EXT4 is not the most undeletion-friendly. And I don't even have the benefits of a RAID5 or RAID6 to fall back on. :-/

I assume that you saves the scan info to a file. You may load it later and resume the scan. Moreover, even if R-Studio crashes, the file remains valid.
You can scan the disk in parts, read more about it on the R-Studio on-line help:Advanced scan.
You need to select only the Ext file system for the scan to reduce memory load. Most likely, you'll have to remain Extra Search for Known File Types option selected.

But there is a possibility that R-Studio crashes at a certain point of the disk rather that because it's eaten up all the available memory. To check that, specify the area on the disk (quite approximately) on the Advanced scan dialog box and run the scan. If R-Studio crashes, you may create an exclusive region on the disk, scan it, and recover files from it.

I understand that all that requires a lot of time, but that the way I see how to get around the crash.

Alt wrote:I assume that you saves the scan info to a file. You may load it later and resume the scan. Moreover, even if R-Studio crashes, the file remains valid.
You can scan the disk in parts, read more about it on the R-Studio on-line help:Advanced scan.
You need to select only the Ext file system for the scan to reduce memory load. Most likely, you'll have to remain Extra Search for Known File Types option selected.

But there is a possibility that R-Studio crashes at a certain point of the disk rather that because it's eaten up all the available memory. To check that, specify the area on the disk (quite approximately) on the Advanced scan dialog box and run the scan. If R-Studio crashes, you may create an exclusive region on the disk, scan it, and recover files from it.

I understand that all that requires a lot of time, but that the way I see how to get around the crash.

Oh, if I resume the scan from where it crashed, it crashes again, and very soon after. That is, if I load the precious scan, and resume, the R-Studio process quickly grows to 8gb, the new scan file it creates quickly grows to the same size as the original, and it crashes again, so "resuming" doesn't work. I *could* do it piece by piece, but then there'd be no connection between the two sets of scan results - what if some files span the two sets of results? Their connection would be lost if I did that.

I already am following all the suggestions for reducing memory (selecting only EXT filesystems, limiting search patterns to just the 5 types I'm looking for, etc.), to no avail.

I could use the Exclusive Regions method, but, again, would that not miss cases where files span 2 different regions?

Wouldn't it be smarter for R-Studio to not attempt to keep so much scan info in RAM while running, but write more to the scan file (or other metadata storage file)? It might've been more convenient for the programmers that way, but it limits R-Studio's usefulness for huge partitions. A well-designed scanner doesn't NEED to keep everything in memory at once, even if that simplifies the program's design (speaking as a programmer here ).

Alt wrote:I mean a little bit different thing: start scanning from the point where R-Studio crashed without loading the scan info. Does R-Studio still crash?

Ah - that's one I haven't yet tried, as I'm concerned about the issue of files spanning the sections scanned. Probably not a HUGE issue, but enough that it makes me wonder if I'd miss some files due to that.

R-Studio doesn't have any way to associate files that span multiple sections scanned separately, does it?

Well, the area where R-Studio can potentially crash may be quite small, so one or two files would be missed.
Excluded areas are areas that R-Studio fills with a pattern for bad sectors. That means the files spanning across those areas would have that pattern in them. Maybe, some of such files could be recovered.

Alt wrote:Well, the area where R-Studio can potentially crash may be quite small, so one or two files would be missed.
Excluded areas are areas that R-Studio fills with a pattern for bad sectors. That means the files spanning across those areas would have that pattern in them. Maybe, some of such files could be recovered.

It fills them with the pattern virtually, during recovery, correct? It doesn't actually write to the disc being recovered?

Incidentally, I made 48gb available for swap and added 8gb more to my motherboard, maxing out the memory to 16gb (this is a previous generation i7 chip, so I can't bump it up to 32gb unfortunately).

In any case, R-Studio has been running all night, and says it's 95% finished (which is a lot further than it got before), and is only using 9gb of RAM at the moment, so I am crossing my fingers that this scan will finish properly.

tbessie wrote:
It fills them with the pattern virtually, during recovery, correct? It doesn't actually write to the disc being recovered?

Sure! R-Studio never ever tries to write anything on the disk.

tbessie wrote:
In any case, R-Studio has been running all night, and says it's 95% finished (which is a lot further than it got before), and is only using 9gb of RAM at the moment, so I am crossing my fingers that this scan will finish properly.