The whole project started like this. I wanted to practice my C(#) skills so I was searching for something to implement and I thought that making an SSD testing suite, based on real life scenarios, would be great! Today almost all sites (except TPU) make their SSD reviews using exclusively synthetic benchmarks, which of course don’t reveal the whole story.
The methodology I followed is similar to the one that TechPowerUp’s W1zzard uses (many thanks to him for the initial idea and for his help in everything I asked him). I run some real life scenarios like copying a large file from one folder to another, decompression of a large file etc. and measure the execution time, meaning the time each test needs till its completed. Every series of tests can be ran up to five times in a row (I have the option to make it ten) and I take the average reading as the final result for each test. Before each test I flush the disk cache and after a whole batch run I restart the system to erase read cache. The program is fully automatic, meaning that after you select the tests and the number of batch runs all you have to do is to hit “Start” and then sit back and enjoy (or read a newspaper or go out for a walk). After 3-4 hours (depending on the number of batch runs you selected) the program finishes testing and you have your results backed. If you want, with a press of a button the results are forwarded to excel for further analysis (graphs etc.) Of course as time passes by and if I am more relaxed from family, reviews etc. I will add more tests and options in order to increase the program’s features and usability.

To the hard part now, I won’t release it in public since it is strictly a review tool (although I still don't know where/if I will do SSD reviews, at least in the same scale that I am enrolled into PSU reviews) and secondly I spent many many hours of my restricted free time to finish it. Also most of the programs I use, in the tests, are licensed and on top of that I don’t want to be blamed for broken SSDs, corrupted windows etc. since this proggy really gives the HDD/SSD a very hard time.

Let’s see now the tests I included into this testing suite.

Windows start up time. Pretty straightforward I think. I measure the time that all startup programs require from the moment the windows Kernel is loaded. This is done via the Windows API so the time is really accurate.

File copy. I copy a large file (3.1GB) from one folder to another, on the same drive, and measure the time it takes for this procedure to finish.

Winrar test. I extract a highly compressed file (linux-3.2.9.tar.bz2) and measure again the time needed for this.

Antivirus check. The time Avira needs to scan the system32 folder including all subfolders and compressed files

Windows Experience Index. Windows have a really handy tool to measure disk performance (Winsat). I use the disk portion of this test and check how much time it needs to finish.

Photoshop CS5 startup time. In this test I start Photoshop CS5, open a large RAW format photo and then terminate the application.

Photoshop Image processing. Start Photoshop CS5, open ten large RAW format photos, apply a demanding action on all of them (one by one) and once all are finished close the application.

PCMark 7 test1. Here I run the systemstorage test and measure its execution time. Also save the results in an XML file.

PCMark 7 test2. This time I run the secondarystorage test. Again the results are saved in XML format.

Office 2010 installation time. Log the time Office 2010 needs to install on the test disk with the installation files already located on the same disk.

In the future I may add more tests although I think they are already enough (maybe I will replace some with other more suitable tests).

ps. If you think that this post doesn't stick here please move it where it would be more appropriate

It seems like it's worth looking for and testing how the SSD drives behave when they encounter compressed content.

As you can see, the Sandforce controllers get high speeds because they compress anything going to the Flash memory chips but when they do get something hard to compress further they start to suck at both writing or reading :

In contrast, other controllers are simply better or have better Flash chips and are maybe slower when writing but manage to maintain the high speed when reading:

It would suck to penalize a Crucial M4 like the one in the picture above or a Samsung 830 because it can only do only 100 MB/s when writing the general stuff all your tests (except maybe Winrar) in the picture you posted would do, because when it comes to writing compressed data to it, the Crucial and Samsung would get much higher write speed and they'll also be able to read the data back from drive fast, unlike the sandforce controllers.

So I don't know what a proper test would be. I'm thinking maybe simulating how converting a video to a iphone/ipod friendly format would go.

Copy a long 20-40 mbps 1080p video with PCM (uncompressed) sound to the SSD and convert it with x264.exe to 640x480 in fast mode (so as to not have cpu as a factor).

This way you'll have compressed content (read slower by Sandforce controllers) muxed with compressible content (the pcm audio) going into the x264 encoder, which generates compressed video (uncompressible by sandforce) and pcm audio (compressible) or compressed audio (not compressible) depending on how you configure it. Then simulate copying it from ssd to another drive or to memory to see how fast the drives read the compressed content back from ssd.

As the output resolution is also low, you'll have high fps at encoding, which means the ssd drive will also have to seek often to get the video and audio chunks from the source video, while writing compressed data to the disk at the same time.

If you want to really make it hard, you could use 2-4 or even more videos and write a script (or i could make one similar for you) that would merge the videos into a larger one, somewhat like how this video is made (see it in a few minutes if youtube allows it and converts it) : http://youtu.be/T9gUSgGD1ss

A script that would do that would involve lots of random seeks coupled with uncompressed and compressed reads and writes so the ssd drives would "sweat" a bit.

ps.. i just read what you mean by winrar test :

Quote:

Winrar test. I extract a highly compressed file (linux-3.2.9.tar.bz2) and measure again the time needed for this.

See this is where there's a problem. If that 3 GB file extracts in 10-20 GB you can have sandforce based drives reading 3 GB at 150 MB/s average and then writing uncompressed data to disk at 300 MB/s (because the drive internally does hardware compression on the data) ... or you have drives like Crucial M4 and Samsung 830 that read 3 GB with 450 MB/s and then write the uncompressed data at 160-200 MB/s ... all may end the job at about the same time but which behavior is better in real world?

ps. another excellent test would be compiling the source code of a large application like Firefox for example, because it has hundreds or gbs of source code and binary objects generated while compiling.

First review (better say article) finished with the use of this program.
You can find it here.

I think it will be highly interesting to everyone since I compare an HDD to a modern SSD and show how much affects the latter the performance of the whole system. The article is in Greek but all graphs are in English, so you will easily get the idea.

some pics of the circuit I made to measure SSDs consumption. I used two high side current sense monitors and tested it today on an Agility 3 and seems to work fine. Now I will hook an oscilloscope or a Labjack on it to capture high frequency readings since I noticed frequent changes to the readings with my fluke.

It is a ghetto mod I know, but this is just the testing board
Soldering these SOICs wasn't an easy task also, with a normal soldering iron at least..