Most of my interests, duties, and responsibilities surround infrastructure. Virtualization, storage, networking, and the architecture of it all. Ask me about the latest video card out there, and not only will I probably not know about it, I might not care. Eventually problems crop up on the desktop though, and the hands have to get dirty. This happened to me after some issues arose over recently purchased SSD drives.

The Development Team wanted SSDs to see if they could make their high end workstations compile code faster. Sounded reasonable to me, but the capacity and price point just hadn’t been there until recently. When it was decided to move forward on the experiment, my shortlist of recommended drives was very short. I specifically recommended the Crucial M4 line of SSDs. There are a number of great SSD drives out there, but the M4 has a good reputation, and also sits in my workstation, my laptop, and my NAS in my home lab. I was quite familiar with the performance numbers they were capable of.

It didn’t take long to learn that that through a series of gaffes, what was ultimately ordered, delivered, and installed on those Developer workstations were not the Crucial M4 SSD drives that have such a good reputation, but the Crucial V4 drives. The complaints were quite startling. Code compile times increasing significantly. In fact, more than doubling over their spinning disk counterparts. When you have cries to bring back the 7200RPM SATA drives, there must be a problem. It was time to jump into the fray to see what was up. The first step was to simply verify that the disks were returning expected results.

The testingI used the venerable Iometer for testing, and set it to the following conditions.

Single Worker, running for 5 minutes

80000000 Sectors

64 Outstanding I/Os

8KB block size

35% write / 65% read

60% random / 40% sequential

Each test was run three times, then averaged. Just like putting a car on a Dyno to measure horsepower, the absolute numbers generated was not of tremendous interest to me. Environmental conditions can affect this too much. I was looking at how these performance numbers related to each other.

For the sake of clarity, I’ve simplified the list of test systems to the following:

VM = A VM in a vSphere 5.0 cluster against an EqualLogic PS6100 array with 7200 RPM SATA drives

I also tested under different settings (block sizes, etc.), but the results were pretty consistent. Something was terribly wrong with the Crucial V4 SSDs. Or, they were just something terrible.

The results Here are the results.

For the first two charts, the higher the number, the better.

For the next two charts, the lower the number, the better

You might be thinking this is an unfair test because they are comparing different systems. This was done to show it wasn’t one system demonstrating the miserable performance results of the V4. So, just to pacify curiosity, here are some results of the same tests on a system that had the V4, then was swapped out with the M4.

For the blue numbers, the higher the better. For the red numbers, the lower the better.

If one looks on the specs between the M4 and V4, there is nothing too unexpected. Sure, one is SATA II while the other is SATA III. But the results speak for themselves. This was not an issue of bus speed. The performance of the M4 drives were very good; exactly as expected. The performance of the V4 drives were terrible – far worse than anything I’ve seen out of an SSD. This isn’t a one off “bad drive” situation either, as there are a bag full of them that perform the same way. They’ve been tested in brand new workstations, and workstations a few years old. Again, the same result across the board for all of them. Surely the V4 is not the only questionable SSD out there. I’m sure there are some pretty hideous ones lining store shelves everywhere. I’m sharing this experience to show the disparity between SSDs so that others can make some smart purchasing decisions.

As for the comparison of code compile times, I’ll be saving that for another post.

ConclusionIt was interesting to see how SSDs were so quickly dismissed internally before any real testing had been performed to verify they were actually working okay. Speculation from the group even included them not being a serious option in the workplace. This false impression was concerning to me, as I knew how much flash is changing the enterprise storage industry. Sure, SSDs can be weird at times, but the jury is in; SSDs change the game for the better. Perhaps the disinterest in testing was simply due to this subject not being their area of focus, or they had other things to think about. Whatever the case, it was certainly a lesson for me in how quickly results can be misinterpreted.

So if anything, this says to me a few things about SSDs.

Check the specific model number, and make sure that it matches the model you desire.

Stick with makes and models that you know.

If you’ve never used a particular brand and model of SSD, test it first. I’m tempted to say, test it no matter what.

Stay far far away from the “value” SSDs out there. It almost appears like solid state thievery. I can only imagine the number of folks who have under performing drives like the V4, and wonder what all the fuss about SSDs are. At least with a bad spinning disk, you could tear it apart and make the worlds strongest refrigerator magnet. Bad SSDs aren’t even good as a paper weight.