A study by Backblaze of hard drive failures shows they don't fail any more often if allowed to run hotter than their recommended operating temperature

InfoWorld|May 14, 2014

Do hard drives live longer if you keep them cooler than their recommended operating temperature? The short answer, according to a new study, is no.

Backblaze, the online backup provider that previously produced a report analyzing which brands of hard drives failed most often, has delved once again into the wealth of statistics generated by its pools of hard drives to determine how much of an effect lower operating temperature has on drive longevity.

Backblaze uses a custom-designed "Backblaze Storage Pod" for its drives, something it claims helps to keep drives at consistent temperatures between the 15°C and 30°C as recommended by most manufacturers. Temperature variation within a given pod might well affect drive lifetime, and so Backblaze examined 34,000 drives to see what correlation might exist. From what's turned up, there's no correlation that can be discerned -- at least not overall.

The key was to determine if failure rates went up unilaterally across drive makes and models, something that Backblaze stated could be tough to determine. Different drives run at different temperatures by default; drives labeled as energy-efficient tend to run cooler.

Backblaze claims some individual models of drive did have higher failure rates when run outside of their comfort zone -- the Seagate Barracuda, for instance, and an older model of Hitachi drive. But when taking into account all 17 models of drive in its pool at once, Backblaze didn't find a statistically significant correlation. In other words, while individual models might be flaky when run a little hot, hard drives as a whole -- at least as tested -- aren't.

Other studies about drive temperatures have drawn varying conclusions. Google didn't see a correlation, but Microsoft and the University of Virginia did. But one major difference with Backblaze's work is that, like its previous study, it names which drive makes and models were part of the pool.

The Microsoft study, conducted in 2012, isn't clear about which drives were being used, although there's one hint Seagate might have figured into the running: "Our measured failure rates also exceed the AFR rates that manufacturers mention in their datasheets [Seagate ES 2011]." The Google study, conducted between 2005 and 2006, also doesn't mention specific manufacturers -- "we do not show a breakdown of drives per manufacturer, model, or vintage due to the proprietary nature of these data" -- although it does say it covered "several models from many of the largest disk drive manufacturers and from at least nine different models."

Some have challenged Backblaze's methodology before. TweakTown criticized its previous drive reliability report, citing (among other things) how Backblaze sourced the drives, the type of enclosure used, and the company's motives overall. But Robin Harris, writing for ZDNet, defended Backblaze for "providing statistically significant information about their hard drive experience -- something millions of consumers are hungering for," and for echoing what he -- as someone who has worked in the storage industry -- claims to have heard on his own as well about which drives are worth the money.