Microsoft Security Essentials Tanks Another Antivirus Test

The latest test of consumer-side security products by the well-regarded Dennis Technology Labs evaluated seven commercial products along with Microsoft's free antivirus. The commercial products all turned in decent scores, while Microsoft came in vastly lower. In one test, Microsoft managed a rare below-zero score.

Microsoft Security Essentials is free, which is great, but its protection has been getting slammed in antivirus tests in the last few months. The vast majority of antivirus products manage to pass certification with AV-Test; not Microsoft. In November and again in January Microsoft failed certification. The Microsoft product team issued a rebuttal basically stating that the test in question didn't measure their actual real-world protection. However, a new test just released by London-based Dennis Technology Labs puts Microsoft in last place, way behind all of its competition.

Where AV-Test and AV-Comparatives generally include twenty or more products in a test, Dennis Labs has focused on eight vendors in the consumer area: AVG, BitDefender, ESET, Kaspersky, McAfee, Microsoft, Norton, and Trend Micro. The commercial products all did well enough, some of them very well indeed.

Accuracy RatingsThe Dennis Labs accuracy test aims to measure a product's ability to "block all threats and allow all legitimate applications." Products gain points both for correctly blocking threats and for correctly leaving legitimate software alone; they also lose points for blocking legitimate software and for failing to identify malware. The best possible score is 400 points; the worst, -1000 points. With 388.5 points Norton Internet Security (2013) came close to the maximum. All the rest earned at least 300 points, except Microsoft, which took a paltry 30 points.

Some products lost significantly due to false positives. Trend Micro in particular would have had a noticeably higher score were it not for these deductions. Not Microsoft. It earned that low score strictly due to poor detection of threats, with no deduction for false positives.

Overall ProtectionDennis Labs researchers also rated each product on its ability to resist real-world malware attacks. A detailed point system "gives credit to products that deny malware any opportunity to tamper with the system and penalizes heavily those that fail to prevent an infection." The best protection, completely defending the system against attack, is worth three points. Letting the malware launch initially but then cleaning all hazardous traces is worth two points. Finally, if the security software managed to terminate a running malicious process without actually cleaning up traces, that gets one point.

As for the heavy penalties, those kick in when the malware totally gets past all defenses, or if the system is damaged after the security product's response. Every such failure reduces the overall score by five points. With 100 samples tested, the best possible score is 300, the worst, -500.

Norton topped this list too, with 289 points, and all the rest earned at least 200 point. All but Microsoft, that is. In a rare sub-zero score, Microsoft took -70 points.

Dennis Labs also released a report on five Enterprise-level security products and five SMB products. Tested in the Enterprise group, Microsoft System Center Endpoint Protection fared even worse than the free consumer antivirus, with negative scores in both accuracy and protection.

McAfee displayed the second-lowest score in both of the consumer-side tests, but McAfee technology fared much worse in the other two tests. McAfee "VirusScan, HIPs and SiteAdvisor" earned very low scores in the Enterprise group, and McAfee Security-as-a-Service actually came in below zero for accuracy in the SMB test.

ConclusionsSimon Edwards, Technical Director of Dennis Technology Labs, observed "It’s interesting to see how badly Microsoft does in the consumer and enterprise tests, particularly when noting that its products also fared poorly in the last AV-Test report. As you no doubt know Microsoft was dismissive of that test but my view is that if lots of different tests, from competing test houses that use different methodologies/approaches, reach similar conclusions then those conclusions start to be appear increasingly convincing."

I have to agree. In my own hands-on testing, Microsoft Security Essentials has never performed well. At the other end, Norton Internet Security is both a PCMag Editors' Choice and the only product to receive the top AAA rating in the Dennis Labs test. Kaspersky and ESET, both of which took AA ratings, also do well in my tests.

A post on Sophos's NakedSecurity blog praised the Dennis Technoogy Labs test, noting that "Conducting these types of tests is not easy or even straight-forward, but Dennis Technology showed that it is indeed possible." It seems to me that Microsoft should look at what the top-rated vendors are doing and try doing the same. Sure, it's good to emphasize "customer-focused processes," but it's even better to do that and pass the lab tests.

Neil Rubenking served as vice president and president of the San Francisco PC User Group for three years when the IBM PC was brand new. He was present at the formation of the Association of Shareware Professionals, and served on its board of directors. In 1986, PC Magazine brought Neil on board to handle the torrent of Turbo Pascal tips submitted by readers. By 1990, he had become PC Magazine's technical editor, and a coast-to-coast telecommuter. His "User to User" column supplied readers with tips...
More »