There are almost as many anti-virus programs for the Mac as there are families of malware, and a constant question among Mac users is whether to use one and, if so, which one to use. Last November I began a project to test Mac anti-virus programs to see what malware they are capable of detecting. This document describes the second round of testing, in which I look at a total of 20 different anti-virus programs using somewhat different methods than those used in the first test.

It is important, before starting with discussion of the test, to point out the relevance of this test. This is not an attempt to compare anti-virus programs across the board. This test examines only a particular aspect of the anti-virus engines being tested: what malware is detected by a manual scan. This test did not attempt to test how well any engine blocks an active attempt at infection. It also contains absolutely no information about the feature sets, performance and stability of any of the tested engines. Do not attempt to use this test as the sole metric of evaluating anti-virus software. Keep in mind that I would actively recommend against a few of the anti-virus programs that scored highly in this test!

Methods

In this test, a total of 128 samples were collected, containing items from 24 different malware families. Samples were organized into folders based on malware family. Samples that came from VirusTotal had names consisting of the SHA1 “fingerprint” of the file. Samples that did not originally come from VirusTotal were uploaded to VirusTotal, then given names identical to the SHA1 name assigned by VirusTotal. This was done to simplify identification of which malware was detected. Any samples that consisted of archives (zip files, disk image files, etc) were expanded/opened, and both the archive and the contents were placed in a sub-folder consisting of the SHA1 “fingerprint” of the archive.

Attempts were made to ensure that all samples were valid samples. Sometimes, VirusTotal results are not conclusive, and samples will be identified as malware that really are not. A number of samples were rejected from inclusion in the testing during the collection phase. Two items (components of the DiabloMiner app) were removed after testing, when it was shown that no anti-virus software detected them, and after determining that DiabloMiner is actually a legit program misused by DevilRobber.

Testing was done in a virtual machine in Parallels. A base Mac OS X 10.8.2 system was set up in a virtual machine, fully updated and with no third-party software installed. A snapshot was created of this system. Then, over the course of several days, 20 different anti-virus programs were obtained and installed in fresh copies of this virtual machine, ending with 20 different snapshots in Parallels, each containing this base system and one of the anti-virus programs to be tested. Once installation was complete, a single day was chosen to open each snapshot and update each anti-virus program, then save a new snapshot of the updated state. The final result was a set of identical systems, each with a fully up-to-date copy of one of the anti-virus programs as of that particular date.

Once that was done, by shutting off network access, testing could proceed over multiple days without changing the results. Each system was run in Parallels, and the folder containing the malware was copied onto the desktop of the test system. (If necessary, any active or on-access scanning was disabled to allow this to be done unimpeded.) Then, a manual scan of that malware folder was done. Most anti-virus software allowed the selection of a specific folder for manual scanning, but some required scanning the entire user folder or even the entire virtual hard drive. In any case, the only malware on the system was in the malware folder, so the results were equivalent.

After testing, the results were tabulated. This was a difficult process in some cases, as many anti-virus programs provide no options for saving scan results. (Some provide command-line tools that can be used for scanning, but only the GUI scanner was used. That is what the average user would be using, and using some command-line tools may lead to claims of differences in scanning between the command-line and GUI versions.) In the case of malware samples consisting of multiple files, the malware was considered to have been detected if any single item in the folder containing the sample’s files was identified.

Data

The complete data can be downloaded as either a Numbers spreadsheet or a PDF file. (An Excel file was not provided because some of the conditional formatting rules that make the data more readable were not included.) Detection rates (defined as the percentage of samples that were detected) varied widely, from 98% down to 6%. Half of all anti-virus engines tested performed at 93% or better, and almost 3/4 of the engines got a “passing grade” (79% and up). Six performed at 66% or lower.

There were 59 samples of what would be considered active malware, omitting malware that is “extinct.” Of those samples only, the detection rates varied the full gamut, from 100% down to 0%. The same ten engines in the top of the testing when including all samples once again performed at 93% or better with active malware, and a full 3/4 of the engines performed at 73% or better. Only five fell below 60%, with one holding the record by not detecting any active malware at all. For the most part, the percentage of total malware detected was very close to the percentage of active malware detected for each engine, although differences as high as 15% were seen.

Among the samples of malware, detection rates varied from being detected by all 20 engines down to only being detected by 7 engines. On average, samples were detected by about 15 engines.

Conclusions

Although it is important to keep in mind that this is only one measure of the quality of each of the tested anti-virus engines, it is not an unimportant one. Obviously, although it is not feasible for any anti-virus software to detect 100% of all malware, a good engine should be capable of coming as close to that number as possible. This is especially true in the Mac world, where the limited number of malware families means that detection rates of very close to 100% should be possible. As expected, some engines did indeed perform to that standard.

Other engines did not fare so well. However, it is important to keep in mind that Mac OS X already does an admirable job of protecting against malware. At this time, there is no known malware capable of infecting a Mac running a properly-updated version of Mac OS X 10.6 or later, with all security settings left at the default (at a minimum). The role of anti-virus software must be taken into consideration, and some compromises in detection rate may be desirable to get desired behavior (or avoid bad behavior). Someone who wants a low-impact engine for scanning e-mail messages for Windows viruses will have very different needs than someone who needs to protect a computer from an irresponsible teenager who will download and install anything that catches his/her attention.

When choosing anti-virus software, always take the full set of features into account, as well as seeking out community feedback regarding stability and performance. Be sure that you know how to uninstall the software before installing it, in case it causes problems and needs to be removed.

For more on the topic of protecting your Mac against malware, see my Mac Malware Guide.

Notes

Why change the methods?

In my first round of testing, 51 samples were tested against 16 engines. That sample size was really too small, though it is difficult to find a large number of samples of Mac malware, since there are so few malware families for the Mac. There were also a few other problems with that sample set, including one Windows .exe file that was mistakenly identified as Mac malware and included erroneously (though it should still have been detected as Windows malware) and a few minor disagreements about whether items should or should not be included.

One primary goal of my second round of testing was to not only scan a larger set of samples, but to more carefully screen each sample to ensure that it was appropriate for inclusion. Although there will still probably be some discussion of whether certain items are appropriate or not, this set is overall much higher-quality than the previous one.

Another problem some people had with the original test was that some samples were archives of varying kinds (mostly zip files). Not all anti-virus engines are capable of looking inside archives, and of those that are capable, not all will do so by default. For this reason, I chose to expand any such archives and include both the archive and the contents in the sample set.

One of the biggest issues had to do with the way the testing was done. I originally did all the testing in a one-day period, while my computer was booted into a test system on an external hard drive. This meant that the testing environment ended up being destroyed when the testing was completed. That meant that there was no way to settle issues of what engine had been used or supply other unrecorded information. In the second round of testing, I changed how I performed the tests to prevent this issue. I chose to use a series of snapshots in a Parallels virtual machine. This meant that, by cutting off network access and opening a specific snapshot, I could repeat testing under the same conditions and gather additional information that might be requested in the future.

Anti-virus software tested

The following anti-virus programs were tested:

Anti-virus engine tested

Distribution

avast! Free Antivirus 7.0 (37781)

free

Avira Mac Security 1.0.0.64

free

BitDefender 2.21 (2.21.4959)

free (Mac App Store)

ClamXav 2.3.4 (271)

free

Comodo Antivirus 1.1.214829.106

free

Dr. Web Light 6.0.6 (201207050)

free (Mac App Store)

ESET Cybersecurity 4.1.86.4

time-limited trial

F-Secure Anti-virus for Mac 0.1.11361

time-limited trial

iAntivirus 1.1.2 (280)

free (Mac App Store)

Kaspersky Security 13.0.2.458

time-limited trial

MacKeeper 2012 2.2 (2.2)

registered copy

MacScan 2.9.4

time-limited trial

McAfee All Access Internet Security 2.0.0.0 (1233)

time-limited trial

Norton Anti-Virus 12.4 (73)

time-limited trial

ProtectMac 1.3.1

time-limited trial

Sophos Anti-Virus for Mac 8.0.10C

free

Trend Micro Titanium 2.0.1279

time-limited trial

VirusBarrier 10.7.1 (448)

time-limited trial

VirusBarrier Express 1.1.6 (79)

free (Mac App Store)

WebRoot SecureAnywhere 8.0.2.103

time-limited trial

Objections

There are a few objections that some may have with this test, so allow me to address them in advance.

First, some will object that this is a rather artificial test, and not a real-world one. Although it would obviously be better to test by trying to infect a system with a variety of malware and determining whether each anti-virus software would block the infection, this is impractical. Not only would it be exceedingly time consuming with only a few samples, but it would be fairly meaningless as well, since Mac OS X is currently able to block all known malware through a variety of methods. Testing with static samples may be less informative, but it does give valuable information about the completeness of each engine’s virus definitions database.

The sample size may also be inadequate for reasonable testing. 128 samples is far better than the 51 samples from my previous test, but it’s still a bit low. Of course, so is the number of malware families for the Mac. By my count, there are only 35 different malware families that have ever been capable of affecting Mac OS X, and given such scarcity of malware families, it is to be expected for samples to be hard to come by for someone not affiliated with any anti-virus company. My opinion is that the samples used are a pretty good selection of malware, but of course, it could be improved on in the future.

Finally, some may object to the fact that more than half of the samples are what would be considered “extinct” malware, since such samples are no longer a real threat to anyone. However, information about what malware has been detected historically by an anti-virus engine is important for predicting future accuracy. In fact, looking at the data, it is clear that there is a correlation between overall detection rate and detection rate for active malware only. There’s also the fact that some people may be looking for anti-virus software for old, legacy systems that may have malware infections from years past still in place. Of course, separating out the active malware only does have its uses, such as identifying which programs are improving and which are falling behind, which is why I included a summary of those numbers in the data as well as the overall statistics.

Special cases

There were a few special cases in various aspects of the testing.

iAntivirus apparently does not feature any kind of mechanism for updating its definitions. (This is confirmed by a Symantec employee in the Norton forums.) This means that, although I was using the latest version of iAntivirus, its definitions were more than two months old. (Which would explain why it did so much worse against recent malware!)

The MacKeeper trial version refused to update the virus definitions unless it had actually been registered. Fortunately, I had been given a serial number by Zeobit recently, so I went ahead and registered so that I could update the definitions. This was the only commercial product that was not used in its time-limited trial mode.

F-Secure evidently has a bit of a problem with its GUI when running in Parallels. It frankly does not work at all. Fortunately, F-Secure tech support was able to give me a work-around that allowed me to test it anyway, by enabling screen sharing in the virtual machine and then connecting from the “real” system on my Mac and controlling the software from there. A little weird, but it worked. Note that this is specific to running F-Secure in Parallels, and is not an issue when installed conventionally.

Updates

There were a couple minor transcription errors (malware that was marked as not detected when it actually was) that were brought to my attention and have now been fixed in the data. I will be reviewing the data further to make sure there aren’t any other mistakes. Although such things are bound to happen when combing through thousands of data points, which had to be collected through screenshots in many cases, my apologies to everyone for the error!

41 Comments

Thomas, thanks for all the work!!! You included many that others had mentioned in your previous test. Like myself, there are probably many who use both Macs and Windows…and files from both platforms. Your work is appreciated!!!

Many did substantially better this time. There are many theories that could be discussed for why that is, but I personally think that the larger sample size made for a more fair test, so that missing a few items didn’t hurt quite so badly.

Beyond the minimal use required for testing, I have no first-hand experiences with Kaspersky, or most of the other programs I tested. I’d rather not pass on any second-hand information, one way or the other, here.

Sophos is right up there in the top three, and the differences between them are likely just chance differences caused by a still somewhat small sample size. I like Sophos because it is lightweight, with good detection rates, and in my testing has never caused any problems. However, I have not tested most of the other anti-virus programs as much as Sophos, so that should not be taken as any kind of comparison with them.

I tried Sophos and it screwed up my system so badly I had to reinstall completely. So I won’t be messing with Sophos anymore. Currently I have no AV installed but am thinking of installing Avast. I used Avast on my Windows laptop and loved it. Though the annoying popup and spoken confirmation of definitions that had been updated was quite bothersome. I wonder if it does the same thing on the Mac..? Has anyone here used Avast without any complications?

Thomas – I’m a little confused now… if a user visits the Apple forums and insists on using an av package, which should we recommend? The ‘old standbys’ were ClamXav and Sophos (and I actually installed and used Sophos for over a month so that I _could_ reliably recommend it) but it seems that while Sophos did pretty well in ’round two’ ClamXav didn’t fare so well (again). So what should we recommend now? Thanks, Clinton

Well, I don’t want to get too much into recommendations here. (I’m trying to stay impartial, and stick to “just the facts” in this particular post, though of course others are welcome to say what they like, within reason.) However, I do make some recommendations in my Mac Malware Guide.

Although avast! certainly has a high detection rate on its side, keep in mind that there are many other factors that should determine which anti-virus software you use. Do not rely solely on testing like this to make the choice. Evaluate the features that you feel you need that each program offers, and make sure that you educate yourself as to what the risks actually are before installing anything. (See my Mac Malware Guide for that kind of information.)

It would be useful to add the built-in XProtect anti-malware into this list. To properly test XProtect:

1) First, ensure the malware is quarantined by downloading it via a web browser or mailing it to yourself.

2) Open the malware as a user would – that is, double-click any zip files, and double-click the malicious app (as opposed to internal components of those apps, that other AV may detect but would not be the initial infection vector for a user).

Testing XProtect alongside these other apps would be unfair and misleading. It only detects a fraction of the malware that they do. Apple uses a multi-layered approach to protect you from malware, and XProtect only covers some of it. It will not protect against something like Crisis, for example, because Crisis sneaks in through Java vulnerabilities. XProtect cannot catch that sort of thing, so there is no definition for Crisis in XProtect. Apple protects against that by forcing users to keep Java updated and preventing them from running insecure Java web plug-ins.

I do what everyone says not to do, with good luck doing it. I play around in the dark web and I use two anti-malware programs running in real-time. I have been doing this for more then two years with no problems. ( I have daily rotating clones if anything ever happened)

I run Intego and Sophos together and put each in their trusted or no scan area. Intego and Avast are a little flaky together. So I found a good match.

Avast is far superior then Sophos in catching PC malware sites. They both have high sigs for PC malware. I test Avast with my Intego every 3 months to see if their coding has changed a bit so the two could coexist together.

I don’t like the new Intego 2013 software, so when Virusbarrier X6 is dropped I will no longer be using Intego unless they change their 2013 simplicity. No need to pay money for AV when others are just as good and just as simplistic. Intego 2013 “is another also ran.”

4 days without update of Intego Virus Defintions??? Hi again, I wrote to you earlier today but I can’t find my message again. I purchasted Intego Premium Bundle 2013 on Jan 30. Since then they updated their virus definitions only once. Thanks a lot for this very interesting blog/website of yours. My question is: How often are virus definitions supposed be updated in an AV program? It is known that new viruses appear every day, so how trustworthy can an AV program such as Intego who publishes ONE updated virus definitions in 4 days be?? I wrote twice to Intego both on Facebook and to their support email asking about that but have not received a reply or explanation yet. What do you think? I feel less and less safe with Intego. I was infected 3 times earlier in January (before I had Intego both with Trojan and Exploit viruses) so I am really afraid now, especially when I see that I have purchased an AV program that publishes so few virus updates. I have a brand new Mac Book Air, by the way. Thanks very much in advance for your comments.

Mac malware variants don’t come daily. Be vary well assured Intego is on the ball and you don’t have to worry one bit that they are not updating. The other Mac AV programs that are updating a couple times a day are updating PC signatures. While Intego has PC sigs they only have some and by no means the amount Sophos for Mac has here “Protects against 4461796 threats” 796 of that number are the Mac sigs ( 796 is just an example guess) The rest are PC sigs.

While I don’t care for the new Intego 2013 graphic interface and over simplifying the interface FROM the best designed Mac anti-malware program ever made, Intego VirusBarrier X6. Intego VirusBarrier X6 is what all Mac AV should be judged by, it is very sad to see it go. Intego is in the top 3, so you should have no worries.

I’m a new mac user and bought the Kaspersky Universal because of the multi device protection. Was a little disappointed by its performance in your test.
I guess my only complaint other than it being a little resource heavy is that it won’t scan some password protected files on a different user account, without my constantly entering the pass. I usually run it from an administrator account. After spending so much money I guess I’ll stick with it until it causes more problems than it solves. But avast! is beginning to look pretty good.

Thanks! I was expecting the second round….
Interesting how antivirus updated since your first tests…
I just don’t understand one thing: If VirusTtotal database is available why does not antivirus update their engine based also in VirusTotal database?

My personal point of view MacKeeper should be excluded from this comparison test. That program and that company should be completely ignored. I hate their campaign against ClamXav, how they keep sending popups in my explorer and they completely f*** my sisters macbook at the point she had to format mac, make a clean osx install and even use my mac to recover some of her data via TechTools Pro (somehow program managed to screw hd directory).

Thats what happens when we find a 14years old with no experience and a very badly programmed program!

Strange “VirusBarrier” and “VirusBarrier Express” does not have the same database…
Thanks again!

I’ve been through most of the titles here. Avast would be my choice except for one major problem: it consistently eats 80-100% of one processor almost all the time, whether idling, opening files, downloading, scanning, watching videos, whatever. It makes watching hi-res video even from files resident on the computer almost impossible. Every animation becomes jerky. I’m using an iMac with a 2.4 GHz Core Duo and 4 gigs of Ram, and while it’s not an up-to-date speed demon, it shouldn’t slow things this much.

I moved to Sophos, and it’s better on the interference. However, I use a few Java apps, and they take bloody forever to start, even if they’re in the exceptions list. I guess it insists on scanning every .jar used or java file referenced.

Keep in mind two things: first, no anti-virus software will be able to scan any kind of encrypted (ie, password protected) files. As for multipart archives, I haven’t ever actually tested it, but I would suspect that those would also give at least some anti-virus software trouble.

Second, any anti-virus apps from the Mac App Store will be unable to scan many locations. All App Store apps are very strictly sandboxed, which means they only have access to a very limited portion of the file system, and can only access a slightly larger portion of the file system with the user’s explicit permission (ie, by giving access to specific files/folders through an Open dialog).

Sophos can scan every single compressed archive including multiple part files, successfully! Their on access feature some time takes me some time to open very large (large = with a lot of files not size) files but even alerts me if there’s a virus (mostly Windows virus)…

I came back here to say this: When I choose to manually scan a file with drweb light IF I click stop scanning before it finishes I suddenly crash finder and freezes all (all = all) my processes… Then (after minutes) when I can open Activity Windows (from dockbar because it hides automatically and finder stops answering me) I can close drweb but there’s actually a lot of drweb processes (more or less between 3 and 7) and some of them requires sometimes 70% to 50% of cpu process. When I close all drweb (leaved behind) after closing main drweb process everything comes back to normal INSTANTLY!

I do not recommend DrWeb (personally because this may only happen to me) and I thought giving my password and allowing the antivirus (from app store) it actually could work as a “normal” application and the only downward was the manually scan…

Anyway, I thought this info should be useful to someone and I get back to Sophos…

I’ve always used and trusted ClamXav. It’s passive right after installation and lets you scan on demand if that’s all you really want to do. I’m hoping that after ClamAV updates some defintions, you will find it results higher on your list. I’ve always like the ClamXav application that uses the ClamAV scan engine.

As somebody getting first Mac this week after 25 years of PCs (used Kaspersky for some time ) and will need to continue to interact with files from Pcs can somebody offer some guide with following
(1) best options for ensuring email scanning within apple Mail
(2) Word Macro viruses
(3) if is am running a Windows 7 machine in coherence mode in Parallels do I have to also have SEPERATE antivirus / malware running on the Windows machine and if so will running something like Sophos on the Mac and Kaspersky on the Windows Parralel machine interfere with each other.

With respect to your comment about AV apps from the Mac App Store being limited in what they can scan, I wonder how the BitDefender you can purchase directly from their website (v2.3) does? I wonder if that would be something to consider in your next test.

Also, I don’t quite understand how VirusBarrier Express can do so well considering it’s from the Mac App Store too. Perhaps the database or scanning engine is just better?

The malware detected is unrelated to what folders the app is capable of scanning. App Store AV software will probably use the same definition database as any non-App Store counterpart, but cannot scan all parts of the hard drive.

Thanks for the article, great job. Avast uses a “Web Shield”. I’m wondering if it’s save to use that shield. What about privacy? Is the traffic between mac and avast proxy encrypted?
You’ve to register the AV-software? (privacy?)

Though, it doesn’t make sense to me how any company could just give away a program, and then also put time and resources into continually offering malware updates – all at no charge, mind you – and for which other companies get paid over $50.00 per year for? Call me cynical, but what’s in it for them?

I am not very experienced nor knowledgeable about this subject however I currently use Bitdefender and MacScan and observed that both have not yet found any Virus on my Mac. The only difference in the two is that MacScan finds TRACKING COOKIES that I can isolate.

I have theses questions for I hope you will be able to answer my concerns :
1-The Tracking Cookies may not be viruses yet isn’t this something worth removing?
2-Are Tracking Cookies a real concern? Would it be safe to keep them?
3-Would my Mac be able to remove theses cookies by itself or by any other means?

Tracking cookies are not a threat. Cookies simply are very small bits of data that web sites are allowed to store on your computer. Cookies stored by a particular site are available only to that site. Tracking cookies are just cookies that are used by a site to track your visits to that site, or that are used by an ad site to track your behavior with respect to their ads (shown on other sites). They cannot harm you in any way. Worst case scenario, an ad site could track which sites you had visited that host that site’s ads.

When you say you tested for viruses, did this include spyware such as keystroke loggers and such? Some of my bank alias login data was stollen half a year ago, and I’m convinced it was via some kind of spyware (since it was login alias data only). I installed the trial of MacScan, cleaned out a bunch of stuff, and that seemed to do it, but now I want to install a permanent solution, something with fewer negative comments. So, do your test results include possible spyware or is that term something that these companies have just made up?

Spyware is simply a name for malware whose purpose is to steal your personal information. Most malware would fall into that category these days. As for keyloggers, if they are known to be used for malicious purposes only, they should be detected by decent anti-virus software. If they are legit commercial keyloggers, they may or may not be detected as “potentially unwanted applications” by some anti-virus software.

So I think I’ve read this correctly, and I don’t think I noticed it… did you do any research on the system drag or ‘heaviness’ the antivirus added to the system? I was running Kaspersky but it made my older Macbook beachball constantly. I think I’d be willing to sacrifice some security for performance.

Nope, this test only looked at what malware samples each anti-virus program was able to detect. There are many other important considerations involved in choosing anti-virus software that were not tested here, such as false positive rate, stability and impact on the system’s performance.

This post is more than 90 days old and has been locked. No further comments are allowed.

This page and all contents (unless otherwise noted) copyright 2011-2014 by Thomas Reed.
For questions or comments, please contact me.