My problem is that I have tried very hard to understand why the AV companies are so upset over this; and I cannot. I asked Luis Corrons, Technical Director of PandaLabs, to explain it to me; and he very kindly agreed (and I am particularly grateful since he is tackling the explanation in a foreign language). I have to say that I am still personally not convinced – but this is how the conversation went.

Luis Corrons, technical director, PandaLabs

PandaLabs
It would appear that Cyveillance have only tested static signature detection capabilities. This may be due to the fact they do not have the time or money required to perform detailed testing such as that from companies like AV-Test.org.

Static signature detection is the very first technology implemented in an antivirus and although it is good at detecting known malware, this is clearly not enough. Panda Security (as well as many other vendors) have been recognising this for years, which is why most of the major vendors have been developing proactive technologies such as behaviour analysis/blocking and cloud based detections, etc. Had Cyveillance performed tests that looked at other technologies aside from the static signature detection, I believe the detection results would have been decidedly different and the media attention would no doubt diminish.

Kevin Townsend
The bit that I’ve never really understood about the AV industry’s concerns about this report is this: if Cyveillance took brand new copies of the AV packages and installed them on clean PCs, and then chucked the virus samples at them, how is this a false test?

Granted it does not say that this package is better than that one – but that wasn’t the purpose. The purpose was to say, OK, AV #1 took n days to learn about this new virus where AV #2 took n+4 days to learn about the same virus. I simply cannot see where this is wrong.

Luis Corrons
Imagine you are writing about cars; “Air-bag is essential – it’s just not as good as they tell us” because a company that makes business on the same sector has made a test with crash test dummies and showed that it was only saving lives 15% of the time. I would understand that some car makers could get upset with that, they would say the security of the user in the car should be measured globally, using all the security measures that it has, as it happens in the real life. Using the air-bag with the seat-belt, it would increase the %, the same with the ABS-EDS, etc.

The same is happening with the antivirus. Cyveillance says that proactive technologies are really important, and that relying on signatures is a mistake. And they are 100% right. I’ve been saying the same for the last 7 years, signature detection techniques are great, but they are more than 20 years old now and they have some limitations. Antivirus vendors know that, and that’s why most of the antivirus solutions include a number of proactive protection layers.

I understand that testing a security product in a real life circumstance is really hard, not many people are able to do it and it takes a lot of time and resources. So why wouldn’t Cyveillance do a real life test? It’s more expensive and the results wouldn’t be that bad, which wouldn’t really benefit Cyveillance, as they want to “show” how bad antivirus solutions are, it doesn’t matter if that’s true or not.

Kevin Townsend
That doesn’t really wash. You are saying that only the airbag manufacturer can test whether it works or not, because only the airbag manufacturer understands how airbags work and how difficult testing airbags in genuine crash conditions really is. But in this analogy, the problem is that the airbag manufacturer is saying that the airbags will protect you 100% of the time (that’s my translation of what the VB100 award says), when in real life people are still getting injured despite the airbags. Similarly, people with AV products are still getting infected, and VirusTotal clearly shows that AV products DO NOT work against 100% of viruses. Airbags (like AV) are essential; but they are not as good as they tell us (if they say that they guarantee 100% protection during a crash, when I can demonstrate this is not the case).

Luis Corrons
I’m afraid I did not explain myself properly. In my analogy I was talking about car manufacturers, not airbag manufacturers. Users don’t buy airbags, but cars. Car manufacturers will go over 3rd party companies to obtain security certifications. And then, a company that says it is able to improve security through building better roads says that airbags are not good enough. That company should be talking about the whole car security, not airbags, and even though the results would be much better from the car manufacturer point of view, still it will show that better roads are needed.

Regarding VB100, I think that it is a test that measures some of the features in the products, such as Cyveillance is doing. I think that the best test is one that really reproduces what happens in the real life, since the threat is sent to the user, the user gets it, tries to enter / install in the system, etc. In these kinds of tests it is almost impossible for any vendor to obtain a 100% result, but it shows what happens in the real life. No more no less.