Antivirus Comparisons and tests.Great info!

I got this from another forum, u can download this PDF file its an actually great read.

In this month Computer Shopper the leading anti virus's were put to the test with Frighting results. It makes interesting reading but to cut a long story short the best are Steganos Anti Virus 2007, Kaspersky Lab Anti Virus 6 and AVG Anti Virus because it's free

i just got done reading it all NOD and KAV were the best as usuall but Avast was last can anyone tell me if these test are any good? I wish i KNEW how to post the pictures instead of this PDF file that has the pictures and story.

The sample size of the test is very small. I would place more weight with the Av-comparatives tests. There is one coming out the beginning of March.
All the AVs tested by Av-comparatives are good products. Pick one that suits your personal preference and spend time learning all about it.

Do not make judgements on this phony test. It is the biggest load of bs I have ever seen. Getting NOD is a good call though. it is very light on resources, strong detection rate, and strong heuristics.

I admit that I believe that NOD is clearly superior, but I would not ditch Avast if I had been happy with it before this "test." In my opinion Avast gives adequate protection for anyone not a "dangerous surfer." I have used it at various times without problems.

The test set is by far too small to base any judgement on it. I think both Andreas Marx (AV-Test) and IBK (AV-Comparatives) use alot more than 1.000.000 files by now.

Heck, the "quick" reference test set I use to check my heuristic/generic detection rate is about 120.000 samples.

Click to expand...

I found a comment in relation to this by the tester(s):

Testing anti-virus software is a controversial subject. Some anti-virus companies have complained that, by handling live viruses, we are being 'unethical'. Some don't believe us when we say their products do not detect certain files and demand proof, which we are happy to provide. Others point out that the number of viruses used by computer magazines is so small as to generate meaningless results. Many magazines don't even bother testing with real viruses and instead rely on the veracity of the certificates awarded to products by the security industry.

There are many different ways to test anti-virus software and we believe that, at the very least, our method demonstrates that it is disingenuous for anti-virus companies to claim that their products provide 100 per cent protection.

More than the number of samples, I doubt the quality of samples they've used....

Click to expand...

Perhaps more significant is the age of the samples. If someone runs a test using the most recent malware discovered, then there is a greater chance that the AV software does not have a signature for it. In such a case, the test is more biased towards how quickly AV companies detect and add new malware (plus the scanner's heuristic abilities) than the more established tests with larger samples would be.

Given the increasing attempts by malware writers to avoid detection (both by morphing their code in various ways and also targeting it at smaller groups to reduce the chance of AV companies picking it up), such attributes are likely to become more important in future, but should really be tested for separately.

Perhaps more significant is the age of the samples. If someone runs a test using the most recent malware discovered, then there is a greater chance that the AV software does not have a signature for it. In such a case, the test is more biased towards how quickly AV companies detect and add new malware (plus the scanner's heuristic abilities) than the more established tests with larger samples would be.

Click to expand...

I rather agree with this hypothesis.

For example, avast! is well known by its users for a very slow adding malware sample process, so avast! didn't perform any good in such test should tell something...