If this is your first visit, be sure to
check out the FAQ by clicking the
link above. You may have to register
before you can post: click the register link above to proceed. To start viewing messages,
select the forum that you want to visit from the selection below.

AV Low detection rates.

The vendors whine (as they always do) but not very convincingly in some cases:

Randy Abrams, director of technical education at ESET, said that the report ‘is a textbook example of how to do anti-malware testing fundamentally wrong'. He said that a sample set of 1,708 unconfirmed malicious files were used, and as ESET sees around 200,000 unique new samples each day, 1,708 is not a statistically significant sample set.

A lot of what he sees will be the same malware, variations on a theme or obfuscations of older malware. But there is no way they are "unique" or "new"

I believe that when you are talking about zero day or obfuscated malware then 1,708 is a highly significant statistical sample. However, if they had really confirmed that these items were malicious, then all should have remained in the test, detected or not.

Basically the test didn't demonstrate anything we didn't know .........signature based detection is going to struggle if it doesn't have a signature.

Also, it isn't so much about detection as prevention? a lot of vendors actually sell security suites that have sandboxes and behavioural analysis etc. They may deny access to the main systems or kill processes that are about to do something malicious.

It is not that clear from the report, but it looks as if they just loaded the files and looked to see if the scanners detected anything suspicious?

A more robust test would be to actually open/execute them and see if they were allowed to run.

EDIT:

The article appears flawed as ESET (NOD32) seem to have had the best initial detection rate of 37&#37;

People are going to have to move to whitelisting. I think we might be to the point where it is easier to decide what is ok to run, versus trying to include a signature for every malicious piece of software.

I have GPOs that prevent unauthorized executables on Student machines. I have also used SRPs to prevent executable files from running out of the temp folders on any staff machines.

I have disabled all JS in our PDF viewers, and I would love to prevent attached PDFs at our mail filter, but I am not fond of lynch mobs.

Using this setup, I don't recall having even one infection. The only thing I can think of are a few false positives with a security agent that interfered with our AV software.

\"Those of us that had been up all night were in no mood for coffee and donuts, we wanted strong drink.\"

I also think that people need to move to whitelisting for proper security. I have seen this problem over and over again. People think they are safe since they have norton or avast or avg or even god forbid mcAfee. Then after showing them that they are actually spamming everybody, they look startled at you. I think antivirus is good to keep the older strains off your system, but there is not much you can do if you run a system with a bunch of dumb users.

I really think the new age of viruses are going to target browsers as their initial attack vector and propagate through network in the way they seem fit. The reason why I do believe that browsers are going to be the biggest hot spot for viruses is that browsers are becoming the new interface for users. Google is making a web OS is what I do think they are calling chrome OS. Though I do think that a web browser is the most used application on a majority of computers, it just doesn't seem like a good idea to have it be so pivotal. Despite what many claim, the internet still is a dangerous place. Even experienced users can be duped by a well crafted phishing attack.