Award-winning news, views, and insight from the ESET security community

RSA, AMTSO, the Universe and Everything

There was an AMTSO (Anti-Malware Testing Standards Organization) panel session here at RSA, where Larry Bridwell, Righard Zwienenberg, Andreas Marx, Roel Schouwenberg and Neil Rubenking talked about AMTSO and what it does (and what it hopes to do). And I added to my list of qualifications for being involved with the organization: current vendor representative,

There was an AMTSO (Anti-Malware Testing Standards Organization) panel session here at RSA, where Larry Bridwell, Righard Zwienenberg, Andreas Marx, Roel Schouwenberg and Neil Rubenking talked about AMTSO and what it does (and what it hopes to do). And I added to my list of qualifications for being involved with the organization: current vendor representative,

There was an AMTSO (Anti-Malware Testing Standards Organization) panel session here at RSA, where Larry Bridwell, Righard Zwienenberg, Andreas Marx, Roel Schouwenberg and Neil Rubenking talked about AMTSO and what it does (and what it hopes to do). And I added to my list of qualifications for being involved with the organization: current vendor representative, ex-tester, ex-corporate customer, inveterate wordsmith, and now haberdasher. Sorry, but all the t-shirts have now gone. ;-)

In the latter part of the session, members of the audience asked some very relevant questions, and I'd like to address a few of them here, though I'm in "IMHO (In My Humble Opinion)" mode here, not speaking for AMTSO as a whole or even for the rest of the Board of Directors.

Stop me if you've heard this before (or at any rate something rather like it), but VirusTotal is not, isn't intended to be, and cannot be a measure of detection performance. It does give you a feel for the likelihood that a single submitted file is malicious (and that's a Really Useful thing), but it doesn't tell you definitively, and it doesn't tell you for sure that a given product does or doesn't detect that file, and it doesn't take accurate account of false positives. This imposes strict limitations on its usefulness statistically, so a statement like "only X products detected" or "only X% of malware is detected…" is hard to justify on the basis of Virus Total reports.

Trying to compare one test to another, even tests by a single testing organization, is of limited use. Even if you have the visibility into details of methodology (and methodology is not all you need to know to evaluate a test fully), there's so much variance in methodological approach, sample sets, validation and so on, that realistic metametrics of that sort are not practical, and I don't see that as a likely target for AMTSO in the foreseeable future. By the way, I'm not saying at all that variation in methodology is a bad thing. AMTSO isn't about standardizing test procedures: it's about raising test standards, which is a very different thing. IMHO…

AMTSO does not develop tests: rather, it develops criteria for evaluating tests, generates documentation intended to facilitate the development of better tests, and on occasion assesses a review's compliance to AMTSO's principles.

Proactive/frozen testing has served us pretty well for quite a while as a measure of proactive detection, but its usefulness is declining as the threatscape changes, being somewhat tied to static testing. (Mind you, I'd still rather see a good static test than a bad dynamic test…)

Inevitably, the question of certification came up. Clearly, AMTSO is not in a position to certify products/companies, even if we saw that as our job, which we don't: we don't have the resources. Similarly, certifying individuals or organizations would also require resources and partnerships that aren't in place right now, though allowing testers a degree of self-assessment will take us in that direction.

However, AMTSO can't offer a way to certify that Mrs Miggins Patented Test Labs will always offer good tests, because inevitably, some tests from a busy lab will be better than others. So it's a pity if the review analysis procedure is regarded as a punishment for "bad testers": in fact, some testers are now starting to see it as a possible way of pre-validating a test that hasn't taken place yet, and (still speaking personally) I see that as hugely positive.

Towards the end of the session, I found myself putting together some thoughts on what I thought about the testing/evaluation process when I was a tester and also as part of the corporate procurement process.

And yes, those are two very (generally) different roles: I told you I'd had a chequered past!). Interestingly, Larry Bridwell summed up at the end of the session with some thoughts along similar lines.

But those are thoughts, I guess, that can probably wait for a future blog.