Introduction

It was an all-new platform for this month's VB100 comparative, with our first look at Microsoft's latest server‑grade operating system variant, Windows Server 2016. Although the platform was officially released only shortly before the test got under way, previews had been available for some time, and being largely similar to the widely deployed desktop equivalent Windows 10, we hoped that security developers would have had plenty of time to ensure their products supported the new environment to the full. However, any new setup will inevitably bring some surprises, so we were more than usually keen to see just how well products would perform this month.

Platform and test sets

Installation, as usual using standard install media acquired through the MSDN programme, proved fairly simple, and the preparation of our test systems, including the addition of our standard selection of common tools, went smoothly too. Trials of our test automation systems, such as our performance measurement tools, brought up nothing untoward either, with all necessary tweaks already having been made for Windows 10 testing. However, one major change in the platform was clear from the start: for the first time on a Windows Server platform, the built-in Windows Defender anti-malware solution was enabled by default. With many of our regular participants having warned us to watch out for this, alerting us to the fact that many of them had not been provided with suitable means of disabling the protection automatically, and that there had been numerous reports of disabled setups reverting unexpectedly to an enabled state, we opted to shut Defender down for ourselves and monitor its status closely throughout testing.

As is our standard approach for speed and performance measures, baselines were taken with the system in its default state, which in this case meant with Defender enabled. The aim of this approach is to enable users to judge how much of an impact each solution has on the speed of a standard system compared to a basic, unmodified installation. This time, however, we quickly noticed something of a problem – almost every product appeared to be running much faster than the baseline times, with significantly lower resource usage too.

The resulting flood of negative numbers made our usual speed graphs rather difficult to read, and after some analysis and consideration we eventually decided to break from our standard practice and rebuild the baseline measures using unprotected, bare systems rather than the default setup for the chosen platform. This gave us a much more easily consumed set of speed data for this report. To give some indication of how Windows Defender fits into the picture, we've included for reference in the main performance and on-access lag time charts the performance numbers that were originally intended to be baselines, although as Defender was not officially submitted for testing, no full set of detection data is available.

The test deadline was set for 2 November, a little later than usual due to the VB conference having taken up a lot of our time in October. Our sample sets were frozen on 2 November and we used the latest WildList available at the time, v4.033. As always, our clean sets were updated and tidied in preparation for the test, with the latest version comprising around 850,000 files and 180GB of data.

With all preparations complete, we settled down to find out how the products would fare on the new platform.

AhnLab's products tend to pop up in our tests at fairly random intervals, but generally put in decent performances. The latest server edition looks slick and glossy with a clean and clear layout, and proved to run pretty smoothly on Windows Server 2016 with only a single incident of the product GUI crashing out. Our performance measures showed pretty low use of resources and a fairly sizeable slowdown of our set of standard tasks, although they still ran through a good bit faster than with Windows Defender operational. File read times were also a little slow on first encounter of items, but again mostly better than with Defender enabled, and they sped up considerably on repeat runs. Scanning speeds looked decent too, particularly over the local system partition.

Detection was strong in the response sets, dropping off fairly considerably in the offline reactive sets. The core certification sets were handled nicely though, and AhnLab kicks off this month's test with a VB100 award.

Avast is a much more regular participant in our tests, with a 100% pass rate in recent years. The vendor's business edition has a slick and attractive appearance that is similar to that of its home‑user offerings, with simplicity on the surface and a wealth of configuration options available. The product demonstrated good stability for the most part – the only issue noted occurred at the end of the offline RAP test, when the entire machine froze and required a reboot. This was not reproducible however, and occurred at a time of high stress, so didn't dent the stability rating too heavily.

Speeds were not the fastest on demand, but on-access lag times were light, especially in the warm runs. RAM usage was low, CPU use a little high, and our set of tasks ran through a touch slowly. Detection was strong in the response sets, tailing off somewhat in the offline part of the sets. The certification sets presented no difficulties though, and Avast maintains its clean run of passes.

Avira's products show up in most of our tests and generally put in strong performances. The server version has the usual simple, angular appearance with a strong set of controls under the covers, and this month held up well under the pressure of testing with no stability problems noted. Scanning speeds were decent, while file read times look fast thanks to there being limited scanning on-read by default. Performance measures show slightly elevated resource usage and a noticeable but not too heavy impact on our set of activities.

Detection was decent too, and there were no issues in the clean sets. There was a clean run over the WildList sets on demand, but on access we noted a couple of items that were not being alerted on; further checking revealed a detection was being prevented by the cloud lookup system, a problem Avira picked up on rapidly and fixed without intervention from us. Nevertheless, it was enough to deny Avira a VB100 award this month, despite another good showing.

One of the very few vendors maintaining a perfect score card over the last two years, Bitdefender's business product has a minimalist appearance, with large fonts and clear messaging making up for minimal controls (most of which are provided in a separate management system). Stability was for the most part very good indeed, although during one large scan job we did note the PC freezing up and needing a restart, an incident which was not repeated and only occurred during unusually heavy usage.

Scanning speeds were pretty decent to start with and blasted through in no time on repeat runs, while file read lags were low and resource consumption also nominal, with a low impact on our set of activities. Detection was very strong indeed, dropping off a little into the proactive sets, and with a flawless run through the certification sets another VB100 award is well earned by Bitdefender.

CYREN's venerable Command product has picked up a good string of passes of late, its earlier, long-running issues with false positives seemingly now in the past, although the product remains the unchallenged title holder in the 'most retro interface' category. Stability was reasonable in everyday use, although scanning unusually large sets of malware seemed like a fairly sure-fire way to crash the GUI. Scanning speeds were sluggish, file read lags pretty hefty, and our set of activities took a long time to complete, with low resource usage figures more a reflection of the long period over which the numbers were averaged out rather than any particular efficiency.

Detection was very strong in the reactive sets, distinctly lower in the proactive tests where the product had no access to cloud lookups, and the WildList was nicely covered. The clean sets were once again handled without issues, and another VB100 award goes to CYREN.

Defenx seems to be back in our regular lineup after a brief absence, having replaced its previous technology provider with K7. The product interface is clean and clear with good controls and plenty of information available, and seemed to brush off any attempt to stress it, earning top marks for stability. Scanning speeds were decent with some good optimization in the warm runs, while file read times weren't slowed down too much and our set of tasks completed in good time too, with minimal resource usage.

Detection was a little lower than most this month, but within acceptable bounds, and with another clean run over the certification sets a VB100 award is easily won by Defenx.

Another member of the 12/12 club with a perfect pass record in recent years, eScan's server edition has a very bright and colourful tiled main screen, with other areas including the ample set of configuration options looking a little less slick, but generally working well. We noted a single GUI crash, during normal usage, as well as a single problem with logging not behaving as expected, but nothing too serious. Scanning speeds were impressive, file access lags pretty light for the most part, and our set of activities wasn't hit too hard, with reasonable resource consumption.

Detection, assisted by the Bitdefender engine, was strong with a slight drop into the proactive sets, and a good showing in the core sets earns eScan another VB100 award.

Fresh from celebrating an epic 100th VB100 pass, ESET returns this month to extend that splendid record of passes still further. The product is highly polished and professional-looking with plenty of data displayed and easy access to a comprehensive set of configuration options. Stability was impeccable once again with no wobbles even under seriously heavy loads, and speeds were good too, with fast scan times, light slowdown of file reads and a pretty reasonable impact on our sets of activities; resource use wasn't excessive either.Detection was excellent with good scores even into the offline proactive sets, and yet another perfect run through the certification sets easily earns ESET its 101st VB100 award.

ESTsoft has a pretty decent record in our tests, with some good runs of passes over the last few years. The current product is pleasant to look at overall, with decent controls available, although fonts look a little wonky in places. Stability was good, with only a single issue noted, related to log exporting. Scanning speeds were fairly fast, file read lag times mostly very low, although executables were held up rather longer than other file types, at least on first visit, with warm times much better. Our set of tasks was slowed down a little but not too much, with resource consumption barely detectable.

Using the Bitdefender engine, detection was, as expected, very solid indeed, and with no problems in the certification sets another VB100 award is comfortably earned by ESTsoft.

Fortinet's FortiClient is another extremely reliable participant in out tests, with passes in all Windows comparatives in the last few years. The product interface is fairly basic with minimal options provided and styling pared down for maximum simplicity. It proved mostly reliable, although a few update attempts failed and had to be re-run and we did see a couple of unexpected restarts. Scanning speeds were fairly slow, on-access lags a little high but showing some improvement on repeat visits to the same files, and our set of tasks was somewhat slowed down with resource consumption a little elevated at busy times.

Detection was very strong in the response sets, dropping considerably into the offline proactive sets, and the core certification sets were dealt with very tidily, earning Fortinet another VB100 award.

G DATA's business solution is a proper corporate offering with an MMC console to provide deployment and central control, and a local agent with limited configuration to keep the user informed of any issues. As usual, deployment and operation proved a little more involved than with straightforward monolithic solutions, but it seemed to work pretty well with some practice, and proved robustly resistant to the stresses of the test, earning a perfect rating for stability. Scanning speeds were reasonable initially and very fast indeed in the warm runs, with file read lags showing a similar improvement on repeat visits. Our set of activities was distinctly slower than the baseline measures, with pretty heavy use of resources too.

Detection was very strong as usual, with good scores across the sets, and another perfect showing in the core sets earns G DATA another VB100 award.

Ikarus seems slightly more prone to false positives than most, but has managed to pick up a decent scattering of passes of late. This month the product looked much the same as ever, the interface somewhat blocky and clunky but reasonably usable, and it proved impressively stable with no problems noted at all.

Scanning speeds started out decent and became excellent on repeat runs, while file lags were fairly significant on first seeing things but again improved impressively after initial settling in. Our set of tasks completed in very good time.

Detection was solid, with a sharpish drop into the proactive sets, and a good job handling the certification sets earns Ikarus another VB100 award.

K7's history in our tests shows an impressive run of success of late, with passes in all Windows comparatives in the last couple of years. The product has a rugged appeal with a good set of controls within easy reach, and proved pretty stable once again with only a single glitch – a fairly minor one where an update failed to complete first time but got the job done without difficulty on re-running. Scanning speeds were slow to start with but a lot quicker on second attempt, while file read lag times were a little high, improving somewhat in the warm runs. Our set of tasks wasn't slowed down too much though, and resource usage was low.

Detection was reasonable, a little lower than the bulk of participants but still respectable, and the core certification sets were handled accurately, earning K7 a VB100 award.

Kaspersky's history in our tests is complicated somewhat by the vendor's large number of product lines, which appear in differing combinations depending on the test. The server solution is a full enterprise offering leveraging the MMC system for its main interface and controls, which are provided in the comprehensive depth one would expect and seem fairly simple to navigate and operate. There were no stability problems noted, earning the product a 'Solid' rating. Scanning speeds were not the fastest, and overheads seemed a little heavy too, with a long time taken to complete our set of tasks and fairly high use of RAM and CPU cycles.

A relative newcomer to VB100 testing, NANO has accumulated a nice little set of passes. The product looks clean and simple, and managed to complete all tests without the slightest sign of instability. Scanning speeds were steady and not too slow, while file read lags were slow over archives but not bad elsewhere. Our set of tasks was somewhat slowed down, and resource usage was also noticeable, but not too heavy.

Detection still lags behind the leaders somewhat but continues to improve steadily, and with a good run through the certification sets, another VB100 award goes to NANO.

PC Pitstop's unusual whitelisting-heavy approach has earned it some stellar detection rates of late, although a tendency to false alarm has meant no certification for a while. The interface is focused on software vulnerabilities with some information on malware protection and basic configuration controls. The GUI itself remained reasonably stable, but we saw a number of fatal blue-screen incidents at all stages of the test (to be fair, we should note that the product is mainly geared towards the consumer market and not intended for use on server platforms). With so many issues noted no speed or performance data could be gathered, but we at least managed to complete all the detection tests, which showed once again some superb detection rates in the RAP sets, but a high FP rate and some issues with the WildList too, meaning there is no VB100 award for PC Pitstop once again.

Quick Heal's server version has a stark black-and-white colour scheme, leavened only occasionally with touches of green or red, and a clean, pared-down layout which nevertheless manages to provide a decent set of configuration options. Stability was good, with just a single incident observed of the scanner snagging. Scanning speeds were slowish, overheads not too bad to start with and barely discernible on repeat visits, while our set of activities wasn't too badly slowed down but resource consumption was on the high side.

Detection was strong, with a steady but not too steep decline through the sets, and the certification sets were nicely dealt with, earning Quick Heal a VB100 award.

Quick Heal's Seqrite product line is aimed at the larger enterprise, but the main interface closely resembles other Quick Heal products, once again using a monochrome look to lend gravitas. Stability was good again, with just that single file tripping up the scanner and, this time, an update attempt returning an error on first try. Scanning speeds and file access lag times were acceptable, with a fairly large hit on our set of activities and somewhat elevated resource consumption.

Detection was pretty decent across the board, including in the certification sets where no issues were noted, duly earning Quick Heal's Seqrite another VB100 award.

Chinese giant Tencent continues to build a steady run of passes in our tests. The latest edition looks bright and glossy with clear controls and a decent set of configuration options provided. Stability was dented only by an incident on one install where the on-access protection seemed to take rather a long time to kick in. Scanning speeds were on the slow side, and with minimal protection on-read our file access measures show low impact. Detection was strong, and with another good run through the certification sets, Tencent picks up another VB100 award.

The 'TAV' edition of Tencent's PC Manager uses only in-house technology without the third-party engine that is enabled in the standard edition, and has been achieving certification fairly reliably over the last year or so. In look and feel it's not much different, with the same shiny, colourful interface. Stability this time was perfect, with no problems observed. Scanning speeds were a little faster than the mainline product, while once again on-read protection was largely absent. Our set of activities was minimally impacted, and resource use was low.

Detection figures are absent thanks to the developers requesting to be excluded from the RAP test, but the core certification sets were handled properly and a VB100 award is earned.

TrustPort's products have been stalwarts of our tests for many years now and can usually be relied upon to produce excellent detection levels thanks to their multi-engine approach. This month, the GUI came in a fairly attractive pale blue, with some nice clear information on the surface and a good set of options under the hood.

Stability was dented by a single incident while scanning some clean files, bringing up an API error message but not apparently affecting the actual scan. Scanning speeds were a little slow, file read overheads fairly high at first but showing some serious improvement later on. Detection was excellent, with a little drop into the proactive sets, and another perfect run through the certification sets earns TrustPort a VB100 award.

¶System drive size measured before product installation.*Product not fully tested, only speed and performance data available.N/T = Not tested. N/A = Not applicable (e.g. product does not have an applicable option).

Reactive and Proactive (RAP) tests

VB100

Reactive

Proactive

Reactive average

Proactive average

Weighted average‡

Set 1*

Set 2*

Set 1§

Set 2§

AhnLab V3Net for Windows Server

93.3%

91.1%

54.5%

48.3%

92.2%

51.4%

78.6%

Avast Business Security

96.2%

93.7%

62.0%

55.2%

94.9%

58.6%

82.8%

Avira Antivirus Server

88.8%

86.5%

62.5%

59.6%

87.6%

61.1%

78.8%

Bitdefender Endpoint Security

95.7%

92.6%

69.3%

66.6%

94.1%

67.9%

85.4%

CYREN Command Anti-Malware

98.6%

95.8%

59.8%

56.8%

97.2%

58.3%

84.2%

Defenx Security Suite

79.5%

77.6%

52.4%

46.4%

78.5%

49.4%

68.8%

eScan Internet Security Suite Server Edition

94.9%

91.9%

69.0%

65.9%

93.4%

67.4%

84.8%

ESET Endpoint Antivirus

95.8%

94.9%

77.9%

70.3%

95.3%

74.1%

88.2%

ESTsoft ALYac

97.3%

95.3%

69.5%

66.7%

96.3%

68.1%

86.9%

Fortinet FortiClient

95.4%

92.7%

62.5%

56.4%

94.0%

59.4%

82.5%

G DATA Antivirus Business

98.5%

96.8%

76.2%

68.4%

97.7%

72.3%

89.2%

Ikarus anti.virus

94.9%

93.4%

63.4%

60.7%

94.1%

62.0%

83.4%

K7 Total Security

84.7%

78.3%

55.6%

51.2%

81.5%

53.4%

72.1%

Kaspersky Anti-Virus 10 for Windows Servers

N/T

N/T

N/T

N/T

N/T

N/T

N/T

NANO Antivirus Pro

82.1%

62.4%

47.3%

43.2%

72.3%

45.2%

63.3%

PC Pitstop PC Matic Home Security

99.9%

99.9%

99.98%

99.97%

99.91%

99.97%

99.9%

Quick Heal AntiVirus Server Edition

93.6%

89.3%

69.1%

67.3%

91.4%

68.2%

83.7%

Quick Heal Seqrite Antivirus Server Edition

93.5%

89.3%

69.1%

67.3%

91.4%

68.2%

83.7%

Tencent PC Manager

95.5%

91.9%

68.4%

66.6%

93.7%

67.5%

85.0%

Tencent PC Manager - TAV

N/T

N/T

N/T

N/T

N/T

N/T

N/T

TrustPort Antivirus 2016

98.8%

99.0%

71.4%

68.9%

98.9%

70.2%

89.3%

*Set -1 = Samples discovered 1 to 5 days before testing; Set -2 = Samples discovered 6 to 10 days before testing.§Set +1 = Samples discovered 1 to 5 days after updates frozen; Set +2 = Samples discovered 6 to 10 days after updates frozen.‡Weighted average gives equal emphasis to the two reactive weeks and the whole proactive part. N/T = Not tested.

Conclusions

Another VB100 comparative completed, and once again pass rates were pleasingly high with most products reaching the standard required for certification. Of those that didn't make it, one was hit by a freak error of the sort that hits everyone from time to time, while the other provides a rather different approach to protection which doesn't fit too well with the rigid requirements of the VB100 scheme.

Elsewhere, detection rates were mostly good and stability was impressive too, with most products rated higher than 'Fair'. Our speed measures proved something of a headache this month thanks to the newness of the platform, but hopefully still provided some fairly clear and actionable data for admins and purchasers.We return next time with our annual visit to Linux, generally a far smaller field of products but always an interesting experience for the test team.

Most organizations use web security products to minimize the risk of malware making it onto the network - the VBWeb tests measure the performance of such web security products against a wide range of live web threats.

We have placed cookies on your device in order to improve the functionality of this site, as outlined in our cookies policy. However, you may delete and block all cookies from this site and your use of the site will be unaffected. By continuing to browse this site, you are agreeing to Virus Bulletin's use of data as outlined in our privacy policy.