AV-Comparatives Real-World Test Results

AV-Comparatives runs a monthly “Whole Product Dynamic Real-World Protection” test. The organization just released its third set of results covering April 2016. And we’re pretty happy about how our products have been faring so far this year!

AV Comparatives April Real-World test results

The results of the AV Comparatives’ April “Whole Product Dynamic Real-World Protection” test.

The guys at AV-Comparatives run extremely thorough tests. In order to properly ascertain how security products function against real-world threats, they scour the Internet for actual compromised sites. This is a tough job considering that compromised sites are often taken down quickly once they’re discovered. In this latest test, they installed systems with Windows 7 SP1 and a set of fully updated third-party software. Finding malicious sites that are able to exploit fully patched software is a tough job, and we commend them on their diligence.

In all of AV-Comparatives’ real-world tests this year, we’ve blocked 100% of the threats thrown at us. (Only one of two vendors to get three-in-a-row! #hattrick) We do, however, suffer from a few false positives. This is mostly due to the fact that our product utilizes logic that makes decisions based on the prevalence of a sample. If none or very few of our customers have seen one of the samples tested against, there’s a chance we’ll block it based on its uniqueness. Andreas Clementi and his team (to their credit) are quite sly when it comes to finding clean samples that none of our own customers have encountered before, and that’s why we often end up with some false positives on their tests. But we’re not stressing out too much about them. None of the samples are critical system files, and we have special logic in our products to ensure we’ll never hit false positives on files that might break systems or software. In the end, for us, it’s more important to block all threats that we encounter than it is to avoid all potential hiccups.

Not that we’re resting on that logic… if you recall from a past post, we’re in the process developing improved automation for collecting and analyzing legit files. We want to outdo Clementi & team when it comes to hunting down clean samples.

We’d like to thank Andreas and everybody else at AV-Comparatives for putting in the hard work that they do. Their testing is important to our industry. And these tests provide us with an invaluable measurement of how well we’re protecting our customers.



Articles with similar Tags