SlideShare a Scribd company logo
1 of 15
Download to read offline
Whole Product Dynamic “Real-World” Protection Test – (February-June 2016) www.av-comparatives.org
- 1 -
Whole Product Dynamic
““““Real-World”””” Protection Test
February-June 2016
Language: English
July 2016
Last revision: 7th
July 2016
www.av-comparatives.org
Whole Product Dynamic “Real-World” Protection Test – (February-June 2016) www.av-comparatives.org
- 2 -
Content
Introduction............................................................ 2
Content................................................................... 2
Test Procedure......................................................... 4
Settings.................................................................. 5
Preparation for every testing day .............................. 5
Testing Cycle for each malicious URL ......................... 5
Test Set .................................................................. 6
Tested products ....................................................... 7
Test Cases ............................................................... 7
Results ................................................................... 7
Summary Results (February-June) .............................. 8
Award levels reached in this test............................... 14
Copyright and Disclaimer.......................................... 15
Whole Product Dynamic “Real-World” Protection Test – (February-June 2016) www.av-comparatives.org
- 3 -
Introduction
Malicious software poses an ever-increasing threat, not only due to the number of malware programs
increasing, but also due to the nature of the threats. Infection vectors are changing from simple file-
based methods to distribution via the Internet. Malware is increasingly focussing on users, e.g. by
deceiving them into visiting infected web pages, installing rogue/malicious software or opening emails
with malicious attachments. The scope of protection offered by antivirus programs is extended by the
inclusion of e.g. URL-blockers, content filtering, reputation systems and user-friendly behaviour-blockers.
If these features are perfectly coordinated with the signature-based and heuristic detection, the
protection provided against threats increases.
In this test, all protection features of the product can be used to prevent infection - not just signatures
or heuristic file scanning. A suite can step in at any stage of the process – accessing the URL,
downloading the file, formation of the file on the local hard drive, file access and file execution – to
protect the PC. This means that the test achieves the most realistic way of determining how well the
security product protects the PC. Because all of a suite’s components can be used to protect the PC, it is
possible for a product to score well in the test by having e.g. very good behavioural protection, but a
weak URL blocker. However, we would recommend that all parts of a product should be as effective as
possible. It should be borne in mind that not all malware enters computer systems via the Internet, and
that e.g. a URL blocker is ineffective against malware introduced to a PC via a USB flash drive or over the
local area network.
In spite of these new technologies, it remains very important that the signature-based and heuristic
detection abilities of antivirus programs continue to be tested. Even with all the protection features
available, the growing frequency of zero-day attacks means that some computers will inevitably become
infected. As signatures can be updated, they provide the opportunity to recognize and remove malware
which was initially missed by the security software. The newer, “non-conventional” protection
technologies often offer no means of checking existing data stores for already-infected files, which can
be found on the file servers of many companies. Those new security layers should be understood as an
addition to good detection rates, not as a replacement.
The Whole-Product Dynamic “Real-World” Protection test is a joint project of AV-Comparatives
and the University of Innsbruck’s Faculty of Computer Science and Quality Engineering. It is
partially funded by the Republic of Austria.
The methodology of our Real-World Protection Test has received the following awards and
certifications, including:
• Constantinus Award – given by the Austrian government
• Cluster Award – given by the Standortagentur Tirol – Tyrolean government
• eAward – given by report.at (Magazine for Computer Science) and the Office of the
Federal Chancellor
• Innovationspreis IT – “Best Of” – given by Initiative Mittelstand Germany
Whole Product Dynamic “Real-World” Protection Test – (February-June 2016) www.av-comparatives.org
- 4 -
Test Procedure
Testing dozens of antivirus products with hundreds of URLs each per day is a great deal of work, which
cannot be done manually (as it would involve visiting thousands of websites in parallel), so it is
necessary to use some sort of automation.
Lab Setup
Every potential test-case to be used in the test is run and analysed on a clean machine without antivirus
software, to ensure that it is a suitable candidate. If the malware meets both of these criteria, the source
URL is added to the list to be tested with security products. Any test cases which turn out not to be
appropriate are excluded from the test set.
Every security program to be tested is installed on its own test computer. All computers are connected to
the Internet. Each system is manually updated every day, and each product is updated before every
single test case. Each test PC has its own external IP address. We make special arrangements with ISPs to
ensure a stable Internet connection for each PC, and take the necessary precautions (with specially
configured firewalls etc.) not to harm other computers (i.e. not to cause outbreaks).
Software
The tests were performed under Microsoft Windows 7 Home Premium SP1 64-Bit, with updates as of 1st
February 2016. Some further installed software includes:
Vendor Product Version Vendor Product Version
Adobe Flash Player ActiveX 20.0 Microsoft Office Home 2013
Adobe Flash Player Plug-In 20.0 Microsoft .NET Framework 4.5.2
Adobe Acrobat Reader 11.0 Mozilla Firefox 43.0.4
Apple QuickTime 7.79 Oracle Java 1.8.0.66
Microsoft Internet Explorer 11.0 VideoLAN VLC Media Player 2.1.5
The use of more up-to-date third-party software and an updated Microsoft Windows 7 SP1 64-Bit makes it
much harder to find exploits in-the-field for the test. Users should always keep their systems and
applications up-to-date, in order to minimize the risk of being infected through exploits which use
unpatched software vulnerabilities.
Whole Product Dynamic “Real-World” Protection Test – (February-June 2016) www.av-comparatives.org
- 5 -
Settings
We use every security suite with its default settings. Our Whole-Product Dynamic Protection Test aims to
simulate real-world conditions as experienced every day by users. If user interactions are shown, we
choose “Allow” or equivalent. If the product protects the system anyway, we count the malware as
blocked, even though we allow the program to run when the user is asked to make a decision. If the
system is compromised, we count it as user-dependent. We consider “protection” to mean that the
system is not compromised. This means that the malware is not running (or is removed/terminated) and
there are no significant/malicious system changes. An outbound-firewall alert about a running malware
process, which asks whether or not to block traffic from the users’ workstation to the Internet, is too
little, too late and not considered by us to be protection.
Preparation for every testing day
Every morning, any available security software updates are downloaded and installed, and a new base
image is made for that day. Before each test case is carried out, the products have some time to
download and install newer updates which have just been released, as well as to load their protection
modules (which in several cases takes some minutes). In the event that a major signature update for a
product is made available during the day, but fails to download/install before each test case starts, the
product will at least have the signatures that were available at the start of the day. This replicates the
situation of an ordinary user in the real world.
Testing Cycle for each malicious URL
Before browsing to each new malicious URL we update the programs/signatures (as described above).
New major product versions (i.e. the first digit of the build number is different) are installed once at the
beginning of the month, which is why in each monthly report we only give the main product version
number. Our test software monitors the PC, so that any changes made by the malware will be recorded.
Furthermore, the recognition algorithms check whether the antivirus program detects the malware. After
each test case the machine is reset to its clean state.
Protection
Security products should protect the user’s PC. It is not very important at which stage the protection
takes place. It could be while browsing to the website (e.g. protection through URL Blocker), while an
exploit tries to run, while the file is being downloaded/created or when the malware is executed (either
by the exploit or by the user). After the malware is executed (if not blocked before), we wait several
minutes for malicious actions and also to give e.g. behaviour-blockers time to react and remedy actions
performed by the malware. If the malware is not detected and the system is indeed
infected/compromised, the process goes to “System Compromised”. If a user interaction is required and
it is up to the user to decide if something is malicious, and in the case of the worst user decision the
system gets compromised, we rate this as “user-dependent”. Because of this, the yellow bars in the
results graph can be interpreted either as protected or not protected (it’s up to each individual user to
decide what he/she would probably do in that situation).
Whole Product Dynamic “Real-World” Protection Test – (February-June 2016) www.av-comparatives.org
- 6 -
Due to the dynamic nature of the test, i.e. mimicking real-world conditions, and because of the way
several different technologies (such as cloud scanners, reputation services, etc.) work, it is a matter of
fact that such tests cannot be repeated or replicated in the way that e.g. static detection rate tests can.
Anyway, we log as much data as reasonably possible to support our findings and results. Vendors are
invited to provide useful log functions in their products that can provide the additional data they want
in the event of disputes. After each testing month, manufacturers are given the opportunity to dispute
our conclusion about the compromised cases, so that we can recheck if there were maybe some problems
in the automation or with our analysis of the results.
In the case of cloud products, we can only consider the results that the products achieved in our lab at
the time of testing; sometimes the cloud services provided by the security vendors are down due to faults
or maintenance downtime by the vendors, but these cloud-downtimes are often not disclosed to the
users by the vendors. This is also a reason why products relying too heavily on cloud services (and not
making use of local heuristics, behaviour blockers, etc.) can be risky, as in such cases the security
provided by the products can decrease significantly. Cloud signatures/reputation should be implemented
in the products to complement the other local/offline protection features, but not replace them
completely, as e.g. offline cloud services would mean the PCs are being exposed to higher risks.
Test Set
We aim to use visible and relevant malicious websites/malware that are currently out there, and present
a risk to ordinary users. We usually try to include as many working drive-by exploits as we find –these are
usually well covered by practically all major security products, which may be one reason why the scores
look relatively high. The rest are URLs that point directly to malware executables; this causes the
malware file to be downloaded, thus replicating a scenario in which the user is tricked by social
engineering into following links in spam mails or websites, or installing some Trojan or other malicious
software.
We use our own crawling system to search continuously for malicious sites and extract malicious URLs
(including spammed malicious links). We also search manually for malicious URLs. If our in-house crawler
does not find enough valid malicious URLs on one day, we have contracted some external researchers to
provide additional malicious URLs (initially for the exclusive use of AV-Comparatives) and look for
additional (re)sources.
In this kind of testing, it is very important to use enough test cases. If an insufficient number of
samples are used in comparative tests, differences in results may not indicate actual differences in
protective capabilities among the tested products1
. Our tests use more test cases (samples) per product
and month than any similar test performed by other testing labs. Because of the higher statistical
significance this achieves, we consider all the products in each results cluster to be equally effective,
assuming that they have a false-positives rate below the industry average.
1
Read more in the following paper: http://www.av-comparatives.org/images/stories/test/statistics/somestats.pdf
Whole Product Dynamic “Real-World” Protection Test – (February-June 2016) www.av-comparatives.org
- 7 -
Tested products
For this test, we normally use the Internet security suite, as any protection features that prevent the
system from being compromised can be used. However, a vendor can choose to enter their basic antivirus
product instead, if they prefer. The main versions of the products tested in each monthly test run are
shown below:
Vendor Product
Version
February
Version
March
Version
April
Version
May
Version
June
Avast Free Antivirus 11.1 11.1 11.2 11.2 12.1
AVG Internet Security 2016 2016 2016 2016 2016
Avira Antivirus Pro 15.0 15.0 15.0 15.0 15.0
Bitdefender Internet Security 2016 2016 2016 2016 2016
BullGuard Internet Security 16.0 16.0 16.0 16.0 16.0
Emsisoft Anti-Malware 11.5 11.5 11.7 11.8 11.8
eScan Internet Security 14.0 14.0 14.0 14.0 14.0
ESET Smart Security 9.0 9.0 9.0 9.0 9.0
F-Secure Safe 2016 2016 2016 2016 2016
Fortinet FortiClient (with FortiGate)2
5.4 5.4 5.4 5.4 5.4
Kaspersky Lab Internet Security 2016 2016 2016 2016 2016
Lavasoft Ad-Aware Pro Security 11.10 11.10 11.10 11.10 11.11
McAfee Internet Security 18.0 18.0 18.0 18.0 18.0
Microsoft Security Essentials 4.8 4.8 4.8 4.9 4.9
Quick Heal Total Security 16.0 16.0 17.0 17.0 17.0
Sophos Endpoint Security and Control 10.3 10.3 10.6 10.6 10.6
Tencent PC Manager (English) 11.2 11.2 11.4 11.4 11.4
ThreatTrack Vipre Internet Security Pro 9.3 9.3 9.3 9.3 9.3
Trend Micro Internet Security 10.0 10.0 10.0 10.0 10.0
Test Cases
Test period Test cases
1st
to 23rd
February 2016 394
1st
to 25th
March 2016 449
1st
to 25th
April 2016 368
2nd
to 25th
May 2016 350
1st
to 23rd
June 2016 307
TOTAL 1868
2
The cloud-based behaviour-analysis feature of Fortinet is only available to enterprises customers who also
purchased a FortiGate.
Whole Product Dynamic “Real-World” Protection Test – (February-June 2016) www.av-comparatives.org
- 8 -
Summary Results (February-June)
Test period: February – June 2016 (1868 Test cases)3
Blocked
User
dependent
Compromised PROTECTION RATE
[Blocked % + (User dependent %)/2]4 Cluster5
F-Secure, Trend Micro 1868 - - 100% 1
Bitdefender 1866 - 2 99,9% 1
Kaspersky Lab 1863 - 5 99,7% 1
Avira 1862 - 6 99,7% 1
AVG 1860 - 8 99,6% 1
ThreatTrack Vipre 1859 - 9 99,5% 1
ESET, Tencent 1841 - 27 98,6% 2
Emsisoft 1814 53 1 98,5% 2
Avast 1840 - 28 98,5% 2
eScan 1825 - 43 97,7% 2
BullGuard 1805 24 39 97,3% 2
Lavasoft 1817 - 51 97,3% 2
Fortinet 1803 - 65 96,5% 3
Sophos 1796 2 70 96,2% 3
McAfee 1788 - 80 95,7% 3
Quick Heal 1748 56 64 95,1% 3
Microsoft 1764 - 104 94,4% 3
3
Interested users who want to see the exact protection rates and FP rates for every month can see the monthly
updated interactive charts on our website: http://chart.av-comparatives.org/chart1.php
4
User-dependent cases are given half credit. For example, if a program blocks 80% by itself, and another 20% of
cases are user-dependent, we give half credit for the 20%, i.e. 10%, so it gets 90% altogether.
5
Hierarchical Clustering Method: defining clusters using average linkage between groups (Euclidian distance) based
on the protection rate (see dendrogram on page 12).
Whole Product Dynamic “Real-World” Protection Test – (February-June 2016) www.av-comparatives.org
- 9 -
The graph below shows the overall protection rate (all samples), including the minimum and maximum
protection rates for the individual months.
Whole Product Dynamic “Real-World” Protection Test – (February-June 2016) www.av-comparatives.org
- 10 -
Whole-Product “False Alarm” Test (wrongly blocked domains/files)
The false-alarm test in the Whole-Product Dynamic “Real-World” Protection Test consists of two parts:
wrongly blocked domains (while browsing) and wrongly blocked files (while downloading/installing). It is
necessary to test both scenarios because testing only one of the two above cases could penalize products
that focus mainly on one type of protection method, either URL filtering or on-
access/behaviour/reputation-based file protection.
a) Wrongly blocked domains (while browsing)
We used around one thousand randomly chosen popular domains. Blocked non-malicious domains/URLs
were counted as false positives (FPs). The wrongly blocked domains have been reported to the respective
vendors for review and should now no longer be blocked.
By blocking whole domains, the security products not only risk causing a loss of trust in their warnings,
but also possibly causing financial damage (besides the damage to website reputation) to the domain
owners, including loss of e.g. advertisement revenue. Due to this, we strongly recommend vendors to
block whole domains only in the case where the domain’s sole purpose is to carry/deliver malicious code,
and otherwise block just to the malicious pages (as long as they are indeed malicious). Products which
tend to block URLs based e.g. on reputation may be more prone to this and score also higher in
protection tests, as they may block many unpopular/new websites.
b) Wrongly blocked files (while downloading/installing)
We used around two thousand different applications listed either as top downloads or as
new/recommended downloads from various download portals. The applications were downloaded from the
original software developers’ websites (instead of the download portal host), saved to disk and installed
to see if they are blocked at any stage of this procedure. Additionally, we included a few clean files that
were encountered and disputed over the past months of the Real-World Protection Test.
The duty of security products is to protect against malicious sites/files, not to censor or limit the access
only to well-known popular applications and websites. If the user deliberately chooses a high security
setting, which warns that it may block some legitimate sites or files, then this may be considered
acceptable. However, we do not regard it to be acceptable as a default setting, where the user has not
been warned. As the test is done at points in time and FPs on very popular software/websites are usually
noticed and fixed within a few hours, it would be surprising to encounter FPs with very popular
applications. Due to this, FP tests which are done e.g. only with very popular applications, or which use
only the top 50 files from whitelisted/monitored download portals would be a waste of time and
resources. Users do not care whether they are infected by malware that affects only them, just as they do
not care if the FP count affects only them. While it is preferable that FPs do not affect many users, it
should be the goal to avoid having any FPs and to protect against any malicious files, no matter how
many users are affected or targeted. Prevalence of FPs based on user-base data is of interest for internal
QA testing of AV vendors, but for the ordinary user it is important to know how accurately its product
distinguishes between clean and malicious files.
Whole Product Dynamic “Real-World” Protection Test – (February-June 2016) www.av-comparatives.org
- 11 -
The below table shows the numbers of wrongly blocked domains/files:
Wrongly blocked clean domains/files
(blocked / user-dependent6
)
Wrongly blocked score7
ThreatTrack Vipre 1 / 0 (1) 1
AVG, Bitdefender, Kaspersky Lab 2 / 0 (2) 2
ESET 3 / 0 (3) 3
Lavasoft 4 / 0 (4) 4
Avast 5 / 0 (5) 5
Microsoft, Sophos 6 / 0 (6) 6
Fortinet 7 / 0 (7) 7
Avira, Tencent 9 / 0 (9) 9
McAfee 14 / 0 (14) 14
Quick Heal 14 / 1 (15) 14,5
Emsisoft 3 / 23 (26) 14,5
average (16) average 16
eScan 40 / 0 (40) 40
F-Secure 46 / 0 (46) 46
Trend Micro 52 / 0 (52) 52
BullGuard 57 / 1 (58) 57,5
To determine which products have to be downgraded in our award scheme due to the rate of wrongly
blocked sites/files, we backed up our decision by using statistical methods and by looking at the average
scores. The following products with above-average FPs have been downgraded: BullGuard, eScan, F-
Secure and Trend Micro.
6
Although user dependent cases are extremely annoying (esp. on clean files) for the user, they were counted only
as half for the “wrongly blocked rate” (like for the protection rate).
7
Lower is better.
Whole Product Dynamic “Real-World” Protection Test – (February-June 2016) www.av-comparatives.org
- 12 -
Prevalence of the FPs
According to some vendors, their own FPs are not seen at all in their user base (zero prevalence) or have
a very low prevalence. Nevertheless, we want to give the best possible overview of prevalence data for
the benefit of users of all our tested products. The table below shows the number of FPs for each product
according to our amalgamated prevalence assessment, for which we used several sources of prevalence
data.
Some products may block files based solely on their prevalence, i.e. if a vendor does not have any data
for a particular file, their product may treat it as a threat. This of course helps to block many malicious
files, but at the same time it can lead to higher false-alarm rates by blocking clean files which currently
have zero or very low prevalence in the user base of the particular vendor.
Very low Low Medium High
Avast 5 0 0 0
AVG 0 1 0 1
Avira 7 1 0 2
Bitdefender 1 0 0 1
BullGuard 14 19 14 11
Emsisoft 11 10 3 2
eScan 18 12 4 6
ESET 1 0 2 0
Fortinet 4 3 0 0
F-Secure 27 14 5 0
Kaspersky Lab 2 0 0 0
Lavasoft 1 2 0 1
McAfee 6 4 2 2
Microsoft 2 4 0 0
Quick Heal 5 6 3 2
Sophos 0 3 1 2
Tencent 3 1 3 2
ThreatTrack Vipre 0 1 0 0
Trend Micro 35 15 1 1
Key to prevalence ratings8
Very low: probably fewer than a hundred users
Low: probably several hundreds of users
Medium: probably several thousands of users
High: probably several tens of thousands of users
8
These relate to our aggregated prevalence data, not to the data of the individual vendors.
Whole Product Dynamic “Real-World” Protection Test – (February-June 2016) www.av-comparatives.org
- 13 -
Illustration of how awards were given
The dendrogram (using average linkage between groups) shows the results of the hierarchical cluster
analysis. It indicates at what level of similarity the clusters are joined. The red drafted line defines the
level of similarity. Each intersection indicates a group (in this case 3 groups)9
. Products that had above-
average FPs (wrongly blocked score) are marked in red (and downgraded according to the ranking system
below).
Ranking system
Protection score
Cluster10
4
Protection score
Cluster 3
Protection score
Cluster 2
Protection score
Cluster 1
< ∅∅∅∅ FPs Tested Standard Advanced Advanced+
> ∅∅∅∅ FPs Tested Tested Standard Advanced
9
As all products scored highly (over 90%) in this test, we have used three instead of four clusters.
10
See protection score clusters on page 9.
Whole Product Dynamic “Real-World” Protection Test – (February-June 2016) www.av-comparatives.org
- 14 -
Award levels reached in this test
The awards are decided and given by the testers based on the observed test results (after consulting
statistical models). The following awards are for the results reached in this Whole-Product Dynamic
“Real-World” Protection Test:
AWARD LEVELS PRODUCTS
Bitdefender
Kaspersky Lab
Avira
AVG
ThreatTrack
F-Secure*
Trend Micro*
ESET
Tencent
Emsisoft
Avast
Lavasoft
eScan*
BullGuard*
Fortinet
Sophos
McAfee
Quick Heal
Microsoft
-
* downgraded by one rank due to the score of wrongly blocked sites/files (FPs); see page 13
Expert users who do not care about wrongly blocked files/websites (false alarms) are free to rely on the
protection rates on page 9 instead of our awards ranking which takes FPs in consideration.
Whole Product Dynamic “Real-World” Protection Test – (February-June 2016) www.av-comparatives.org
- 15 -
Copyright and Disclaimer
This publication is Copyright © 2016 by AV-Comparatives ®. Any use of the results, etc. in whole or in
part, is ONLY permitted with the explicit written agreement of the management board of AV-
Comparatives, prior to any publication. AV-Comparatives and its testers cannot be held liable for any
damage or loss which might occur as a result of, or in connection with, the use of the information
provided in this paper. We take every possible care to ensure the correctness of the basic data, but
liability for the correctness of the test results cannot be taken by any representative of AV-Comparatives.
We do not give any guarantee of the correctness, completeness, or suitability for a specific purpose of
any of the information/content provided at any given time. No-one else involved in creating, producing
or delivering test results shall be liable for any indirect, special or consequential damage, or loss of
profits, arising out of, or related to, the use (or inability to use), the services provided by the website,
test documents or any related data.
For more information about AV-Comparatives and the testing methodologies please visit our website.
AV-Comparatives (July 2016)

More Related Content

What's hot

5 howtomitigate
5 howtomitigate5 howtomitigate
5 howtomitigatericharddxd
 
IRJET- A Study on Penetration Testing using Metasploit Framework
IRJET- A Study on Penetration Testing using Metasploit FrameworkIRJET- A Study on Penetration Testing using Metasploit Framework
IRJET- A Study on Penetration Testing using Metasploit FrameworkIRJET Journal
 
Stephanie Vanroelen - Mobile Anti-Virus apps exposed
Stephanie Vanroelen - Mobile Anti-Virus apps exposedStephanie Vanroelen - Mobile Anti-Virus apps exposed
Stephanie Vanroelen - Mobile Anti-Virus apps exposedNoNameCon
 
PHDays 9: new methods of Vulnerability Prioritization in Vulnerability Manage...
PHDays 9: new methods of Vulnerability Prioritization in Vulnerability Manage...PHDays 9: new methods of Vulnerability Prioritization in Vulnerability Manage...
PHDays 9: new methods of Vulnerability Prioritization in Vulnerability Manage...Alexander Leonov
 
Introduction to Metascan Client
Introduction to Metascan ClientIntroduction to Metascan Client
Introduction to Metascan ClientOPSWAT
 
Cyber Defense Forensic Analyst - Real World Hands-on Examples
Cyber Defense Forensic Analyst - Real World Hands-on ExamplesCyber Defense Forensic Analyst - Real World Hands-on Examples
Cyber Defense Forensic Analyst - Real World Hands-on ExamplesSandeep Kumar Seeram
 
White Paper - Are antivirus solutions enough to protect industrial plants?
White Paper - Are antivirus solutions enough to protect industrial plants?White Paper - Are antivirus solutions enough to protect industrial plants?
White Paper - Are antivirus solutions enough to protect industrial plants?TI Safe
 
Full-System Emulation Achieving Successful Automated Dynamic Analysis of Evas...
Full-System Emulation Achieving Successful Automated Dynamic Analysis of Evas...Full-System Emulation Achieving Successful Automated Dynamic Analysis of Evas...
Full-System Emulation Achieving Successful Automated Dynamic Analysis of Evas...Lastline, Inc.
 
An evaluation of two host based intrusion prevention systems
An evaluation of two host based intrusion prevention systemsAn evaluation of two host based intrusion prevention systems
An evaluation of two host based intrusion prevention systemsUltraUploader
 
Av is dead long live managed endpoint security
Av is dead   long live managed endpoint securityAv is dead   long live managed endpoint security
Av is dead long live managed endpoint securitySolarwinds N-able
 
CyberCentral Summit 2018 in Prague
CyberCentral Summit 2018 in PragueCyberCentral Summit 2018 in Prague
CyberCentral Summit 2018 in PragueAlexander Leonov
 
IE Exploit Protection
IE Exploit ProtectionIE Exploit Protection
IE Exploit ProtectionKim Jensen
 
PHDays 8: Vulnerability Databases. Sifting thousands tons of verbal ore
PHDays 8: Vulnerability Databases. Sifting thousands tons of verbal orePHDays 8: Vulnerability Databases. Sifting thousands tons of verbal ore
PHDays 8: Vulnerability Databases. Sifting thousands tons of verbal oreAlexander Leonov
 
Kaspersky Security center 10 documentation
Kaspersky Security center 10  documentationKaspersky Security center 10  documentation
Kaspersky Security center 10 documentationTarek Amer
 
Sa No Scan Paper
Sa No Scan PaperSa No Scan Paper
Sa No Scan Papertafinley
 
Understanding & Addressing OWASP’s Newest Top Ten Threat: Using Components wi...
Understanding & Addressing OWASP’s Newest Top Ten Threat: Using Components wi...Understanding & Addressing OWASP’s Newest Top Ten Threat: Using Components wi...
Understanding & Addressing OWASP’s Newest Top Ten Threat: Using Components wi...Sonatype
 
The CIS Critical Security Controls the International Standard for Defense
The CIS Critical Security Controls the International Standard for DefenseThe CIS Critical Security Controls the International Standard for Defense
The CIS Critical Security Controls the International Standard for DefenseEnclaveSecurity
 
Enterprise Vulnerability Management - ZeroNights16
Enterprise Vulnerability Management - ZeroNights16Enterprise Vulnerability Management - ZeroNights16
Enterprise Vulnerability Management - ZeroNights16Alexander Leonov
 
Matteo Meucci Software Security in practice - Aiea torino - 30-10-2015
Matteo Meucci   Software Security in practice - Aiea torino - 30-10-2015Matteo Meucci   Software Security in practice - Aiea torino - 30-10-2015
Matteo Meucci Software Security in practice - Aiea torino - 30-10-2015Minded Security
 
Analysis Of Adverarial Code - The Role of Malware Kits
Analysis Of Adverarial Code - The Role of Malware KitsAnalysis Of Adverarial Code - The Role of Malware Kits
Analysis Of Adverarial Code - The Role of Malware KitsRahul Mohandas
 

What's hot (20)

5 howtomitigate
5 howtomitigate5 howtomitigate
5 howtomitigate
 
IRJET- A Study on Penetration Testing using Metasploit Framework
IRJET- A Study on Penetration Testing using Metasploit FrameworkIRJET- A Study on Penetration Testing using Metasploit Framework
IRJET- A Study on Penetration Testing using Metasploit Framework
 
Stephanie Vanroelen - Mobile Anti-Virus apps exposed
Stephanie Vanroelen - Mobile Anti-Virus apps exposedStephanie Vanroelen - Mobile Anti-Virus apps exposed
Stephanie Vanroelen - Mobile Anti-Virus apps exposed
 
PHDays 9: new methods of Vulnerability Prioritization in Vulnerability Manage...
PHDays 9: new methods of Vulnerability Prioritization in Vulnerability Manage...PHDays 9: new methods of Vulnerability Prioritization in Vulnerability Manage...
PHDays 9: new methods of Vulnerability Prioritization in Vulnerability Manage...
 
Introduction to Metascan Client
Introduction to Metascan ClientIntroduction to Metascan Client
Introduction to Metascan Client
 
Cyber Defense Forensic Analyst - Real World Hands-on Examples
Cyber Defense Forensic Analyst - Real World Hands-on ExamplesCyber Defense Forensic Analyst - Real World Hands-on Examples
Cyber Defense Forensic Analyst - Real World Hands-on Examples
 
White Paper - Are antivirus solutions enough to protect industrial plants?
White Paper - Are antivirus solutions enough to protect industrial plants?White Paper - Are antivirus solutions enough to protect industrial plants?
White Paper - Are antivirus solutions enough to protect industrial plants?
 
Full-System Emulation Achieving Successful Automated Dynamic Analysis of Evas...
Full-System Emulation Achieving Successful Automated Dynamic Analysis of Evas...Full-System Emulation Achieving Successful Automated Dynamic Analysis of Evas...
Full-System Emulation Achieving Successful Automated Dynamic Analysis of Evas...
 
An evaluation of two host based intrusion prevention systems
An evaluation of two host based intrusion prevention systemsAn evaluation of two host based intrusion prevention systems
An evaluation of two host based intrusion prevention systems
 
Av is dead long live managed endpoint security
Av is dead   long live managed endpoint securityAv is dead   long live managed endpoint security
Av is dead long live managed endpoint security
 
CyberCentral Summit 2018 in Prague
CyberCentral Summit 2018 in PragueCyberCentral Summit 2018 in Prague
CyberCentral Summit 2018 in Prague
 
IE Exploit Protection
IE Exploit ProtectionIE Exploit Protection
IE Exploit Protection
 
PHDays 8: Vulnerability Databases. Sifting thousands tons of verbal ore
PHDays 8: Vulnerability Databases. Sifting thousands tons of verbal orePHDays 8: Vulnerability Databases. Sifting thousands tons of verbal ore
PHDays 8: Vulnerability Databases. Sifting thousands tons of verbal ore
 
Kaspersky Security center 10 documentation
Kaspersky Security center 10  documentationKaspersky Security center 10  documentation
Kaspersky Security center 10 documentation
 
Sa No Scan Paper
Sa No Scan PaperSa No Scan Paper
Sa No Scan Paper
 
Understanding & Addressing OWASP’s Newest Top Ten Threat: Using Components wi...
Understanding & Addressing OWASP’s Newest Top Ten Threat: Using Components wi...Understanding & Addressing OWASP’s Newest Top Ten Threat: Using Components wi...
Understanding & Addressing OWASP’s Newest Top Ten Threat: Using Components wi...
 
The CIS Critical Security Controls the International Standard for Defense
The CIS Critical Security Controls the International Standard for DefenseThe CIS Critical Security Controls the International Standard for Defense
The CIS Critical Security Controls the International Standard for Defense
 
Enterprise Vulnerability Management - ZeroNights16
Enterprise Vulnerability Management - ZeroNights16Enterprise Vulnerability Management - ZeroNights16
Enterprise Vulnerability Management - ZeroNights16
 
Matteo Meucci Software Security in practice - Aiea torino - 30-10-2015
Matteo Meucci   Software Security in practice - Aiea torino - 30-10-2015Matteo Meucci   Software Security in practice - Aiea torino - 30-10-2015
Matteo Meucci Software Security in practice - Aiea torino - 30-10-2015
 
Analysis Of Adverarial Code - The Role of Malware Kits
Analysis Of Adverarial Code - The Role of Malware KitsAnalysis Of Adverarial Code - The Role of Malware Kits
Analysis Of Adverarial Code - The Role of Malware Kits
 

Similar to Avc prot 2016a_en

AV Comparatives 2013 (Comparación de Antivirus)
AV Comparatives 2013 (Comparación de Antivirus)AV Comparatives 2013 (Comparación de Antivirus)
AV Comparatives 2013 (Comparación de Antivirus)Doryan Mathos
 
ANTIVIRUS
ANTIVIRUSANTIVIRUS
ANTIVIRUSfauscha
 
AV-Comparatives Performance Test
AV-Comparatives Performance TestAV-Comparatives Performance Test
AV-Comparatives Performance TestHerbert Rodriguez
 
Astaro Orange Paper Oss Myths Dispelled
Astaro Orange Paper Oss Myths DispelledAstaro Orange Paper Oss Myths Dispelled
Astaro Orange Paper Oss Myths Dispelledlosalamos
 
Is Using Off-the-shelf Antimalware Product to Secure Your Medical Device a Go...
Is Using Off-the-shelf Antimalware Product to Secure Your Medical Device a Go...Is Using Off-the-shelf Antimalware Product to Secure Your Medical Device a Go...
Is Using Off-the-shelf Antimalware Product to Secure Your Medical Device a Go...Jose Lopez
 
From Code to Customer: How to Make Software Products Secure
From Code to Customer: How to Make Software Products SecureFrom Code to Customer: How to Make Software Products Secure
From Code to Customer: How to Make Software Products SecureKaspersky
 
AV-Comparatives’ 2017 business software review
AV-Comparatives’ 2017 business software reviewAV-Comparatives’ 2017 business software review
AV-Comparatives’ 2017 business software reviewJermund Ottermo
 
Outpost Security Pro 7.5: What's Inside?
Outpost Security Pro 7.5: What's Inside?Outpost Security Pro 7.5: What's Inside?
Outpost Security Pro 7.5: What's Inside?Lubov Putsko
 
Vulnerability , Malware and Risk
Vulnerability , Malware and RiskVulnerability , Malware and Risk
Vulnerability , Malware and RiskSecPod Technologies
 
Pileup Flaws: Vulnerabilities in Android Update Make All Android Devices Vuln...
Pileup Flaws: Vulnerabilities in Android Update Make All Android Devices Vuln...Pileup Flaws: Vulnerabilities in Android Update Make All Android Devices Vuln...
Pileup Flaws: Vulnerabilities in Android Update Make All Android Devices Vuln...MOBIQUANT TECHNOLOGIES
 
Packet capture and network traffic analysis
Packet capture and network traffic analysisPacket capture and network traffic analysis
Packet capture and network traffic analysisCARMEN ALCIVAR
 
Penetration testing dont just leave it to chance
Penetration testing dont just leave it to chancePenetration testing dont just leave it to chance
Penetration testing dont just leave it to chanceDr. Anish Cheriyan (PhD)
 
How to automate your DevSecOps successfully
How to automate your DevSecOps successfullyHow to automate your DevSecOps successfully
How to automate your DevSecOps successfullyManuel Pistner
 
Cscu module 03 protecting systems using antiviruses
Cscu module 03 protecting systems using antivirusesCscu module 03 protecting systems using antiviruses
Cscu module 03 protecting systems using antivirusesAlireza Ghahrood
 

Similar to Avc prot 2016a_en (20)

AV Comparatives 2013 (Comparación de Antivirus)
AV Comparatives 2013 (Comparación de Antivirus)AV Comparatives 2013 (Comparación de Antivirus)
AV Comparatives 2013 (Comparación de Antivirus)
 
Avc per 201304_en
Avc per 201304_enAvc per 201304_en
Avc per 201304_en
 
ANTIVIRUS
ANTIVIRUSANTIVIRUS
ANTIVIRUS
 
Performance dec 2010
Performance dec 2010Performance dec 2010
Performance dec 2010
 
AV-Comparatives Performance Test
AV-Comparatives Performance TestAV-Comparatives Performance Test
AV-Comparatives Performance Test
 
Avc per 201206_en
Avc per 201206_enAvc per 201206_en
Avc per 201206_en
 
Astaro Orange Paper Oss Myths Dispelled
Astaro Orange Paper Oss Myths DispelledAstaro Orange Paper Oss Myths Dispelled
Astaro Orange Paper Oss Myths Dispelled
 
Is Using Off-the-shelf Antimalware Product to Secure Your Medical Device a Go...
Is Using Off-the-shelf Antimalware Product to Secure Your Medical Device a Go...Is Using Off-the-shelf Antimalware Product to Secure Your Medical Device a Go...
Is Using Off-the-shelf Antimalware Product to Secure Your Medical Device a Go...
 
From Code to Customer: How to Make Software Products Secure
From Code to Customer: How to Make Software Products SecureFrom Code to Customer: How to Make Software Products Secure
From Code to Customer: How to Make Software Products Secure
 
AV-Comparatives’ 2017 business software review
AV-Comparatives’ 2017 business software reviewAV-Comparatives’ 2017 business software review
AV-Comparatives’ 2017 business software review
 
Mac review 2012_en
Mac review 2012_enMac review 2012_en
Mac review 2012_en
 
Outpost Security Pro 7.5: What's Inside?
Outpost Security Pro 7.5: What's Inside?Outpost Security Pro 7.5: What's Inside?
Outpost Security Pro 7.5: What's Inside?
 
Top 10 antiviruses
Top 10 antivirusesTop 10 antiviruses
Top 10 antiviruses
 
Vulnerability , Malware and Risk
Vulnerability , Malware and RiskVulnerability , Malware and Risk
Vulnerability , Malware and Risk
 
It kamus virus security glossary
It kamus virus   security glossaryIt kamus virus   security glossary
It kamus virus security glossary
 
Pileup Flaws: Vulnerabilities in Android Update Make All Android Devices Vuln...
Pileup Flaws: Vulnerabilities in Android Update Make All Android Devices Vuln...Pileup Flaws: Vulnerabilities in Android Update Make All Android Devices Vuln...
Pileup Flaws: Vulnerabilities in Android Update Make All Android Devices Vuln...
 
Packet capture and network traffic analysis
Packet capture and network traffic analysisPacket capture and network traffic analysis
Packet capture and network traffic analysis
 
Penetration testing dont just leave it to chance
Penetration testing dont just leave it to chancePenetration testing dont just leave it to chance
Penetration testing dont just leave it to chance
 
How to automate your DevSecOps successfully
How to automate your DevSecOps successfullyHow to automate your DevSecOps successfully
How to automate your DevSecOps successfully
 
Cscu module 03 protecting systems using antiviruses
Cscu module 03 protecting systems using antivirusesCscu module 03 protecting systems using antiviruses
Cscu module 03 protecting systems using antiviruses
 

More from Andrey Apuhtin

Shadow pad technical_description_pdf
Shadow pad technical_description_pdfShadow pad technical_description_pdf
Shadow pad technical_description_pdfAndrey Apuhtin
 
Ftc cdt-vpn-complaint-8-7-17
Ftc cdt-vpn-complaint-8-7-17Ftc cdt-vpn-complaint-8-7-17
Ftc cdt-vpn-complaint-8-7-17Andrey Apuhtin
 
Hutchins redacted indictment
Hutchins redacted indictmentHutchins redacted indictment
Hutchins redacted indictmentAndrey Apuhtin
 
Dr web review_mob_july_2017
Dr web review_mob_july_2017Dr web review_mob_july_2017
Dr web review_mob_july_2017Andrey Apuhtin
 
Nexusguard d do_s_threat_report_q1_2017_en
Nexusguard d do_s_threat_report_q1_2017_enNexusguard d do_s_threat_report_q1_2017_en
Nexusguard d do_s_threat_report_q1_2017_enAndrey Apuhtin
 
Pandalabs отчет за 1 квартал 2017
Pandalabs   отчет за 1 квартал 2017Pandalabs   отчет за 1 квартал 2017
Pandalabs отчет за 1 квартал 2017Andrey Apuhtin
 
Lookout pegasus-android-technical-analysis
Lookout pegasus-android-technical-analysisLookout pegasus-android-technical-analysis
Lookout pegasus-android-technical-analysisAndrey Apuhtin
 
Apwg trends report_q4_2016
Apwg trends report_q4_2016Apwg trends report_q4_2016
Apwg trends report_q4_2016Andrey Apuhtin
 
News berthaume-sentencing-jan2017
News berthaume-sentencing-jan2017News berthaume-sentencing-jan2017
News berthaume-sentencing-jan2017Andrey Apuhtin
 
Windows exploitation-2016-a4
Windows exploitation-2016-a4Windows exploitation-2016-a4
Windows exploitation-2016-a4Andrey Apuhtin
 

More from Andrey Apuhtin (20)

Shadow pad technical_description_pdf
Shadow pad technical_description_pdfShadow pad technical_description_pdf
Shadow pad technical_description_pdf
 
Ftc cdt-vpn-complaint-8-7-17
Ftc cdt-vpn-complaint-8-7-17Ftc cdt-vpn-complaint-8-7-17
Ftc cdt-vpn-complaint-8-7-17
 
Hutchins redacted indictment
Hutchins redacted indictmentHutchins redacted indictment
Hutchins redacted indictment
 
Dr web review_mob_july_2017
Dr web review_mob_july_2017Dr web review_mob_july_2017
Dr web review_mob_july_2017
 
Dmarc
DmarcDmarc
Dmarc
 
Nexusguard d do_s_threat_report_q1_2017_en
Nexusguard d do_s_threat_report_q1_2017_enNexusguard d do_s_threat_report_q1_2017_en
Nexusguard d do_s_threat_report_q1_2017_en
 
Pandalabs отчет за 1 квартал 2017
Pandalabs   отчет за 1 квартал 2017Pandalabs   отчет за 1 квартал 2017
Pandalabs отчет за 1 квартал 2017
 
Sel03129 usen
Sel03129 usenSel03129 usen
Sel03129 usen
 
Cldap threat-advisory
Cldap threat-advisoryCldap threat-advisory
Cldap threat-advisory
 
Lookout pegasus-android-technical-analysis
Lookout pegasus-android-technical-analysisLookout pegasus-android-technical-analysis
Lookout pegasus-android-technical-analysis
 
Rand rr1751
Rand rr1751Rand rr1751
Rand rr1751
 
Apwg trends report_q4_2016
Apwg trends report_q4_2016Apwg trends report_q4_2016
Apwg trends report_q4_2016
 
Browser history
Browser historyBrowser history
Browser history
 
Software
SoftwareSoftware
Software
 
Antivirus
AntivirusAntivirus
Antivirus
 
Https interception
Https interceptionHttps interception
Https interception
 
Wilssc 006 xml
Wilssc 006 xmlWilssc 006 xml
Wilssc 006 xml
 
News berthaume-sentencing-jan2017
News berthaume-sentencing-jan2017News berthaume-sentencing-jan2017
News berthaume-sentencing-jan2017
 
Windows exploitation-2016-a4
Windows exploitation-2016-a4Windows exploitation-2016-a4
Windows exploitation-2016-a4
 
Mw stj 08252016_2
Mw stj 08252016_2Mw stj 08252016_2
Mw stj 08252016_2
 

Recently uploaded

Steps To Getting Up And Running Quickly With MyTimeClock Employee Scheduling ...
Steps To Getting Up And Running Quickly With MyTimeClock Employee Scheduling ...Steps To Getting Up And Running Quickly With MyTimeClock Employee Scheduling ...
Steps To Getting Up And Running Quickly With MyTimeClock Employee Scheduling ...MyIntelliSource, Inc.
 
Unveiling the Tech Salsa of LAMs with Janus in Real-Time Applications
Unveiling the Tech Salsa of LAMs with Janus in Real-Time ApplicationsUnveiling the Tech Salsa of LAMs with Janus in Real-Time Applications
Unveiling the Tech Salsa of LAMs with Janus in Real-Time ApplicationsAlberto González Trastoy
 
What is Binary Language? Computer Number Systems
What is Binary Language?  Computer Number SystemsWhat is Binary Language?  Computer Number Systems
What is Binary Language? Computer Number SystemsJheuzeDellosa
 
HR Software Buyers Guide in 2024 - HRSoftware.com
HR Software Buyers Guide in 2024 - HRSoftware.comHR Software Buyers Guide in 2024 - HRSoftware.com
HR Software Buyers Guide in 2024 - HRSoftware.comFatema Valibhai
 
why an Opensea Clone Script might be your perfect match.pdf
why an Opensea Clone Script might be your perfect match.pdfwhy an Opensea Clone Script might be your perfect match.pdf
why an Opensea Clone Script might be your perfect match.pdfjoe51371421
 
Salesforce Certified Field Service Consultant
Salesforce Certified Field Service ConsultantSalesforce Certified Field Service Consultant
Salesforce Certified Field Service ConsultantAxelRicardoTrocheRiq
 
Adobe Marketo Engage Deep Dives: Using Webhooks to Transfer Data
Adobe Marketo Engage Deep Dives: Using Webhooks to Transfer DataAdobe Marketo Engage Deep Dives: Using Webhooks to Transfer Data
Adobe Marketo Engage Deep Dives: Using Webhooks to Transfer DataBradBedford3
 
Optimizing AI for immediate response in Smart CCTV
Optimizing AI for immediate response in Smart CCTVOptimizing AI for immediate response in Smart CCTV
Optimizing AI for immediate response in Smart CCTVshikhaohhpro
 
Engage Usergroup 2024 - The Good The Bad_The Ugly
Engage Usergroup 2024 - The Good The Bad_The UglyEngage Usergroup 2024 - The Good The Bad_The Ugly
Engage Usergroup 2024 - The Good The Bad_The UglyFrank van der Linden
 
Asset Management Software - Infographic
Asset Management Software - InfographicAsset Management Software - Infographic
Asset Management Software - InfographicHr365.us smith
 
Cloud Management Software Platforms: OpenStack
Cloud Management Software Platforms: OpenStackCloud Management Software Platforms: OpenStack
Cloud Management Software Platforms: OpenStackVICTOR MAESTRE RAMIREZ
 
Project Based Learning (A.I).pptx detail explanation
Project Based Learning (A.I).pptx detail explanationProject Based Learning (A.I).pptx detail explanation
Project Based Learning (A.I).pptx detail explanationkaushalgiri8080
 
Alluxio Monthly Webinar | Cloud-Native Model Training on Distributed Data
Alluxio Monthly Webinar | Cloud-Native Model Training on Distributed DataAlluxio Monthly Webinar | Cloud-Native Model Training on Distributed Data
Alluxio Monthly Webinar | Cloud-Native Model Training on Distributed DataAlluxio, Inc.
 
Building Real-Time Data Pipelines: Stream & Batch Processing workshop Slide
Building Real-Time Data Pipelines: Stream & Batch Processing workshop SlideBuilding Real-Time Data Pipelines: Stream & Batch Processing workshop Slide
Building Real-Time Data Pipelines: Stream & Batch Processing workshop SlideChristina Lin
 
Unit 1.1 Excite Part 1, class 9, cbse...
Unit 1.1 Excite Part 1, class 9, cbse...Unit 1.1 Excite Part 1, class 9, cbse...
Unit 1.1 Excite Part 1, class 9, cbse...aditisharan08
 
Professional Resume Template for Software Developers
Professional Resume Template for Software DevelopersProfessional Resume Template for Software Developers
Professional Resume Template for Software DevelopersVinodh Ram
 
(Genuine) Escort Service Lucknow | Starting ₹,5K To @25k with A/C 🧑🏽‍❤️‍🧑🏻 89...
(Genuine) Escort Service Lucknow | Starting ₹,5K To @25k with A/C 🧑🏽‍❤️‍🧑🏻 89...(Genuine) Escort Service Lucknow | Starting ₹,5K To @25k with A/C 🧑🏽‍❤️‍🧑🏻 89...
(Genuine) Escort Service Lucknow | Starting ₹,5K To @25k with A/C 🧑🏽‍❤️‍🧑🏻 89...gurkirankumar98700
 
Introduction to Decentralized Applications (dApps)
Introduction to Decentralized Applications (dApps)Introduction to Decentralized Applications (dApps)
Introduction to Decentralized Applications (dApps)Intelisync
 
Reassessing the Bedrock of Clinical Function Models: An Examination of Large ...
Reassessing the Bedrock of Clinical Function Models: An Examination of Large ...Reassessing the Bedrock of Clinical Function Models: An Examination of Large ...
Reassessing the Bedrock of Clinical Function Models: An Examination of Large ...harshavardhanraghave
 

Recently uploaded (20)

Steps To Getting Up And Running Quickly With MyTimeClock Employee Scheduling ...
Steps To Getting Up And Running Quickly With MyTimeClock Employee Scheduling ...Steps To Getting Up And Running Quickly With MyTimeClock Employee Scheduling ...
Steps To Getting Up And Running Quickly With MyTimeClock Employee Scheduling ...
 
Unveiling the Tech Salsa of LAMs with Janus in Real-Time Applications
Unveiling the Tech Salsa of LAMs with Janus in Real-Time ApplicationsUnveiling the Tech Salsa of LAMs with Janus in Real-Time Applications
Unveiling the Tech Salsa of LAMs with Janus in Real-Time Applications
 
What is Binary Language? Computer Number Systems
What is Binary Language?  Computer Number SystemsWhat is Binary Language?  Computer Number Systems
What is Binary Language? Computer Number Systems
 
HR Software Buyers Guide in 2024 - HRSoftware.com
HR Software Buyers Guide in 2024 - HRSoftware.comHR Software Buyers Guide in 2024 - HRSoftware.com
HR Software Buyers Guide in 2024 - HRSoftware.com
 
why an Opensea Clone Script might be your perfect match.pdf
why an Opensea Clone Script might be your perfect match.pdfwhy an Opensea Clone Script might be your perfect match.pdf
why an Opensea Clone Script might be your perfect match.pdf
 
Salesforce Certified Field Service Consultant
Salesforce Certified Field Service ConsultantSalesforce Certified Field Service Consultant
Salesforce Certified Field Service Consultant
 
Adobe Marketo Engage Deep Dives: Using Webhooks to Transfer Data
Adobe Marketo Engage Deep Dives: Using Webhooks to Transfer DataAdobe Marketo Engage Deep Dives: Using Webhooks to Transfer Data
Adobe Marketo Engage Deep Dives: Using Webhooks to Transfer Data
 
Optimizing AI for immediate response in Smart CCTV
Optimizing AI for immediate response in Smart CCTVOptimizing AI for immediate response in Smart CCTV
Optimizing AI for immediate response in Smart CCTV
 
Engage Usergroup 2024 - The Good The Bad_The Ugly
Engage Usergroup 2024 - The Good The Bad_The UglyEngage Usergroup 2024 - The Good The Bad_The Ugly
Engage Usergroup 2024 - The Good The Bad_The Ugly
 
Asset Management Software - Infographic
Asset Management Software - InfographicAsset Management Software - Infographic
Asset Management Software - Infographic
 
Cloud Management Software Platforms: OpenStack
Cloud Management Software Platforms: OpenStackCloud Management Software Platforms: OpenStack
Cloud Management Software Platforms: OpenStack
 
Project Based Learning (A.I).pptx detail explanation
Project Based Learning (A.I).pptx detail explanationProject Based Learning (A.I).pptx detail explanation
Project Based Learning (A.I).pptx detail explanation
 
Alluxio Monthly Webinar | Cloud-Native Model Training on Distributed Data
Alluxio Monthly Webinar | Cloud-Native Model Training on Distributed DataAlluxio Monthly Webinar | Cloud-Native Model Training on Distributed Data
Alluxio Monthly Webinar | Cloud-Native Model Training on Distributed Data
 
Building Real-Time Data Pipelines: Stream & Batch Processing workshop Slide
Building Real-Time Data Pipelines: Stream & Batch Processing workshop SlideBuilding Real-Time Data Pipelines: Stream & Batch Processing workshop Slide
Building Real-Time Data Pipelines: Stream & Batch Processing workshop Slide
 
Unit 1.1 Excite Part 1, class 9, cbse...
Unit 1.1 Excite Part 1, class 9, cbse...Unit 1.1 Excite Part 1, class 9, cbse...
Unit 1.1 Excite Part 1, class 9, cbse...
 
Professional Resume Template for Software Developers
Professional Resume Template for Software DevelopersProfessional Resume Template for Software Developers
Professional Resume Template for Software Developers
 
Exploring iOS App Development: Simplifying the Process
Exploring iOS App Development: Simplifying the ProcessExploring iOS App Development: Simplifying the Process
Exploring iOS App Development: Simplifying the Process
 
(Genuine) Escort Service Lucknow | Starting ₹,5K To @25k with A/C 🧑🏽‍❤️‍🧑🏻 89...
(Genuine) Escort Service Lucknow | Starting ₹,5K To @25k with A/C 🧑🏽‍❤️‍🧑🏻 89...(Genuine) Escort Service Lucknow | Starting ₹,5K To @25k with A/C 🧑🏽‍❤️‍🧑🏻 89...
(Genuine) Escort Service Lucknow | Starting ₹,5K To @25k with A/C 🧑🏽‍❤️‍🧑🏻 89...
 
Introduction to Decentralized Applications (dApps)
Introduction to Decentralized Applications (dApps)Introduction to Decentralized Applications (dApps)
Introduction to Decentralized Applications (dApps)
 
Reassessing the Bedrock of Clinical Function Models: An Examination of Large ...
Reassessing the Bedrock of Clinical Function Models: An Examination of Large ...Reassessing the Bedrock of Clinical Function Models: An Examination of Large ...
Reassessing the Bedrock of Clinical Function Models: An Examination of Large ...
 

Avc prot 2016a_en

  • 1. Whole Product Dynamic “Real-World” Protection Test – (February-June 2016) www.av-comparatives.org - 1 - Whole Product Dynamic ““““Real-World”””” Protection Test February-June 2016 Language: English July 2016 Last revision: 7th July 2016 www.av-comparatives.org
  • 2. Whole Product Dynamic “Real-World” Protection Test – (February-June 2016) www.av-comparatives.org - 2 - Content Introduction............................................................ 2 Content................................................................... 2 Test Procedure......................................................... 4 Settings.................................................................. 5 Preparation for every testing day .............................. 5 Testing Cycle for each malicious URL ......................... 5 Test Set .................................................................. 6 Tested products ....................................................... 7 Test Cases ............................................................... 7 Results ................................................................... 7 Summary Results (February-June) .............................. 8 Award levels reached in this test............................... 14 Copyright and Disclaimer.......................................... 15
  • 3. Whole Product Dynamic “Real-World” Protection Test – (February-June 2016) www.av-comparatives.org - 3 - Introduction Malicious software poses an ever-increasing threat, not only due to the number of malware programs increasing, but also due to the nature of the threats. Infection vectors are changing from simple file- based methods to distribution via the Internet. Malware is increasingly focussing on users, e.g. by deceiving them into visiting infected web pages, installing rogue/malicious software or opening emails with malicious attachments. The scope of protection offered by antivirus programs is extended by the inclusion of e.g. URL-blockers, content filtering, reputation systems and user-friendly behaviour-blockers. If these features are perfectly coordinated with the signature-based and heuristic detection, the protection provided against threats increases. In this test, all protection features of the product can be used to prevent infection - not just signatures or heuristic file scanning. A suite can step in at any stage of the process – accessing the URL, downloading the file, formation of the file on the local hard drive, file access and file execution – to protect the PC. This means that the test achieves the most realistic way of determining how well the security product protects the PC. Because all of a suite’s components can be used to protect the PC, it is possible for a product to score well in the test by having e.g. very good behavioural protection, but a weak URL blocker. However, we would recommend that all parts of a product should be as effective as possible. It should be borne in mind that not all malware enters computer systems via the Internet, and that e.g. a URL blocker is ineffective against malware introduced to a PC via a USB flash drive or over the local area network. In spite of these new technologies, it remains very important that the signature-based and heuristic detection abilities of antivirus programs continue to be tested. Even with all the protection features available, the growing frequency of zero-day attacks means that some computers will inevitably become infected. As signatures can be updated, they provide the opportunity to recognize and remove malware which was initially missed by the security software. The newer, “non-conventional” protection technologies often offer no means of checking existing data stores for already-infected files, which can be found on the file servers of many companies. Those new security layers should be understood as an addition to good detection rates, not as a replacement. The Whole-Product Dynamic “Real-World” Protection test is a joint project of AV-Comparatives and the University of Innsbruck’s Faculty of Computer Science and Quality Engineering. It is partially funded by the Republic of Austria. The methodology of our Real-World Protection Test has received the following awards and certifications, including: • Constantinus Award – given by the Austrian government • Cluster Award – given by the Standortagentur Tirol – Tyrolean government • eAward – given by report.at (Magazine for Computer Science) and the Office of the Federal Chancellor • Innovationspreis IT – “Best Of” – given by Initiative Mittelstand Germany
  • 4. Whole Product Dynamic “Real-World” Protection Test – (February-June 2016) www.av-comparatives.org - 4 - Test Procedure Testing dozens of antivirus products with hundreds of URLs each per day is a great deal of work, which cannot be done manually (as it would involve visiting thousands of websites in parallel), so it is necessary to use some sort of automation. Lab Setup Every potential test-case to be used in the test is run and analysed on a clean machine without antivirus software, to ensure that it is a suitable candidate. If the malware meets both of these criteria, the source URL is added to the list to be tested with security products. Any test cases which turn out not to be appropriate are excluded from the test set. Every security program to be tested is installed on its own test computer. All computers are connected to the Internet. Each system is manually updated every day, and each product is updated before every single test case. Each test PC has its own external IP address. We make special arrangements with ISPs to ensure a stable Internet connection for each PC, and take the necessary precautions (with specially configured firewalls etc.) not to harm other computers (i.e. not to cause outbreaks). Software The tests were performed under Microsoft Windows 7 Home Premium SP1 64-Bit, with updates as of 1st February 2016. Some further installed software includes: Vendor Product Version Vendor Product Version Adobe Flash Player ActiveX 20.0 Microsoft Office Home 2013 Adobe Flash Player Plug-In 20.0 Microsoft .NET Framework 4.5.2 Adobe Acrobat Reader 11.0 Mozilla Firefox 43.0.4 Apple QuickTime 7.79 Oracle Java 1.8.0.66 Microsoft Internet Explorer 11.0 VideoLAN VLC Media Player 2.1.5 The use of more up-to-date third-party software and an updated Microsoft Windows 7 SP1 64-Bit makes it much harder to find exploits in-the-field for the test. Users should always keep their systems and applications up-to-date, in order to minimize the risk of being infected through exploits which use unpatched software vulnerabilities.
  • 5. Whole Product Dynamic “Real-World” Protection Test – (February-June 2016) www.av-comparatives.org - 5 - Settings We use every security suite with its default settings. Our Whole-Product Dynamic Protection Test aims to simulate real-world conditions as experienced every day by users. If user interactions are shown, we choose “Allow” or equivalent. If the product protects the system anyway, we count the malware as blocked, even though we allow the program to run when the user is asked to make a decision. If the system is compromised, we count it as user-dependent. We consider “protection” to mean that the system is not compromised. This means that the malware is not running (or is removed/terminated) and there are no significant/malicious system changes. An outbound-firewall alert about a running malware process, which asks whether or not to block traffic from the users’ workstation to the Internet, is too little, too late and not considered by us to be protection. Preparation for every testing day Every morning, any available security software updates are downloaded and installed, and a new base image is made for that day. Before each test case is carried out, the products have some time to download and install newer updates which have just been released, as well as to load their protection modules (which in several cases takes some minutes). In the event that a major signature update for a product is made available during the day, but fails to download/install before each test case starts, the product will at least have the signatures that were available at the start of the day. This replicates the situation of an ordinary user in the real world. Testing Cycle for each malicious URL Before browsing to each new malicious URL we update the programs/signatures (as described above). New major product versions (i.e. the first digit of the build number is different) are installed once at the beginning of the month, which is why in each monthly report we only give the main product version number. Our test software monitors the PC, so that any changes made by the malware will be recorded. Furthermore, the recognition algorithms check whether the antivirus program detects the malware. After each test case the machine is reset to its clean state. Protection Security products should protect the user’s PC. It is not very important at which stage the protection takes place. It could be while browsing to the website (e.g. protection through URL Blocker), while an exploit tries to run, while the file is being downloaded/created or when the malware is executed (either by the exploit or by the user). After the malware is executed (if not blocked before), we wait several minutes for malicious actions and also to give e.g. behaviour-blockers time to react and remedy actions performed by the malware. If the malware is not detected and the system is indeed infected/compromised, the process goes to “System Compromised”. If a user interaction is required and it is up to the user to decide if something is malicious, and in the case of the worst user decision the system gets compromised, we rate this as “user-dependent”. Because of this, the yellow bars in the results graph can be interpreted either as protected or not protected (it’s up to each individual user to decide what he/she would probably do in that situation).
  • 6. Whole Product Dynamic “Real-World” Protection Test – (February-June 2016) www.av-comparatives.org - 6 - Due to the dynamic nature of the test, i.e. mimicking real-world conditions, and because of the way several different technologies (such as cloud scanners, reputation services, etc.) work, it is a matter of fact that such tests cannot be repeated or replicated in the way that e.g. static detection rate tests can. Anyway, we log as much data as reasonably possible to support our findings and results. Vendors are invited to provide useful log functions in their products that can provide the additional data they want in the event of disputes. After each testing month, manufacturers are given the opportunity to dispute our conclusion about the compromised cases, so that we can recheck if there were maybe some problems in the automation or with our analysis of the results. In the case of cloud products, we can only consider the results that the products achieved in our lab at the time of testing; sometimes the cloud services provided by the security vendors are down due to faults or maintenance downtime by the vendors, but these cloud-downtimes are often not disclosed to the users by the vendors. This is also a reason why products relying too heavily on cloud services (and not making use of local heuristics, behaviour blockers, etc.) can be risky, as in such cases the security provided by the products can decrease significantly. Cloud signatures/reputation should be implemented in the products to complement the other local/offline protection features, but not replace them completely, as e.g. offline cloud services would mean the PCs are being exposed to higher risks. Test Set We aim to use visible and relevant malicious websites/malware that are currently out there, and present a risk to ordinary users. We usually try to include as many working drive-by exploits as we find –these are usually well covered by practically all major security products, which may be one reason why the scores look relatively high. The rest are URLs that point directly to malware executables; this causes the malware file to be downloaded, thus replicating a scenario in which the user is tricked by social engineering into following links in spam mails or websites, or installing some Trojan or other malicious software. We use our own crawling system to search continuously for malicious sites and extract malicious URLs (including spammed malicious links). We also search manually for malicious URLs. If our in-house crawler does not find enough valid malicious URLs on one day, we have contracted some external researchers to provide additional malicious URLs (initially for the exclusive use of AV-Comparatives) and look for additional (re)sources. In this kind of testing, it is very important to use enough test cases. If an insufficient number of samples are used in comparative tests, differences in results may not indicate actual differences in protective capabilities among the tested products1 . Our tests use more test cases (samples) per product and month than any similar test performed by other testing labs. Because of the higher statistical significance this achieves, we consider all the products in each results cluster to be equally effective, assuming that they have a false-positives rate below the industry average. 1 Read more in the following paper: http://www.av-comparatives.org/images/stories/test/statistics/somestats.pdf
  • 7. Whole Product Dynamic “Real-World” Protection Test – (February-June 2016) www.av-comparatives.org - 7 - Tested products For this test, we normally use the Internet security suite, as any protection features that prevent the system from being compromised can be used. However, a vendor can choose to enter their basic antivirus product instead, if they prefer. The main versions of the products tested in each monthly test run are shown below: Vendor Product Version February Version March Version April Version May Version June Avast Free Antivirus 11.1 11.1 11.2 11.2 12.1 AVG Internet Security 2016 2016 2016 2016 2016 Avira Antivirus Pro 15.0 15.0 15.0 15.0 15.0 Bitdefender Internet Security 2016 2016 2016 2016 2016 BullGuard Internet Security 16.0 16.0 16.0 16.0 16.0 Emsisoft Anti-Malware 11.5 11.5 11.7 11.8 11.8 eScan Internet Security 14.0 14.0 14.0 14.0 14.0 ESET Smart Security 9.0 9.0 9.0 9.0 9.0 F-Secure Safe 2016 2016 2016 2016 2016 Fortinet FortiClient (with FortiGate)2 5.4 5.4 5.4 5.4 5.4 Kaspersky Lab Internet Security 2016 2016 2016 2016 2016 Lavasoft Ad-Aware Pro Security 11.10 11.10 11.10 11.10 11.11 McAfee Internet Security 18.0 18.0 18.0 18.0 18.0 Microsoft Security Essentials 4.8 4.8 4.8 4.9 4.9 Quick Heal Total Security 16.0 16.0 17.0 17.0 17.0 Sophos Endpoint Security and Control 10.3 10.3 10.6 10.6 10.6 Tencent PC Manager (English) 11.2 11.2 11.4 11.4 11.4 ThreatTrack Vipre Internet Security Pro 9.3 9.3 9.3 9.3 9.3 Trend Micro Internet Security 10.0 10.0 10.0 10.0 10.0 Test Cases Test period Test cases 1st to 23rd February 2016 394 1st to 25th March 2016 449 1st to 25th April 2016 368 2nd to 25th May 2016 350 1st to 23rd June 2016 307 TOTAL 1868 2 The cloud-based behaviour-analysis feature of Fortinet is only available to enterprises customers who also purchased a FortiGate.
  • 8. Whole Product Dynamic “Real-World” Protection Test – (February-June 2016) www.av-comparatives.org - 8 - Summary Results (February-June) Test period: February – June 2016 (1868 Test cases)3 Blocked User dependent Compromised PROTECTION RATE [Blocked % + (User dependent %)/2]4 Cluster5 F-Secure, Trend Micro 1868 - - 100% 1 Bitdefender 1866 - 2 99,9% 1 Kaspersky Lab 1863 - 5 99,7% 1 Avira 1862 - 6 99,7% 1 AVG 1860 - 8 99,6% 1 ThreatTrack Vipre 1859 - 9 99,5% 1 ESET, Tencent 1841 - 27 98,6% 2 Emsisoft 1814 53 1 98,5% 2 Avast 1840 - 28 98,5% 2 eScan 1825 - 43 97,7% 2 BullGuard 1805 24 39 97,3% 2 Lavasoft 1817 - 51 97,3% 2 Fortinet 1803 - 65 96,5% 3 Sophos 1796 2 70 96,2% 3 McAfee 1788 - 80 95,7% 3 Quick Heal 1748 56 64 95,1% 3 Microsoft 1764 - 104 94,4% 3 3 Interested users who want to see the exact protection rates and FP rates for every month can see the monthly updated interactive charts on our website: http://chart.av-comparatives.org/chart1.php 4 User-dependent cases are given half credit. For example, if a program blocks 80% by itself, and another 20% of cases are user-dependent, we give half credit for the 20%, i.e. 10%, so it gets 90% altogether. 5 Hierarchical Clustering Method: defining clusters using average linkage between groups (Euclidian distance) based on the protection rate (see dendrogram on page 12).
  • 9. Whole Product Dynamic “Real-World” Protection Test – (February-June 2016) www.av-comparatives.org - 9 - The graph below shows the overall protection rate (all samples), including the minimum and maximum protection rates for the individual months.
  • 10. Whole Product Dynamic “Real-World” Protection Test – (February-June 2016) www.av-comparatives.org - 10 - Whole-Product “False Alarm” Test (wrongly blocked domains/files) The false-alarm test in the Whole-Product Dynamic “Real-World” Protection Test consists of two parts: wrongly blocked domains (while browsing) and wrongly blocked files (while downloading/installing). It is necessary to test both scenarios because testing only one of the two above cases could penalize products that focus mainly on one type of protection method, either URL filtering or on- access/behaviour/reputation-based file protection. a) Wrongly blocked domains (while browsing) We used around one thousand randomly chosen popular domains. Blocked non-malicious domains/URLs were counted as false positives (FPs). The wrongly blocked domains have been reported to the respective vendors for review and should now no longer be blocked. By blocking whole domains, the security products not only risk causing a loss of trust in their warnings, but also possibly causing financial damage (besides the damage to website reputation) to the domain owners, including loss of e.g. advertisement revenue. Due to this, we strongly recommend vendors to block whole domains only in the case where the domain’s sole purpose is to carry/deliver malicious code, and otherwise block just to the malicious pages (as long as they are indeed malicious). Products which tend to block URLs based e.g. on reputation may be more prone to this and score also higher in protection tests, as they may block many unpopular/new websites. b) Wrongly blocked files (while downloading/installing) We used around two thousand different applications listed either as top downloads or as new/recommended downloads from various download portals. The applications were downloaded from the original software developers’ websites (instead of the download portal host), saved to disk and installed to see if they are blocked at any stage of this procedure. Additionally, we included a few clean files that were encountered and disputed over the past months of the Real-World Protection Test. The duty of security products is to protect against malicious sites/files, not to censor or limit the access only to well-known popular applications and websites. If the user deliberately chooses a high security setting, which warns that it may block some legitimate sites or files, then this may be considered acceptable. However, we do not regard it to be acceptable as a default setting, where the user has not been warned. As the test is done at points in time and FPs on very popular software/websites are usually noticed and fixed within a few hours, it would be surprising to encounter FPs with very popular applications. Due to this, FP tests which are done e.g. only with very popular applications, or which use only the top 50 files from whitelisted/monitored download portals would be a waste of time and resources. Users do not care whether they are infected by malware that affects only them, just as they do not care if the FP count affects only them. While it is preferable that FPs do not affect many users, it should be the goal to avoid having any FPs and to protect against any malicious files, no matter how many users are affected or targeted. Prevalence of FPs based on user-base data is of interest for internal QA testing of AV vendors, but for the ordinary user it is important to know how accurately its product distinguishes between clean and malicious files.
  • 11. Whole Product Dynamic “Real-World” Protection Test – (February-June 2016) www.av-comparatives.org - 11 - The below table shows the numbers of wrongly blocked domains/files: Wrongly blocked clean domains/files (blocked / user-dependent6 ) Wrongly blocked score7 ThreatTrack Vipre 1 / 0 (1) 1 AVG, Bitdefender, Kaspersky Lab 2 / 0 (2) 2 ESET 3 / 0 (3) 3 Lavasoft 4 / 0 (4) 4 Avast 5 / 0 (5) 5 Microsoft, Sophos 6 / 0 (6) 6 Fortinet 7 / 0 (7) 7 Avira, Tencent 9 / 0 (9) 9 McAfee 14 / 0 (14) 14 Quick Heal 14 / 1 (15) 14,5 Emsisoft 3 / 23 (26) 14,5 average (16) average 16 eScan 40 / 0 (40) 40 F-Secure 46 / 0 (46) 46 Trend Micro 52 / 0 (52) 52 BullGuard 57 / 1 (58) 57,5 To determine which products have to be downgraded in our award scheme due to the rate of wrongly blocked sites/files, we backed up our decision by using statistical methods and by looking at the average scores. The following products with above-average FPs have been downgraded: BullGuard, eScan, F- Secure and Trend Micro. 6 Although user dependent cases are extremely annoying (esp. on clean files) for the user, they were counted only as half for the “wrongly blocked rate” (like for the protection rate). 7 Lower is better.
  • 12. Whole Product Dynamic “Real-World” Protection Test – (February-June 2016) www.av-comparatives.org - 12 - Prevalence of the FPs According to some vendors, their own FPs are not seen at all in their user base (zero prevalence) or have a very low prevalence. Nevertheless, we want to give the best possible overview of prevalence data for the benefit of users of all our tested products. The table below shows the number of FPs for each product according to our amalgamated prevalence assessment, for which we used several sources of prevalence data. Some products may block files based solely on their prevalence, i.e. if a vendor does not have any data for a particular file, their product may treat it as a threat. This of course helps to block many malicious files, but at the same time it can lead to higher false-alarm rates by blocking clean files which currently have zero or very low prevalence in the user base of the particular vendor. Very low Low Medium High Avast 5 0 0 0 AVG 0 1 0 1 Avira 7 1 0 2 Bitdefender 1 0 0 1 BullGuard 14 19 14 11 Emsisoft 11 10 3 2 eScan 18 12 4 6 ESET 1 0 2 0 Fortinet 4 3 0 0 F-Secure 27 14 5 0 Kaspersky Lab 2 0 0 0 Lavasoft 1 2 0 1 McAfee 6 4 2 2 Microsoft 2 4 0 0 Quick Heal 5 6 3 2 Sophos 0 3 1 2 Tencent 3 1 3 2 ThreatTrack Vipre 0 1 0 0 Trend Micro 35 15 1 1 Key to prevalence ratings8 Very low: probably fewer than a hundred users Low: probably several hundreds of users Medium: probably several thousands of users High: probably several tens of thousands of users 8 These relate to our aggregated prevalence data, not to the data of the individual vendors.
  • 13. Whole Product Dynamic “Real-World” Protection Test – (February-June 2016) www.av-comparatives.org - 13 - Illustration of how awards were given The dendrogram (using average linkage between groups) shows the results of the hierarchical cluster analysis. It indicates at what level of similarity the clusters are joined. The red drafted line defines the level of similarity. Each intersection indicates a group (in this case 3 groups)9 . Products that had above- average FPs (wrongly blocked score) are marked in red (and downgraded according to the ranking system below). Ranking system Protection score Cluster10 4 Protection score Cluster 3 Protection score Cluster 2 Protection score Cluster 1 < ∅∅∅∅ FPs Tested Standard Advanced Advanced+ > ∅∅∅∅ FPs Tested Tested Standard Advanced 9 As all products scored highly (over 90%) in this test, we have used three instead of four clusters. 10 See protection score clusters on page 9.
  • 14. Whole Product Dynamic “Real-World” Protection Test – (February-June 2016) www.av-comparatives.org - 14 - Award levels reached in this test The awards are decided and given by the testers based on the observed test results (after consulting statistical models). The following awards are for the results reached in this Whole-Product Dynamic “Real-World” Protection Test: AWARD LEVELS PRODUCTS Bitdefender Kaspersky Lab Avira AVG ThreatTrack F-Secure* Trend Micro* ESET Tencent Emsisoft Avast Lavasoft eScan* BullGuard* Fortinet Sophos McAfee Quick Heal Microsoft - * downgraded by one rank due to the score of wrongly blocked sites/files (FPs); see page 13 Expert users who do not care about wrongly blocked files/websites (false alarms) are free to rely on the protection rates on page 9 instead of our awards ranking which takes FPs in consideration.
  • 15. Whole Product Dynamic “Real-World” Protection Test – (February-June 2016) www.av-comparatives.org - 15 - Copyright and Disclaimer This publication is Copyright © 2016 by AV-Comparatives ®. Any use of the results, etc. in whole or in part, is ONLY permitted with the explicit written agreement of the management board of AV- Comparatives, prior to any publication. AV-Comparatives and its testers cannot be held liable for any damage or loss which might occur as a result of, or in connection with, the use of the information provided in this paper. We take every possible care to ensure the correctness of the basic data, but liability for the correctness of the test results cannot be taken by any representative of AV-Comparatives. We do not give any guarantee of the correctness, completeness, or suitability for a specific purpose of any of the information/content provided at any given time. No-one else involved in creating, producing or delivering test results shall be liable for any indirect, special or consequential damage, or loss of profits, arising out of, or related to, the use (or inability to use), the services provided by the website, test documents or any related data. For more information about AV-Comparatives and the testing methodologies please visit our website. AV-Comparatives (July 2016)