SlideShare a Scribd company logo
1 of 10
Download to read offline
Anti-Virus Comparative




File Detection Test of
Malicious Software
includes false alarm test



Language: English
March 2013

Last Revision: 4th April 2013

www.av-comparatives.org
Anti-Virus Comparative - File Detection Test - March 2013               www.av-comparatives.org




  Table of Contents

   Tested Products                                                 3

   Introduction                                                    4 

   Graph of missed samples                                         6 

   Results                                                         7 

   False positive/alarm test                                       8 

   Award levels reached in this test                               9 

   Copyright and Disclaimer                                       10 




                                                            -2-
Anti-Virus Comparative - File Detection Test - March 2013                                   www.av-comparatives.org




Tested Products



       AhnLab V3 Internet Security 8.0                              Fortinet FortiClient 5.0
       avast! Free Antivirus 8.0                                    G DATA AntiVirus 2013
       AVG Anti-Virus 2013                                          Kaspersky Anti-Virus 2013
       AVIRA Antivirus Premium 2013                                 McAfee AntiVirus Plus 2013
       Bitdefender Antivirus Plus 2013                              Microsoft Security Essentials 4.2
       BullGuard Antivirus 2013                                     Panda Cloud Free Antivirus 2.1.1
       eScan Anti-Virus 14.0                                        Sophos Anti-Virus 10.2
       Emsisoft Anti-Malware 7.0                                    Symantec Norton Anti-Virus 2013
       ESET NOD32 Antivirus 6.0                                     ThreatTrack Vipre Antivirus 2013
       F-Secure Anti-Virus 2013                                     Trend Micro Titanium AntiVirus+ 2013




                                                            -3-
Anti-Virus Comparative - File Detection Test - March 2013                                            www.av-comparatives.org




Introduction

Before proceeding with this report, readers are advised to first read the methodology documents as
well as the information provided on our website.

The Malware sets have been frozen the 22nd February 2013 and consisted of 136610 sample variants.
The products were updated on the 28th February 2013 and tested under Microsoft1 Windows 7 64-Bit.
The following twenty up-to-date products were included in this public test (most current ones
available at time of testing):

       AhnLab V3 Internet Security 8.0.7.5                      Fortinet FortiClient 5.0.1.0199
       avast! Free Antivirus 8.0.1482                           G DATA AntiVirus 23.1.0.2
       AVG Anti-Virus 2013.0.2899                               Kaspersky Anti-Virus 13.0.1.4190 (e)
       AVIRA Antivirus Premium 13.0.0.3185                      McAfee AntiVirus Plus 12.1.253
       Bitdefender Anti-Virus+ 16.26.0.1739                     Microsoft Security Essentials 4.2.233.0
       BullGuard Antivirus 13.0.256                             Panda Cloud Free Antivirus 2.1.1
       eScan Anti-Virus 14.0.1400.1364                          Sophos Anti-Virus 10.2
       Emsisoft Anti-Malware 7.0.0.18                           Symantec2 Norton Anti-Virus 20.2.1.22
       ESET NOD32 Antivirus 6.0.306.3                           ThreatTrack Vipre Antivirus 6.1.5493
       F-Secure Anti-Virus 12.77.100                            Trend Micro Titanium AntiVirus Plus 6.0.1215


Please try the products3 on your own system before making a purchase decision based on these tests.
There are also some other program features and important factors (e.g. price, ease of
use/management, compatibility, graphical user interface, language, support, etc.) to consider.
Although very important, the file detection rate of a product is only one aspect of a complete Anti-
Virus product. AV-Comparatives provides also a whole product dynamic “real-world” protection test, as
well as other test reports which cover different aspects/features of the products. We invite users to
look at our other tests and not only at this type of test. This is to make users aware of other types of
tests and reports that we are providing already since several years, like e.g. our Whole-Product “Real-
World” Protection Test. A good file detection rate is still one of the most important, deterministic and
reliable features of an Anti-Virus product. Additionally, most products provide at least some kind of
functionalities to block (or at least warn about the possibility of) malicious actions e.g. during the
execution of malware, when all other on-access and on-demand detection/protection mechanism
failed (these protection features are evaluated in other types of tests that we provide on our
website).



1
  We additionally tested also Windows Defender included in Windows 8 (we observed the same results as
achieved by Security Essentials under Windows 7).
2
  We added Symantec Norton in this test, even if they did not apply for being included into our test-series, as
the results have been highly demanded by our readers and the press.
3
  Information about used additional third-party engines/signatures inside the products: BullGuard, Emsisoft,
eScan and F-Secure are based on the Bitdefender engine. The tested version of G DATA (2013) was based on
the Avast and Bitdefender engines. G DATA 2014 is based on the Bitdefender engine; the results in this report
of G DATA 2013 are not applicable to G DATA 2014.


                                                            -4-
Anti-Virus Comparative - File Detection Test - March 2013                                  www.av-comparatives.org



Most products run with highest settings by default. Certain products switch to highest settings
automatically when malware is found (e.g. Avast, Symantec, Trend Micro, etc.). This makes it
impossible to test against various malware with real “default” settings. In order to get comparable
results we set the few remaining products to highest settings or leave them to lower settings - in
accordance with the respective vendors. We kindly ask vendors to provide strong settings by default,
i.e. set their default settings to highest levels of detection (e.g. AVIRA, Kaspersky, etc.), esp. for
scheduled scans or scans initiated by the user. This is usually already the case for on-access scans
and/or on-execution scans (e.g. ESET and Kaspersky use higher heuristic settings when files are
executed). We allow the few remaining vendors (which do not use highest settings in on-demand
scans) to choose to be tested with higher setting as they e.g. use in on-access/on-execution higher
settings by default, so the results of the file detection test are closer to the usage in the field (as on-
access the settings are higher). We ask vendors to remove paranoid settings inside the user interface
which are too high to be ever of any benefit for common users (e.g. AVG, F-Secure, Sophos, etc.).
Below are some notes about the settings used (scan all files, scan archives, scan for PUA, scan with
cloud, etc. is always being enabled):

AVG, F-Secure, Sophos: asked to get tested and awarded based on their default settings (i.e. without
using their advanced heuristics / suspicious detections / thorough scan setting).
AVIRA, Kaspersky: asked to get tested with heuristic set to high/advanced.

Several products make use of cloud technologies, which require an active internet connection. Our
tests are performed using an active internet connection. Users should be aware that detection rates
may be in some cases drastically lower if the scan is performed while offline (or when the cloud is
unreachable for various reasons). The cloud should be considered as an additional benefit/feature to
increase detection rates (as well as response times and false alarm suppression) and not as a full
replacement for local offline detections. Vendors should make sure that users are warned in case that
the connectivity to the cloud gets lost e.g. during a scan, which may affect considerably the provided
protection and make e.g. an initiated scan useless. While in our test we are able to check whether the
clouds of the security vendors are reachable, users should be aware that being online does not
necessarily mean that the cloud connection of the products they use is working properly. In fact,
sometimes products with cloud functionality have various network issues due to which no security is
provided through the cloud without informing the user. In near future, tools will be available to verify
the proper functionality of cloud-supported products.

The used test-set has been built consulting telemetry data with the aim to include mainly prevalent
samples from the last weeks/months which are/were hitting users in the field. We applied a clustering
method to classify similar files (one sample - the most prevalent one - per variant). This allows us to
evaluate prevalent similar samples and reducing the size of the set without introducing bias.
Therefore, each miss is intended to represent one missed group/variant of files.

Even if we deliver various tests and show different aspects of Anti-Virus software, users are advised to
evaluate the software by themselves and build their own opinion about them. Test data or reviews
just provide guidance to some aspects that users cannot evaluate by themselves. We suggest and
encourage readers to research also other independent test results provided by various well-known and
established independent testing organizations, in order to get a better overview about the detection
and protection capabilities of the various products over different test scenarios and various test-sets.
A list of various testing labs can be found on our website.



                                                            -5-
Anti-Virus Comparative - File Detection Test - March 2013                                                           www.av-comparatives.org



The malware detection rates are grouped by the testers after looking at the clusters build with the
hierarchal clustering method. By using clusters, there are no fixed thresholds to reach, as the
thresholds change based on the results. The testers may group the clusters rationally and not rely
solely on the clusters, to avoid that if e.g. all products would in future score badly, they do not get
high rankings anyway. We are still evaluating whether to apply clusters also for the FP ranges or to
set the threshold for “Many” at 11 instead 16.

                                                                  Detection Rate Clusters/Groups
                                                      (given by the testers after consulting statistical methods)
                                                            4           3               2                1
                            Very few (0-2 FP’s)
                                                      TESTED       STANDARD        ADVANCED         ADVANCED+
                               Few (3-15 FP’s)

                              Many (16-50 FP’s)       TESTED         TESTED        STANDARD         ADVANCED

                      Very many (51-100 FP’s)         TESTED         TESTED          TESTED         STANDARD

                   Crazy many (over 100 FP’s)         TESTED         TESTED          TESTED           TESTED




Graph of missed samples (lower is better)




The graph above shows the test results against a "out-of-box" malware detection provided by
Microsoft Windows. In Windows 8, this is provided by Windows Defender, which is pre-installed by
default with the operating system. The equivalent in Windows 7 is Microsoft Security Essentials, which
is not pre-installed, but can easily be added for free as an option via the Windows Update service. We
use this to compare the value of this feature – i.e. static malware detection - provided to users of
third-party Anti-Virus solutions. This comparisons will from now on be provided also in other tests,
like e.g. monthly for our Real-World Protection Tests (for products participating in public main test-
series) available on our website.




                                                                  -6-
Anti-Virus Comparative - File Detection Test - March 2013                                      www.av-comparatives.org




Results

Total detection rates (clustered in groups):

Please consider also the false alarm rates when looking at the below file detection rates4.

     1.     G DATA 2013                                 99.9%
     2.     AVIRA                                       99.6%
     3.     F-Secure                                    99.5%
     4.     Bitdefender, eScan,                         99.3%
            BullGuard, Panda, Emsisoft
     5.     Kaspersky                                   99.2%
     6.      Fortinet, Vipre                            98.6%
     7.      AVG, Trend Micro                           98.4%
     8.      Sophos, McAfee                             98.0%
     9.      Avast                                      97.8%
     10.     ESET                                       97.5%
     11.     AhnLab                                     92.3%
     12.     Microsoft                                  92.0%
     13.     Symantec                                   91.2%


The used test-set contained 136610 recent/prevalent samples from last weeks/months.

Hierarchical	Cluster	Analysis	

                                                                                 This dendogram shows the
                                                                                 results    of    the    cluster
                                                                                         5
                                                                                 analysis . It indicates at
                                                                                 what level of similarity the
                                                                                 clusters are joined. The red
                                                                                 drafted line defines the level
                                                                                 of       similarity.       Each
                                                                                 intersection     indicates    a
                                                                                 group.




4
 We estimate the remaining error margin on the final percentages to be below 0.2%
5
 For more information about cluster analysis, see e.g. this easy to understand tutorial:
http://strata.uga.edu/software/pdf/clusterTutorial.pdf


                                                                -7-
Anti-Virus Comparative - File Detection Test - March 2013                               www.av-comparatives.org




False positive/alarm test
In order to better evaluate the quality of the file detection capabilities (distinguish good files from
malicious files) of anti-virus products, we provide a false alarm test. False alarms can sometimes cause
as much troubles as a real infection. Please consider the false alarm rate when looking at the
detection rates, as a product which is prone to cause false alarms achieves higher scores easier.

False Positive Results

Number of false alarms found in our set of clean files (lower is better):
     1.         Microsoft                                   0          very few FPs
     2.         Fortinet                                     5
     3.         Kaspersky, Sophos                            6
     4.         AVIRA                                        8
     5.         Bitdefender, BullGuard, ESET                 9         few FP’s
     6.         F-Secure                                    11
     7.         Avast                                       14
     8.         McAfee                                      15
     9.         AhnLab, G DATA                              19
     10.        AVG, eScan                                  21
     11.        Trend Micro                                 22
     12.        Symantec                                    23         many FP’s
     13.        Panda                                       28
     14.        Vipre                                       30
     15.        Emsisoft                                    38

Details about the discovered false alarms (including their assumed prevalence) can be seen in a
separate report available at: http://www.av-comparatives.org/images/stories/test/fp/avc_fp_mar2013.pdf




A product that is successful at detecting a high percentage of malicious files but suffers from false
alarms may not be necessarily better than a product which detects less malicious files but which
generates less false alarms.




                                                                 -8-
Anti-Virus Comparative - File Detection Test - March 2013                                   www.av-comparatives.org




Award levels reached in this test

AV-Comparatives provides a ranking award. As this report contains also the raw detection rates and
not only the awards, expert users that e.g. do not care about false alarms can rely on that score alone
if they want to.

                                            AWARDS                            PRODUCTS6
                            (based on detection rates and false alarms)



                                                                             AVIRA
                                                                             F-Secure
                                                                             Bitdefender
                                                                             BullGuard
                                                                             Kaspersky


                                                                             G DATA*
                                                                             eScan*
                                                                             Panda*
                                                                             Emsisoft*
                                                                             Fortinet
                                                                             Sophos
                                                                             McAfee
                                                                             Avast
                                                                             ESET



                                                                           Vipre*
                                                                           AVG*
                                                                           Trend Micro*




                                                                           AhnLab*
                                                                           Symantec*




                     *: those products got lower awards due to false alarms

The Awards are not only based on detection rates - also False Positives found in our set of clean files
are considered. On page 6 of this report you can see how awards are being given.




6
  Microsoft security products are no longer included in the awards page, as their out-of-box detection is
included in the operating system and therefore out-of-competition.


                                                                  -9-
Anti-Virus Comparative - File Detection Test - March 2013                                www.av-comparatives.org




Copyright and Disclaimer

This publication is Copyright © 2013 by AV-Comparatives e.V. ®. Any use of the results, etc. in whole
or in part, is ONLY permitted after the explicit written agreement of the management board of AV-
Comparatives e.V., prior to any publication. AV-Comparatives e.V. and its testers cannot be held liable
for any damage or loss, which might occur as result of, or in connection with, the use of the
information provided in this paper. We take every possible care to ensure the correctness of the basic
data, but a liability for the correctness of the test results cannot be taken by any representative of
AV-Comparatives e.V. We do not give any guarantee of the correctness, completeness, or suitability
for a specific purpose of any of the information/content provided at any given time. No one else
involved in creating, producing or delivering test results shall be liable for any indirect, special or
consequential damage, or loss of profits, arising out of, or related to, the use or inability to use, the
services provided by the website, test documents or any related data. AV-Comparatives e.V. is a
registered Austrian Non-Profit-Organization.

For more information about AV-Comparatives and the testing methodologies, please visit our website.

                                                                      AV-Comparatives e.V. (April 2013)




                                                            - 10 -

More Related Content

What's hot

The AV-Comparatives Guide to the Best Cybersecurity Solutions of 2017
The AV-Comparatives Guide to the Best Cybersecurity Solutions of 2017The AV-Comparatives Guide to the Best Cybersecurity Solutions of 2017
The AV-Comparatives Guide to the Best Cybersecurity Solutions of 2017Jermund Ottermo
 
Antivirus Comparative junio 2014
Antivirus Comparative junio 2014Antivirus Comparative junio 2014
Antivirus Comparative junio 2014Doryan Mathos
 
Can consumer av products protect
Can consumer av products protectCan consumer av products protect
Can consumer av products protectAnatoliy Tkachev
 
White Paper - Are antivirus solutions enough to protect industrial plants?
White Paper - Are antivirus solutions enough to protect industrial plants?White Paper - Are antivirus solutions enough to protect industrial plants?
White Paper - Are antivirus solutions enough to protect industrial plants?TI Safe
 
Metascan Multi-scanning Technology
Metascan Multi-scanning TechnologyMetascan Multi-scanning Technology
Metascan Multi-scanning TechnologyOPSWAT
 
Accuracy and time_costs_of_web_app_scanners
Accuracy and time_costs_of_web_app_scannersAccuracy and time_costs_of_web_app_scanners
Accuracy and time_costs_of_web_app_scannersLarry Suto
 
Using Multiple Antivirus Engine Scanning to Protect Critical Infrastructure
Using Multiple Antivirus Engine Scanning to Protect Critical InfrastructureUsing Multiple Antivirus Engine Scanning to Protect Critical Infrastructure
Using Multiple Antivirus Engine Scanning to Protect Critical InfrastructureOPSWAT
 
инструкции и утилиты для удаления остатков антивирусных программ
инструкции и утилиты для удаления остатков антивирусных программинструкции и утилиты для удаления остатков антивирусных программ
инструкции и утилиты для удаления остатков антивирусных программbelhonka
 
IE Exploit Protection
IE Exploit ProtectionIE Exploit Protection
IE Exploit ProtectionKim Jensen
 
Analyzing the Effectivess of Web Application Firewalls
Analyzing the Effectivess of Web Application FirewallsAnalyzing the Effectivess of Web Application Firewalls
Analyzing the Effectivess of Web Application FirewallsLarry Suto
 
LC Chen Presentation at Icinga Camp 2015 Kuala Lumpur
LC Chen Presentation at Icinga Camp 2015 Kuala LumpurLC Chen Presentation at Icinga Camp 2015 Kuala Lumpur
LC Chen Presentation at Icinga Camp 2015 Kuala LumpurIcinga
 
A study of anti virus' response to unknown threats
A study of anti virus' response to unknown threatsA study of anti virus' response to unknown threats
A study of anti virus' response to unknown threatsUltraUploader
 
Preventing Known and Unknown Threats
Preventing Known and Unknown ThreatsPreventing Known and Unknown Threats
Preventing Known and Unknown ThreatsOPSWAT
 
IRJET- A Study on Penetration Testing using Metasploit Framework
IRJET- A Study on Penetration Testing using Metasploit FrameworkIRJET- A Study on Penetration Testing using Metasploit Framework
IRJET- A Study on Penetration Testing using Metasploit FrameworkIRJET Journal
 

What's hot (19)

Avc prot 2012b_en
Avc prot 2012b_enAvc prot 2012b_en
Avc prot 2012b_en
 
The AV-Comparatives Guide to the Best Cybersecurity Solutions of 2017
The AV-Comparatives Guide to the Best Cybersecurity Solutions of 2017The AV-Comparatives Guide to the Best Cybersecurity Solutions of 2017
The AV-Comparatives Guide to the Best Cybersecurity Solutions of 2017
 
Antivirus Comparative junio 2014
Antivirus Comparative junio 2014Antivirus Comparative junio 2014
Antivirus Comparative junio 2014
 
Avc per 201206_en
Avc per 201206_enAvc per 201206_en
Avc per 201206_en
 
Avc prot 2016a_en
Avc prot 2016a_enAvc prot 2016a_en
Avc prot 2016a_en
 
Windows 8 kasp1248
Windows 8 kasp1248Windows 8 kasp1248
Windows 8 kasp1248
 
Can consumer av products protect
Can consumer av products protectCan consumer av products protect
Can consumer av products protect
 
White Paper - Are antivirus solutions enough to protect industrial plants?
White Paper - Are antivirus solutions enough to protect industrial plants?White Paper - Are antivirus solutions enough to protect industrial plants?
White Paper - Are antivirus solutions enough to protect industrial plants?
 
Metascan Multi-scanning Technology
Metascan Multi-scanning TechnologyMetascan Multi-scanning Technology
Metascan Multi-scanning Technology
 
Accuracy and time_costs_of_web_app_scanners
Accuracy and time_costs_of_web_app_scannersAccuracy and time_costs_of_web_app_scanners
Accuracy and time_costs_of_web_app_scanners
 
Using Multiple Antivirus Engine Scanning to Protect Critical Infrastructure
Using Multiple Antivirus Engine Scanning to Protect Critical InfrastructureUsing Multiple Antivirus Engine Scanning to Protect Critical Infrastructure
Using Multiple Antivirus Engine Scanning to Protect Critical Infrastructure
 
инструкции и утилиты для удаления остатков антивирусных программ
инструкции и утилиты для удаления остатков антивирусных программинструкции и утилиты для удаления остатков антивирусных программ
инструкции и утилиты для удаления остатков антивирусных программ
 
IE Exploit Protection
IE Exploit ProtectionIE Exploit Protection
IE Exploit Protection
 
Analyzing the Effectivess of Web Application Firewalls
Analyzing the Effectivess of Web Application FirewallsAnalyzing the Effectivess of Web Application Firewalls
Analyzing the Effectivess of Web Application Firewalls
 
LC Chen Presentation at Icinga Camp 2015 Kuala Lumpur
LC Chen Presentation at Icinga Camp 2015 Kuala LumpurLC Chen Presentation at Icinga Camp 2015 Kuala Lumpur
LC Chen Presentation at Icinga Camp 2015 Kuala Lumpur
 
A study of anti virus' response to unknown threats
A study of anti virus' response to unknown threatsA study of anti virus' response to unknown threats
A study of anti virus' response to unknown threats
 
Preventing Known and Unknown Threats
Preventing Known and Unknown ThreatsPreventing Known and Unknown Threats
Preventing Known and Unknown Threats
 
IRJET- A Study on Penetration Testing using Metasploit Framework
IRJET- A Study on Penetration Testing using Metasploit FrameworkIRJET- A Study on Penetration Testing using Metasploit Framework
IRJET- A Study on Penetration Testing using Metasploit Framework
 
Security testing
Security testingSecurity testing
Security testing
 

Similar to Avc fdt 201303_en

Avc report25
Avc report25Avc report25
Avc report25Era Brown
 
AV-Comparatives Performance Test
AV-Comparatives Performance TestAV-Comparatives Performance Test
AV-Comparatives Performance TestHerbert Rodriguez
 
AV-Comparatives Real World Protection Test
AV-Comparatives Real World Protection TestAV-Comparatives Real World Protection Test
AV-Comparatives Real World Protection TestHerbert Rodriguez
 
AV Comparatives 2013 (Comparación de Antivirus)
AV Comparatives 2013 (Comparación de Antivirus)AV Comparatives 2013 (Comparación de Antivirus)
AV Comparatives 2013 (Comparación de Antivirus)Doryan Mathos
 
ANTIVIRUS
ANTIVIRUSANTIVIRUS
ANTIVIRUSfauscha
 
Avtest Kasım 2011 Bedava Android Antivirüs Araştırması
Avtest Kasım 2011 Bedava Android Antivirüs AraştırmasıAvtest Kasım 2011 Bedava Android Antivirüs Araştırması
Avtest Kasım 2011 Bedava Android Antivirüs AraştırmasıErol Dizdar
 
Malware Removal Test OCT 2009
Malware Removal Test OCT 2009Malware Removal Test OCT 2009
Malware Removal Test OCT 2009Kim Jensen
 
From Code to Customer: How to Make Software Products Secure
From Code to Customer: How to Make Software Products SecureFrom Code to Customer: How to Make Software Products Secure
From Code to Customer: How to Make Software Products SecureKaspersky
 
Webroot Cloud vs. Six Traditional Antivirus Products
Webroot Cloud vs. Six Traditional Antivirus ProductsWebroot Cloud vs. Six Traditional Antivirus Products
Webroot Cloud vs. Six Traditional Antivirus ProductsWebroot
 
Avtest 2012 02-android_anti-malware_report_english
Avtest 2012 02-android_anti-malware_report_englishAvtest 2012 02-android_anti-malware_report_english
Avtest 2012 02-android_anti-malware_report_englishDaniel zhao
 
Avtest 2012 02-android_anti-malware_report_english
Avtest 2012 02-android_anti-malware_report_englishAvtest 2012 02-android_anti-malware_report_english
Avtest 2012 02-android_anti-malware_report_englishКомсс Файквэе
 
Avtest 2012 02-android_anti-malware_report_english
Avtest 2012 02-android_anti-malware_report_englishAvtest 2012 02-android_anti-malware_report_english
Avtest 2012 02-android_anti-malware_report_englishКомсс Файквэе
 

Similar to Avc fdt 201303_en (20)

Avc report25
Avc report25Avc report25
Avc report25
 
AV-Comparatives Performance Test
AV-Comparatives Performance TestAV-Comparatives Performance Test
AV-Comparatives Performance Test
 
Performance dec 2010
Performance dec 2010Performance dec 2010
Performance dec 2010
 
AV-Comparatives Real World Protection Test
AV-Comparatives Real World Protection TestAV-Comparatives Real World Protection Test
AV-Comparatives Real World Protection Test
 
AV Comparatives 2013 (Comparación de Antivirus)
AV Comparatives 2013 (Comparación de Antivirus)AV Comparatives 2013 (Comparación de Antivirus)
AV Comparatives 2013 (Comparación de Antivirus)
 
Avc per 201304_en
Avc per 201304_enAvc per 201304_en
Avc per 201304_en
 
ANTIVIRUS
ANTIVIRUSANTIVIRUS
ANTIVIRUS
 
Avtest Kasım 2011 Bedava Android Antivirüs Araştırması
Avtest Kasım 2011 Bedava Android Antivirüs AraştırmasıAvtest Kasım 2011 Bedava Android Antivirüs Araştırması
Avtest Kasım 2011 Bedava Android Antivirüs Araştırması
 
Malware Removal Test OCT 2009
Malware Removal Test OCT 2009Malware Removal Test OCT 2009
Malware Removal Test OCT 2009
 
From Code to Customer: How to Make Software Products Secure
From Code to Customer: How to Make Software Products SecureFrom Code to Customer: How to Make Software Products Secure
From Code to Customer: How to Make Software Products Secure
 
Mac review 2012_en
Mac review 2012_enMac review 2012_en
Mac review 2012_en
 
smpef
smpefsmpef
smpef
 
Dtl 2012 q4_home.1
Dtl 2012 q4_home.1Dtl 2012 q4_home.1
Dtl 2012 q4_home.1
 
Webroot Cloud vs. Six Traditional Antivirus Products
Webroot Cloud vs. Six Traditional Antivirus ProductsWebroot Cloud vs. Six Traditional Antivirus Products
Webroot Cloud vs. Six Traditional Antivirus Products
 
Avtest 2012 02-android_anti-malware_report_english
Avtest 2012 02-android_anti-malware_report_englishAvtest 2012 02-android_anti-malware_report_english
Avtest 2012 02-android_anti-malware_report_english
 
Avtest 2012 02-android_anti-malware_report_english
Avtest 2012 02-android_anti-malware_report_englishAvtest 2012 02-android_anti-malware_report_english
Avtest 2012 02-android_anti-malware_report_english
 
Avtest 2012 02-android_anti-malware_report_english
Avtest 2012 02-android_anti-malware_report_englishAvtest 2012 02-android_anti-malware_report_english
Avtest 2012 02-android_anti-malware_report_english
 
Avc aph 201207_en
Avc aph 201207_enAvc aph 201207_en
Avc aph 201207_en
 
Top 10 antiviruses
Top 10 antivirusesTop 10 antiviruses
Top 10 antiviruses
 
Avc rem 201211_en
Avc rem 201211_enAvc rem 201211_en
Avc rem 201211_en
 

Avc fdt 201303_en

  • 1. Anti-Virus Comparative File Detection Test of Malicious Software includes false alarm test Language: English March 2013 Last Revision: 4th April 2013 www.av-comparatives.org
  • 2. Anti-Virus Comparative - File Detection Test - March 2013 www.av-comparatives.org Table of Contents Tested Products 3 Introduction 4  Graph of missed samples 6  Results 7  False positive/alarm test 8  Award levels reached in this test 9  Copyright and Disclaimer 10  -2-
  • 3. Anti-Virus Comparative - File Detection Test - March 2013 www.av-comparatives.org Tested Products  AhnLab V3 Internet Security 8.0  Fortinet FortiClient 5.0  avast! Free Antivirus 8.0  G DATA AntiVirus 2013  AVG Anti-Virus 2013  Kaspersky Anti-Virus 2013  AVIRA Antivirus Premium 2013  McAfee AntiVirus Plus 2013  Bitdefender Antivirus Plus 2013  Microsoft Security Essentials 4.2  BullGuard Antivirus 2013  Panda Cloud Free Antivirus 2.1.1  eScan Anti-Virus 14.0  Sophos Anti-Virus 10.2  Emsisoft Anti-Malware 7.0  Symantec Norton Anti-Virus 2013  ESET NOD32 Antivirus 6.0  ThreatTrack Vipre Antivirus 2013  F-Secure Anti-Virus 2013  Trend Micro Titanium AntiVirus+ 2013 -3-
  • 4. Anti-Virus Comparative - File Detection Test - March 2013 www.av-comparatives.org Introduction Before proceeding with this report, readers are advised to first read the methodology documents as well as the information provided on our website. The Malware sets have been frozen the 22nd February 2013 and consisted of 136610 sample variants. The products were updated on the 28th February 2013 and tested under Microsoft1 Windows 7 64-Bit. The following twenty up-to-date products were included in this public test (most current ones available at time of testing):  AhnLab V3 Internet Security 8.0.7.5  Fortinet FortiClient 5.0.1.0199  avast! Free Antivirus 8.0.1482  G DATA AntiVirus 23.1.0.2  AVG Anti-Virus 2013.0.2899  Kaspersky Anti-Virus 13.0.1.4190 (e)  AVIRA Antivirus Premium 13.0.0.3185  McAfee AntiVirus Plus 12.1.253  Bitdefender Anti-Virus+ 16.26.0.1739  Microsoft Security Essentials 4.2.233.0  BullGuard Antivirus 13.0.256  Panda Cloud Free Antivirus 2.1.1  eScan Anti-Virus 14.0.1400.1364  Sophos Anti-Virus 10.2  Emsisoft Anti-Malware 7.0.0.18  Symantec2 Norton Anti-Virus 20.2.1.22  ESET NOD32 Antivirus 6.0.306.3  ThreatTrack Vipre Antivirus 6.1.5493  F-Secure Anti-Virus 12.77.100  Trend Micro Titanium AntiVirus Plus 6.0.1215 Please try the products3 on your own system before making a purchase decision based on these tests. There are also some other program features and important factors (e.g. price, ease of use/management, compatibility, graphical user interface, language, support, etc.) to consider. Although very important, the file detection rate of a product is only one aspect of a complete Anti- Virus product. AV-Comparatives provides also a whole product dynamic “real-world” protection test, as well as other test reports which cover different aspects/features of the products. We invite users to look at our other tests and not only at this type of test. This is to make users aware of other types of tests and reports that we are providing already since several years, like e.g. our Whole-Product “Real- World” Protection Test. A good file detection rate is still one of the most important, deterministic and reliable features of an Anti-Virus product. Additionally, most products provide at least some kind of functionalities to block (or at least warn about the possibility of) malicious actions e.g. during the execution of malware, when all other on-access and on-demand detection/protection mechanism failed (these protection features are evaluated in other types of tests that we provide on our website). 1 We additionally tested also Windows Defender included in Windows 8 (we observed the same results as achieved by Security Essentials under Windows 7). 2 We added Symantec Norton in this test, even if they did not apply for being included into our test-series, as the results have been highly demanded by our readers and the press. 3 Information about used additional third-party engines/signatures inside the products: BullGuard, Emsisoft, eScan and F-Secure are based on the Bitdefender engine. The tested version of G DATA (2013) was based on the Avast and Bitdefender engines. G DATA 2014 is based on the Bitdefender engine; the results in this report of G DATA 2013 are not applicable to G DATA 2014. -4-
  • 5. Anti-Virus Comparative - File Detection Test - March 2013 www.av-comparatives.org Most products run with highest settings by default. Certain products switch to highest settings automatically when malware is found (e.g. Avast, Symantec, Trend Micro, etc.). This makes it impossible to test against various malware with real “default” settings. In order to get comparable results we set the few remaining products to highest settings or leave them to lower settings - in accordance with the respective vendors. We kindly ask vendors to provide strong settings by default, i.e. set their default settings to highest levels of detection (e.g. AVIRA, Kaspersky, etc.), esp. for scheduled scans or scans initiated by the user. This is usually already the case for on-access scans and/or on-execution scans (e.g. ESET and Kaspersky use higher heuristic settings when files are executed). We allow the few remaining vendors (which do not use highest settings in on-demand scans) to choose to be tested with higher setting as they e.g. use in on-access/on-execution higher settings by default, so the results of the file detection test are closer to the usage in the field (as on- access the settings are higher). We ask vendors to remove paranoid settings inside the user interface which are too high to be ever of any benefit for common users (e.g. AVG, F-Secure, Sophos, etc.). Below are some notes about the settings used (scan all files, scan archives, scan for PUA, scan with cloud, etc. is always being enabled): AVG, F-Secure, Sophos: asked to get tested and awarded based on their default settings (i.e. without using their advanced heuristics / suspicious detections / thorough scan setting). AVIRA, Kaspersky: asked to get tested with heuristic set to high/advanced. Several products make use of cloud technologies, which require an active internet connection. Our tests are performed using an active internet connection. Users should be aware that detection rates may be in some cases drastically lower if the scan is performed while offline (or when the cloud is unreachable for various reasons). The cloud should be considered as an additional benefit/feature to increase detection rates (as well as response times and false alarm suppression) and not as a full replacement for local offline detections. Vendors should make sure that users are warned in case that the connectivity to the cloud gets lost e.g. during a scan, which may affect considerably the provided protection and make e.g. an initiated scan useless. While in our test we are able to check whether the clouds of the security vendors are reachable, users should be aware that being online does not necessarily mean that the cloud connection of the products they use is working properly. In fact, sometimes products with cloud functionality have various network issues due to which no security is provided through the cloud without informing the user. In near future, tools will be available to verify the proper functionality of cloud-supported products. The used test-set has been built consulting telemetry data with the aim to include mainly prevalent samples from the last weeks/months which are/were hitting users in the field. We applied a clustering method to classify similar files (one sample - the most prevalent one - per variant). This allows us to evaluate prevalent similar samples and reducing the size of the set without introducing bias. Therefore, each miss is intended to represent one missed group/variant of files. Even if we deliver various tests and show different aspects of Anti-Virus software, users are advised to evaluate the software by themselves and build their own opinion about them. Test data or reviews just provide guidance to some aspects that users cannot evaluate by themselves. We suggest and encourage readers to research also other independent test results provided by various well-known and established independent testing organizations, in order to get a better overview about the detection and protection capabilities of the various products over different test scenarios and various test-sets. A list of various testing labs can be found on our website. -5-
  • 6. Anti-Virus Comparative - File Detection Test - March 2013 www.av-comparatives.org The malware detection rates are grouped by the testers after looking at the clusters build with the hierarchal clustering method. By using clusters, there are no fixed thresholds to reach, as the thresholds change based on the results. The testers may group the clusters rationally and not rely solely on the clusters, to avoid that if e.g. all products would in future score badly, they do not get high rankings anyway. We are still evaluating whether to apply clusters also for the FP ranges or to set the threshold for “Many” at 11 instead 16. Detection Rate Clusters/Groups (given by the testers after consulting statistical methods) 4 3 2 1 Very few (0-2 FP’s) TESTED STANDARD ADVANCED ADVANCED+ Few (3-15 FP’s) Many (16-50 FP’s) TESTED TESTED STANDARD ADVANCED Very many (51-100 FP’s) TESTED TESTED TESTED STANDARD Crazy many (over 100 FP’s) TESTED TESTED TESTED TESTED Graph of missed samples (lower is better) The graph above shows the test results against a "out-of-box" malware detection provided by Microsoft Windows. In Windows 8, this is provided by Windows Defender, which is pre-installed by default with the operating system. The equivalent in Windows 7 is Microsoft Security Essentials, which is not pre-installed, but can easily be added for free as an option via the Windows Update service. We use this to compare the value of this feature – i.e. static malware detection - provided to users of third-party Anti-Virus solutions. This comparisons will from now on be provided also in other tests, like e.g. monthly for our Real-World Protection Tests (for products participating in public main test- series) available on our website. -6-
  • 7. Anti-Virus Comparative - File Detection Test - March 2013 www.av-comparatives.org Results Total detection rates (clustered in groups): Please consider also the false alarm rates when looking at the below file detection rates4. 1. G DATA 2013 99.9% 2. AVIRA 99.6% 3. F-Secure 99.5% 4. Bitdefender, eScan, 99.3% BullGuard, Panda, Emsisoft 5. Kaspersky 99.2% 6. Fortinet, Vipre 98.6% 7. AVG, Trend Micro 98.4% 8. Sophos, McAfee 98.0% 9. Avast 97.8% 10. ESET 97.5% 11. AhnLab 92.3% 12. Microsoft 92.0% 13. Symantec 91.2% The used test-set contained 136610 recent/prevalent samples from last weeks/months. Hierarchical Cluster Analysis This dendogram shows the results of the cluster 5 analysis . It indicates at what level of similarity the clusters are joined. The red drafted line defines the level of similarity. Each intersection indicates a group. 4 We estimate the remaining error margin on the final percentages to be below 0.2% 5 For more information about cluster analysis, see e.g. this easy to understand tutorial: http://strata.uga.edu/software/pdf/clusterTutorial.pdf -7-
  • 8. Anti-Virus Comparative - File Detection Test - March 2013 www.av-comparatives.org False positive/alarm test In order to better evaluate the quality of the file detection capabilities (distinguish good files from malicious files) of anti-virus products, we provide a false alarm test. False alarms can sometimes cause as much troubles as a real infection. Please consider the false alarm rate when looking at the detection rates, as a product which is prone to cause false alarms achieves higher scores easier. False Positive Results Number of false alarms found in our set of clean files (lower is better): 1. Microsoft 0 very few FPs 2. Fortinet 5 3. Kaspersky, Sophos 6 4. AVIRA 8 5. Bitdefender, BullGuard, ESET 9 few FP’s 6. F-Secure 11 7. Avast 14 8. McAfee 15 9. AhnLab, G DATA 19 10. AVG, eScan 21 11. Trend Micro 22 12. Symantec 23 many FP’s 13. Panda 28 14. Vipre 30 15. Emsisoft 38 Details about the discovered false alarms (including their assumed prevalence) can be seen in a separate report available at: http://www.av-comparatives.org/images/stories/test/fp/avc_fp_mar2013.pdf A product that is successful at detecting a high percentage of malicious files but suffers from false alarms may not be necessarily better than a product which detects less malicious files but which generates less false alarms. -8-
  • 9. Anti-Virus Comparative - File Detection Test - March 2013 www.av-comparatives.org Award levels reached in this test AV-Comparatives provides a ranking award. As this report contains also the raw detection rates and not only the awards, expert users that e.g. do not care about false alarms can rely on that score alone if they want to. AWARDS PRODUCTS6 (based on detection rates and false alarms)  AVIRA  F-Secure  Bitdefender  BullGuard  Kaspersky  G DATA*  eScan*  Panda*  Emsisoft*  Fortinet  Sophos  McAfee  Avast  ESET  Vipre*  AVG*  Trend Micro*  AhnLab*  Symantec* *: those products got lower awards due to false alarms The Awards are not only based on detection rates - also False Positives found in our set of clean files are considered. On page 6 of this report you can see how awards are being given. 6 Microsoft security products are no longer included in the awards page, as their out-of-box detection is included in the operating system and therefore out-of-competition. -9-
  • 10. Anti-Virus Comparative - File Detection Test - March 2013 www.av-comparatives.org Copyright and Disclaimer This publication is Copyright © 2013 by AV-Comparatives e.V. ®. Any use of the results, etc. in whole or in part, is ONLY permitted after the explicit written agreement of the management board of AV- Comparatives e.V., prior to any publication. AV-Comparatives e.V. and its testers cannot be held liable for any damage or loss, which might occur as result of, or in connection with, the use of the information provided in this paper. We take every possible care to ensure the correctness of the basic data, but a liability for the correctness of the test results cannot be taken by any representative of AV-Comparatives e.V. We do not give any guarantee of the correctness, completeness, or suitability for a specific purpose of any of the information/content provided at any given time. No one else involved in creating, producing or delivering test results shall be liable for any indirect, special or consequential damage, or loss of profits, arising out of, or related to, the use or inability to use, the services provided by the website, test documents or any related data. AV-Comparatives e.V. is a registered Austrian Non-Profit-Organization. For more information about AV-Comparatives and the testing methodologies, please visit our website. AV-Comparatives e.V. (April 2013) - 10 -