Dtl 2013 q1_home.1
Upcoming SlideShare
Loading in...5
×
 

Like this? Share it with your network

Share

Dtl 2013 q1_home.1

on

  • 5,315 views

 

Statistics

Views

Total Views
5,315
Views on SlideShare
464
Embed Views
4,851

Actions

Likes
0
Downloads
1
Comments
0

3 Embeds 4,851

http://www.comss.ru 4845
http://translate.googleusercontent.com 5
http://www.admuncher.com 1

Accessibility

Categories

Upload Details

Uploaded via as Adobe PDF

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment

Dtl 2013 q1_home.1 Document Transcript

  • 1. Home Anti-Virus ProtectionJANUARY - MARCH 2013Dennis Technology Labswww.DennisTechnologyLabs.comThis report aims to compare the effectiveness ofanti-malware products provided by well-knownsecurity companies.The products were exposed to internet threatsthat were live during the test period. Thisexposure was carried out in a realistic way, closelyreflecting a customer’s experience.These results reflect what would have happened ifa user was using one of the products and visited aninfected website.EXECUTIVE SUMMARYProducts testedAVG Anti-Virus Free 2013Avast! Free Antivirus 7BitDefender Internet Security 2013ESET Smart Security 6Kaspersky Internet Security 2013McAfee Internet Security 2013Microsoft Security EssentialsNorton Internet Security 2013Trend Micro Internet Security 2013The effectiveness of paid-for anti-malware security suites varies widely but all beatMicrosoft’s free product.Every product was compromised at least once. The most effective were compromised just once or not atall, while the least effective (Microsoft Security Essentials) were compromised by one third of the threats.Blocking malicious sites based on reputation is an effective approach.Those products that prevented users from visiting the malicious sites in the first place gained a significantadvantage. If the malware can’t download onto the victim’s computer then the anti-malware softwarefaces less of an ongoing challenge.Some anti-malware programs are too harsh when evaluating legitimate softwareMost of the software generated at least one false positive. Trend Micro Internet Security 2013 was theleast effective, blocking 18 legitimate applications. AVG Anti-Virus Free 2013, Kaspersky Internet Security2013, Norton Internet Security 2013 and Microsoft Security Essentials were the most effective in this partof the test.Which was the best product?The most accurate programs were Symantec’s Norton Internet Security 2013, Avast! Free Antivirus 7 andKaspersky Internet Security 2013, all of which won our AAA award in this test.Simon Edwards, Dennis Technology Labs, 2nd April 2013
  • 2. Home Anti-Virus Protection, January - March 2013 Page 2 of 20CONTENTSExecutive summary .................................................................................................................................................................... 1Contents....................................................................................................................................................................................... 21. Total Accuracy Ratings......................................................................................................................................................... 32. Protection Ratings ................................................................................................................................................................. 53. Protection Scores .................................................................................................................................................................. 74. Protection Details.................................................................................................................................................................. 85. False Positives ......................................................................................................................................................................... 96. The Tests ...............................................................................................................................................................................137. Test Details ...........................................................................................................................................................................148. Conclusions...........................................................................................................................................................................17Appendix A: Terms Used.......................................................................................................................................................18Appendix B: FAQs....................................................................................................................................................................19Endnotes.....................................................................................................................................................................................20
  • 3. Home Anti-Virus Protection, January - March 2013 Page 3 of 201. TOTAL ACCURACY RATINGSThe total accuracy ratings provide a way to judgehow effectively the security programs work bylooking at a single graph.Anti-malware software should not just detectthreats. It should allow legitimate software to rununhindered as well.The results below take into account howaccurately the programs treated threats andhandled legitimate software.The total accuracy ratings take into account successes and failures with both malware and legitimateapplications.We ran two distinct tests: one that measured howthe products handled internet threats and one thatmeasured how they handled legitimate programs.The ideal product would block all threats andallow all legitimate applications.When a product fails to protect the system againsta threat it is compromised. When it warns against,or even blocks, legitimate software then itgenerates a ‘false positive’ result.Products gain points for stopping threatssuccessfully and for allowing users to install andrun legitimate software. Products lose points forfailing to stop threats and when they handlelegitimate files incorrectly.Each product then receives a final rating based onits performance in each of the ‘threat’ and‘legitimate software’ tests.These results show a combined accuracy rating,taking into account each product’s performancewith both threats and non-malicious software.There is a maximum possible score of 400 and aminimum of -1,000.See 5. False Positives on page 9 for detailed resultsand an explanation on how the false positiveratings are calculated.050100150200250300350400Total AccuracyTotal
  • 4. Home Anti-Virus Protection, January - March 2013 Page 4 of 20TOTAL ACCURACY RATINGSProduct Total Accuracy Rating Percentage AwardNorton Internet Security 2013 388 97% AAAAvast! Free Antivirus 7 381 95% AAAKaspersky Internet Security 2013 379 95% AAABitDefender Internet Security 2013 355 89% AESET Smart Security 6 348 87% ATrend Micro Internet Security 2013 331.3 83% BMcAfee Internet Security 2013 317 79% CAVG Anti-Virus Free 2013 296 74% -Microsoft Security Essentials 127 32% -AwardsThe following products win Dennis Technology Labs awards:Norton Internet Security 2013Avast! Free Antivirus 7Kaspersky Internet Security 2013BitDefender Internet Security 2013ESET Smart Security 6Trend Micro Internet Security 2013McAfee Internet Security 2013
  • 5. Home Anti-Virus Protection, January - March 2013 Page 5 of 202. PROTECTION RATINGSThe following results show how each product wasscored for its accuracy in handling malware only.They do not take into account false positives.Neutralize (+1)If the product terminated a running threat theresult was a neutralization. The product protectedthe system and was awarded one point.Neutralize, complete remediation (+2)The product was awarded a bonus point if, inaddition to stopping the malware, it removed allhazardous traces of the attack.Defense (+3)Products that prevented threats from running‘defended’ the system and were awarded threepoints.Compromise (-5)If the threat ran uninhibited on the system, or thesystem was damaged, five points were deducted.The best possible protection rating is 300 and theworst is -500.With protection ratings we award products extra points for completely blocking a threat, while removingpoints when they are compromised by a threat.How we calculate the ratingsNorton Internet Security 2013 defended against 97of the 100 threats. It gained three points for eachdefense (3x97), totaling 291. It neutralized twothreats (1x2) gaining two further points, bringingthe subtotal to 293. One compromise (-5x1)reduced the final rating to 288.AVG Anti-Virus Free 2013 scored much lower,although it protected the system against 95 percent of the threats. This is because it often failedto completely remediate the neutralized threats. Itdefended 63 times; neutralized threats 32 times;and was compromised five times. Its score iscalculated like this: (3x63) + (1x32) + (-5x5) = 196.The score weighting gives credit to products thatdeny malware any opportunity to tamper with thesystem and penalizes heavily those that fail toprevent an infection.It is possible to apply your own weightings if youfeel that compromises should be penalized moreor less heavily. To do so use the results from 4.Protection Details on page 8.050100150200250300Protection Ratings
  • 6. Home Anti-Virus Protection, January - March 2013 Page 6 of 20PROTECTION RATINGSProduct Protection RatingNorton Internet Security 2013 288Avast! Free Antivirus 7 286Kaspersky Internet Security 2013 280Trend Micro Internet Security 2013 271BitDefender Internet Security 2013 270ESET Smart Security 6 260McAfee Internet Security 2013 232AVG Anti-Virus Free 2013 196Microsoft Security Essentials 27
  • 7. Home Anti-Virus Protection, January - March 2013 Page 7 of 203. PROTECTION SCORESThe following illustrates the general level ofprotection, combining defended and neutralizedresults.There is no distinction made between thesedifferent levels of protection. Either a system isprotected or it is not.The protection scores simply indicate how many time each product prevented a threat fromcompromising the system.PROTECTION SCORESProduct Protected ScoresAvast! Free Antivirus 7 99Norton Internet Security 2013 99Kaspersky Internet Security 2013 98BitDefender Internet Security 2013 97Trend Micro Internet Security 2013 97ESET Smart Security 6 96AVG Anti-Virus Free 2013 95McAfee Internet Security 2013 94Microsoft Security Essentials 67(Average: 97 per cent)0102030405060708090100Protection Scores
  • 8. Home Anti-Virus Protection, January - March 2013 Page 8 of 204. PROTECTION DETAILSThe security products provided different levels ofprotection. When a product defended against athreat, it prevented the malware from gaining afoothold on the target systemi. A threat mighthave been able to exploit or infect the system and,in some cases, the product neutralized it eitherafter the exploit ran or later. When it couldn’t thesystem was compromised.The graph shows details on how the products handled the attacks. They are ordered according to theirprotection scores. For overall protection scores see 3. Protection Scores on page 7.PROTECTION DETAILSProduct Sum Defended Sum Neutralized Sum CompromisedAvast! Free Antivirus 7 96 3 1Norton Internet Security 2013 97 2 1Kaspersky Internet Security 2013 96 2 2BitDefender Internet Security 2013 94 3 3Trend Micro Internet Security 2013 94 3 3ESET Smart Security 6 92 4 4AVG Anti-Virus Free 2013 63 32 5McAfee Internet Security 2013 84 10 6Microsoft Security Essentials 62 5 33020406080100Protection DetailsSum Compromised Sum Neutralized Sum Defended
  • 9. Home Anti-Virus Protection, January - March 2013 Page 9 of 205. FALSE POSITIVES5.1 False positive incidentsA security product needs to be able to protect thesystem from threats, while allowing legitimatesoftware to work properly. When legitimatesoftware is misclassified a false positive is generated.We split the results into two main groups becausemost products we test take one of two basicapproaches when attempting to protect thesystem from the legitimate programs. They eitherwarn that the software was suspicious or take themore decisive step of blocking it.Blocking a legitimate application is more seriousthan issuing a warning because it directly hampersthe user.Products that generated false positives tended to either warn users about legitimate software, or theyblocked it completely.0102030405060708090100TrendMicroInternetSecurity2013ESETSmartSecurity6McAfeeInternetSecurity2013BitDefenderInternetSecurity2013Avast!FreeAntivirus7KasperskyInternetSecurity2013MicrosoftSecurityEssentialsAVGAnti-VirusFree2013NortonInternetSecurity2013TrendMicroInternetSecurity2013ESETSmartSecurity6McAfeeInternetSecurity2013BitDefenderInternetSecurity2013Avast!FreeAntivirus7KasperskyInternetSecurity2013MicrosoftSecurityEssentialsAVGAnti-VirusFree2013NortonInternetSecurity2013Warnings BlockingsFalse Positive IncidentsTotal
  • 10. Home Anti-Virus Protection, January - March 2013 Page 10 of 20FALSE POSITIVE INCIDENTSFalse Positive Type Product TotalWarnings Trend Micro Internet Security 2013 0ESET Smart Security 6 0McAfee Internet Security 2013 0BitDefender Internet Security 2013 1Avast! Free Antivirus 7 3Kaspersky Internet Security 2013 0Microsoft Security Essentials 0AVG Anti-Virus Free 2013 0Norton Internet Security 2013 0Blockings Trend Micro Internet Security 2013 18ESET Smart Security 6 5McAfee Internet Security 2013 5BitDefender Internet Security 2013 4Avast! Free Antivirus 7 1Kaspersky Internet Security 2013 1Microsoft Security Essentials 0AVG Anti-Virus Free 2013 0Norton Internet Security 2013 05.2 Taking file prevalence into accountThe prevalence of each file is significant. If aproduct misclassified a common file then thesituation would be more serious than if it blockeda less common one.That said, it is usually expected that anti-malwareprograms should not misclassify any legitimatesoftware.The files selected for the false positive testingwere organized into five groups: Very High Impact,High Impact, Medium Impact, Low Impact and VeryLow Impact.These categories were based on downloadnumbers as reported by sites includingDownload.com at the time of testing. The rangesfor these categories are recorded in the tablebelow:FALSE POSITIVE PREVALENCE CATEGORIESImpact categoryImpact categoryImpact categoryImpact category PrevalencePrevalencePrevalencePrevalence (downloads in the previous week)(downloads in the previous week)(downloads in the previous week)(downloads in the previous week)Very High Impact >20,000High Impact 1,000 – 20,000Medium Impact 100 – 999Low Impact 25 – 99Very Low Impact < 25
  • 11. Home Anti-Virus Protection, January - March 2013 Page 11 of 205.3 Modifying scoresThe following set of score modifiers were used tocreate an impact-weighted accuracy score. Eachtime a product allowed a new legitimate programto install and run it was awarded one point. It lostpoints (or fractions of a point) if and when itgenerated false positives. We used the followingscore modifiers:FALSE POSITIVE PREVALENCE SCORE MODIFIERSFalse positive actionFalse positive actionFalse positive actionFalse positive action Impact categoryImpact categoryImpact categoryImpact category Score modifierScore modifierScore modifierScore modifierBlocked Very High Impact -5High Impact -2Medium Impact -1Low Impact -0.5Very Low Impact -0.1Warning Very High Impact -2.5High Impact -1Medium Impact -0.5Low Impact -0.25Very Low Impact -0.055.4 Distribution of impact categoriesProducts that scored highest were the mostaccurate when handling the legitimate applicationsused in the test. The best score possible is 100,while the worst would be -500 (assuming that allapplications were classified as Very High Impactand were blocked). In fact the distribution ofapplications in the impact categories was notrestricted only to Very High Impact. The tablebelow shows the true distribution:FALSE POSITIVE CATEGORY FREQUENCYPrevalence Rating FrequencyVery High Impact 31High Impact 34Medium Impact 15Low Impact 10Very Low Impact 10
  • 12. Home Anti-Virus Protection, January - March 2013 Page 12 of 205.5 False positive ratingsCombining the impact categories with weighted scores produces the following false positive accuracy ratings.When a product misclassified a popular program it faced a stronger penalty than if the file was moreobscure.FALSE POSITIVE RATINGSProduct Accuracy RatingMicrosoft Security Essentials 100AVG Anti-Virus Free 2013 100Norton Internet Security 2013 100Kaspersky Internet Security 2013 99Avast! Free Antivirus 7 95ESET Smart Security 6 88BitDefender Internet Security 2013 85McAfee Internet Security 2013 85Trend Micro Internet Security 2013 60.30102030405060708090100False Positive RatingsTotal
  • 13. Home Anti-Virus Protection, January - March 2013 Page 13 of 206. THE TESTS6.1 The threatsProviding a realistic user experience was importantin order to illustrate what really happens when auser encounters a threat on the internet.For example, in these tests web-based malwarewas accessed by visiting an original, infectedwebsite using a web browser, and not downloadedfrom a CD or internal test website.All target systems were fully exposed to thethreats. This means that any exploit code wasallowed to run, as were other malicious files, Theywere run and permitted to perform exactly as theywere designed to, subject to checks made by theinstalled security software.A minimum time period of five minutes wasprovided to allow the malware an opportunity toact.6.2 Test roundsTests were conducted in rounds. Each roundrecorded the exposure of every product to aspecific threat. For example, in ‘round one’ each ofthe products was exposed to the same maliciouswebsite.At the end of each round the test systems werecompletely reset to remove any possible trace ofmalware before the next test began.6.3 MonitoringClose logging of the target systems was necessaryto gauge the relative successes of the malware andthe anti-malware software. This included recordingactivity such as network traffic, the creation of filesand processes and changes made to importantfiles.6.4 Levels of protectionThe products displayed different levels ofprotection. Sometimes a product would prevent athreat from executing, or at least making anysignificant changes to the target system.In other cases a threat might be able to performsome tasks on the target (such as exploiting asecurity vulnerability or executing a maliciousprogram), after which the security product wouldintervene and remove some or all of the malware.Finally, a threat may be able to bypass the securityproduct and carry out its malicious tasksunhindered. It may even be able to disable thesecurity software.Occasionally Windows own protection systemmight handle a threat while the anti-virus programignored it. Another outcome is that the malwaremay crash for various reasons.The different levels of protection provided by eachproduct were recorded following analysis of thelog files.If malware failed to perform properly in a givenincident, perhaps because of the very presence ofthe security product, rather than any specificdefending action that the product took, theproduct was given the benefit of the doubt and aDefended result was recorded.If the test system was damaged, becoming hard touse following an attempted attack, this wascounted as a compromise even if the active partsof the malware had eventually been removed bythe product.6.5 Types of protectionAll of the products tested provided two maintypes of protection: real-time and on-demand.Real-time protection monitors the systemconstantly in an attempt to prevent a threat fromgaining access.On-demand protection is essentially a ‘virus scan’that is run by the user at an arbitrary time.The test results note each product’s behaviorwhen a threat is introduced and afterwards. Thereal-time protection mechanism was monitoredthroughout the test, while an on-demand scan wasrun towards the end of each test to measure howsafe the product determined the system to be.Manual scans were run only when a testerdetermined that malware had made an interactionwith the target system. In other words, if thesecurity product claimed to block the attack at theinitial stage, and the monitoring logs supported thisclaim, the case was considered closed and aDefended result was recorded
  • 14. Home Anti-Virus Protection, January - March 2013 Page 14 of 207. TEST DETAILS7.1 The targetsTo create a fair testing environment, each productwas installed on a clean Windows XP Professionaltarget system. The operating system was updatedwith Windows XP Service Pack 3 (SP3), althoughno later patches or updates were applied.We test with Windows XP SP3 and InternetExplorer 7 due to the high prevalence of internetthreats that work with this combination. We alsowant to collect a full year’s worth of Windows XP-based test data before upgrading to Windows 7.The prevalence of these threats suggests thatthere are many systems with this level of patchingcurrently connected to the internet.At the time of testing Windows XP was still beingused heavily by consumers and businesses.According to Net Applications, which monitorsthe popularity of operating systems and webbrowsers, nearly as many people were usingWindows XP as Windows 7. Windows XP wasrunning 39.5 per cent of PCs, while Windows 7was installed on 44.4%ii.Additionally, our aim is to test the securityproduct and not the protection provided bykeeping systems completely up to date withpatches and other mechanisms.A selection of legitimate but vulnerable softwarewas pre-installed on the target systems. Theseposed security risks, as they contained knownsecurity issues. They included versions of AdobeFlash Player, Adobe Reader and Java.A different security product was then installed oneach system. Each product’s update mechanismwas used to download the latest version with themost recent definitions and other elements.Due to the dynamic nature of the tests, whichwere carried out in real-time with live maliciouswebsites, the products update systems wereallowed to run automatically and were also runmanually before each test round was carried out.The products were also allowed to call homeshould they be programmed to query databases inreal-time. Some products might automaticallyupgrade themselves during the test. At any giventime of testing, the very latest version of eachprogram was used.Target systems used identical hardware, includingan Intel Core 2 Duo processor, 1GB RAM, 160GBhard disk and DVD-ROM drive. Each wasconnected to the internet via its own virtualnetwork (VLAN) to avoid cross-infection ofmalware.7.2 Threat selectionThe malicious web links (URLs) used in the testswere not provided by any anti-malware vendor.They were picked from lists generated by DennisTechnology Labs’ own malicious site detectionsystem, which uses popular search enginekeywords submitted to Google. It analyses sitesthat are returned in the search results from anumber of search engines and adds them to adatabase of malicious websites.In all cases, a control system (Verification TargetSystem - VTS) was used to confirm that the URLslinked to actively malicious sites.Malicious URLs and files are not shared with anyvendors during the testing process.7.3 Test stagesThere were three main stages in each individualtest:1. Introduction2. Observation3. RemediationDuring the Introduction stage, the target systemwas exposed to a threat. Before the threat wasintroduced, a snapshot was taken of the system.This created a list of Registry entries and files onthe hard disk. The threat was then introduced.Immediately after the system’s exposure to thethreat, the Observation stage is reached. During thistime, which typically lasted at least 10 minutes, thetester monitored the system both visually andusing a range of third-party tools.The tester reacted to pop-ups and other promptsaccording to the directives described below (see7.5 Observation and intervention on page 15).
  • 15. Home Anti-Virus Protection, January - March 2013 Page 15 of 20In the event that hostile activity to other internetusers was observed, such as when spam was beingsent by the target, this stage was cut short.The Observation stage concluded with anothersystem snapshot. This ‘exposed’ snapshot wascompared to the original ‘clean’ snapshot and areport generated. The system was then rebooted.The Remediation stage is designed to test theproducts’ ability to clean an infected system. If itdefended against the threat in the Observation stagethen we skipped it. An on-demand scan was runon the target, after which a ‘scanned’ snapshot wastaken. This was compared to the original ‘clean’snapshot and a report was generated.All log files, including the snapshot reports and theproduct’s own log files, were recovered from thetarget.In some cases the target may become so damagedthat log recovery is considered impractical. Thetarget was then reset to a clean state, ready forthe next test.7.4 Threat introductionMalicious websites were visited in real-time usingthe web browser. This risky behavior wasconducted using live internet connections. URLswere typed manually into the browser.Web-hosted malware often changes over time.Visiting the same site over a short period of timecan expose systems to what appear to be a rangeof threats (although it may be the same threat,slightly altered to avoid detection).Also, many infected sites will only attack aparticular IP address once, which makes it hard totest more than one product against the samethreat.In order to improve the chances that each targetsystem received the same experience from amalicious web server, we used a web replaysystem.When the verification target systems visited amalicious site, the page’s content, includingmalicious code, was downloaded, stored andloaded into the replay system. When each targetsystem subsequently visited the site, it receivedexactly the same content.The network configurations were set to allow allproducts unfettered access to the internetthroughout the test, regardless of the web replaysystems.7.5 Observation and interventionThroughout each test, the target system wasobserved both manually and in real-time. Thisenabled the tester to take comprehensive notesabout the system’s perceived behavior, as well asto compare visual alerts with the products’ logentries.At certain stages the tester was required to act asa regular user. To achieve consistency, the testerfollowed a policy for handling certain situations,including dealing with pop-ups displayed byproducts or the operating system, system crashes,invitations by malware to perform tasks and so on.This user behavior policy included the followingdirectives:1. Act naively. Allow the threat a goodchance to introduce itself to the target byclicking OK to malicious prompts, forexample.2. Don’t be too stubborn in retrying blockeddownloads. If a product warns againstvisiting a site, don’t take further measuresto visit that site.3. Where malware is downloaded as a Zipfile, or similar, extract it to the Desktopthen attempt to run it. If the archive isprotected by a password, and thatpassword is known to you (e.g. it wasincluded in the body of the originalmalicious email), use it.4. Always click the default option. Thisapplies to security product pop-ups,operating system prompts (includingWindows firewall) and malwareinvitations to act.5. If there is no default option, wait. Givethe prompt 20 seconds to choose acourse of action automatically.6. If no action is taken automatically, choosethe first option. Where options are listedvertically, choose the top one. Whereoptions are listed horizontally, choose theleft-hand one.7.6 RemediationWhen a target is exposed to malware, the threatmay have a number of opportunities to infect thesystem. The security product also has a number of
  • 16. Home Anti-Virus Protection, January - March 2013 Page 16 of 20chances to protect the target. The snapshotsexplained in 7.3 Test stages on page 14 providedinformation that was used to analyze a system’sfinal state at the end of a test.Before, during and after each test, a ‘snapshot’ ofthe target system was taken to provideinformation about what had changed during theexposure to malware. For example, comparing asnapshot taken before a malicious website wasvisited to one taken after might highlight newentries in the Registry and new files on the harddisk.Snapshots were also used to determine howeffective a product was at removing a threat thathad managed to establish itself on the targetsystem. This analysis gives an indication as to thelevels of protection that a product has provided.These levels of protection have been recordedusing three main terms: defended, neutralized, andcompromised. A threat that was unable to gain afoothold on the target was defended against; onethat was prevented from continuing its activitieswas neutralized; while a successful threat wasconsidered to have compromised the target.A defended incident occurs where no maliciousactivity is observed with the naked eye or third-party monitoring tools following the initial threatintroduction. The snapshot report files are used toverify this happy state.If a threat is observed to run actively on thesystem, but not beyond the point where an on-demand scan is run, it is considered to have beenneutralized.Comparing the snapshot reports should show thatmalicious files were created and Registry entrieswere made after the introduction. However, aslong as the ‘scanned’ snapshot report shows thateither the files have been removed or the Registryentries have been deleted, the threat has beenneutralized.The target is compromised if malware is observedto run after the on-demand scan. In some cases aproduct might request a further scan to completethe removal. We considered secondary scans tobe acceptable, but continual scan requests may beignored after no progress is determined.An edited ‘hosts’ file or altered system file alsocounted as a compromise.7.7 Automatic monitoringLogs were generated using third-party applications,as well as by the security products themselves.Manual observation of the target systemthroughout its exposure to malware (andlegitimate applications) provided more informationabout the security products’ behavior.Monitoring was performed directly on the targetsystem and on the network.Client-side loggingA combination of Process Explorer, ProcessMonitor, TcpView and Wireshark were used tomonitor the target systems. Regshot was usedbetween each testing stage to record a systemsnapshot.A number of Dennis Technology Labs-createdscripts were also used to provide additionalsystem information. Each product was able togenerate some level of logging itself.Process Explorer and TcpView were runthroughout the tests, providing a visual cue to thetester about possible malicious activity on thesystem. In addition, Wireshark’s real-time output,and the display from the web proxy (see Networklogging, below), indicated specific network activitysuch as secondary downloads.Process Monitor also provided valuableinformation to help reconstruct maliciousincidents. Both Process Monitor and Wiresharkwere configured to save their logs automatically toa file. This reduced data loss when malware causeda target to crash or reboot.Network loggingAll target systems were connected to a liveinternet connection, which incorporated atransparent web proxy and a network monitoringsystem. All traffic to and from the internet had topass through this system.The network monitor was a dual-homed Linuxsystem running as a transparent router, passing allweb traffic through a Squid proxy.An HTTP replay system ensured that all targetsystems received the same malware as each other.It was configured to allow access to the internetso that products could download updates andcommunicate with any available ‘in the cloud’servers.
  • 17. Home Anti-Virus Protection, January - March 2013 Page 17 of 208. CONCLUSIONSWhere are the threats?The threats used in this test were genuine, real-lifethreats that were infecting victims globally at thesame time as we tested the products. In almostevery case the threat was launched from alegitimate website that had been compromised byan attacker.The types of infected or malicious sites werevaried, which demonstrates that effective anti-virussoftware is essential for those who want to usethe web using a Windows PC.Most threats installed automatically when a uservisited the infected webpage. This infection wasoften invisible to a casual observer.Where does protection start?There were a significant number of compromisesin this test, as well as a relatively large number ofneutralizations.The strongest products blocked the site before itwas even able to deliver its payload. The weakesttended to handle the threat after it had started tointeract with the target system.Sorting the wheat from the chaffNorton Internet Security 2013 scored highest interms of malware protection, while Avast! FreeAntivirus 7 took a close second place. KasperskyInternet Security 2013 came third.Norton Internet Security 2013 was compromisedjust once and neutralized two threats; Avast! Anti-Virus 7 was also compromised once only butneutralized three threats. Kaspersky InternetSecurity 2013 was compromised twice andneutralized two threats.The products from BitDefender and Trend Microwere compromised three times and neutralizedthree threats each.However, anti-malware products need to be ableto distinguish between malicious and non-maliciousprograms. This is where Trend Micro’s productparticularly failed to excel.Trend Micro Internet Security 2013 misclassifiedlegitimate applications often, blocking 18 legitimateprograms. This was far more than any otherproduct in this test.In contrast, Microsoft Security Essentials generatedno false positives but was notably poor atprotecting the system from malware. It failed toprevent 33 per cent of the threats fromcompromising the system.Overall, considering each product’s ability tohandle both malware and legitimate applications,the winner was Norton Internet Security 2013,following closely by Avast! Free Antivirus 7 andthen Kaspersky Internet Security 2013. All win theAAA award.Anti-virus is important (but not apanacea)This test shows that with even a relatively smallsample set of 100 threats there is a significantdifference in performance between the anti-virusprograms. Most importantly, it illustrates thisdifference using real threats that attacked realcomputers at the time of testing.The average protection level of the testedproducts is 97 per cent (see 3. Protection Scores onpage 7). This figure is much lower than somedetection results typically quoted in anti-malwaremarketing material.The presence of anti-malware software can beseen to decrease the chances of a malwareinfection even when the only sites being visited areproven to be actively malicious. That said, not oneproduct produced a 100 per cent protection rate,while most generated false positive results.
  • 18. Home Anti-Virus Protection, January - March 2013 Page 18 of 20APPENDIX A: TERMS USEDCompromised Malware continues to run on an infected system, even after an on-demand scan.Defended Malware was prevented from running on, or making changes to, the target.False Positive A legitimate application was incorrectly classified as being malicious.Introduction Test stage where a target system is exposed to a threat.NeutralizedMalware or exploit was able to run on the target, but was then removed by the securityproduct.Observation Test stage during which malware may affect the target.On-demand (protection) Manual ‘virus’ scan, run by the user at an arbitrary time.PromptQuestions asked by software, including malware, security products and the operatingsystem. With security products, prompts usually appear in the form of pop-up windows.Some prompts don’t ask questions but provide alerts. When these appear anddisappear without a user’s interaction, they are called ‘toasters’.Real-time (protection) The ‘always-on’ protection offered by many security products.Remediation Test stage that measures a product’s abilities to remove any installed threat.Round Test series of multiple products, exposing each target to the same threat.Snapshot Record of a target’s file system and Registry contents.Target Test system exposed to threats in order to monitor the behavior of security products.Threat A program or other measure designed to subvert a system.UpdateCode provided by a vendor to keep its software up to date. This includes virusdefinitions, engine updates and operating system patches.
  • 19. Home Anti-Virus Protection, January - March 2013 Page 19 of 20APPENDIX B: FAQSThis test was unsponsored.The test rounds were conducted between 24th January 2013 and 28th March 2013 using the most up todate versions of the software available on any given day.All products were able to communicate with their back-end systems over the internet.The products selected for this test were chosen by Dennis Technology Labs.Samples were located and verified by Dennis Technology Labs.Products were exposed to threats within 24 hours of the same threats being verified. In practice therewas only a delay of up to three to four hours.Details of the samples, including their URLs and code, were provided to partner vendors only after thetest was complete.The sample set comprised 100 actively-malicious URLs and 100 legitimate applications.Do participating vendors know what samples are used, before or during the test?No. We don’t even know what threats will be used until the test starts. Each day we find new ones, so it isimpossible for us to give this information before the test starts. Neither do we disclose this information untilthe test has concluded.What is the difference between a vendor and a partner vendor?Partner vendors contribute financially to the test in return for a preview of the results, an opportunity tochallenge results before publication and the right to use award logos in marketing material. Other participantsfirst see the results on the day of publication and may not use award logos for any purpose.Do you share samples with the vendors?Partner vendors are able to download all samples from us after the test is complete.Other vendors may request a subset of the threats that compromised their products in order for them toverify our results. The same applies to client-side logs, including the network capture files. There is a smalladministration fee for the provision of this service.What is a sample?In our tests a sample is not simply a set of malicious executable files that runs on the system. A sample is anentire replay archive that enables researchers to replicate the incident, even if the original infected website isno longer available. This means that it is possible to reproduce the attack and to determine which layer ofprotection is was able to bypass. Replaying the attack should, in most cases, produce the relevant executablefiles. If not, these are usually available in the client-side network capture (pcap) file.
  • 20. WHILE EVERY EFFORT IS MADE TO ENSURE THE ACCURACY OF THE INFORMATION PUBLISHED INTHIS DOCUMENT, NO GUARANTEE IS EXPRESSED OR IMPLIED AND DENNIS PUBLISHING LTD DOESNOT ACCEPT LIABILITY FOR ANY LOSS OR DAMAGE THAT MAY ARISE FROM ANY ERRORS OROMISSIONS.Home Anti-Virus Protection, January - March 2013 Page 20 of 20ENDNOTESi Our testing policy gives each product the benefit of the doubt when malware fails to run. This is because, whether theproduct intentionally blocked the threat or not, a real user would not have been infected in similar circumstances.See 6.4 Levels of protection on page 13 for our full set of layered protection policies.When we exposed the system protected by Avast! Anti-Virus Free 7 to the threats we saw a large number (21 per cent)of cases in which the malware did not manifest itself. The product neither produced any visible alerts, nor generated logsto indicate the detection or blocking of an attack.However, the live threats involved continued to work against unprotected systems and our analysis shows that the Avast!system remained secure throughout. For this reason we categorised these incidents as being Defended results.This is a statement from Avast!:“We were not able to reproduce the behavior in our test lab. We will continue to work with Dennis Technology Labs tobetter understand the cause of the problem. Needless to say, this is obviously just a little glitch as none of the affectedsystems have been compromised.”ii http://news.cnet.com/8301-10805_3-57567081-75/windows-8-ekes-out-2.2-percent-market-share/