Your SlideShare is downloading. ×
Dtl 2012 q4_home.1
Upcoming SlideShare
Loading in...5
×

Thanks for flagging this SlideShare!

Oops! An error has occurred.

×

Introducing the official SlideShare app

Stunning, full-screen experience for iPhone and Android

Text the download link to your phone

Standard text messaging rates apply

Dtl 2012 q4_home.1

4,597
views

Published on


0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total Views
4,597
On Slideshare
0
From Embeds
0
Number of Embeds
3
Actions
Shares
0
Downloads
1
Comments
0
Likes
0
Embeds 0
No embeds

Report content
Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
No notes for slide

Transcript

  • 1. Home Anti-Virus Protection OCTOBER - DECEMBER 2012 Dennis Technology Labs www.DennisTechnologyLabs.comThis report aims to compare the effectiveness of exposure was carried out in a realistic way, closelyanti-malware products provided by well-known reflecting a customer’s experience.security companies. These results reflect what would have happened ifThe products were exposed to internet threats a user was using one of the products and visited anthat were live during the test period. This infected website.EXECUTIVE SUMMARY Products tested AVG Internet Security 2013 McAfee Internet Security 2013 BitDefender Internet Security 2013 Microsoft Security Essentials ESET Smart Security 5 Norton Internet Security 2013 Kaspersky Internet Security 2013 Trend Micro Internet Security 2013 The effectiveness of paid-for anti-malware security suites varies widely but all beat Microsoft’s free product. Nearly every product was compromised at least once. The most effective were compromised just once or not at all, while the least effective (Microsoft Security Essentials) were compromised by 41 per cent of the threats. Blocking malicious sites based on reputation is an effective approach. Those products that prevented users from visiting the malicious sites in the first place gained a significant advantage. If the malware can’t download onto the victim’s computer then the anti-malware software faces less of an ongoing challenge. Some anti-malware programs are too harsh when evaluating legitimate software All of the software generated at least one false positive. Trend Micro Internet Security 2013 was the least effective, blocking 21 legitimate applications. Kaspersky Internet Security 2013, Norton Internet Security 2013 and Microsoft Security Essentials were the most effective in this part of the test. Which was the best product? The most accurate program was Symantec’s Norton Internet Security 2013, the only product to receive our AAA award in this test. Its performance was closely followed by that of Kaspersky Internet Security 2013 and ESET Smart Security 5, both of which earned AA awards. Simon Edwards, Dennis Technology Labs, 31st December 2012
  • 2. CONTENTS Executive summary .................................................................................................................................................................... 1 Contents ....................................................................................................................................................................................... 2 1. Total Accuracy Ratings ......................................................................................................................................................... 3 2. Protection Ratings ................................................................................................................................................................. 5 3. Protection Scores .................................................................................................................................................................. 7 4. Protection Details .................................................................................................................................................................. 8 5. False Positives ......................................................................................................................................................................... 9 6. The Tests ............................................................................................................................................................................... 13 7. Test Details ........................................................................................................................................................................... 14 8. Conclusions ........................................................................................................................................................................... 17 Appendix A: Terms Used ....................................................................................................................................................... 18 Appendix B: FAQs.................................................................................................................................................................... 19Home Anti-Virus Protection, October - December 2012 Page 2 of 19
  • 3. 1. TOTAL ACCURACY RATINGSThe total accuracy ratings provide a way to judge The results below take into account howhow effectively the security programs work by accurately the programs treated threats andlooking at a single graph. handled legitimate software.Anti-malware software should not just detectthreats. It should allow legitimate software to rununhindered as well. Total Accuracy 400 350 300 250 200 150 100 50 Total 0 The total accuracy ratings take into account successes and failures with both malware and legitimate applications.We ran two distinct tests: one that measured how Each product then receives a final rating based onthe products handled internet threats and one that its performance in each of the ‘threat’ andmeasured how they handled legitimate programs. ‘legitimate software’ tests.The ideal product would block all threats and These results show a combined accuracy rating,allow all legitimate applications. taking into account each product’s performance with both threats and non-malicious software.When a product fails to protect the system againsta threat it is compromised. When it warns against, There is a maximum possible score of 400 and aor even blocks, legitimate software then it minimum of -1,000.generates a ‘false positive’ result. See 5. False Positives on page 9 for detailed resultsProducts gain points for stopping threats and an explanation on how the false positivesuccessfully and for allowing users to install and ratings are calculated.run legitimate software. Products lose points forfailing to stop threats and when they handlelegitimate files incorrectly.Home Anti-Virus Protection, October - December 2012 Page 3 of 19
  • 4. TOTAL ACCURACY RATINGS Product Total Accuracy Rating Percentage Award Norton Internet Security 2013 388.5 97% AAA Kaspersky Internet Security 2013 368 92% AA ESET Smart Security 5 359.5 90% AA BitDefender Internet Security 2013 348.9 87% A Trend Micro Internet Security 2013 340.1 85% A AVG Internet Security 2013 335.5 84% B McAfee Internet Security 2013 305.5 76% C Microsoft Security Essentials 30 8% - AwardsThe following products win Dennis Technology Labs awards: Norton Internet Security 2013 ESET Smart Security 5 Kaspersky Internet Security 2013 BitDefender Internet Security 2013 Trend Micro Internet Security 2013 AVG Internet Security 2013 McAfee Internet Security 2013Home Anti-Virus Protection, October - December 2012 Page 4 of 19
  • 5. 2. PROTECTION RATINGSThe following results show how each product was Defense (+3)scored for its accuracy in handling malware only. Products that prevented threats from runningThey do not take into account false positives. ‘defended’ the system and were awarded three points. Neutralize (+1)If the product terminated a running threat the Compromise (-5)result was a neutralization. The product protected If the threat ran uninhibited on the system, or thethe system and was awarded one point. system was damaged, five points were deducted. Neutralize, complete remediation (+2) The best possible protection rating is 300 and theThe product was awarded a bonus point if, in worst is -500.addition to stopping the malware, it removed allhazardous traces of the attack. Protection Ratings 300 250 200 150 100 50 0 Norton Trend Micro ESET Smart BitDefender Kaspersky AVG McAfee Microsoft -50 Internet Internet Security 5 Internet Internet Internet Internet Security Security Security Security Security Security Security Essentials -100 2013 2013 2013 2013 2013 2013 With protection ratings we award products extra points for completely blocking a threat, while removing points when they are compromised by a threat. remediation three times; and neutralized withoutHow we calculate the ratingsNorton Internet Security 2013 defended against 96 complete remediation seven times. Its score isof the 100 threats. It gained three points for each calculated like this: (3x90) + (2x3) + (1x7) = 283.defense (3x96), totaling 288. It neutralized three The score weighting gives credit to products thatthreats with full remediation (3x2) gaining six deny malware any opportunity to tamper with thefurther points, bringing the subtotal to 294. One system and penalizes heavily those that fail tocompromise (-5x1) reduced the final rating to 289. prevent an infection.Trend Micro Internet Security 2013 scored lower, It is possible to apply your own weightings if youdespite being the only product not to suffer a feel that compromises should be penalized morecompromise. It sometimes failed to completely or less heavily. To do so use the results from 4.remediate the neutralized threats. It defended 90 Protection Details on page 8.times; neutralized threats with completeHome Anti-Virus Protection, October - December 2012 Page 5 of 19
  • 6. PROTECTION RATINGS Product Protection Rating Norton Internet Security 2013 289 Trend Micro Internet Security 2013 283 ESET Smart Security 5 275 BitDefender Internet Security 2013 268 Kaspersky Internet Security 2013 268 AVG Internet Security 2013 244 McAfee Internet Security 2013 217 Microsoft Security Essentials -70Home Anti-Virus Protection, October - December 2012 Page 6 of 19
  • 7. 3. PROTECTION SCORESThe following illustrates the general level of There is no distinction made between theseprotection, combining defended and neutralized different levels of protection. Either a system isresults. protected or it is not. Protection Scores 100 90 80 70 60 50 40 30 20 10 0 The protection scores simply indicate how many time each product prevented a threat from compromising the system.PROTECTION SCORES Product Protected Scores Trend Micro Internet Security 2013 100 Norton Internet Security 2013 99 ESET Smart Security 5 98 BitDefender Internet Security 2013 97 Kaspersky Internet Security 2013 96 AVG Internet Security 2013 95 McAfee Internet Security 2013 93 Microsoft Security Essentials 59 (Average: 92 per cent)Home Anti-Virus Protection, October - December 2012 Page 7 of 19
  • 8. 4. PROTECTION DETAILSThe security products provided different levels of been able to exploit or infect the system and, inprotection. When a product defended against a some cases, the product neutralized it either afterthreat, it prevented the malware from gaining a the exploit ran or later. When it couldn’t thefoothold on the target system. A threat might have system was compromised. Protection Details100 80 60 40 20 0 Trend Micro Norton ESET Smart BitDefender Kaspersky AVG McAfee Microsoft Internet Internet Security 5 Internet Internet Internet Internet Security Security Security Security Security Security Security Essentials 2013 2013 2013 2013 2013 2013 Sum Compromised Sum Neutralized Sum Defended The graph shows details on how the products handled the attacks. They are ordered according to their protection scores. For overall protection scores see 3. Protection Scores on page 7.PROTECTION DETAILS Product Sum Defended Sum Neutralized Sum Compromised Trend Micro Internet Security 2013 90 10 0 Norton Internet Security 2013 96 3 1 ESET Smart Security 5 92 6 2 BitDefender Internet Security 2013 91 6 3 Kaspersky Internet Security 2013 96 0 4 AVG Internet Security 2013 86 9 5 McAfee Internet Security 2013 78 15 7 Microsoft Security Essentials 36 23 41Home Anti-Virus Protection, October - December 2012 Page 8 of 19
  • 9. 5. FALSE POSITIVES 5.1 False positive incidentsA security product needs to be able to protect the system from the legitimate programs. They eithersystem from threats, while allowing legitimate warn that the software was suspicious or take thesoftware to work properly. When legitimate more decisive step of blocking it.software is misclassified a false positive is generated. Blocking a legitimate application is more seriousWe split the results into two main groups because than issuing a warning because it directly hampersmost products we test take one of two basic the user.approaches when attempting to protect the False Positive Incidents 100 90 80 70 60 50 40 30 20 10 0 Total McAfee Internet Security 2013 Kaspersky Internet Security 2013 McAfee Internet Security 2013 Kaspersky Internet Security 2013 Trend Micro Internet Security 2013 Norton Internet Security 2013 AVG Internet Security 2013 Trend Micro Internet Security 2013 Norton Internet Security 2013 AVG Internet Security 2013 ESET Smart Security 5 ESET Smart Security 5 Microsoft Security Essentials Microsoft Security Essentials BitDefender Internet Security 2013 BitDefender Internet Security 2013 Warnings Blockings Products that generated false positives tended to either warn users about legitimate software, or they blocked it completely.Home Anti-Virus Protection, October - December 2012 Page 9 of 19
  • 10. FALSE POSITIVE INCIDENTS False Positive Type Product Total Warnings Trend Micro Internet Security 2013 0 BitDefender Internet Security 2013 2 McAfee Internet Security 2013 2 ESET Smart Security 5 2 Norton Internet Security 2013 0 AVG Internet Security 2013 2 Microsoft Security Essentials 0 Kaspersky Internet Security 2013 0 Blockings Trend Micro Internet Security 2013 21 BitDefender Internet Security 2013 9 McAfee Internet Security 2013 4 ESET Smart Security 5 3 Norton Internet Security 2013 1 AVG Internet Security 2013 1 Microsoft Security Essentials 0 Kaspersky Internet Security 2013 0 5.2 Taking file prevalence into accountThe prevalence of each file is significant. If a High Impact, Medium Impact, Low Impact and Veryproduct misclassified a common file then the Low Impact.situation would be more serious than if it blocked These categories were based on downloada less common one. numbers as reported by sites includingThat said, it is usually expected that anti-malware Download.com at the time of testing. The rangesprograms should not misclassify any legitimate for these categories are recorded in the tablesoftware. below:The files selected for the false positive testingwere organized into five groups: Very High Impact,FALSE POSITIVE PREVALENCE CATEGORIESImpact category Prevalence (downloads in the previous week)Very High Impact >20,000High Impact 1,000 – 20,000Medium Impact 100 – 999Low Impact 25 – 99Very Low Impact < 25Home Anti-Virus Protection, October - December 2012 Page 10 of 19
  • 11. 5.3 Modifying scoresThe following set of score modifiers were used to points (or fractions of a point) if and when itcreate an impact-weighted accuracy score. Each generated false positives. We used the followingtime a product allowed a new legitimate program score modifiers:to install and run it was awarded one point. It lostFALSE POSITIVE PREVALENCE SCORE MODIFIERSFalse positive action Impact category Score modifierBlocked Very High Impact -5 High Impact -2 Medium Impact -1 Low Impact -0.5 Very Low Impact -0.1Warning Very High Impact -2.5 High Impact -1 Medium Impact -0.5 Low Impact -0.25 Very Low Impact -0.05 5.4 Distribution of impact categoriesProducts that scored highest were the most and were blocked). In fact the distribution ofaccurate when handling the legitimate applications applications in the impact categories was notused in the test. The best score possible is 100, restricted only to Very High Impact. The tablewhile the worst would be -500 (assuming that all below shows the true distribution:applications were classified as Very High ImpactFALSE POSITIVE CATEGORY FREQUENCY Prevalence Rating Frequency Very High Impact 30 High Impact 35 Medium Impact 15 Low Impact 10 Very Low Impact 10Home Anti-Virus Protection, October - December 2012 Page 11 of 19
  • 12. 5.5 False positive ratingsCombining the impact categories with weighted scores produces the following false positive accuracy ratings. False Positive Ratings 100 90 80 70 60 50 40 30 20 10 Total 0 When a product misclassified a popular program it faced a stronger penalty than if the file was more obscure.FALSE POSITIVE RATINGS Product Accuracy Rating Kaspersky Internet Security 2013 100 Microsoft Security Essentials 100 Norton Internet Security 2013 99.5 AVG Internet Security 2013 91.5 McAfee Internet Security 2013 88.5 ESET Smart Security 5 84.5 BitDefender Internet Security 2013 80.9 Trend Micro Internet Security 2013 57.1Home Anti-Virus Protection, October - December 2012 Page 12 of 19
  • 13. 6. THE TESTS Finally, a threat may be able to bypass the security 6.1 The threats product and carry out its malicious tasksProviding a realistic user experience was important unhindered. It may even be able to disable thein order to illustrate what really happens when a security software.user encounters a threat on the internet. Occasionally Windows own protection systemFor example, in these tests web-based malware might handle a threat while the anti-virus programwas accessed by visiting an original, infected ignored it. Another outcome is that the malwarewebsite using a web browser, and not downloaded may crash for various reasons.from a CD or internal test website. The different levels of protection provided by eachAll target systems were fully exposed to the product were recorded following analysis of thethreats. This means that any exploit code was log files.allowed to run, as were other malicious files, They If malware failed to perform properly in a givenwere run and permitted to perform exactly as they incident, perhaps because of the very presence ofwere designed to, subject to checks made by the the security product, rather than any specificinstalled security software. defending action that the product took, theA minimum time period of five minutes was product was given the benefit of the doubt and aprovided to allow the malware an opportunity to Defended result was recorded.act. If the test system was damaged, becoming hard to 6.2 Test rounds use following an attempted attack, this wasTests were conducted in rounds. Each round counted as a compromise even if the active partsrecorded the exposure of every product to a of the malware had eventually been removed byspecific threat. For example, in ‘round one’ each of the product.the products was exposed to the same malicious 6.5 Types of protectionwebsite. All of the products tested provided two mainAt the end of each round the test systems were types of protection: real-time and on-demand.completely reset to remove any possible trace of Real-time protection monitors the systemmalware before the next test began. constantly in an attempt to prevent a threat from gaining access. 6.3 MonitoringClose logging of the target systems was necessary On-demand protection is essentially a ‘virus scan’to gauge the relative successes of the malware and that is run by the user at an arbitrary time.the anti-malware software. This included recording The test results note each product’s behavioractivity such as network traffic, the creation of files when a threat is introduced and afterwards. Theand processes and changes made to important real-time protection mechanism was monitoredfiles. throughout the test, while an on-demand scan was run towards the end of each test to measure how 6.4 Levels of protection safe the product determined the system to be.The products displayed different levels ofprotection. Sometimes a product would prevent a Manual scans were run only when a testerthreat from executing, or at least making any determined that malware had made an interactionsignificant changes to the target system. with the target system. In other words, if the security product claimed to block the attack at theIn other cases a threat might be able to perform initial stage, and the monitoring logs supported thissome tasks on the target (such as exploiting a claim, the case was considered closed and asecurity vulnerability or executing a malicious Defended result was recordedprogram), after which the security product wouldintervene and remove some or all of the malware.Home Anti-Virus Protection, October - December 2012 Page 13 of 19
  • 14. 7. TEST DETAILS time of testing, the very latest version of each 7.1 The targets program was used.To create a fair testing environment, each product Target systems used identical hardware, includingwas installed on a clean Windows XP Professional an Intel Core 2 Duo processor, 1GB RAM, 160GBtarget system. The operating system was updated hard disk and DVD-ROM drive. Each waswith Windows XP Service Pack 3 (SP3), although connected to the internet via its own virtualno later patches or updates were applied. network (VLAN) to avoid cross-infection ofWe test with Windows XP SP3 and Internet malware.Explorer 7 due to the high prevalence of internetthreats that work with this combination. The 7.2 Threat selectionprevalence of these threats suggests that there are The malicious web links (URLs) used in the testsmany systems with this level of patching currently were not provided by any anti-malware vendor.connected to the internet. They were picked from lists generated by DennisAt the time of testing Windows XP was still being Technology Labs’ own malicious site detectionused heavily by consumers and businesses. system, which uses popular search engine keywords submitted to Google. It analyses sitesAccording to Net Applications, which monitors that are returned in the search results from athe popularity of operating systems and web number of search engines and adds them to abrowsers, nearly as many people were using database of malicious websites.Windows XP as Windows 7. Windows XP wasrunning 39.5 per cent of PCs, while Windows 7 In all cases, a control system (Verification Targetwas installed on 44.4%1. System - VTS) was used to confirm that the URLs linked to actively malicious sites.Additionally, our aim is to test the securityproduct and not the protection provided by Malicious URLs and files are not shared with anykeeping systems completely up to date with vendors during the testing process.patches and other mechanisms. 7.3 Test stagesA selection of legitimate but vulnerable software There were three main stages in each individualwas pre-installed on the target systems. These test:posed security risks, as they contained knownsecurity issues. They included versions of Adobe 1. IntroductionFlash Player, Adobe Reader and Java. 2. Observation 3. RemediationA different security product was then installed oneach system. Each product’s update mechanism During the Introduction stage, the target systemwas used to download the latest version with the was exposed to a threat. Before the threat wasmost recent definitions and other elements. introduced, a snapshot was taken of the system. This created a list of Registry entries and files onDue to the dynamic nature of the tests, which the hard disk. The threat was then introduced.were carried out in real-time with live maliciouswebsites, the products update systems were Immediately after the system’s exposure to theallowed to run automatically and were also run threat, the Observation stage is reached. During thismanually before each test round was carried out. time, which typically lasted at least 10 minutes, the tester monitored the system both visually andThe products were also allowed to call home using a range of third-party tools.should they be programmed to query databases inreal-time. Some products might automatically The tester reacted to pop-ups and other promptsupgrade themselves during the test. At any given according to the directives described below (see 7.5 Observation and intervention on page 15).1 http://news.cnet.com/8301-10805_3-57567081-75/windows-8-ekes-out-2.2-percent-market-share/Home Anti-Virus Protection, October - December 2012 Page 14 of 19
  • 15. In the event that hostile activity to other internet throughout the test, regardless of the web replayusers was observed, such as when spam was being systems.sent by the target, this stage was cut short. 7.5 Observation and interventionThe Observation stage concluded with another Throughout each test, the target system wassystem snapshot. This ‘exposed’ snapshot was observed both manually and in real-time. Thiscompared to the original ‘clean’ snapshot and a enabled the tester to take comprehensive notesreport generated. The system was then rebooted. about the system’s perceived behavior, as well asThe Remediation stage is designed to test the to compare visual alerts with the products’ logproducts’ ability to clean an infected system. If it entries.defended against the threat in the Observation stage At certain stages the tester was required to act asthen we skipped it. An on-demand scan was run a regular user. To achieve consistency, the testeron the target, after which a ‘scanned’ snapshot was followed a policy for handling certain situations,taken. This was compared to the original ‘clean’ including dealing with pop-ups displayed bysnapshot and a report was generated. products or the operating system, system crashes,All log files, including the snapshot reports and the invitations by malware to perform tasks and so on.product’s own log files, were recovered from the This user behavior policy included the followingtarget. directives:In some cases the target may become so damaged 1. Act naively. Allow the threat a goodthat log recovery is considered impractical. The chance to introduce itself to the target bytarget was then reset to a clean state, ready for clicking OK to malicious prompts, forthe next test. example. 2. Don’t be too stubborn in retrying blocked 7.4 Threat introduction downloads. If a product warns againstMalicious websites were visited in real-time using visiting a site, don’t take further measuresthe web browser. This risky behavior was to visit that site.conducted using live internet connections. URLs 3. Where malware is downloaded as a Zipwere typed manually into the browser. file, or similar, extract it to the DesktopWeb-hosted malware often changes over time. then attempt to run it. If the archive isVisiting the same site over a short period of time protected by a password, and thatcan expose systems to what appear to be a range password is known to you (e.g. it wasof threats (although it may be the same threat, included in the body of the originalslightly altered to avoid detection). malicious email), use it.Also, many infected sites will only attack a 4. Always click the default option. Thisparticular IP address once, which makes it hard to applies to security product pop-ups,test more than one product against the same operating system prompts (includingthreat. Windows firewall) and malware invitations to act.In order to improve the chances that each target 5. If there is no default option, wait. Givesystem received the same experience from a the prompt 20 seconds to choose amalicious web server, we used a web replay course of action automatically.system. 6. If no action is taken automatically, chooseWhen the verification target systems visited a the first option. Where options are listedmalicious site, the page’s content, including vertically, choose the top one. Wheremalicious code, was downloaded, stored and options are listed horizontally, choose theloaded into the replay system. When each target left-hand one.system subsequently visited the site, it receivedexactly the same content. 7.6 Remediation When a target is exposed to malware, the threatThe network configurations were set to allow all may have a number of opportunities to infect theproducts unfettered access to the internet system. The security product also has a number ofHome Anti-Virus Protection, October - December 2012 Page 15 of 19
  • 16. chances to protect the target. The snapshots 7.7 Automatic monitoringexplained in 7.3 Test stages on page 14 provided Logs were generated using third-party applications,information that was used to analyze a system’s as well as by the security products themselves.final state at the end of a test. Manual observation of the target systemBefore, during and after each test, a ‘snapshot’ of throughout its exposure to malware (andthe target system was taken to provide legitimate applications) provided more informationinformation about what had changed during the about the security products’ behavior.exposure to malware. For example, comparing a Monitoring was performed directly on the targetsnapshot taken before a malicious website was system and on the network.visited to one taken after might highlight newentries in the Registry and new files on the hard Client-side loggingdisk. A combination of Process Explorer, ProcessSnapshots were also used to determine how Monitor, TcpView and Wireshark were used toeffective a product was at removing a threat that monitor the target systems. Regshot was usedhad managed to establish itself on the target between each testing stage to record a systemsystem. This analysis gives an indication as to the snapshot.levels of protection that a product has provided. A number of Dennis Technology Labs-createdThese levels of protection have been recorded scripts were also used to provide additionalusing three main terms: defended, neutralized, and system information. Each product was able tocompromised. A threat that was unable to gain a generate some level of logging itself.foothold on the target was defended against; one Process Explorer and TcpView were runthat was prevented from continuing its activities throughout the tests, providing a visual cue to thewas neutralized; while a successful threat was tester about possible malicious activity on theconsidered to have compromised the target. system. In addition, Wireshark’s real-time output,A defended incident occurs where no malicious and the display from the web proxy (see Networkactivity is observed with the naked eye or third- logging, below), indicated specific network activityparty monitoring tools following the initial threat such as secondary downloads.introduction. The snapshot report files are used to Process Monitor also provided valuableverify this happy state. information to help reconstruct maliciousIf a threat is observed to run actively on the incidents. Both Process Monitor and Wiresharksystem, but not beyond the point where an on- were configured to save their logs automatically todemand scan is run, it is considered to have been a file. This reduced data loss when malware causedneutralized. a target to crash or reboot.Comparing the snapshot reports should show that Network loggingmalicious files were created and Registry entries All target systems were connected to a livewere made after the introduction. However, as internet connection, which incorporated along as the ‘scanned’ snapshot report shows that transparent web proxy and a network monitoringeither the files have been removed or the Registry system. All traffic to and from the internet had toentries have been deleted, the threat has been pass through this system.neutralized. The network monitor was a dual-homed LinuxThe target is compromised if malware is observed system running as a transparent router, passing allto run after the on-demand scan. In some cases a web traffic through a Squid proxy.product might request a further scan to complete An HTTP replay system ensured that all targetthe removal. We considered secondary scans to systems received the same malware as each other.be acceptable, but continual scan requests may be It was configured to allow access to the internetignored after no progress is determined. so that products could download updates andAn edited ‘hosts’ file or altered system file also communicate with any available ‘in the cloud’counted as a compromise. servers.Home Anti-Virus Protection, October - December 2012 Page 16 of 19
  • 17. 8. CONCLUSIONS Trend Micro Internet Security lost out on first Where are the threats? place because it misclassified legitimate applicationsThe threats used in this test were genuine, real-life too often. It blocked 21 legitimate programs, farthreats that were infecting victims globally at the more than any other product in this test.same time as we tested the products. In almost In contrast, Microsoft Security Essentials generatedevery case the threat was launched from a no false positives but was notably poor atlegitimate website that had been compromised by protecting the system from malware. It failed toan attacker. prevent 41 per cent of the threats fromThe types of infected or malicious sites were compromising the system.varied, which demonstrates that effective anti-virus Overall, considering each product’s ability tosoftware is essential for those who want to use handle both malware and legitimate applications,the web using a Windows PC. the winner was Norton Internet Security 2013,Most threats installed automatically when a user following closely by Kaspersky Internet Securityvisited the infected webpage. This infection was 2013 and ESET Smart Security 5.often invisible to a casual observer. Anti-virus is important (but not a Where does protection start? panacea)There were a significant number of compromises This test shows that with even a relatively smallin this test, as well as a relatively large number of sample set of 100 threats there is a significantneutralizations. difference in performance between the anti-virusThe strongest products blocked the site before it programs. Most importantly, it illustrates thiswas even able to deliver its payload. The weakest difference using real threats that attacked realtended to handle the threat after it had started to computers at the time of testing.interact with the target system. The average protection level of the tested products is 92 per cent (see 3. Protection Scores on Sorting the wheat from the chaff page 7), which is a significant value for twoTrend Micro Internet Security 2012 scored highest reasons. First, it is very close to the average figuresin terms of malware protection, while Norton published in previous Dennis Technology LabsInternet Security, ESET Smart Security 5 and reports over the years. Second, it is much lowerBitDefender Internet Security 2013 took second, than some detection results typically quoted inthird and fourth places respectively. anti-malware marketing material.Norton Internet Security 2012 was compromised The presence of anti-malware software can beonce and neutralized three threats; ESET Smart seen to decrease the chances of a malwareSecurity 5 was compromised twice and neutralized infection even when the only sites being visited aresix threats, while BitDefender was compromised proven to be actively malicious. That said, only onethree times and neutralized six threats. product produced a 100 per cent protection rate,However, anti-malware products need to be able while most generated false positive results.to distinguish between malicious and non-maliciousprograms. This is where Trend Micro’s productparticularly failed to excel.Home Anti-Virus Protection, October - December 2012 Page 17 of 19
  • 18. APPENDIX A: TERMS USEDCompromised Malware continues to run on an infected system, even after an on-demand scan.Defended Malware was prevented from running on, or making changes to, the target.False Positive A legitimate application was incorrectly classified as being malicious.Introduction Test stage where a target system is exposed to a threat. Malware or exploit was able to run on the target, but was then removed by the securityNeutralized product.Observation Test stage during which malware may affect the target.On-demand (protection) Manual ‘virus’ scan, run by the user at an arbitrary time. Questions asked by software, including malware, security products and the operating system. With security products, prompts usually appear in the form of pop-up windows.Prompt Some prompts don’t ask questions but provide alerts. When these appear and disappear without a user’s interaction, they are called ‘toasters’.Real-time (protection) The ‘always-on’ protection offered by many security products.Remediation Test stage that measures a product’s abilities to remove any installed threat.Round Test series of multiple products, exposing each target to the same threat.Snapshot Record of a target’s file system and Registry contents.Target Test system exposed to threats in order to monitor the behavior of security products.Threat A program or other measure designed to subvert a system. Code provided by a vendor to keep its software up to date. This includes virusUpdate definitions, engine updates and operating system patches.Home Anti-Virus Protection, October - December 2012 Page 18 of 19
  • 19. APPENDIX B: FAQS This test was unsponsored. The test rounds were conducted between 27th September 2012 and 4th December 2012 using the most up to date versions of the software available on any given day. All products were able to communicate with their back-end systems over the internet. The products selected for this test were chosen by Dennis Technology Labs. Samples were located and verified by Dennis Technology Labs. Products were exposed to threats within 24 hours of the same threats being verified. In practice there was only a delay of up to three to four hours. Details of the samples, including their URLs and code, were provided to partner vendors only after the test was complete. The sample set comprised 100 actively-malicious URLs and 100 legitimate applications.Do participating vendors know what samples are used, before or during the test?No. We don’t even know what threats will be used until the test starts. Each day we find new ones, so it isimpossible for us to give this information before the test starts. Neither do we disclose this information untilthe test has concluded.What is the difference between a vendor and a partner vendor?Partner vendors contribute financially to the test in return for a preview of the results, an opportunity tochallenge results before publication and the right to use award logos in marketing material. Other participantsfirst see the results on the day of publication and may not use award logos for any purpose.Do you share samples with the vendors?Partner vendors are able to download all samples from us after the test is complete.Other vendors may request a subset of the threats that compromised their products in order for them toverify our results. The same applies to client-side logs, including the network capture files. There is a smalladministration fee for the provision of this service.What is a sample?In our tests a sample is not simply a set of malicious executable files that runs on the system. A sample is anentire replay archive that enables researchers to replicate the incident, even if the original infected website isno longer available. This means that it is possible to reproduce the attack and to determine which layer ofprotection is was able to bypass. Replaying the attack should, in most cases, produce the relevant executablefiles. If not, these are usually available in the client-side network capture (pcap) file.WHILE EVERY EFFORT IS MADE TO ENSURE THE ACCURACY OF THE INFORMATION PUBLISHED INTHIS DOCUMENT, NO GUARANTEE IS EXPRESSED OR IMPLIED AND DENNIS PUBLISHING LTD DOESNOT ACCEPT LIABILITY FOR ANY LOSS OR DAMAGE THAT MAY ARISE FROM ANY ERRORS OROMISSIONS.Home Anti-Virus Protection, October - December 2012 Page 19 of 19