Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Exploiting the Testing System


Published on

International Antivirus Testing Conference. Viorel Canja, Head of BitDefender Labs, Bitdefender.

Published in: Technology
  • Be the first to comment

  • Be the first to like this

Exploiting the Testing System

  1. 1. Exploiting the testing system Viorel Canja,Head of BitDefender Labs
  2. 2. Contents <ul><li>What does the title mean ? </li></ul><ul><li>Testing detection on wildcore </li></ul><ul><li>Testing detection on zoo collections </li></ul><ul><li>Retrospective detection tests </li></ul><ul><li>Examples </li></ul><ul><li>Feedback from the industry </li></ul><ul><li>Q&A </li></ul>
  3. 3. What does the title mean ? <ul><li>Purpose of tests: </li></ul><ul><li>to define metrics and measure the performance of AV products </li></ul><ul><li>to find am approximation for the real world performance of AV products </li></ul><ul><li>to give feedback to AV researchers about their products </li></ul><ul><li>to allow the users to make an informed decision </li></ul>
  4. 4. What does the title mean ? <ul><li>“ Define:exploit” </li></ul><ul><li>use or manipulate to one's advantage </li></ul><ul><li>draw from; make good use of </li></ul><ul><li>overwork: work excessively hard </li></ul>
  5. 5. What does the title mean ? <ul><li>To use the limitations of the testing procedure to one’s advantage. </li></ul><ul><li>The focus is on those actions which have questionable benefits for the user. </li></ul>
  6. 6. Types of tests <ul><li>detection tests on wildcore </li></ul><ul><li>detection tests on zoo collections </li></ul><ul><li>retrospective detection tests </li></ul>
  7. 7. Testing detection on wildcore <ul><li>What is wildcore ? </li></ul><ul><li>“ WildCore is a set of replicated virus samples that represents the real threat to computer users.” </li></ul><ul><li>“ When a virus is reported to us by two or more Reporters, it's a pretty good indication that the virus is out there, spreading, causing real problems to users. We consider such a virus to be 'In the Wild'.” </li></ul>
  8. 8. Testing detection on wildcore <ul><li>The Wildcore samples are known to all AV companies as soon as wildcore is published. </li></ul><ul><li>Tests are likely to be performed on exactly the same samples. This is always the case with samples of malware which does not replicate. </li></ul>
  9. 9. Testing detection on wildcore <ul><li>Quick hack: just sign all the samples with dumb ( aka automatic ) signatures. </li></ul><ul><li>Disable heuristics to avoid false positives ( if the testbed is already known there is no need for technology that detects previously unknown threats ) </li></ul>
  10. 10. Testing detection on zoo collections <ul><li>Zoo should contain a large number of files so that the statistics are as accurate as possible </li></ul><ul><li>Threats should be replicated ( where applicable ) or large numbers of samples should be used for polymorphic malware or malware that is re-generated on the server </li></ul><ul><li>The zoo should not contain garbage </li></ul>
  11. 11. Testing detection on zoo collections <ul><li>Hacks: </li></ul><ul><li>use customized settings for the test. Heuristics should be set to paranoid mode. Automatically sign all previously missed samples and white-list all previously reported false positives. </li></ul><ul><li>automatically sign all samples detected by at least one AV product just to be on the “safe” side </li></ul>
  12. 12. Testing detection on zoo collections <ul><li>Hacks (2): </li></ul><ul><li>- add detection routines for garbage that is usually found in collections. This includes detecting known false positives of other products, detecting damaged executables, detecting files produced by different analysis tools. </li></ul>
  13. 13. Retrospective detection tests <ul><li>Signature databases are frozen at a certain moment </li></ul><ul><li>Detection is tested against samples received after that moment </li></ul><ul><li>Testing should be done with default settings because most of the products are marketed as “install and forget” and the majority of users will not change the settings </li></ul>
  14. 14. Retrospective detection tests <ul><li>Has the disadvantage that it will not take into account proactive detections introduced by generic routines created for malware families that appear after the signatures are frozen </li></ul><ul><li>These routines ( or signatures ) will detect proactively subsequent variants of the same family </li></ul><ul><li>Favors aggressive heuristics if not correlated with false positive tests </li></ul>
  15. 15. Examples <ul><li>Automatic signing: </li></ul><ul><li>Av01 (1 st pair) : TR/Zapchast.CP </li></ul><ul><li>Av02 : Collected.Z </li></ul><ul><li>Av03: W32/KillAV.3B84!tr </li></ul><ul><li>Av04: Trojan.Downloader.Asks </li></ul><ul><li>Av05: Program:Win32/SpySheriff (threat-c) </li></ul><ul><li>Av06: Trojan.Gen </li></ul><ul><li>Av07 : Win32:Trojan-gen. {Other} </li></ul><ul><li>Av08: Win32/Dewnuttin.B </li></ul><ul><li>Av09: W32/Tofger.CD </li></ul><ul><li>Av10: Application/KillApp.A </li></ul><ul><li>Av11: (2 nd pair) TROJ_PROCKILL.DJ </li></ul><ul><li>Av12: Trojan.Xtssksastsm </li></ul><ul><li>Av13: (1 st pair) Trojan.Win32.Zapchast.cp </li></ul><ul><li>Av14: (2 nd pair) application ProcKill-DJ </li></ul><ul><li>Av15: Win32/ProcKill.1hj!Trojan </li></ul><ul><li>Av16: Trojan.Zapchast.CT </li></ul>
  16. 16. Examples <ul><li>Detecting other products’ false positives: </li></ul><ul><li>Av01: Backdoor.X </li></ul><ul><li>Av02: FalseAlarm.Av01.Backdoor.X </li></ul>
  17. 17. Feedback from the industry <ul><li>Automatic sample processing … </li></ul><ul><li>is a must given the number of samples received </li></ul>
  18. 18. Feedback from the industry <ul><li>… and adding detection based on the output of other AVs </li></ul><ul><li>illegal, immoral, plain wrong </li></ul><ul><li>bad idea </li></ul><ul><li>it’s common practice </li></ul><ul><li>it probably started as an attempt to have common names </li></ul><ul><li>there is no other way </li></ul>
  19. 19. Feedback from the industry <ul><li>Reporting packed files </li></ul><ul><li>if they are not malicious we should not detect them </li></ul><ul><li>some of the packers should be blacklisted while others are too widely used so must be allowed </li></ul><ul><li>an unfortunate necessity </li></ul><ul><li>professional companies do not need to use dodgy packers </li></ul>
  20. 20. Feedback from the industry <ul><li>White-listing clean apps instead of black-listing malware </li></ul><ul><li>it’s not possible </li></ul><ul><li>does not scale </li></ul><ul><li>it’s ok in controlled environments </li></ul><ul><li>better and better idea as time passes </li></ul>
  21. 21. The end … <ul><li>Q&A </li></ul>