Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

The VTC experience


Published on

Presented at the International Antivirus Testing Workshop 2007 by Prof. Dr. Klaus Brunnstein, University of Hamburg, Germany

Published in: Technology
  • Be the first to comment

  • Be the first to like this

The VTC experience

  1. 1. About the aVTC experience Dr. Klaus Brunnstein, Professor emeritus Department for Informatics, University of Hamburg President, International Federation for Information Processing (IFIP) AV-workshop Reykjavik (F-Prot) May 16-17, 2007 <ul><li>Background: Uni Hamburg´s IT Security Curricula </li></ul><ul><li>Development of aVTC @ Uni-Hamburg </li></ul><ul><li>Methods used in aVTC tests, lessons learned </li></ul><ul><li>Demand for inherently secure systems </li></ul>
  2. 2. Abstract <ul><ul><li>Title:         The VTC experience </li></ul></ul><ul><li>        Author:      Klaus Brunnstein, University of Hamburg, Germany </li></ul><ul><li>        Abstract:   Established in 1987, Virus Test Center at Hamburg university was the first lab </li></ul><ul><li>                         where students learned how to analyse security threats esp. related to malicious </li></ul><ul><li>                         software and prepare software solutions to counter related threats (later, other labs </li></ul><ul><li>                        worked about chipcard security, biometrics and incident reponse methods). After initial </li></ul><ul><li>                        projects (including Morton Swimmer´s ANTIJERU), Vesselin Bontchev (coming from </li></ul><ul><li>                         the virus lab of the Bulgarian Academy, Sofia) joined VTC in 1992 and started his </li></ul><ul><li>                        AntiVirus test suit; Vesselin was probably the first ever to systematically organise </li></ul><ul><li>                        AV tests, and his experiences taught several AV experts and their companies how to </li></ul><ul><li>                        improve their products. When Vesselin left (for Iceland), a series of student projects </li></ul><ul><li>                        were started where students could learn to organise and maintain a malware datase, </li></ul><ul><li>                        prepare testbeds, develop criteria for testing, perform AV/AM tests with special </li></ul><ul><li>                         emphasis on detection quality of AntiVirus and AntiMalware products. VTC results </li></ul><ul><li>                        were sometimes controversially recognized, esp. when the author announced that </li></ul><ul><li>                        product tests would also adress detection of non-replicating malware (aka trojans); </li></ul><ul><li>                        at that time, some AV producers withdrew their product from the test (some of which </li></ul><ul><li>                        joined later, after having been convinced that AntiVirus-only tests are too restrictive). </li></ul><ul><li>  </li></ul><ul><li>                        The paper describes methods used by VTC in maintaining testbeds and how </li></ul><ul><li>                        tests were performed, esp. also adressing problems found in testing. After the </li></ul><ul><li>                        principal investigator finished his teaching carreer (in fall 2004), VTC was closed </li></ul><ul><li>                        because of lack of students devoting time to test procedures. </li></ul><ul><li>  </li></ul>
  3. 3. Agenda: Chapter 1 <ul><li>Background: Uni Hamburg´s IT Security Curricula </li></ul><ul><li>Development of aVTC @ Uni-Hamburg </li></ul><ul><li>Methods used in aVTC tests </li></ul><ul><li>Demand for inherently secure systems </li></ul>
  4. 4. 1.1 Background: Hamburg´s IT Security Curricula <ul><li>Working Group AGN (Applications in Science: K.Bru.) </li></ul><ul><li>responsible for education and research in IT Security </li></ul><ul><li>WS 1987/88 : first lecture „IT Security and Safety“ </li></ul><ul><li>pre-cycle: winter 1987/88-summer 1989 </li></ul><ul><li>Curriculum IT-Sicherheit (IT Security): </li></ul><ul><li>1st cycle: winter 1989/90-summer 1991 </li></ul><ul><li>2nd cycle: winter 1991/92-summer 1933 </li></ul><ul><li>3rd cycle: winter 1993/94-summer 1995 </li></ul><ul><li>4th cycle: winter 1995/96-summer 1997 </li></ul><ul><li>5th cycle: winter 1997/98-summer 1999 </li></ul><ul><li>6th cycle: winter 1999/00-summer 2001 </li></ul><ul><li>Mean: 50 students per cycle (optional in diploma) </li></ul>
  5. 5. 1.2 Background: Hamburg´s IT Security Curricula <ul><li>Lecture 1: Introduction into IT Security and Safety </li></ul><ul><li>Survey of dependability/vulnerability studies </li></ul><ul><li>Survey of IT misuse: Hackers, Cracker, Viruses, Worms </li></ul><ul><li>Basic IT paradigms and IT-induced risks selected </li></ul><ul><li>Case studies of IT relevant incidents in organisations and enterprises, security and safety issues and policies </li></ul><ul><li>Legal aspects: - Data Protection - Computer Crime Legislation - Copyright, Intellectual Property Right </li></ul>
  6. 6. 1.3 Background: Hamburg´s IT Security Curricula <ul><li>Lecture 2: Concepts of Secure & Safe Systems I </li></ul><ul><li>Problems of &quot;Quality&quot;, ISO 9000 etc. </li></ul><ul><li>IT Security and Safety Models & IT Security Criteria: </li></ul><ul><li>TCSEC/TNI, ITSEC, CTCPEC, US'FC, MSFR, </li></ul><ul><li>JCSEC,R-ITSEC, Common Criteria </li></ul><ul><li>Reference Monitor Concepts </li></ul><ul><li>Implementations of Virtual Systems </li></ul><ul><li>Intrusion Detection (IDES) / Avoidance (IDA) </li></ul>
  7. 7. 1.4 Background: Hamburg´s IT Security Curricula <ul><li>Lecture 3: Concepts of Secure & Safe IT Systems II </li></ul><ul><li>Encryption methods (general, DES, RSA, Clipper) </li></ul><ul><li>Data Base/Information Systems Security: Problems and Solutions (DBMS, RDMS) </li></ul><ul><li>Communication and Network Security </li></ul><ul><li>Lecture 4: Risk and Incident Analysis </li></ul><ul><li>Case studies: Incident of IT based Systems - Network, Mainframe, PC Attacks - Bank networks/accidents - Flight Management (EFCS) and other accidents </li></ul><ul><li>Methods of Risk Analysis </li></ul><ul><li>Large Systems Backup Solutions </li></ul><ul><li>Methods of Reverse Engineering </li></ul><ul><li>Methods of Computer Emergency Response </li></ul>
  8. 8. AGN Anwendungen der Informatik in Geistes- und Naturwissenschaften
  9. 9. 1.5a Background: Reverse Engineering Course <ul><li>(anti) Virus Test Center = antiMalware Laboratory : </li></ul><ul><li>- local network, clients with flexible hub switching concept </li></ul><ul><li>- Intel-based Workstations </li></ul><ul><li>- VMWare as basic platform, to „contain“ malicious events </li></ul><ul><li>- DOS (boot viruses&trojans) </li></ul><ul><li>- W32 systems (file, macro&script viruses/worms; trojans) </li></ul><ul><li>Reverse-Engineering Courses (1/year): </li></ul><ul><li>- 10 days (2 weeks): survey malware, methods of analysis </li></ul><ul><li>- practice of reverse-engineering </li></ul><ul><li>- Certificate (examination + analysis of unknown malware) </li></ul>
  10. 10. 1.5b Background: Reverse Engineering Course Generating Replicated Code <ul><li>„ Goat (=victim) object “: </li></ul><ul><li>executable „pure“ content of different types and length </li></ul><ul><li>„ Infection process“ : </li></ul><ul><li>virus or worms executed in protected environment to avoid uncontrolled spreading </li></ul><ul><li>different goat objects to assure that infection works under all relevant circumstances (also for proper detection) </li></ul><ul><li>generations of infection : </li></ul><ul><li>original virus infects 1st generation </li></ul><ul><li>1st generation infection generates 2nd generation </li></ul><ul><li>2nd generation infection generates 3rd generation </li></ul><ul><li> assurance: 1st & 2nd generation are infectious (viral/wormy) </li></ul>
  11. 11. 1.5c Background: Reverse Engineering Course: Dynamic Analysis: Observing Replication <ul><li>Hardware based analysis : Logic analyser; s ystem tracing using special hardware PERISCOPE: observe and control dynamic behaviour, performance monitoring </li></ul><ul><li>Software based analysis : </li></ul><ul><ul><li>Event tracing </li></ul></ul><ul><ul><li>Interrupt observation: INTSPY </li></ul></ul><ul><ul><li>HelpPC makes essential details (Interrupts, BIOS and DOS structures, DOS commands, hardware specs) available </li></ul></ul><ul><li>Code Tracing : </li></ul><ul><ul><li>Practicing Debugging (breakpoints, trigger etc) </li></ul></ul><ul><ul><li>Tool: SoftIce </li></ul></ul><ul><ul><li>Problem: analysis of malware using AntiDebugging methods </li></ul></ul>
  12. 12. 1.5d Background: Reverse Engineering Course Basics of Code Analysis <ul><li>General: differential analysis : comparing infected vs. uninfected code </li></ul><ul><li>16-bit/W32 code : Disassembly (mostly: Sourcer) separation of code/data, library functions, documentation of code, ... </li></ul><ul><li>Macro code : specialised decompiler (or „manual“ work) </li></ul><ul><ul><li>Macro viruses/worms exist in source (VBA) AND p-code </li></ul></ul><ul><ul><li> Analysis must adress BOTH VBA and p-code </li></ul></ul><ul><ul><li>VBA : „hi-level language“ easy to understand (and reprogram  ) </li></ul></ul><ul><ul><li>p-code : generating source code with editor </li></ul></ul><ul><ul><ul><li>Remark: several viruses may reconstruct deleted source code from p code </li></ul></ul></ul><ul><li>Script code : specialised decompiler, mostly „manual“ work </li></ul><ul><ul><li>Problem: VBS may deeply influence MS-W32 system system structures; therefore, good system knowledge is required </li></ul></ul>
  13. 13. 1.5e Background: Reverse Engineering Course: Dynamic and Static Analysis: Understanding Camouflage <ul><li>Self-protection of Malware against Detection: </li></ul><ul><ul><li>Hiding interactions : e.g. replacement of interrupts </li></ul></ul><ul><ul><li>Self-encrypting malware : </li></ul></ul><ul><ul><ul><li>Many viruses self-encrypt, with the decryption routine often </li></ul></ul></ul><ul><ul><li>Oligo- and Polymorphic (Obfuscated) Code : </li></ul></ul><ul><ul><ul><li>Change static layout of code by changing sequence of code (e.g. sequence of loading registers before procedure invocation) where semantic is not affected </li></ul></ul></ul><ul><ul><ul><li>Oligomorphic code : few different variations of code (same effect) </li></ul></ul></ul><ul><ul><ul><li>Polymorphic code : many different instantiations of code (same effect) </li></ul></ul></ul><ul><ul><ul><li>Problem : malware „signatures“ are combinations of static codes (combined with AND, OR, NOT and woWildcards) to help identifying viruses and distinguish different „variants“ </li></ul></ul></ul><ul><ul><ul><li>Such code requires specific detection routines (scanning process slowed) </li></ul></ul></ul>
  14. 14. 1.6 Background: Hamburg´s IT Security Curricula <ul><li>Additional Lectures on : </li></ul><ul><li>Mathematicals Cryptography (Prof. Kudlek), Data Protection etc. </li></ul><ul><li>Seminar on: „Actual Problems of IT Security and Safety“ </li></ul><ul><li>(every semester) </li></ul><ul><li>Practice in Reverse Engineering </li></ul><ul><li>Virus Test Center : practical student work with malware/tests </li></ul><ul><li>Other labs: biometrics systems, secure chipcards </li></ul><ul><li>Examination Work: about 100 diplom/master thesis </li></ul><ul><li>Dissertation Works : e.g. Vesselin Bontchev on Viruses, </li></ul><ul><li>KlausPeter Kossakowski: Principles of Incident Response Systems </li></ul><ul><li>Morton Swimmer (2005): new about new AV methods </li></ul>
  15. 15. 1.7 Background: Hamburg´s IT Security Curricula <ul><li>Hamburg Bachelor Curriculum 2001-2006 : </li></ul><ul><ul><li>Lecture (4 hours/week) for ALL students in 3rd year </li></ul></ul><ul><ul><li>„ Foundations of Secure/Safe Systems“ (GBI) </li></ul></ul><ul><ul><li>250 Students per semester (mandatory) </li></ul></ul><ul><ul><li>Essential elements: </li></ul></ul><ul><ul><ul><li>Legal requirements (data protection, crime law, SOX etc) </li></ul></ul></ul><ul><ul><ul><li>Definition Security & Safety, technical requirements </li></ul></ul></ul><ul><ul><ul><li>Survey of hacking intrusion techniques, malware </li></ul></ul></ul><ul><ul><ul><li>Sources of InSecurity: paradigms, protocols, weaknesses, … </li></ul></ul></ul><ul><ul><ul><li>Concepts of security: cryptography, secure OS & DB, Firewalls, Kerberos, AntiMalware, Intrusion Detection </li></ul></ul></ul><ul><ul><ul><li>Risk Aanalysis / Risk Management </li></ul></ul></ul>
  16. 16. Agenda: Chapter 2 <ul><li>Background: Uni Hamburg´s IT Security Curriculum </li></ul><ul><li>Development of aVTC @ Uni-HH </li></ul><ul><li>Methods used in aVTC tests </li></ul><ul><li>Demand for inherently secure Systems </li></ul>
  17. 17. 2.1 Development of aVTC @ Uni-Hamburg <ul><li>Phase 0: 1978: engagement in technical aspects of data protection , </li></ul><ul><li> seminars, lectures and diplom thesis </li></ul><ul><li>Phase 1: 1987 analysis of Jerusalem virus (received from Hebrew univ) </li></ul><ul><li>1st antivirus: Morton Swimmer: „antiJeru.exe“ </li></ul><ul><li>KGB hack/K-P Kossakowski: foundation of 1st CERT </li></ul><ul><li>1991: Michelangelo virus: antiMich.exe distributed 30k* </li></ul><ul><li>Phase 2: 1990-1995 : Vesselin Bontchev: 1st professional AV tests </li></ul><ul><li> Vesselin is „Best Teller of this Saga“  </li></ul><ul><li>Phase 3: 1994-2004: VTC established with student testers </li></ul><ul><li>1997/98 : 1st malware test (against protest of some AV comps) </li></ul><ul><li>October 2004: aVTC closed (Prof. emeritus – no more students) </li></ul>
  18. 18. 2.2 Survey of tests at aVTC @ Uni-Hamburg <ul><li> Scanner test July 2004 Scanner test April 2003 Scanner test December 2002 &quot;Heureka-2&quot; Scanner test March 2002 Scanner test October 2001 &quot;Heureka(-1)&quot; Scanner test July 2001 Scanner test April 2001 AntiVirus Repair Test (ART 2000-11) Comment on Sophos ´ reaction to VTC test report August 2000 Scanner test August 2000 Scanner test April 2000 Pre-released Scanner test February 2000 Scanner test September 1999 Scanner test March 1999 Scanner test October 1998 Scanner test Computer Bild ( June 1998) Scanner test February 1998 Scanner test July 1997 Scanner test February 1997 Scanner test July 1994 </li></ul><ul><li>Scanner test July 2005: Detection of mobile viruses (diplom thesis) </li></ul>
  19. 19. Agenda: Chapter 3 <ul><li>Background: Uni Hamburg´s IT Security Curricula </li></ul><ul><li>Development of aVTC @ Uni-HH </li></ul><ul><li>Methods used in aVTC tests </li></ul><ul><ul><li>3A Survey of methods </li></ul></ul><ul><ul><li>3B Survey of test results </li></ul></ul><ul><ul><li>3C Lessons learned </li></ul></ul><ul><li>Demand for inherently secure Systems </li></ul>
  20. 20. 3A.1 Test System: Lab Network DOS Win 95 Win NT WXP Client 3 Win NT 100 Mbit Ethernet using Microsoft Netbui Client 1 Client 2 Server
  21. 21. 3A.2a Test server: <ul><li>Win-NT Server (1) hardware : </li></ul><ul><li>Pentium 200 MHz , 64 MB RAM, 2 GB hard disk (boot) </li></ul><ul><li>2*4,3 GB data/reports, </li></ul><ul><li>2*9,1 GB virus database (mirror) </li></ul><ul><li>3 network cards: 2*100 MBit/sec, 1*10 MBit/sec </li></ul><ul><li>Protected against electrical faults ( USV : APC 420 VA) </li></ul><ul><li>Operating system : Windows NT Server 4.0 SP 6 </li></ul><ul><li>Network :1* 10 MBit/sec BNC for 20 DOS clients </li></ul><ul><li>1*100 MBit/sec via 2 cascaded switches </li></ul><ul><li>for all other clients with 10 MBit/sec cards </li></ul><ul><li>1*100 MBit/sec via 100 MBit/sec hub other clients </li></ul>
  22. 22. 3A.2b Test clients: <ul><li>Windows Clients (9) have the following hardware: </li></ul><ul><li>2*Pentium 133 MHz, 64 MB RAM, 2 GB harddisk, 10 MBit/sec </li></ul><ul><li>Pentium 90 MHz, 32 MB RAM, 1 GB harddisk, 100 MBit/sec </li></ul><ul><li>Pentium-II 350 MHz, 64 MB RAM, 2 GB harddisk, 100 MBit/sec </li></ul><ul><li>Pentium 233 MMX MHz, 64 MB RAM,2 GB harddisk, 100 MBit/s </li></ul><ul><li>Pentium-II 233 MHz, 64 MB RAM, 4 GB harddisk, 100 MBit/s </li></ul><ul><li>Pentium-II 350 MHz, 64 MB RAM, 4 GB harddisk, 100 MBit/s </li></ul><ul><li>Pentium MMX 233 MHz 196 MB RAM, 4 GB harddisk, 100 MB/s </li></ul><ul><li>Pentium III 128 MB RAM, 4 GB hard disk, 100 MBit/sec </li></ul><ul><li>2*Pentium IV 1.7 GHz 512 MB RAM, 40 GB harddisk, 100 MBit/s </li></ul>
  23. 23. 3A.3 Test System: Databases Boot virus database Saved as images of bootsectors and master boot records File virus database File extentions: boo, img, mbr File extentions: COM,EXE,CMD,SYS, BAT The directory structure is created out of the virus names The files are in their original structure
  24. 24. 3A.4 Test System: Directory structure <ul><li>Main directories : </li></ul><ul><li>CARO ( the three main scanners identify the virus identical: </li></ul><ul><li>implies that CARO naming conventions are valid </li></ul><ul><li>NYETCARO: one or two scanners identified the virus </li></ul><ul><li>UNKNOWN: none of the three scanners identified the virus, but the files replicate </li></ul><ul><li>In early tests: </li></ul><ul><ul><li>OS/2: viruses natively working under OS/2 </li></ul></ul><ul><ul><li>WINDOWS 95: viruses natively working under Windows 95 </li></ul></ul>
  25. 25. 3A.5 Early Test System Size (1997) Boot virus database : images: 3910 viruses: 1004 File virus database : files: 84236 viruses: 13014 Macro virus database : files: 2676 viruses: 1017 Macro malware database: files: 61 malware: 89 File malware database: files: 213 malware: 163
  26. 26. 3A.6b Test System: Size April 2003 <ul><li>&quot;Full Zoo&quot;: 21,790 File Viruses in 158,747 infected files </li></ul><ul><li>8,001 different File Malware in 18,277 files </li></ul><ul><li>664 Clean file objects for False Positive test </li></ul><ul><li>7,306 Macro Viruses in 25,231 infected docs </li></ul><ul><li>450 different Macro Malware in 747 macro objects </li></ul><ul><li>329 Clean macro objects for False Positive test </li></ul><ul><li>823 different script viruses in 1,574 infected objects </li></ul><ul><li>117 different script malware in 202 macro objects </li></ul><ul><li>&quot;ITW Zoo&quot;: </li></ul><ul><li>11 Boot Viruses in 149 infected images/sectors </li></ul><ul><li>50 File Viruses in 443 infected files </li></ul><ul><li>124 Macro Viruses in 1,337 infected documents </li></ul><ul><li> 20 Script Viruses in 122 infected objects </li></ul>
  27. 27. 3A.7a Preprocessing of new objects (#1/4) Unzip the archives Reset all file attributes Sort all files into main categories (boot, file, macro) Restore the normal file extensions (e.g. .EX_ ==> .EXE)
  28. 28. 3A.7b Preprocessing of new objects (#2/4) Remove with Dustbin all known non-viruses Search for duplicate files (binary identical) First step: only the new files Second step: new files and old database Third step: delete all duplicate files Replication of all new files to test if they are „alive“ (partially applied in test 1997-07)
  29. 29. 3A.7c Preprocessing of new objects (#3/4) Scan new files and previous databases with F-Prot, Dr. Solomon and AVP to create report files Move the non viruses (trojan, dropper, germs) into a special directory Preprocessing reports using CARO.bat If a virus is operating-system specific, it is sorted into the corresponding subdirectory below the specific OS-Directory (Win95, WinNT, OS/2)
  30. 30. 3A.7d How CARO.BAT works (#4/4): The subdirectory name is created out of the virus name. The dots between the family names, sub family, main variant and sub variant are substituted with backslashes. All characters except a-z, 0-9, „-“ and „_“ are substituted with „_“. If a file with the same name already exists, the new file in this directory is renamed. If F-Prot identifies a virus by name, the file is moved into the corresponding subdirectory below the NYETCARO directory If Dr. Solomon identifies a virus by name, the file is moved into the corresponding subdirectory below the NYETCARO directory If AVP identifies a virus by name, the file is moved into the corresponding subdirectory below the NYETCARO directory If all three scanners identify a virus by the same name, the file is moved into the corresponding subdirectory below the CARO-Directory
  31. 31. 3A.8 Test Procedures: Testing boot viruses For practical reasons, no infected floppy disks are tested (method for such tests available but not practiced). <ul><li>1.) Using SIMBOOT: </li></ul><ul><li>Is used to scan the boot images </li></ul><ul><li>Simulates changing of infected floppy disks </li></ul><ul><li>Simulates the user inputs to scan the next floppy disk </li></ul><ul><li>2.) If SIMBOOT fails, direct test: </li></ul><ul><li>Scan the images directly </li></ul>Remark: several AV products crash under SIMBOOT.
  32. 32. 3A.9 Test Procedures: Testing file/macro viruses <ul><li>Heuristic mode </li></ul><ul><li>Reports only (no repair) </li></ul><ul><li>Experience: some scanners crash upon detecting viruses improperly </li></ul><ul><li>Scan small amount of files ( it‘s easier to start the scanner again ) </li></ul><ul><ul><li>CARO </li></ul></ul><ul><ul><li>NYETCAROA </li></ul></ul><ul><ul><li>NYETCAROB </li></ul></ul><ul><ul><li>........ </li></ul></ul><ul><ul><li>NYETCARO </li></ul></ul><ul><ul><li>Unknown </li></ul></ul><ul><ul><li>OS/2 (early tests) </li></ul></ul><ul><ul><li>WINDOWS 95 </li></ul></ul><ul><ul><li>Windows NT </li></ul></ul><ul><ul><li>Windows XP </li></ul></ul>
  33. 33. 3A.10 Test Procedures for file/macro viruses Start Test-Version of the OS Install scanner Scan and save report to the network Reboot with Master System Delete Test-Version and restore from backup Start from beginning
  34. 34. 3A.11 Test Results, Evaluation 1) UNIX-Tools and AWK-Scripts are used to evaluate the reports; in cases of changed scanner diagnostics, scripts must be adapted. 2) Create an alphabetical list, which contains for each directory the directory name and the number of files in the directory 3) Analyse how many files are scanned and recognized for each scanner report. 4) Sort and join the reports ( directory listing - preprocessed scanner report ) 5) Evaluate the joined report 6) Quality assurance
  35. 35. 3B.1a Test results (e.g. 2003-04: 1st WXP test) <ul><li>0README.1ST - Latest notes. </li></ul><ul><li>0XECSUM.TXT - Executive Summary of Test Report 2003-04 </li></ul><ul><li>1CONTENT.TXT - This file </li></ul><ul><li>2PROLOG.TXT - Background, Aims of this test </li></ul><ul><li>3INTRO.TXT - Introduction: Background, Aims, </li></ul><ul><li> Development of VTC tests </li></ul><ul><li>4TESTCON.TXT - Conditions which a scanner must fullfil </li></ul><ul><li>in order to be tested </li></ul><ul><li>5PROTOCO.TXT - Detailed description of VTC test protocols </li></ul><ul><li>6jWXP.TXT - Detailed Results: Windows-XP File, Macro and Script </li></ul><ul><li>Virus and Malware Results </li></ul><ul><li>6mCMP32.TXT - Detailed Results: Comparison of 32-Bit results from </li></ul><ul><li>test 2002-12 and 2003-04 (Win-XP, Win-98, Win-2k) </li></ul><ul><li>7EVAL-WXP.TXT - Windows-XP Results: Evaluation, Grading of WXP products </li></ul><ul><li>7EVAL-CMP.TXT - W32-platforms: Comparison, Evaluation, </li></ul><ul><li>Grading of W32 products </li></ul><ul><li>8PROBLMS.TXT - Problems and bugs experienced during tests </li></ul><ul><li>9EPILOG.TXT - Summary, future test plans, and final comment </li></ul><ul><li>DISCLAIM.TXT - Disclaimer: About usage of this document </li></ul>
  36. 36. 3B.1b Test report structure (cont) <ul><li>Evidence for reproducibility of test results: </li></ul><ul><li>-------------------- </li></ul><ul><li>A1ITW00b.TXT - &quot;In-The-Wild&quot; list of PC Viruses </li></ul><ul><li>(October 2002: </li></ul><ul><li>A2SCANLS.TXT - List of scanners/versions and parameters, </li></ul><ul><li>including information on producer </li></ul><ul><li>A4TSTDIR.TXT - Directory of A3TSTBEDs (content of </li></ul><ul><li>A5CODNAM.TXT - Code names of AV products in VTC tests </li></ul><ul><li>Separate appendix: </li></ul><ul><li>------------------ </li></ul><ul><li>A3TSTBED.ZIP - Index of File, Macro, Script Virus & infected object </li></ul><ul><li>Databases, both full and &quot;In-The-Wild&quot;; </li></ul><ul><li>Index of Macro and Script Malware Databases; </li></ul><ul><li>and Index of non-viral and non-malicious objects </li></ul><ul><li>used in False-Positive test (all pkZIPped). </li></ul>
  37. 37. 3B.2 Development of testbeds: <ul><li>File Viruses/Malware Boot Viruses Macro Viruses/Malware ScriptViruses/Malware </li></ul><ul><li>Viral objects Malware viruses objects viruses objects viruses objects malware </li></ul><ul><li>--------------------------------+---------------+--------------------------+-------------------------- </li></ul><ul><li>1997-07: 12,826 83,910 213 938 3,387 617 2,036 72 </li></ul><ul><li>1998-03: 14,596 106,470 323 1,071 4,464 1,548 4,436 459 </li></ul><ul><li>1998-10: 13,993 112,038 3,300 881 4,804 2,159 9,033 191 </li></ul><ul><li>1999-09: 17,561 132,576 6,217 1,237 5,286 3,546 9,731 329 </li></ul><ul><li>2000-04: 18,359 135,907 6,639 1,237 5,379 4,525 12,918 260 </li></ul><ul><li>2000-08: 5,418 15,720 500 306 </li></ul><ul><li>2001-04: 20,564 140,703 12,160 1,311 5,723 6,233 19,387 627 477 </li></ul><ul><li>2002-12: 21,790 158,747 18,277 7,306 25,231 823 1,574 </li></ul><ul><li>2003-04: 21,790 158,747 18,277 7,306 25,231 823 1,574 </li></ul><ul><li>-------------------------------+---------------+-----------------------+------------------------ </li></ul>
  38. 38. 3B.3 Example of test result: File/Macro/Script Zoo Virus Detection Rates <ul><li>Scan I == File Virus == + == Macro Virus == + == Script Virus == </li></ul><ul><li>ner I Detection I Detection I Detection </li></ul><ul><li>-------+-----------------------+-------------------------+-------------------------- </li></ul><ul><li>AVP I 100.~ I 100.~ I 98.9 </li></ul><ul><li>BDF I 82.9 I 99.0 I 72.4 </li></ul><ul><li>CMD I 98.5 I 99.9 I 89.1 </li></ul><ul><li>DRW I 98.3 I 99.4 I 94.7 </li></ul><ul><li>FSE I 100.~ I 100.~ I 99.5 </li></ul><ul><li>INO I 98.7 I 99.9 I 94.7 </li></ul><ul><li>NAV I 98.3 I 99.6 I 96.8 </li></ul><ul><li>NVC I 97.8 I 99.8 I 87.6 </li></ul><ul><li>RAV I 96.7 I 99.9 I 96.1 </li></ul><ul><li>SCN I 99.8 I 100.0 I 99.6 </li></ul><ul><li>-------+-----------------------+-------------------------+-------------------------- </li></ul><ul><li>Mean : 97.1% I 99.8% I 92.9% </li></ul><ul><li>Mean>10%: 97.1% I 99.8% I 92.9% </li></ul><ul><li>-------+-----------------------+-------------------------+-------------------------- </li></ul><ul><li>Student testers preferred/developped graphical representations (see next folios). </li></ul>
  39. 39. Antiviren-Test 2002-12 Erkennung unter Windows 2000
  40. 40. Antiviren-Test 2002-12 Erkennung unter Windows 2000
  41. 41. Antiviren-Test 2002-12 Erkennung unter Windows 2000
  42. 42. 3B.4a Grading of AV/AM products: <ul><li>Definition (1): A &quot;Perfect AntiVirus (AV) product&quot; </li></ul><ul><li>-------------------------------------------------- </li></ul><ul><li>1) Will detect ALL viral samples &quot;In-The-Wild&quot; </li></ul><ul><li>AND at least 99.9% of zoo samples, </li></ul><ul><li>in ALL categories (file, boot, macro and script-based </li></ul><ul><li>viruses), with always same high precision </li></ul><ul><li>of identification and in every infected sample, </li></ul><ul><li>2) Will detect ALL ITW viral samples in compressed </li></ul><ul><li>objects for all (NOW:5) popular packers, and </li></ul><ul><li>3) Will NEVER issue a False Positive alarm </li></ul><ul><li>on any sample which is not viral. </li></ul><ul><li>Remark: detection of &quot;exotic viruses&quot; is </li></ul><ul><li>presently NOT rated. </li></ul>
  43. 43. 3B.4b Grading of AV/AM products: <ul><li>Definition (2): A &quot;Perfect AntiMalware (AM) product&quot; </li></ul><ul><li>---------------------------------------------------- </li></ul><ul><li>1) Will be a &quot;Perfect AntiVirus product&quot;, </li></ul><ul><li>That is: 100% ITW detection </li></ul><ul><li>AND >99% zoo detection </li></ul><ul><li>AND high precision of identification </li></ul><ul><li>AND high precision of detection </li></ul><ul><li>AND 100% detection of ITW viruses </li></ul><ul><li>in compressed objects, </li></ul><ul><li>AND 0% False-Positive rate, </li></ul><ul><li>2) AND it will also detect essential forms </li></ul><ul><li>of malicious software, at least in unpacked </li></ul><ul><li>forms, reliably at high rates (>90%). </li></ul><ul><li>Remark: detection of &quot;exotic malware&quot; is </li></ul><ul><li>presently NOT rated. </li></ul>
  44. 44. 3B.4c Example of product grading <ul><li>Test category: &quot;Perfect&quot; &quot;Excellent&quot; </li></ul><ul><li>---------------------------------------------------------------------------------------------------- </li></ul><ul><li>WXP file ITW test: AVP,DRW,FSE,NAV,SCN INO,RAV </li></ul><ul><li>WXP macro ITW test: AVP,DRW,FSE,INO,NAV,SCN BDF,CMD,NVC,RAV </li></ul><ul><li>WXP script ITW test: AVP,CMD,DRW, INO </li></ul><ul><li>FSE,NAV,NVC,RAV,SCN </li></ul><ul><li>---------------------------------------------------------------------------------------------------- </li></ul><ul><li>WXP file zoo test --- AVP,FSE,SCN </li></ul><ul><li>WXP macro zoo test: SCN AVP,FSE,CMD,INO, </li></ul><ul><li>RAV,NAV,NVC,DRW,BDF </li></ul><ul><li>WXP script zoo test: --- SCN,FSE </li></ul><ul><li>---------------------------------------------------------------------------------------------------- </li></ul><ul><li>WXP file pack test: AVP,FSE,SCN DRW </li></ul><ul><li>WXP macro pack test: SCN AVP,DRW </li></ul><ul><li>---------------------------------------------------------------------------------------------------- </li></ul><ul><li>WXP file FP avoidance: AVP,BDF,CMD,FSE,INO, DRW </li></ul><ul><li>NAV,NVC,RAV,SCN </li></ul><ul><li>WXP macro FP avoidance: BDF,INO,NAV,SCN RAV </li></ul><ul><li>---------------------------------------------------------------------------------------------------- </li></ul><ul><li>WXP file malware test: --- FSE,AVP,SCN </li></ul><ul><li>WXP macro malware test: AVP,FSE,SCN CMD,RAV,NVC,INO, </li></ul><ul><li>NAV,BDF,DRW </li></ul><ul><li>WXP script malware test: --- SCN,FSE,AVP,NAV </li></ul><ul><li>---------------------------------------------------------------------------------------------------- </li></ul>
  45. 45. 3B.4d Example of product grading <ul><li>************************************************************ </li></ul><ul><li>&quot;Perfect&quot; Windows-XP AntiVirus product: </li></ul><ul><li>=NONE= (20 points) </li></ul><ul><li>&quot;Excellent&quot; Windows-XP products: </li></ul><ul><li>1st place: SCN (18 points) </li></ul><ul><li>2nd place: AVP,FSE (13 points) </li></ul><ul><li>4th place: NAV (11 points) </li></ul><ul><li>5th place: DRW (10 points) </li></ul><ul><li>6th place: INO ( 9 points) </li></ul><ul><li>7th place: RAV ( 8 points) </li></ul><ul><li>8th place: BDF,CMD,NVC ( 6 points) </li></ul><ul><li>************************************************************ </li></ul><ul><li>&quot;Perfect&quot; Windows-XP AntiMalware product: </li></ul><ul><li>=NONE= (26 points) </li></ul><ul><li>&quot;Excellent&quot; Windows-XP AntiMalware product: </li></ul><ul><li>1st place: SCN (22 points) </li></ul><ul><li>2nd place: AVP,FSE (17 points) </li></ul><ul><li>4th place: NAV (13 points) </li></ul><ul><li>5th place: DRW (11 points) </li></ul><ul><li>6th place: INO (10 points) </li></ul><ul><li>7th place: RAV ( 9 points) </li></ul><ul><li>8th place: BDF,CMD,NVC ( 7 points) </li></ul><ul><li>************************************************************ </li></ul>
  46. 46. 3B.5a Symbian MobilePhone Malware: Threats <ul><li>Advent of Mobile Malware: </li></ul><ul><li>Platforms (Symbian, EPOC, ...) conceived to support easy implementation of applications </li></ul><ul><li>Programming in script languages, no exclusion of potentially harmful functions </li></ul><ul><li>Example: platform = Symbian OS </li></ul><ul><ul><li>Presently known : 12 different strains (families) of self-replicating (=viral) or not self-replicating (=trojanic) malware with 100 variants or modifications </li></ul></ul><ul><ul><li>Malicious functions: most specimen are „proof-of-concept“ malware (viruses/trojans) but some have a dangerous payload </li></ul></ul><ul><ul><li>Example of dangerous payload : </li></ul></ul><ul><ul><ul><li>Reorganize dictionary of telephone numbers </li></ul></ul></ul><ul><ul><ul><li>Send MMS to every entry in telephone dictionary (real „ pay load“  ) </li></ul></ul></ul>
  47. 47. 3B.5b Symbian MobilePhone Malware Test: Products <ul><li>14 Products in aVTC test : (versions: May 2005) </li></ul><ul><li>ANT AntiVir (H&B EDV) (Germany) </li></ul><ul><ul><li>AVA AVAST (32) (Czech Republic) </li></ul></ul><ul><ul><li>AVG Grisoft Antivirus (Czech Republic) </li></ul></ul><ul><ul><li>AVK AntiVirus Kit (GData) (Germany/Russia) </li></ul></ul><ul><ul><li>AVP AVP (Platinum) (Russia) </li></ul></ul><ul><ul><li>BDF BitDefender (AntiVirus eXpert) (Romania) </li></ul></ul><ul><ul><li>FPW FProt FP-WIN (Iceland) </li></ul></ul><ul><ul><li>FSE F-Secure AntiVirus (Finland) </li></ul></ul><ul><ul><li>IKA Ikarus Antivirus (Austria) </li></ul></ul><ul><ul><li>MKS MKS_vir 2005 (Poland) </li></ul></ul><ul><ul><li>NAV Norton AntiVirus /Symantec (USA) </li></ul></ul><ul><ul><li>NVC Norman Virus Control (Norway) </li></ul></ul><ul><ul><li>SCN NAI VirusScan /McAfee (USA) </li></ul></ul><ul><ul><li>SWP Sophos AntiVirus (Sweep) (UK) </li></ul></ul>
  48. 48. 3B.5c Symbian MobilePhone Malware Test: Testbed <ul><li>Testbed (all specimen known May 12, 2005): </li></ul><ul><li>Cabir 22 Variants (a ... .v), </li></ul><ul><li> 1 dropper (installing variants .b, .c, .d) </li></ul><ul><li>Commwarrior 2 Variants (a-b) </li></ul><ul><li>Dampig 1 Variant (a) </li></ul><ul><li>Drever 3 Variants (a-c) </li></ul><ul><li>Fontal 1 Variant (a) </li></ul><ul><li>Hobbes 1 Variant (a) </li></ul><ul><li>Lasco 1 Variant (a) </li></ul><ul><li>Locknut: 2 Variants (a, b) </li></ul><ul><li>Mabir 1 Variant (a) </li></ul><ul><li>MGDropper (Metal Gear trojan) 1 Variant (a) </li></ul><ul><li>Mosquitos 1 Variant (a) </li></ul><ul><li>Skulls 11 Variants (a-k); </li></ul><ul><li>52 modifications of Skulls.D </li></ul><ul><li>12 strains (=„families“) with 100 variants/modifications . </li></ul>
  49. 49. 3b.5d Symbian MobilePhone Malware Test: Results <ul><li>Rank/Product Detected (135 samples) DetectionRate(%) Grade </li></ul><ul><li>( 6) ANT 92 68,15 Risky </li></ul><ul><li>( 6) AVA 53 39,26 Risky </li></ul><ul><li>( 6) AVG 119 88,15 Risky </li></ul><ul><li>( 2) AVK 131 97,04 Very Good </li></ul><ul><li>( 1) AVP 134 99,26 Excellent </li></ul><ul><li>( 4) BDF 126 93,33 Good </li></ul><ul><li>(13) FPW 13 9,63 Inacceptable </li></ul><ul><li>( 2) FSE 132 97,78 Very Good </li></ul><ul><li>( 6) IKA 57 42,22 Risky </li></ul><ul><li>( 6) MKS 55 40,74 Risky </li></ul><ul><li>( 6) NAV 81 60,00 Risky </li></ul><ul><li>(13) NVC 5 3,70 Inacceptable </li></ul><ul><li>( 4) SCN 123 91,11 Good </li></ul><ul><li>( 6) SWP 60 44,44 Risky </li></ul>
  50. 50. 3C Lessons learned for AV-Test Centers <ul><li>1) Continuous improvement of knowledge & skills of AV test personnell (courses, events) </li></ul><ul><li>2) Publish test methods in detail , as basis for analysis of test methods & discourse about improvement </li></ul><ul><li>Work with trusted AVcompanies but avoid dependency </li></ul><ul><li>Sound test database requires uniform naming as well as proper quality assurance measures (including publication of problems, including own failures) </li></ul><ul><li>Send test results (incl. missed samples ) to AV companies for analysis&verification of results . </li></ul><ul><li>Publish all details of tests (methods, problems, findings) to allow for expert analysis (NOT samples!) </li></ul>
  51. 51. Agenda: Chapter 4 <ul><li>Background: Hamburg´s IT Security Curriculum </li></ul><ul><li>Development of aVTC @ Uni-HH </li></ul><ul><li>Methods used in aVTC tests </li></ul><ul><li>Demand for inherently secure systems </li></ul>
  52. 52. 4.1 Contemporary Solution: „Tower of IT“ A B B WAN Protected LAN AM AM LAN U1 KryptoBox Firewall Intrustion Detection AntiMalware KryptoBox Malicious Information Zone Red: NO PROTECTION Zone Blue: Hi-Protection Zone Yellow: Partial Protection U#
  53. 53. 4.2 Requirements for Inherently Safe&Secure Systems <ul><li>Basic requirement: for all IT systems in a ubiquitous network (including devices in personal contact), manufacturers specify and guarantee essential functions and features. </li></ul><ul><li>Requirement #1: „SafeComputing“ (SC): </li></ul><ul><li>SC architecture guarantees: functionality of processes, persistence & integrity of objects, encapsulation of processes, graceful degradation (!) , benign recovery (!) </li></ul><ul><li>Requirement #2: „SecureNetworking“ (SN): </li></ul><ul><li>SN protocol guarantees: confidentiality, integrity, authenticity of sender/receiver, reliability of transfer, </li></ul><ul><li>non-repudiation (!) , non-deniability (!) </li></ul><ul><li>Requirement #3: Assurance of functional adequacy : </li></ul><ul><li>All functions and features must be specified and implemented in a way to permit adequate assurance of specifications. </li></ul>
  54. 54. 4.3 Residual Risks in Ubiquitous Computing <ul><li>Future Secure and InSecure Networlds: </li></ul>Stand alone: local anomalies No protection against attacks in FreeNetwork No anomalies Lokal stark Inherently secure against attacks FreeNetwork SecureNetwork !!! Protection from import of anomalies, attacks, flooding
  55. 55. 4.4 Enforcement of Inherent Security <ul><li>Path #1 : DT Manufacturers establish and enforce adequate quality and standards. </li></ul><ul><li>Example : Vapor engine quality enforced through </li></ul><ul><li>„ Dampfkessel Ueberwachungs-Verein“ (now: TÜV) </li></ul><ul><li>Presently, no such self-organisation of the ICT industry is available. </li></ul><ul><li>Path #2 : Directives (EU, president) and laws enforce protection of customers (persons AND enterprises), including damage compensation and preventive actions. </li></ul><ul><li>Example : customer protection legislation in USA etc following Nader´s book „Unsafe at any speed!“ </li></ul>