Your SlideShare is downloading. ×
Malware Classification Using Structured Control Flow
Malware Classification Using Structured Control Flow
Malware Classification Using Structured Control Flow
Malware Classification Using Structured Control Flow
Malware Classification Using Structured Control Flow
Malware Classification Using Structured Control Flow
Malware Classification Using Structured Control Flow
Malware Classification Using Structured Control Flow
Malware Classification Using Structured Control Flow
Malware Classification Using Structured Control Flow
Malware Classification Using Structured Control Flow
Malware Classification Using Structured Control Flow
Malware Classification Using Structured Control Flow
Malware Classification Using Structured Control Flow
Malware Classification Using Structured Control Flow
Malware Classification Using Structured Control Flow
Malware Classification Using Structured Control Flow
Malware Classification Using Structured Control Flow
Malware Classification Using Structured Control Flow
Malware Classification Using Structured Control Flow
Malware Classification Using Structured Control Flow
Malware Classification Using Structured Control Flow
Upcoming SlideShare
Loading in...5
×

Thanks for flagging this SlideShare!

Oops! An error has occurred.

×
Saving this for later? Get the SlideShare app to save on your phone or tablet. Read anywhere, anytime – even offline.
Text the download link to your phone
Standard text messaging rates apply

Malware Classification Using Structured Control Flow

2,634

Published on

Presented at AusPDC 2010 as part of ACSW. Full paper available on home page.

Presented at AusPDC 2010 as part of ACSW. Full paper available on home page.

Published in: Technology
0 Comments
1 Like
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total Views
2,634
On Slideshare
0
From Embeds
0
Number of Embeds
2
Actions
Shares
0
Downloads
81
Comments
0
Likes
1
Embeds 0
No embeds

Report content
Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
No notes for slide
  • My name is Silvio Cesare. My coauthor for this paper is Dr Yang Xiang. We are both from Central Queensland University and our research investigates the topic of Malware classification using structured control flow.
  • The first topic I’d like to address is what the motivation for investigating this research is, and why it’s a significant topic to investigate. Our research focuses on better methods to detect and classify malware, but what is malware? Malware is characteriized as hostile, intrusive or annoying software, and it’s a pervasive problem in distributed and networked computing. The global problem of malware gives motivation to the detection of malware. And detection of malware is necessary for a secure environment. Identifying malware variants provides great benefit in early detection and presents a useful defense against malware threats.
  • A variety of schemes exist to statically classify malware. In a purely static approach, the malware is never is executed. And static approaches have been applied employing statistical measures such as n-grams, or dissimilarity measures such as the edit distance of the malware’s raw content. Classification using control flow is considered superior to n-grams and edit distances utilising the raw malware content, because control flow can be identified as an invariant characteristic across strains in a family of malware. Alternate techniques perform poorly because small changes in the malware source code can significantly affect the byte level content. This is not true, however, of control flow. Control flow is an effective feature to fingerprint malware, but the extraction of these features can be hindered when the malware hides its real content using the code packing transformation.
  • The code packing transformation is an obfuscation method applied to malware as a post-processing stage to hide its real content. Some legitimate software is packed, but the majority of malware is also. In one study, 79% of malware seen in that month was found to be packed. In 2006, it was reported that 50% of malware from that year were repacked versions of existing malware. The typical behaviour of a packed program is to dynamically generate the hidden code at runtime, and then execute it. The goal of automated unpacking is to reverse the code packing transformation so that the hidden content is revealed.
  • Transcript

    • 1. Silvio Cesare and Yang Xiang School of Management and Information Systems Centre for Intelligent and Networked Systems Central Queensland University
    • 2. Motivation
      • Malware - hostile, intrusive, or annoying software or program code.
      • Malware is a pervasive problem in distributed and networked computing.
      • Detection of malware is necessary for a secure environment.
      • Detection of malware variants provides great benefit in early detection.
    • 3. Introduction
      • A variety of schemes exist to statically classify malware.
        • N-grams, edit distances, control flow.
      • Control flow can be identified as an invariant characteristic across strains in a family of malware.
      • Control flow analysis is hindered by malware hiding the real code and contents using the ‘code packing transformation’
    • 4. Introduction to Code Packing
      • Hides the malware’s real contents using encryption and compression.
        • Some legitimate software is packed.
        • 79% of malware in one month during 2007 was packed [1].
        • 50% of malware in 2006 were repacked versions of existing malware [2].
      • Typical behaviour of packed program - at runtime, the hidden code is dynamically generated and then executed (self decompressing).
      • Automated unpacking extracts the hidden code by simulating the malware until the hidden content is revealed.
      • Panda Research, “Mal(ware)formation statistics - panda research blog,” 2007; http://research.pandasecurity.com/archive/Mal_2800_ware_2900_formation-statistics.aspx
      • A. Stepan, “Improving proactive detection of packed malware,” Virus Bulletin Conference, 2006.
    • 5. Our Contribution
      • A novel system for approximate identification of control flow (flowgraph) signatures using the decompilation technique of structuring, and then using those signatures to classify a query program against a malware database.
      • A fast application level emulator to provide automated unpacking, that is capable of real-time desktop use.
      • A novel algorithm to determine when to stop emulation, using entropy analysis.
      • We implement and evaluate our ideas in a prototype system that performs automated unpacking and malware classification.
    • 6. Related Work
      • Automated unpacking
        • Whole System Emulation – Pandora’s Bochs, Renovo
        • Dynamic Binary Instrumentation – Saffron
        • Native Execution– OmniUnpack, Saffron
        • Virtualization - Ether
      • Malware classification
        • N-grams, n-perms of raw contents
        • Edit distance between basic blocks, inverted index and bloom filters.
        • Flowgraphs – Exact and approximate. Call graphs and control flow graphs.
          • ‘ A Fast Flowgraph Based Classification System for Packed and Polymorphic Malware on the Endhost’.
    • 7. Problem Statement
      • A database exists containing malware signatures.
      • Given to the system is a query program – goal is to determine if it’s malicious.
      • Find the similarity between the query program and each of the malware in the database.
        • Similarity is a real number between 0 and 1.
        • Similarity is based on shared and invariant characteristics or features.
      • If similarity exceeds a threshold, declare program as a malicious variant.
    • 8. Our Approach
      • Identify code packing using entropy analysis.
      • Unpack the program using application level emulation, using entropy analysis to detect when unpacking is complete.
      • Identify characteristics – control flow graphs of each procedure – and generate signatures using ‘structuring’.
        • Structuring decompiles the procedure into source code like control flow.
        • Result is a string.
      • Use the string edit distance and the approximate dictionary search to show dissimilarity (and thus similarity) of each procedure to database signatures.
      • Accumulate similarities of signatures for a final result. A similarity equal to or greater than 0.6 indicates a variant.
    • 9.  
    • 10. Identifying Packed Binaries
      • Entropy analysis identifies the amount of ‘information’ in a text.
      • Compressed and encrypted content has high entropy.
      • Packed malware contains compressed or encrypted content.
      • By looking for a sequence of high entropy blocks of data, we identify it as being packed.
    • 11. Unpacking - Application Level Emulation
      • A more efficient approach than the whole system emulation employed by existing automated unpackers.
      • Implemented using interpretation.
      • Emulates:
        • The non privileged x86 Instruction Set Architecture.
        • Virtual memory, including segmentation.
        • Windows Structured Exception Handling.
        • The most common functions in the Windows API.
        • Linking and Loading.
        • Thread and Process management.
        • OS specific structures.
    • 12. Verifying Emulation
      • Automate testing the correctness of emulation.
      • Emulate the malware in parallel to running the malware in a debugger.
      • Verify program state is the same between emulator and debugger.
      • Some instructions and APIs behave differently when debugged.
        • Debugger can rewrite these instructions on the fly to maintain correctness.
    • 13. Detecting Completion of Hidden Code Extraction
      • Need to detect when the hidden code is revealed, and emulation should stop.
        • Known as the Original Entry Point (OEP)
      • Existing literature identifies execution of dynamically generated content by tracing writes to and execution of memory.
        • But multiple layers of dynamically generated code exist. How to know when to stop?
      • Our solution: Use entropy analysis to identify packed data that hasn’t been accessed during execution– meaning unpacking hasn’t processed the packed data and is therefore not complete.
      • If an unimplemented API is executed, stop also.
    • 14. Flowgraph Based Signature Generation
      • Once unpacked:
      • Disassemble image.
      • Identify procedures.
      • Translate to an intermediate representation.
      • Build control flow graphs.
      • Transform control flow graphs to strings using structuring.
      • Calculate weight of each string using the ratio of its size proportional to the sum of all string sizes.
    • 15. Signatures Using Structuring
      • Transformation of a cfg to a string uses a variation of the structuring algorithm used in the DCC decompiler.
        • When a cfg can’t be structured, a goto is generated.
      • The source code like output is transformed to a smaller but semantically equivalent string of tokens representing control flow constructs like if() or while().
      • Similar control flow graphs have similar string signatures.
      • String signatures are amenable to string algorithms such as the edit distance.
    • 16.
      • The relationship between a control flow graph, a high level structured graph, and a signature.
    • 17. Malware Classification
      • Finding similar signatures or strings, is done by searching the malware database using an approximate dictionary search.
      • The similarity ratio, , is a measure of similarity between two signatures and calculated from the distance between strings using the Levenshtein (edit) distance.
        • Levenshtein distance is the number of insertions, deletions and substitutions to transform one string to the other.
      • Using a similarity ratio of s=0.9, we calculate the number of errors , , or distance, allowed in the dictionary search.
    • 18. Malware Classification Algorithm
      • Similarity ratios for each control flow graph in the query binary are found based on the best approximate match in the malware database.
      • The asymmetric similarity is the sum of the weighted similarity ratios.
      • Two weights are possible for each matching flowgraph – the weight from the malware database, and the weight from the query binary – resulting in two asymmetric similarities.
      • The final result, program similarity, is the product of the asymmetric similarities.
    • 19. Evaluation Unpacking Synthetic Samples
      • Tested packing Windows programs hostname.exe (shown) and calc.exe prototype against 14 public packing tools.
      • Results indicate accurate detection of the original entry point, and a speed suitable for adoption in real-time desktop Antivirus.
      Name Time (s) Num. Instr. mew 0.13 56042 fsg 0.13 58138 upx 0.11 61654 packman 0.13 123959 npack 0.14 129021 aspack 0.15 161183 pe compact 0.14 179664 expressor 0.20 620932 winupack 0.20 632056 yoda’s protector 0.15 659401 rlpack 0.18 916590 telock 0.20 1304163 acprotect 0.67 3347105 pespin 0.64 10482466 Name Revealed code and data Number of stages to real OEP Stages unpacked % of instr. to real OEP unpacked upx 13107 1 1 100.00 rlpack 6947 1 1 100.00 mew 4808 1 1 100.00 fsg 12348 1 1 100.00 npack 10890 1 1 100.00 expressor 59212 1 1 100.00 packman 10313 2 1 99.99 pe compact 18039 4 3 99.98 acprotect 99900 46 39 98.81 winupack 41250 2 1 98.80 telock 3177 19 15 93.45 yoda's protector 3492 6 2 85.81 aspack 2453 6 1 43.41 pepsin err 23 err err
    • 20. Evaluation of Flowgraph Based Classification
      • Tested classifying Klez (shown bottom left), Netsky, (shown bottom right) and Roron families of malware.
      • Results show high similarities between malware variants.
      a b c d g h a   0.84 1.00 0.76 0.47 0.47 b 0.84 0.84 0.87 0.46 0.46 c 1.00 0.84 0.76 0.47 0.47 d 0.76 0.87 0.76 0.46 0.45 g 0.47 0.46 0.47 0.46 0.83 h 0.47 0.46 0.47 0.45 0.83   aa ac f j p t x y aa 0.78 0.61 0.70 0.47 0.67 0.44 0.81 ac 0.78 0.66 0.75 0.41 0.53 0.35 0.64 f 0.61 0.66 0.86 0.46 0.59 0.39 0.72 j 0.70 0.75 0.86 0.52 0.67 0.44 0.83 p 0.47 0.41 0.46 0.52 0.61 0.79 0.56 t 0.67 0.53 0.59 0.67 0.61 0.61 0.79 x 0.44 0.35 0.39 0.44 0.79 0.61 0.49 y 0.81 0.64 0.72 0.83 0.56 0.79 0.49
    • 21. Evaluation of Flowgraph Based Classification (cont)
      • Examined similarities between unrelated malware and programs (left).
      • Evaluated likely occurrence of false positives by calculating the similarities between the set of Windows Vista system programs, which are mostly not similar to each other (right).
      • Most programs showed a low similarity to others.
      Similarity Matches 0.0 105497 0.1 2268 0.2 637 0.3 342 0.4 199 0.5 121 0.6 44 0.7 72 0.8 24 0.9 20 1.0 6 cmd.exe calc.exe netsky.aa klez.a roron.ao cmd.exe   0.00 0.00 0.00 0.00 calc.exe 0.00 0.00 0.00 0.00 netsky.aa 0.00 0.00 0.19 0.08 klez.a 0.00 0.00 0.19 0.15 roron.ao 0.00 0.00 0.08 0.15  
    • 22. Conclusion
      • Malware can be classified according to similarity between flowgraphs.
      • We proposed algorithms to perform fast unpacking. We also proposed algorithms to classify malware.
      • Automated unpacking was demonstrated to be effective on synthetically packed samples, and fast enough for desktop Antivirus.
      • Finally, we demonstrated that by using our classification system, real malware variants could be identified.

    ×