Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Behind the scene of malware operators. Insights and countermeasures. CONFidence 2018, Kracow 05.06.2018

192 views

Published on

Modern cybercrime operates highly-sophisticated campaigns that challenge, or even evade, the state-of-art in defense and protection. On a daily basis, users worldwide are fooled by new techniques and threats that went under the radar, like new 0-days or attack vectors. We passively monitored how these attacks are conducted on real installations, and unveiled the modus operandi of malware operators. In this presentation, we share with the audience our recent findings and trends that we observed in-the-wild from the analysis we conducted on 3 million software downloads, involving hundreds of thousands of Internet connected machines. During the talk, we provide insights on our investigation like the effect of code signing abuse, the compromise of cloud providers' operations, the use of domains generated automatically via social engineering, and the business model behind modern malware campaigns. We also discuss the problem of "unknown threats", showing how the Internet's threats landscape is still largely unexplored and how it badly impacts on million of users. We conclude with a proof-of-concept system that we designed and that uses machine-learning to generate human-readable rules for detection. Our system represents a potential mitigation to the problem of "unknown threats" and an assistance tool for analysts globally.

Published in: Technology
  • Be the first to comment

  • Be the first to like this

Behind the scene of malware operators. Insights and countermeasures. CONFidence 2018, Kracow 05.06.2018

  1. 1. Behind the scene of malware operators. Insights and countermeasures. Dr. Marco Balduzzi @embyte Kracow, 05.06.2018
  2. 2. 2010 Europe US Asia
  3. 3. 200,000 ASes 50,000,000 ASes 2010 2018
  4. 4. Benign Software Malicious Software
  5. 5. Unknown Software
  6. 6. Experiment ● 3 Million software (binaries) – Downloaded and Executed – Not white-listed ● From hundreds of thousands Internet machines ● 2 years after: best-effort labeling – Internal DBs + VT
  7. 7. KNOWN = 17% ?
  8. 8. 69 % MACHINES Executed Unknown Content!
  9. 9. GOAL → Reduce the ‘unknowns’ ←
  10. 10. APPROACH Learn from the visible, the ‘known’ Condense this knowledge into an intelligent system Let the system deciding for us
  11. 11. What users download and execute? ● Very “unprevalent” software ● The download URL is not white-listed – E.g., Microsoft updates
  12. 12. Distribution Model ● Popular websites house more malicious files than benign ● Heavy use of file hosting providers like softonic, cloudfront and mediafire
  13. 13. Droppers and PUPs ● Embedded in questionable software ● Re-packaging ● Actors need to maximize distribution
  14. 14. Social Engineering will Never Die! ● Adware ● Domains resembling media streaming websites ● Observed as well in malverstising
  15. 15. Social Engineering will Never Die! ● FakeAV ● Domains resembling antivirus software companies ● wmicrodefender27.nl offers malware concealed as Windows Defender Antivirus to Dutch users
  16. 16. Code Signing Adoption in Malware ● Malware signed more than Benign ● Browser-downloaded malware signed most ● First-stage vs second-stage malware
  17. 17. Code Signing Abuse ● StuxNet – Targets SIMATIC WinCC, i.e. a SCADA and HMI system for Siemens ● Signature from Realtek Semiconductor – Then revoked ● Signature from JMicron Technology
  18. 18. Code Signing Abuse ● Massive hack against Sony Pictures (2014) ● Valid certificates sold in the underground ● Acquired by actors operating the Destover campaign
  19. 19. Fraudulent Certs ● Social Engineering
  20. 20. Poor Validation at CA Level ● First option ● Applies to PKI classes 2 and 3 as well ● Examples are Comodo and Certum
  21. 21. Fraudulent Certs ● Stolen, upon compromise or leak
  22. 22. Questionable “organizations” ● Sign and distribute both benign and unwanted software ● Mainly PUPs
  23. 23. Software Distribution ● 4 categories Browsers Windows Processes Java Acrobat
  24. 24. Software Distribution ● Popularity Browsers Windows Processes Java Acrobat
  25. 25. Software Distribution ● Infection rates Browsers Windows Processes Java Acrobat
  26. 26. Software Distribution ● Observation Browsers Windows Processes Java Acrobat Unpatched Windows?
  27. 27. Software Distribution ● Observation Browsers Windows Processes Java Acrobat Malicious? Sound rec, custom calendar, etc..
  28. 28. Browser Infections ● Chome beats other browsers ● IE automatically patched by corporate policies? Most Prevalent
  29. 29. Business Model of Operators ● Campaign 1 → Campaign 2 → Campaign 3 ?
  30. 30. Business Model of Operators ● Malware operators stick to malware campaign of choice ● Case: Ransomware→Ransomware is 80% ● Reasons: – Technological bar higher than early 2000s – Different economical model, i.e. monetization and operational costs
  31. 31. PUP & Adware the new First-Stage?
  32. 32. Actionable Intelligent System ● Ingests observations from the “known world” ● Produces detection rules – Human-readable! – Immediately applicable – High detection rate, low error rate
  33. 33. PART ● Partial Detection Trees ● Use security related features ● Pruning and optimization
  34. 34. Category Feature Downloaded File Signer Name CA Name Packer Name Downloading Process Signer Name CA Name Packer Name Category Downloading Domain Popularity (Alexa)
  35. 35. IF File Signer = “Apps Installer S.L.” AND File CA = “thawte code signing ca g2” AND Process Signer = “Microsoft Windows” → MALICIOUS Category Feature Downloaded File Signer Name CA Name Packer Name Downloading Process Signer Name CA Name Packer Name Category Downloading Domain Popularity (Alexa)
  36. 36. Training Set (Month X) PARTTraining Set (Month X) Configuration: Features + Parameters
  37. 37. Training Set (Month X) ~1500 Rules PARTTraining Set (Month X) Configuration: Features + Parameters
  38. 38. Training Set (Month X) ~1500 Rules PART ~1000 Subset Rules PRUNING (τ=0) Training Set (Month X) Configuration: Features + Parameters
  39. 39. Training Set (Month X) ~1500 Rules PART ~1000 Subset Rules PRUNING (τ=0) Testing Set (Month X+1) APPLY TP / FP EVALUATION Training Set (Month X) Configuration: Features + Parameters
  40. 40. Training Set (Month X) ~1500 Rules PART ~1000 Subset Rules PRUNING (τ=0) Testing Set (Month X+1) APPLY TP / FP EVALUATION Training Set (Month X) Configuration: Features + Parameters Operational Rules
  41. 41. Training Set (Month X) ~1500 Rules PART ~1000 Subset Rules PRUNING (τ=0) Testing Set (Month X+1) APPLY TP / FP EVALUATION Training Set (Month X) Configuration: Features + Parameters Operational Rules Unknown Set APPLY
  42. 42. KNOWN = +30%
  43. 43. Examples ● Process = “Acrobat Reader” → Malicious
  44. 44. File Signer = “Somoto ltd.” → Malicious
  45. 45. File Signer = None AND Domain = unpopular [*] AND Process Signer = “Microsoft Windows” AND Process = Benign → Malicious [*] over position 100,000 in Alexa
  46. 46. Adversarial Machine Learning ● Machine-Learning is prone to evasion ● Two research directions – Detect attacks – Design robust algorithms ● https://evademl.org
  47. 47. Discussion ● Our approach can be evaded, but? ● Would require a change of signature and/or packer, for each polymorphic variant ● Signature: – Acquiring valid certificates is “no trivial” ● Packer: – Attackers can switch to benign packers (instead of custom) → Code analysis trivial!.
  48. 48. Thanks! http://www.madlab.it @embyte

×