Your SlideShare is downloading. ×
Measuring the Actual Security that Vendors Provide to Customers
Upcoming SlideShare
Loading in...5
×

Thanks for flagging this SlideShare!

Oops! An error has occurred.

×

Saving this for later?

Get the SlideShare app to save on your phone or tablet. Read anywhere, anytime - even offline.

Text the download link to your phone

Standard text messaging rates apply

Measuring the Actual Security that Vendors Provide to Customers

252
views

Published on

“There is a desperate need for new standards for today’s anti-virus products. The dominant paradigm, scanning directories of files, is focused on old and known threats, and reveals little about …

“There is a desperate need for new standards for today’s anti-virus products. The dominant paradigm, scanning directories of files, is focused on old and known threats, and reveals little about product efficacy in the wild.”
Williamson & Gorelik (2007)


0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total Views
252
On Slideshare
0
From Embeds
0
Number of Embeds
1
Actions
Shares
0
Downloads
1
Comments
0
Likes
0
Embeds 0
No embeds

Report content
Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
No notes for slide

Transcript

  • 1. Measuring the Actual Security that Vendors Provide to Customers the need for AV product testing reform An executive session with Trend Micro CTO Raimund Genes and industry guests
  • 2. Four ways to stop malware
    • Block malware from arriving at the endpoint.
        • e.g. , web filtering; web reputation services
    • Stop malware files from executing on the endpoint.
        • e.g. , signature-based scanning of files
    • Interrupt malware doing bad things on execution.
        • e.g. , behavior monitoring
    • Protect vulnerabilities from being exploited.
        • e.g. , disable access to known vulnerabilities until patched
  • 3. But traditional testing only counts one
    • Block malware from arriving at the endpoint.
        • e.g. , web filtering; web reputation services
    • Stop malware files from executing on the endpoint.
        • e.g. , signature-based scanning of files
    • Interrupt malware doing bad things on execution.
        • e.g. , behavior monitoring
    • Protect vulnerabilities from being exploited.
        • e.g. , disable access to known vulnerabilities until patched
    Traditional AV product testing only measures detection
  • 4. As a result … “ There is a desperate need for new standards for today’s anti-virus products. The dominant paradigm, scanning directories of files , is focused on old and known threats, and reveals little about product efficacy in the wild .” Williamson & Gorelik (2007)
  • 5. Test Labs are responding
    • Independent testing labs have introduced new testing methods
    • Many new metrics attempt to better measure actual security
    • But the Labs have trouble keeping up with changes:
      • New cybercriminal techniques
      • Anti-malware solution innovations
    • As a result, there is now chaos in AV testing metrics & results
    • Names of Testing Metrics
    • Anti-malware detection
    • Caught initially on download
    • Caught on first exposure
    • Caught subsequently on execution
    • Caught with repeated exposure
    • Drive-by-download protection
    • Dynamic detection
    • End-to-end web threat protection
    • Exposure layer web threat protection
    • Infection layer web threat protection
    • Internet-connected detection
    • Malware blocking
    • Malware detection
    • Overall web threat protection
    • Proactive detection
    • Web security blocking
    • Web threat blocking effectiveness
    • Whole product dynamic test
    • Zero-day protection
  • 6. AV Testing Metrics Chaos
    • Names of Testing Metrics
    • Anti-malware detection
    • Caught initially on download
    • Caught on first exposure
    • Caught subsequently on execution
    • Caught with repeated exposure
    • Drive-by-download protection
    • Dynamic detection
    • End-to-end web threat protection
    • Exposure layer web threat protection
    • Infection layer web threat protection
    • Internet-connected detection
    • Malware blocking
    • Malware detection
    • Overall web threat protection
    • Proactive detection
    • Web security blocking
    • Web threat blocking effectiveness
    • Whole product dynamic test
    • Zero-day protection
  • 7. AV Testing Metrics Chaos
    • No consistency of testing method
    • No consistency of applied threat stimuli
    • No consistency of metrics definition
    • No consistency of results
    • consequently:
    • Little value to buyers of security products
  • 8. Blocking in the cloud before arrival
    • 92% of malware arrives over the Internet.
    • The source is often easier to identify than the malware files.
    • Blocking files from a bad source does not require file detection.
    • Traditional test methods do not credit blocking by source URL.
  • 9. A new threat every 1.5 sec.
    • Thousands of new threats per day overwhelm test methods.
    • Stored threats become irrelevant before a test is completed.
    • Speed of response to new threats is more important than detection of old threats.
    • Many threats are “old” in hours to days – not weeks.
  • 10. How long to respond to a new threat? … a metric that shows real differences among vendors.
  • 11. Key principles for AV product testing
    • Credit for “protection” instead of “detection”
    • “Real-time” or “dynamic” testing
    • Reproducibility: Statistical not deterministic
    • Broad and diverse relevant threat samples
    • Measuring the vendor response
    • e.g. , “time-to-protect”
  • 12. Comments from Industry Guests
    • Gerhard Eschelbeck
    • CTO & SVP Engineering at Webroot
    • Vik Phatak
    • Chairman & CTO at NSS Labs
    • Andreas Marx
    • CEO at AV-Test
    • Anil Somayaji
    • Director, Computer Security Lab, Carleton University
  • 13.