Your SlideShare is downloading. ×
0
ndss2007-phinding-ph..
ndss2007-phinding-ph..
ndss2007-phinding-ph..
ndss2007-phinding-ph..
ndss2007-phinding-ph..
ndss2007-phinding-ph..
ndss2007-phinding-ph..
ndss2007-phinding-ph..
ndss2007-phinding-ph..
ndss2007-phinding-ph..
ndss2007-phinding-ph..
ndss2007-phinding-ph..
ndss2007-phinding-ph..
ndss2007-phinding-ph..
ndss2007-phinding-ph..
ndss2007-phinding-ph..
ndss2007-phinding-ph..
ndss2007-phinding-ph..
ndss2007-phinding-ph..
ndss2007-phinding-ph..
ndss2007-phinding-ph..
ndss2007-phinding-ph..
ndss2007-phinding-ph..
ndss2007-phinding-ph..
ndss2007-phinding-ph..
ndss2007-phinding-ph..
ndss2007-phinding-ph..
ndss2007-phinding-ph..
ndss2007-phinding-ph..
Upcoming SlideShare
Loading in...5
×

Thanks for flagging this SlideShare!

Oops! An error has occurred.

×
Saving this for later? Get the SlideShare app to save on your phone or tablet. Read anywhere, anytime – even offline.
Text the download link to your phone
Standard text messaging rates apply

ndss2007-phinding-ph..

253

Published on

0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total Views
253
On Slideshare
0
From Embeds
0
Number of Embeds
0
Actions
Shares
0
Downloads
9
Comments
0
Likes
0
Embeds 0
No embeds

Report content
Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
No notes for slide

Transcript

  • 1. Phinding Phish: An Evaluation of Anti-Phishing Toolbars Yue Zhang, Serge Egelman, Lorrie Cranor, and Jason Hong
  • 2. Anti-Phishing Tools <ul><li>84 Listed on download.com (Sept. ‘06) </li></ul><ul><li>Included in many browsers </li></ul><ul><li>Poor usability </li></ul><ul><ul><li>Many users don’t see indicators </li></ul></ul><ul><ul><li>Many choose to ignore them </li></ul></ul><ul><ul><li>But usability is being addressed </li></ul></ul><ul><li>Are they accurate? </li></ul>
  • 3. Tools Tested <ul><li>CallingID </li></ul><ul><li>Cloudmark </li></ul><ul><li>EarthLink </li></ul>
  • 4. Tools Tested <ul><li>eBay </li></ul><ul><li>Firefox </li></ul>
  • 5. Tools Tested <ul><li>IE7 </li></ul>
  • 6. Tools Tested <ul><li>Netcraft </li></ul><ul><li>Netscape </li></ul>
  • 7. Tools Tested <ul><li>SpoofGuard </li></ul><ul><li>TrustWatch </li></ul>
  • 8. Source of Phish <ul><li>High volume of fresh phish </li></ul><ul><ul><li>Sites taken down after a day on average </li></ul></ul><ul><ul><li>Fresh phish yield blacklist update information </li></ul></ul><ul><li>Can’t use toolbar blacklists </li></ul><ul><li>We experimented with several sources </li></ul><ul><ul><li>APWG - high volume but many duplicates and legitimate URLs included </li></ul></ul><ul><ul><li>Phishtank.org - lower volume but easier to extract phish </li></ul></ul><ul><ul><li>Assorted other phish archives - often low volume or not fresh enough </li></ul></ul>
  • 9. Phishing Feeds <ul><li>Anti-Phishing Working Group </li></ul><ul><ul><li>[email_address] .org </li></ul></ul><ul><ul><li>ISPs, individuals, etc. </li></ul></ul><ul><ul><li>&gt;2,000 messages/day </li></ul></ul><ul><ul><li>Filtering out URLs from messages </li></ul></ul><ul><li>PhishTank </li></ul><ul><ul><li>http://www. phishtank .org/ </li></ul></ul><ul><ul><li>Submitted by public </li></ul></ul><ul><ul><li>~48 messages/day </li></ul></ul><ul><ul><li>Manually verify URLs </li></ul></ul>
  • 10. Testbed for Anti-Phishing Toolbars <ul><li>Automated testing </li></ul><ul><li>Aggregate performance statistics </li></ul><ul><li>Key design issue: </li></ul><ul><ul><li>Different browsers </li></ul></ul><ul><ul><li>Different toolbars </li></ul></ul><ul><ul><li>Different indicator types </li></ul></ul><ul><li>Solution: Image analysis </li></ul><ul><ul><li>Compare screenshots with known states </li></ul></ul>
  • 11. Image-Based Comparisons <ul><li>Two examples: TrustWatch and Google </li></ul><ul><li>TrustWatch: </li></ul><ul><li>Google: </li></ul>ScreenShot ScreenShot Phish!! Warning!! Verified Not verified
  • 12. Testbed System Architecture
  • 13. Testbed System Architecture Retrieve Potential Phishing Sites
  • 14. Testbed System Architecture Send URL to Workers
  • 15. Testbed System Architecture Worker Evaluates Potential Phishing Site
  • 16. Testbed System Architecture Task Manager Aggregates Results
  • 17. Experiment Methodology <ul><li>Catch Rate: </li></ul><ul><li>Given a set of phishing URLs, what percentage of them are correctly labeled as phish by the tool </li></ul><ul><li>- count block and warning only </li></ul><ul><li>- taken down sites removed </li></ul><ul><li>False Positives: </li></ul><ul><li>Given a set of legitimate URLs, what percentage of them are incorrectly labeled as phish by the tool </li></ul><ul><li>- count block and warning only </li></ul><ul><li>- taken down sites removed </li></ul>
  • 18. Experiment 1 <ul><li>PhishTank feed used </li></ul><ul><li>Equipment: </li></ul><ul><ul><li>1 Notebook as Task Manager </li></ul></ul><ul><ul><li>2 Notebooks as Workers </li></ul></ul><ul><li>10 Tools Examined: </li></ul><ul><ul><li>CloudMark </li></ul></ul><ul><ul><li>Earthlink </li></ul></ul><ul><ul><li>eBay </li></ul></ul><ul><ul><li>IE7 </li></ul></ul><ul><ul><li>Google/Firefox </li></ul></ul><ul><ul><li>McAfee </li></ul></ul><ul><ul><li>Netcraft </li></ul></ul><ul><ul><li>Netscape </li></ul></ul><ul><ul><li>SpoofGuard </li></ul></ul><ul><ul><li>TrustWatch </li></ul></ul>
  • 19. Experiment 1 <ul><li>100 phishing URLs </li></ul><ul><ul><li>PhishTank feed </li></ul></ul><ul><ul><li>Manually verified </li></ul></ul><ul><ul><li>Re-examined at 1, 2, 12, 24 hour intervals </li></ul></ul><ul><ul><li>Examined blacklist update rate (except w/SpoofGuard) </li></ul></ul><ul><ul><li>Examined take-down rate </li></ul></ul><ul><li>514 legitimate URLs </li></ul><ul><ul><li>416 from 3Sharp report </li></ul></ul><ul><ul><li>35 from bank log-in pages </li></ul></ul><ul><ul><li>35 from top pages by Alexa </li></ul></ul><ul><ul><li>30 random pages </li></ul></ul>
  • 20. Experiment 2 <ul><li>APWG phishing feed </li></ul><ul><li>9 of the same toolbars tested + CallingID </li></ul><ul><li>Same testing environment </li></ul>
  • 21. Results of Experiment 1
  • 22. Results of Experiment 2
  • 23. False Positives Not a big problem for most of the toolbars 5 (1%) EarthLink 5 (1%) Cloudmark 10 (2%) CallingID 218 (42%) SpoofGuard False Positive Toolbar
  • 24. Overall findings <ul><li>No toolbar caught 100% </li></ul><ul><li>Good performers: </li></ul><ul><ul><li>SpoofGuard (&gt;90%) </li></ul></ul><ul><ul><ul><li>Though 42% false positives </li></ul></ul></ul><ul><ul><li>IE7 (70%-80%) </li></ul></ul><ul><ul><li>Netcraft (60%-80%) </li></ul></ul><ul><ul><li>Firefox (50%-80%) </li></ul></ul><ul><li>Most performed poorly: </li></ul><ul><ul><li>Netscape (10%-30%) </li></ul></ul><ul><ul><li>CallingID (20%-40%) </li></ul></ul>
  • 25. More findings <ul><li>Performance varied with feed </li></ul><ul><ul><li>Better with Phishtank: </li></ul></ul><ul><ul><ul><li>Cloudmark, Earthlink, Firefox, Netcraft </li></ul></ul></ul><ul><ul><li>Better with APWG: </li></ul></ul><ul><ul><ul><li>eBay, IE7, Netscape </li></ul></ul></ul><ul><ul><li>Almost the same: </li></ul></ul><ul><ul><ul><li>Spoofguard, Trustwatch </li></ul></ul></ul><ul><li>Different increases over time </li></ul><ul><ul><li>More increases on APWG </li></ul></ul><ul><ul><li>Reflects the “freshness” of URLs </li></ul></ul>
  • 26. CDN Attack <ul><li>Many tools use blacklists </li></ul><ul><li>Many examine IP addresses (location, etc.) </li></ul><ul><li>Proxies distort URLs </li></ul><ul><ul><li>Used Coral CDN </li></ul></ul><ul><ul><li>Append .nyud.net:8090 to URLs </li></ul></ul><ul><ul><li>Uses PlanetLab </li></ul></ul><ul><li>Works on: </li></ul><ul><ul><li>Cloudmark </li></ul></ul><ul><ul><li>Google </li></ul></ul><ul><ul><li>TrustWatch </li></ul></ul><ul><ul><li>Netcraft </li></ul></ul><ul><ul><li>Netscape </li></ul></ul>
  • 27. Page Load Attack <ul><li>Some wait for page to be fully loaded </li></ul><ul><ul><li>SpoofGuard </li></ul></ul><ul><ul><li>eBay </li></ul></ul><ul><li>Insert a web bug taking infinite load time </li></ul><ul><ul><li>5 lines of PHP </li></ul></ul><ul><ul><li>1x1 GIF </li></ul></ul><ul><ul><li>Infinite loop spitting out data very slowly </li></ul></ul><ul><li>Tool stays in previous state </li></ul><ul><li>Unable to indicate anything </li></ul>
  • 28. Conclusion <ul><li>Tool Performance </li></ul><ul><ul><li>No toolbars are perfect </li></ul></ul><ul><ul><li>No single toolbar will outperform others </li></ul></ul><ul><ul><li>Heuristics have false positives </li></ul></ul><ul><ul><ul><li>Whitelists? </li></ul></ul></ul><ul><ul><ul><li>Hybrid approach? </li></ul></ul></ul><ul><li>Testing Methodology </li></ul><ul><ul><li>Get fresher URLs </li></ul></ul><ul><ul><li>Test other than default settings </li></ul></ul><ul><li>User interfaces </li></ul><ul><ul><li>Usability is important </li></ul></ul><ul><ul><ul><li>Traffic light? </li></ul></ul></ul><ul><ul><ul><li>Pop up message? </li></ul></ul></ul><ul><ul><ul><li>Re-direct page? </li></ul></ul></ul>
  • 29. C MU U sable P rivacy and S ecurity Laboratory http://cups.cs.cmu.edu/

×