4. Extracted Data from Apple AppStore
• 3 Industries: Finance, eCommerce, Social Media
• 50 Applications:
• Finance: Akbank, Bloomberg, Chase, ING, PayPal…
• eCommerce: AliBaba, Amazon, eBay, Kliksa, n11…
• Social Media: Ask.fm, BIP, Facebook, Instagram,
Tango…
• 1051 Revisions
5. Revision Age of Mobile Apps
0.00%
5.00%
10.00%
15.00%
20.00%
25.00%
30.00%
35.00%
0-3 4-7 8-15 16-31 32-63 64-127 128-255 256-511 512-
eCommerce Finance Social Media
Source: Apple
n=1051
6. Average App Revision Count = 21.02
-
5.00
10.00
15.00
20.00
25.00
30.00
2009 2010 2011 2012 2013 2014 2015
Avg.TotalNumberofVersions
First version relase year
Source: Apple
n=50 (1051)
8. The Situation
• Mobile app development
• Agile is ad-hoc standard
• New app release is under pressure
• Competition
• OS updates & new devices
• New version every … 16-31 days (& decreasing)
9. The Requirement
• Detailed manual penetration testing for each
version (before release)
• If possible, source code review
10. The Situation for Mobile App
Security
• Testing (mobile) applications are
• Expensive
• Takes time & resources
• Mostly perfomed periodically
• Once a year
• 4 times a year
11. Problem
• Detailed penetration
NOT POSSIBLE (each
time)
• Not enough time
• Not enough resources
• Not needed!
• Most of the app is the same
• Is it?
• Each new version is
subject to
• Developer mistakes,
• Security vulnerabilities
(including re-appear)
• Time pressure
• Less user acceptance
testing
• Less developer & unit
testing
• Less code review
12. BEST-Mobile
• Best Effort Security Testing for Mobile
Applications
• Continue periodical, detailed pen-testing...
• For each new version: Perform mobile application
security tests with best effort and best efficiency
• Use: Automated tools &
Security Testing as a Service
13. Question
• Can we trust automated testing?
• Is there enough investment on automated
mobile vulnerability
15. Data From Mobile App Testers
• Source
• OWASP – WhiteHat, Pure Hacking, Lostar,
BugCrowd, SecureNetwork, Metaintelli
• 2014 – Detected and reported issues, n=1065
• Overall 54% found by automated tools
• How about risk level?
16. Top 10 Mobile Risks
Exploitability Prevalence Detectability Impact
BEST
Capability*
1
Weak Server Side
Controls Easy Common Average Severe 61%
2
Insecure Data
Storage Easy Common Average Severe 64%
3
Insufficient
Transport Layer
Protection
Difficult Common Easy Moderate 91%
4
Unintended Data
Leakage Easy Common Easy Severe 24%
5
Poor Authorization
and Authentication Easy Common Easy Severe %14
17. Top 10 Mobile Risks
Exploitability Prevalence Detectability Impact
BEST
Capability*
6
Broken
Cryptography Easy Common Easy Severe 90%
7
Client Side
Injection Easy Common Easy Moderate No data
8
Security Decisions
Via Untrusted Inputs Easy Common Easy Severe No data
9
Improper Session
Handling Easy Common Easy Severe 16%
10
Lack of Binary
Protections Medium Common Easy Severe 82%
18. Verdict
• Automated testing can NOT replace manual
mobile penetration testing and code review.
• It can however, can be used to see the basics
(i.e.Usefull in the sence of pareto rule)
20. Proposed Testing Process
2. Use Classification for Pre-Relase Security
TestingClass Major Release Minor Release Regular Reviews
High Pre: PenTest + Code Review Pre: PenTest
(Opt/Post: Code
Review)
External
(PenTest+Code
Review)
Medium Pre: PenTest (Opt/Post: Code
Review)
Pre: BEST
(Opt/Post:
PenTest)
PenTest + Code
Review
Low Pre: BEST (Post PenTest) Pre/Post: BEST PenTest
21. Proposed Testing Process
3. Decision Matrix
App Class Major Issue Minor Issue
High Postpone to fix Postpone for Risk
Advisory Board
Medium Postpone for Risk Advisory Board Postpone to verify and
decide
Low Issue verification Deploy, then fix
23. Next Step
• Put BEST into mobile SDLC
• Strategies
• Send the beta product to BEST tool
• Integrate BEST tool with Bug Tracking System
• Create automated bug/issue entries for each finding
• When
• Ath the end of each sprint
• Part of testing process