SlideShare a Scribd company logo
1 of 42
Download to read offline
Attack-based Exploratory Testing and
The Mobile Tester
(with bonus round on ISO 29119)
Jon D. Hagar, Consultant, Grand Software Testing
embedded@ecentral.com
Author: Software Test Attacks to Break
Mobile and Embedded Devices
Copyright 2015, Jon D. Hagar Grand Software Testing, LLC –
“Software Test Attacks to Break Mobile and Embedded Devices”
1
18th June 2015
http://www.gallop.net/meetups
 Gaming Testing Story
 It only takes a few minutes using an App before users like or hate it
 Worse than that. . .
 Many users will post a social media review of the app
 You don’t want to be a BAD
Copyright 2015, Jon D. Hagar Mobile-Embedded Taxonomies from “Software Test Attacks to Break Mobile and Embedded Devices”
2
The Mobile Opportunity
 Depth
 Passion
 Speed
What Does it Take to be a Great
Mobile App Tester?
Copyright 2015, Jon D. Hagar Grand Software Testing, LLC –
Software Test Attacks to Break Mobile and Embedded Devices
3
Copyright 2015, Jon D. Hagar Grand Software Testing, LLC –
Software Test Attacks to Break Mobile and Embedded Devices
3
 As the names imply, these are devices—small, held in the hand, connected
to communication networks, including
 Cell and smart phones – apps
 Tablets
 Medical devices
 Typically have:
 Many of the problems of classic embedded systems
 The power of PCs/IT
 More user interface (UI) than classic embedded systems
 Fast and frequent updates
 However, mobile devices are “evolving” with more power, resources, apps,
etc.
 Mobile is the “hot” area of computers/software
 Testing rules and concepts are still evolving
 Now starting to include IoT
You know what they are right?
Mobile and Handheld?
Copyright 2015, Jon D. Hagar Mobile-Embedded Taxonomies from “Software Test Attacks to Break Mobile and Embedded Devices”
 Definitions
 Industry Error Trends Taxonomy
 Developer Attacks
 Basic Attacks for the Tester
 The Big “Scary” Security Attacks
 ISO 29119
 Summary
Copyright 2015, Jon D. Hagar Mobile-Embedded Taxonomies from “Software Test Attacks to Break Mobile and Embedded Devices”
5
Agenda
 Test – the act of conducting experiments on something to
determine the quality and to provide information to stakeholders
 Many methods, techniques, approaches, levels, context
 Considerations: input, environment, output, instrumentation
 Quality (ies) – Value to someone (that they will pay for)
 Functions
 Non-functional
 It “works”
 Does no harm
 Are there (critical) bugs?
Copyright 2015, Jon D. Hagar Mobile-Embedded Taxonomies from “Software Test Attacks to Break Mobile and Embedded Devices”
6
Basic Definitions
 From Wikipedia:
Taxonomy is the practice and science of classification. The word finds its
roots in the Greek τάξις, taxis (meaning 'order', 'arrangement') and νόμος,
nomos ('law' or 'science'). Taxonomy uses taxonomic units, known as taxa
(singular taxon). In addition, the word is also used as a count noun: a
taxonomy, or taxonomic scheme, is a particular classification ("the
taxonomy of ..."), arranged in a hierarchical structure.
 Helping to “understand and know”
Copyright 2015, Jon D. Hagar Mobile-Embedded Taxonomies from “Software Test Attacks to Break Mobile and Embedded Devices”
7
Seeing the Eyes of the Enemy
Copyright 2015, Jon D. Hagar Mobile-Embedded Taxonomies from “Software Test Attacks to Break Mobile and Embedded Devices”
8
Taxonomy (researched)
Super Category
Aero-Space Med sys Mobile General
Time 3 2 3
Interrupted - Saturation
(over time) 5.5
Time Boundary – failure resulting
from incompatible system time
formats or values
0.5 1
Time - Race Conditions 3 1
Time - Long run usages 4 1 20
Interrupt - timing or priority
inversions 0.7 3
Date(s) wrong/cause problem
0.5 1
Clocks 4 2
Computation - Flow 6 23 19
Computation - on data 4 1 3 1
Copyright 2015, Jon D. Hagar Mobile-Embedded Taxonomies from “Software Test Attacks to Break Mobile and Embedded Devices”
9
Taxonomy part 2
Super Category
Aero-Space Med sys Mobile General
Data (wrong data loaded or used) 4 5.00 2
Initialization 6 2.00 3 5
Pointers 8 2.00 18 10
Logic and/or control law ordering 8 43 3 30
Loop control –Recursion 1
Decision point (if test structure) 0.5 1 1
Logically Impossible & dead code 0.7
Operating system – (Lack of Fault
tolerance , interface to OS, other) 1.5 2 6
Software - Hardware interfaces 16 13
Software - Software Interface 5 2.00 3
Software - Bad command- problem
on server 3 5
UI - User/ operator interface 4 5.00 20 10
UI - Bad Alarm 0.5 3
UI - Training – system fault
resulting from improper training 3
Other 10.6 9.00 5 5
Note: one report on C/C++ indicated 70% of errors found involved pointers
 A pattern (of testing) based on a common mode of failure
seen over and over
 Part of Exploratory Testing
 May be seen as a negative, when it really is a positive
 Goes after the “bugs” that may be in the software
 May include or use classic test techniques and test concepts
 Lee Copeland’s book on test design
 Many other good books
 A Pattern (more than a process) which must be modified
for the context at hand to do the testing
 Testers learn mental attack patterns
working over the years in a specific domain
Attack-based Testing
What is an attack?
Copyright 2015, Jon D. Hagar Mobile-Embedded Taxonomies from “Software Test Attacks to Break Mobile and Embedded Devices”
Attacks
(from Software Test Attacks to Break Mobile and Embedded Devices)
 Attack 1: Static Code Analysis
 Attack 2: Finding White–Box Data Computation Bugs
 Attack 3: White–Box Structural Logic Flow Coverage
 Attack 4: Finding Hardware–System Unhandled Uses in Software
 Attack 5: Hw-Sw and Sw-Hw signal Interface Bugs
 Attack 6: Long Duration Control Attack Runs
 Attack 7: Breaking Software Logic and/or Control Laws
 Attack 8: Forcing the Unusual Bug Cases
 Attack 9 Breaking Software with Hardware and System
Operations
 9.1 Sub–Attack: Breaking Battery Power
 Attack 10: Finding Bugs in Hardware–Software Communications
 Attack 11: Breaking Software Error Recovery
 Attack 12: Interface and Integration Testing
 12.1 Sub–Attack: Configuration Integration Evaluation
 Attack 13: Finding Problems in Software–System Fault Tolerance
 Attack 14: Breaking Digital Software Communications
 Attack 15: Finding Bugs in the Data
 Attack 16: Bugs in System–Software Computation
 Attack 17: Using Simulation and Stimulation to Drive Software
Attacks
 Attack 18: Bugs in Timing Interrupts and Priority Inversion
 Attack 19: Finding Time Related Bugs
 Attack 20: Time Related Scenarios, Stories and Tours
 Attack 21: Performance Testing Introduction
 Attack 22: Finding Supporting (User) Documentation
Problems
 Sub–Attack 22.1: Confirming Install–ability
 Attack 23: Finding Missing or Wrong Alarms
 Attack 24: Finding Bugs in Help Files
 Attack 25: Finding Bugs in Apps
 Attack 26: Testing Mobile and Embedded Games
 Attack 27: Attacking App–Cloud Dependencies
 Attack 28 Penetration Attack Test
 Attack 28.1 Penetration Sub–Attacks: Authentication —
Password Attack
 Attack 28.2 Sub–Attack Fuzz Test
 Attack 29: Information Theft—Stealing Device Data
 Attack 29.1 Sub Attack –Identity Social Engineering
 Attack 30: Spoofing Attacks
 Attack 30.1 Location and/or User Profile Spoof Sub–Attack
 Attack 30.2 GPS Spoof Sub–Attack
 Attack 31: Attacking Viruses on the Run in Factories or PLCs
 Attack 32: Using Combinatorial Tests
 Attack 33: Attacking Functional Bugs
Copyright 2015, Jon D. Hagar Mobile-Embedded Taxonomies from “Software Test Attacks to Break Mobile and Embedded Devices”
1: Developer Attacks for
Mobile and IoT
Three of many
Copyright 2015, Jon D. Hagar Grand Software Testing, LLC – Software
Test Attacks to Break Mobile and Embedded Devices
12
Attack 1: Static Code Analysis (testing)
 When to apply this attack?
 After/during coding
 What faults make this attack
successful?
 Many
 Example: Issues with pointers
 Who conducts this attack?
 Developer, tester, independent party
 Where is this attack conducted?
 Tool/test lab
 How to determine if the attack
exposes failures?
 Review warning messages and find
true bugs
 How to conduct this attack
 Obtain and run tool
 Find and eliminate false positive
 Identify and address real bugs
 Repeat as code evolves
 Single unit/object
 Class/Group
 Component
 Full system
Copyright 2015, Jon D. Hagar Grand Software Testing, LLC –
Software Test Attacks to Break Mobile and Embedded Devices
13
Attack 2: Finding White–Box Data
Computation Bugs
 When to apply this attack?
 After/during coding
 What faults make this attack
successful?
 Mistakes associated with data
 Example: Wrong value of Pi
 Who conducts this attack?
 Developer, tester, independent party
 Where is this attack conducted?
 Development Tool/test lab
 How to determine if the attack
exposes failures?
 Structural-data test success criteria
not met
 How to conduct this attack
 Obtain tool
 Determine criteria and coverage
 Create test automation with
specific values (really a
programing problem)
 NOT NICE NUMBERS
 Run automated test cases
 Resolve failures
 Peer check test cases
 Repeat as code evolves
Copyright 2015, Jon D. Hagar Grand Software Testing, LLC –
Software Test Attacks to Break Mobile and Embedded Devices
2: Tester Basic Attacks
What is Missing, Usability, Alarms
Sampling of where to start Exploratory Testing
Copyright 2015, Jon D. Hagar Grand Software Testing, LLC – Software
Test Attacks to Break Mobile and Embedded Devices
15
Attack 4: Finding Hardware–
System Unhandled User Cases
 When to apply this attack?
 Starting at system-software analysis
 What faults make this attack
successful?
 Lack of understand of the world
 Example: Car braking on ice
 Who conducts this attack?
 Developer, tester, analyst
 Where is this attack conducted?
 Environments, simulations, field
 How to determine if the attack exposes
failures?
 An unhandled condition exist
 Note: data explosion problem
 How to conduct this attack
 Knowledge
 Out-of-box thinking
 Operation Concepts
 Analysis
 Modeling
 Lab testing
 Field testing
 Feedback
 Repeat
Copyright 2015, Jon D. Hagar Grand Software Testing, LLC –
Software Test Attacks to Break Mobile and Embedded Devices
 When to apply this attack? …when your app/device has a user
 What faults make this attack successful? …devices are increasingly
complex
 Who conducts this attack? …see chart on Roles
 Where is this attack conducted? …throughout lifecycle and in user’s
environments
 How to determine if the attack exposes failures?
 Unhappy “users”
 Bugs found
 See sample checklist
Jean Ann Harrison Copyright 2013
Attack : Testing Usability
Mobile IoT Usability Tends to be “Poor”
Copyright 2015, Jon D. Hagar Grand Software Testing, LLC – “Software Test Attacks to Break Mobile and Embedded Devices”
 Refine checklist to context scope
 Define a role
 Watch what is happening with this role
 Define a usage (many different user roles)
 Guided explorations or ad hoc
 Stress, unusual cases, explore options
 Capture understanding, risk, observations, etc.
 Checklist (watch for confusion of the tester)
 Run Exploratory Attack (s)
 Learn
 Re-plan-design
 Watch for Bias
 Switch testers
 Repeat
Copyright 2015, Jon D. Hagar Grand Software Testing, LLC – “Software Test Attacks to Break Mobile and Embedded Devices”
Usability Attack Pattern
3: IoT and Mobile Security Attacks
And Now for Something Completely Different
Well, At Least A Very Scary (Not Silly) Walk
19
Copyright 2015, Jon D. Hagar Grand Software Testing, LLC – “Software Test Attacks to Break Mobile and Embedded Devices”
 Fraud – Identity
 Worms, virus, etc.
 Fault injection
 Processing on the run
 Hacks impact
 Power
 Memory
 CPU usage
Copyright 2015, Jon D. Hagar Grand Software Testing, LLC – Software
Test Attacks to Break Mobile and Embedded Devices
Mobile Security Concerns
• Eavesdropping – yes everyone can hear you
• Hijacking
• Click-jacking
• Voice/Screen
• Physical Hacks
• File snooping
• Lost phone
 Mobile systems are highly integrated hardware–
software–system solutions which:
 Must be highly trustworthy since they handle sensitive data
 Often perform critical tasks
 Security holes and problems abound
 Coverity Scan 2010 Open Source Integrity Report - Android
 Static analysis test attack found 0.47 defects per 1,000 SLOC
 359 defects in total, 88 of which were considered “high risk” in
the security domain
 OS hole Android with Angry Birds
 Researchers Jon Oberheide and Zach Lanier
 Robots and Drones rumored to be attacked
 Cars and medical devices being hacked
 Stuxnet Virus and its family
The Current Security Situation
Copyright 2015, Jon D. Hagar Grand Software Testing, LLC – “Software Test Attacks to Break Mobile and Embedded Devices”
 Apply when the device is mobile and has
 Account numbers
 User-ids and passwords
 Location tags
 Restricted data
 Current authentication approaches in use on mobile
devices
 Server-based
 Registry (user/password)
 Location or device-based
 Profile-based
Security Attacks
Copyright 2015, Jon D. Hagar Grand Software Testing, LLC – “Software Test Attacks to Break Mobile and Embedded Devices”
 Attack 28 Penetration Attack Test
 Attack 28.1 Penetration Sub–Attacks: Authentication — Password
 Attack 28.2 Sub–Attack Fuzz Test
 Attack 29: Information Theft—Stealing Device Data
 Attack 29.1 Sub Attack –Identity Social Engineering
 Attack 30: Spoofing Attacks
 Attack 30.1 Location and/or User Profile Spoof Sub–Attack
 Attack 30.2 GPS Spoof Sub–Attack
Security Attacks
(only a starting point checklist of things to do)
Copyright 2015, Jon D. Hagar Grand Software Testing, LLC – “Software Test Attacks to Break Mobile and Embedded Devices”
 Security attacks must be done with the knowledge and approval of
owners of the system and software
 Severe legal implications exist in this area
 Many of these attacks must be done in a lab (sandbox)
 In these attacks, I tell you conceptually how to “drive a car very fast
(150 miles an hour) but there are places to do this with a car legally (a
race track) and places where you will get a ticket (most public streets)”
 Be forewarned - Do not attack you favorite app on your phone or any
connected server without the right permissions due to legal
implications
Warnings when Conducting Security
Attacks
Copyright 2015, Jon D. Hagar Grand Software Testing, LLC – “Software Test Attacks to Break Mobile and Embedded Devices”
Attacks
(from Software Test Attacks to Break Mobile and Embedded Devices)
 Attack 1: Static Code Analysis
 Attack 2: Finding White–Box Data Computation Bugs
 Attack 3: White–Box Structural Logic Flow Coverage
 Attack 4: Finding Hardware–System Unhandled Uses in Software
 Attack 5: Hw-Sw and Sw-Hw signal Interface Bugs
 Attack 6: Long Duration Control Attack Runs
 Attack 7: Breaking Software Logic and/or Control Laws
 Attack 8: Forcing the Unusual Bug Cases
 Attack 9 Breaking Software with Hardware and System
Operations
 9.1 Sub–Attack: Breaking Battery Power
 Attack 10: Finding Bugs in Hardware–Software Communications
 Attack 11: Breaking Software Error Recovery
 Attack 12: Interface and Integration Testing
 12.1 Sub–Attack: Configuration Integration Evaluation
 Attack 13: Finding Problems in Software–System Fault Tolerance
 Attack 14: Breaking Digital Software Communications
 Attack 15: Finding Bugs in the Data
 Attack 16: Bugs in System–Software Computation
 Attack 17: Using Simulation and Stimulation to Drive Software
Attacks
 Attack 18: Bugs in Timing Interrupts and Priority Inversion
 Attack 19: Finding Time Related Bugs
 Attack 20: Time Related Scenarios, Stories and Tours
 Attack 21: Performance Testing Introduction
 Attack 22: Finding Supporting (User) Documentation
Problems
 Sub–Attack 22.1: Confirming Install–ability
 Attack 23: Finding Missing or Wrong Alarms
 Attack 24: Finding Bugs in Help Files
 Attack 25: Finding Bugs in Apps
 Attack 26: Testing Mobile and Embedded Games
 Attack 27: Attacking App–Cloud Dependencies
 Attack 28 Penetration Attack Test
 Attack 28.1 Penetration Sub–Attacks: Authentication —
Password Attack
 Attack 28.2 Sub–Attack Fuzz Test
 Attack 29: Information Theft—Stealing Device Data
 Attack 29.1 Sub Attack –Identity Social Engineering
 Attack 30: Spoofing Attacks
 Attack 30.1 Location and/or User Profile Spoof Sub–Attack
 Attack 30.2 GPS Spoof Sub–Attack
 Attack 31: Attacking Viruses on the Run in Factories or PLCs
 Attack 32: Using Combinatorial Tests
 Attack 33: Attacking Functional Bugs
Copyright 2015, Jon D. Hagar Mobile-Embedded Taxonomies from “Software Test Attacks to Break Mobile and Embedded Devices”
Bonus Round
ISO 29119
(and how in may impact mobile and testing)
Copyright 2015, Jon D. Hagar Grand Software Testing, LLC –
Software Test Attacks to Break Mobile and Embedded Devices
26
Motivation for ISO29119
 Conflicts in definitions, processes and procedures
 “One ring to rule them all” — standards to be replaced by one
 e.g., IEEE 829, IEEE 1008, BS7925-1/-2, IEEE 1028
 Users do not know which standard to follow
 Lacking in current standards or incomplete:
 Organizational areas
 e.g., Test Policy and Organizational Test Strategy
 Project Test Management
 BS7925 only covers unit testing
 General processes
 Common functional techniques missing
 Coverage of non-functional testing
ISO/IEC29119 –Structure and Flow
BS7925-1
BS7925-2 IEEE 829
Concepts & Vocabulary
Part 1
Process
Assessment
Testing
Techniques
Part 4
Documentation
Part 3Part 2
Processes
Keyword-Driven
Testing
Part 5 ISO/IEC 33063
ISO 12207
ISO 15288
Directives
IEEE 1008
Thanks to Stuart Reid
Part 1: Concepts & Vocabulary
SOFTWARE TESTING CONCEPTS
Scope, Conformance, Normative References
TESTING IN DIFFERENT LIFE CYCLE MODELS
ROLES AND RESPONSIBILITIES IN TESTING
ANNEXES – Metrics, Examples, Bibliography
DEFINITIONS
Test:Approach,Basis,
Methods
-RiskBasedTesting
Part 2: Testing Processes
TEST MANAGEMENT PROCESSES
ORGANIZATIONAL TEST PROCESS
DYNAMIC TEST PROCESSES
TEST MANAGEMENT PROCESSES
ORGANIZATIONAL TEST PROCESS
DYNAMIC TEST PROCESSES
Instantiating Testing Processes
Ref: S. Reid
Organize
Test Plan
Development
Identify &
Estimate Risks
Design Test
Strategy
Determine
Staffing and
Scheduling
Document
Test Plan
Schedule,
Staffing Profile
Test
Strategy
Analyzed
Risks
Scope
Identify Risk
Treatment
Approaches
Gain
Consensus on
Test Plan
Approved
Test Plan
Draft
Test Plan
Test
Plan Publish
Test Plan
Understand
Context
Treatment
Approaches
Test Planning Processes
Part 3 – Test Documentation
TEST DOCUMENTATION
ANNEXES - EXAMPLES
Scope, Conformance,
Normative References
Selectasubsetofdocs
Part 3: Test Documentation
 Organizational test documentation
 Test policy
 Test strategy
 Project test documentation
 Project test plan
 Test project completion report
 Test Level documentation
 Test plan
 Test specification
 Test results
 Anomaly reports
 Level test status report
 Test environment report
 Test level completion report
 Appendices
 Examples of documents at each level of testing
Part 4 – Test Techniques
TEST COVERAGE MEASUREMENT
Scope, Conformance, Normative References
ANNEXE – TESTING OF QUALITY CHARACTERISTICS
ANNEXE – SELECTION OF TECHNIQUES
ANNEXE – TEST TECHNIQUE EFFECTIVENESS
TEST DESIGN TECHNIQUES
Functional Structural
Part 5- Keyword-Driven Testing
 Part 5 will address:
 Concept
 Applicability
 Interfaces
 Approach
 Part 5 WD was sent out in May and next draft due in Nov 2013
Impact to 29119 to Mobile/Smart
Maybe not much EXCEPT
 Some domains are more regulated such as:
 safety-related
 telecoms
 financial – banks, stock markets, etc.
 Impact to business
 International contracting
 Assessment
 Legal
 Process improvement
 Large company to company (trading language)
Do Testers Need Standards? –
Yes, maybe, but
 Standards support common communication within the topic
 Common reference points
 Starting point for research, usage (pro & con), critic
 Maturity is an issue but a baseline serves as sounding board
and common reference point for “Scientific” methods
 An international benchmark
 Thinkers and researchers can prove/disprove benchmark(s)
 Part of being in a profession (but only part)
 These attacks are presented at a summary level only
 Much more detail and effort are needed
 Understanding your local context and error patterns
is important
(one size does NOT fit all)
 Attacks are patterns…you still must THINK and tailor
 ISO may help, but only in a few contexts
Wrap Up of this Session
Copyright 2015, Jon D. Hagar Grand Software Testing, LLC – “Software Test Attacks to Break Mobile and Embedded Devices”
 James Whittaker (attacks)
 Elisabeth Hendrickson (simulations)
 Lee Copeland (techniques)
 Brian Merrick (testing)
 James Bach (exploratory and tours)
 Cem Kaner (test thinking)
 Jean Ann Harrison (her thinking and help)
 ISO 29119 standards and working group 26
 Many teachers
 Generations past and future
 Books, references, and so on
Notes: Thank You
(ideas used from)
Copyright 2015, Jon D. Hagar Grand Software Testing, LLC – “Software Test Attacks to Break Mobile and Embedded Devices”
 “Software Test Attacks to Break Mobile and Embedded Devices”
– Jon Hagar
 “How to Break Software” James Whittaker, 2003
 And his other “How To Break…” books
 “A Practitioner’s Guide to Software Test Design” Copeland, 2004
 “A Practitioner’s Handbook for Real-Time Analysis” Klein et. al., 1993
 “Computer Related Risks”, Neumann, 1995
 “Safeware: System Safety and Computers”, Leveson, 1995
 Honorable mentions:
 “Systems Testing with an Attitude” Petschenik 2005
 “Software System Testing and Quality Assurance” Beizer, 1987
 “Testing Computer Software” Kaner et. al., 1988
 “Systematic Software Testing” Craig & Jaskiel, 2001
 “Managing the Testing Process” Black, 2002
Book Notes List (my favorites)
Copyright 2015, Jon D. Hagar Grand Software Testing, LLC – Software Test Attacks to Break Mobile and Embedded Devices
• www.stickyminds.com – Collection of test info
• www.embedded.com – info on attacks
 www.sqaforums.com - Mobile Devices, Mobile Apps -
Embedded Systems Testing forum
• Association of Software Testing
– BBST Classes http://www.testingeducation.org/BBST/
• Your favorite search engine
• Our web sites and blogs (listed on front page)
More Resources
Copyright 2015, Jon D. Hagar Grand Software Testing, LLC – Software Test Attacks to Break Mobile and Embedded Devices

More Related Content

What's hot

St & internationalization
St & internationalizationSt & internationalization
St & internationalization
Sachin MK
 
Software Security Testing
Software Security TestingSoftware Security Testing
Software Security Testing
ankitmehta21
 
Mitigating Privilege-Escalation Attacks on Android Report
Mitigating Privilege-Escalation Attacks on Android  ReportMitigating Privilege-Escalation Attacks on Android  Report
Mitigating Privilege-Escalation Attacks on Android Report
Vinoth Kanna
 
Process_to_Produce_Secure_Software-DHS_White-House_Geoff-Shively
Process_to_Produce_Secure_Software-DHS_White-House_Geoff-ShivelyProcess_to_Produce_Secure_Software-DHS_White-House_Geoff-Shively
Process_to_Produce_Secure_Software-DHS_White-House_Geoff-Shively
Curious Geoff (Shively)
 
Mining Software Defects: Should We Consider Affected Releases?
Mining Software Defects: Should We Consider Affected Releases?Mining Software Defects: Should We Consider Affected Releases?
Mining Software Defects: Should We Consider Affected Releases?
Chakkrit (Kla) Tantithamthavorn
 
Appendix g iocs readme
Appendix g iocs readmeAppendix g iocs readme
Appendix g iocs readme
Yury Chemerkin
 
The Impact of Mislabelling on the Performance and Interpretation of Defect Pr...
The Impact of Mislabelling on the Performance and Interpretation of Defect Pr...The Impact of Mislabelling on the Performance and Interpretation of Defect Pr...
The Impact of Mislabelling on the Performance and Interpretation of Defect Pr...
Chakkrit (Kla) Tantithamthavorn
 

What's hot (20)

COVERT app
COVERT appCOVERT app
COVERT app
 
WHAT IS APP SECURITY – THE COMPLETE PROCESS AND THE TOOLS & TESTS TO RUN IT
WHAT IS APP SECURITY – THE COMPLETE PROCESS AND THE TOOLS & TESTS TO RUN ITWHAT IS APP SECURITY – THE COMPLETE PROCESS AND THE TOOLS & TESTS TO RUN IT
WHAT IS APP SECURITY – THE COMPLETE PROCESS AND THE TOOLS & TESTS TO RUN IT
 
Androinspector a system for
Androinspector a system forAndroinspector a system for
Androinspector a system for
 
05 extended report
05 extended report05 extended report
05 extended report
 
St & internationalization
St & internationalizationSt & internationalization
St & internationalization
 
Software Security Testing
Software Security TestingSoftware Security Testing
Software Security Testing
 
Issue Tracking
Issue TrackingIssue Tracking
Issue Tracking
 
Are free Android app security analysis tools effective in detecting known vul...
Are free Android app security analysis tools effective in detecting known vul...Are free Android app security analysis tools effective in detecting known vul...
Are free Android app security analysis tools effective in detecting known vul...
 
Mitigating Privilege-Escalation Attacks on Android Report
Mitigating Privilege-Escalation Attacks on Android  ReportMitigating Privilege-Escalation Attacks on Android  Report
Mitigating Privilege-Escalation Attacks on Android Report
 
Presentation (software engineering)
Presentation (software engineering)Presentation (software engineering)
Presentation (software engineering)
 
Internet of Things: Government Keynote, Randy Garrett
Internet of Things: Government Keynote, Randy GarrettInternet of Things: Government Keynote, Randy Garrett
Internet of Things: Government Keynote, Randy Garrett
 
Analytical Survey on Bug Tracking System
Analytical Survey on Bug Tracking SystemAnalytical Survey on Bug Tracking System
Analytical Survey on Bug Tracking System
 
Process_to_Produce_Secure_Software-DHS_White-House_Geoff-Shively
Process_to_Produce_Secure_Software-DHS_White-House_Geoff-ShivelyProcess_to_Produce_Secure_Software-DHS_White-House_Geoff-Shively
Process_to_Produce_Secure_Software-DHS_White-House_Geoff-Shively
 
Introduction to Application Security Testing
Introduction to Application Security TestingIntroduction to Application Security Testing
Introduction to Application Security Testing
 
Mining Software Defects: Should We Consider Affected Releases?
Mining Software Defects: Should We Consider Affected Releases?Mining Software Defects: Should We Consider Affected Releases?
Mining Software Defects: Should We Consider Affected Releases?
 
Secure by design and secure software development
Secure by design and secure software developmentSecure by design and secure software development
Secure by design and secure software development
 
Appendix g iocs readme
Appendix g iocs readmeAppendix g iocs readme
Appendix g iocs readme
 
IRJET - Research on Data Mining of Permission-Induced Risk for Android Devices
IRJET - Research on Data Mining of Permission-Induced Risk for Android DevicesIRJET - Research on Data Mining of Permission-Induced Risk for Android Devices
IRJET - Research on Data Mining of Permission-Induced Risk for Android Devices
 
The Impact of Mislabelling on the Performance and Interpretation of Defect Pr...
The Impact of Mislabelling on the Performance and Interpretation of Defect Pr...The Impact of Mislabelling on the Performance and Interpretation of Defect Pr...
The Impact of Mislabelling on the Performance and Interpretation of Defect Pr...
 
Semi-Automated Security Testing of Web applications
Semi-Automated Security Testing of Web applicationsSemi-Automated Security Testing of Web applications
Semi-Automated Security Testing of Web applications
 

Viewers also liked

Software testing
Software testingSoftware testing
Software testing
thaneofife
 

Viewers also liked (7)

Software testing
Software testingSoftware testing
Software testing
 
Mobile App Testing Best Practices
Mobile App Testing Best PracticesMobile App Testing Best Practices
Mobile App Testing Best Practices
 
Crowd Testing Framework : Mobile Application Testing
Crowd Testing Framework : Mobile Application TestingCrowd Testing Framework : Mobile Application Testing
Crowd Testing Framework : Mobile Application Testing
 
Tips for Writing Better Charters for Exploratory Testing Sessions by Michael...
 Tips for Writing Better Charters for Exploratory Testing Sessions by Michael... Tips for Writing Better Charters for Exploratory Testing Sessions by Michael...
Tips for Writing Better Charters for Exploratory Testing Sessions by Michael...
 
Exploratory Testing
Exploratory TestingExploratory Testing
Exploratory Testing
 
Bhim app case study.ppt
Bhim app case study.pptBhim app case study.ppt
Bhim app case study.ppt
 
Bhim app
Bhim appBhim app
Bhim app
 

Similar to Exploratory testing and the mobile tester : A presentation by Jon Hagar

Software techniques
Software techniquesSoftware techniques
Software techniques
home
 
Avtest 2012 02-android_anti-malware_report_english
Avtest 2012 02-android_anti-malware_report_englishAvtest 2012 02-android_anti-malware_report_english
Avtest 2012 02-android_anti-malware_report_english
Daniel zhao
 
Avtest 2012 02-android_anti-malware_report_english
Avtest 2012 02-android_anti-malware_report_englishAvtest 2012 02-android_anti-malware_report_english
Avtest 2012 02-android_anti-malware_report_english
Комсс Файквэе
 
Avtest 2012 02-android_anti-malware_report_english
Avtest 2012 02-android_anti-malware_report_englishAvtest 2012 02-android_anti-malware_report_english
Avtest 2012 02-android_anti-malware_report_english
Комсс Файквэе
 

Similar to Exploratory testing and the mobile tester : A presentation by Jon Hagar (20)

Implement Combinatorial Test Patterns for Better Mobile and IoT Testing
Implement Combinatorial Test Patterns for Better Mobile and IoT TestingImplement Combinatorial Test Patterns for Better Mobile and IoT Testing
Implement Combinatorial Test Patterns for Better Mobile and IoT Testing
 
Mobile App Testing: The Good, the Bad, and the Ugly
Mobile App Testing: The Good, the Bad, and the UglyMobile App Testing: The Good, the Bad, and the Ugly
Mobile App Testing: The Good, the Bad, and the Ugly
 
XBOSoft Mobile Security Webinar with Jon D. Hagar
XBOSoft Mobile Security Webinar with Jon D. HagarXBOSoft Mobile Security Webinar with Jon D. Hagar
XBOSoft Mobile Security Webinar with Jon D. Hagar
 
IoT Software Testing Challenges: The IoT World Is Really Different
IoT Software Testing Challenges: The IoT World Is Really DifferentIoT Software Testing Challenges: The IoT World Is Really Different
IoT Software Testing Challenges: The IoT World Is Really Different
 
How to Break Software: Embedded Edition
How to Break Software: Embedded EditionHow to Break Software: Embedded Edition
How to Break Software: Embedded Edition
 
Software Attacks for Embedded, Mobile, and Internet of Things
Software Attacks for Embedded, Mobile, and Internet of ThingsSoftware Attacks for Embedded, Mobile, and Internet of Things
Software Attacks for Embedded, Mobile, and Internet of Things
 
lecture02.ppt
lecture02.pptlecture02.ppt
lecture02.ppt
 
IoT Software Testing Challenges: The IoT World Is Really Different
IoT Software Testing Challenges: The IoT World Is Really DifferentIoT Software Testing Challenges: The IoT World Is Really Different
IoT Software Testing Challenges: The IoT World Is Really Different
 
Mobile App Testing: Design Automation Patterns You Should Use
Mobile App Testing: Design Automation Patterns You Should UseMobile App Testing: Design Automation Patterns You Should Use
Mobile App Testing: Design Automation Patterns You Should Use
 
Ch01-whyTest.pptx
Ch01-whyTest.pptxCh01-whyTest.pptx
Ch01-whyTest.pptx
 
Ch01-whyTest.pptx
Ch01-whyTest.pptxCh01-whyTest.pptx
Ch01-whyTest.pptx
 
Software techniques
Software techniquesSoftware techniques
Software techniques
 
IRJET- Cross Platform Penetration Testing Suite
IRJET-  	  Cross Platform Penetration Testing SuiteIRJET-  	  Cross Platform Penetration Testing Suite
IRJET- Cross Platform Penetration Testing Suite
 
Avtest 2012 02-android_anti-malware_report_english
Avtest 2012 02-android_anti-malware_report_englishAvtest 2012 02-android_anti-malware_report_english
Avtest 2012 02-android_anti-malware_report_english
 
Avtest 2012 02-android_anti-malware_report_english
Avtest 2012 02-android_anti-malware_report_englishAvtest 2012 02-android_anti-malware_report_english
Avtest 2012 02-android_anti-malware_report_english
 
Avtest 2012 02-android_anti-malware_report_english
Avtest 2012 02-android_anti-malware_report_englishAvtest 2012 02-android_anti-malware_report_english
Avtest 2012 02-android_anti-malware_report_english
 
Mobile Apps Security Testing -1
Mobile Apps Security Testing -1Mobile Apps Security Testing -1
Mobile Apps Security Testing -1
 
Software testing
Software testingSoftware testing
Software testing
 
Web Application Testing for Today’s Biggest and Emerging Threats
Web Application Testing for Today’s Biggest and Emerging ThreatsWeb Application Testing for Today’s Biggest and Emerging Threats
Web Application Testing for Today’s Biggest and Emerging Threats
 
Mobile App Security Testing -2
Mobile App Security Testing -2Mobile App Security Testing -2
Mobile App Security Testing -2
 

Recently uploaded

+971581248768>> SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHA...
+971581248768>> SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHA...+971581248768>> SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHA...
+971581248768>> SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHA...
?#DUbAI#??##{{(☎️+971_581248768%)**%*]'#abortion pills for sale in dubai@
 
Architecting Cloud Native Applications
Architecting Cloud Native ApplicationsArchitecting Cloud Native Applications
Architecting Cloud Native Applications
WSO2
 

Recently uploaded (20)

Apidays New York 2024 - The Good, the Bad and the Governed by David O'Neill, ...
Apidays New York 2024 - The Good, the Bad and the Governed by David O'Neill, ...Apidays New York 2024 - The Good, the Bad and the Governed by David O'Neill, ...
Apidays New York 2024 - The Good, the Bad and the Governed by David O'Neill, ...
 
ProductAnonymous-April2024-WinProductDiscovery-MelissaKlemke
ProductAnonymous-April2024-WinProductDiscovery-MelissaKlemkeProductAnonymous-April2024-WinProductDiscovery-MelissaKlemke
ProductAnonymous-April2024-WinProductDiscovery-MelissaKlemke
 
MS Copilot expands with MS Graph connectors
MS Copilot expands with MS Graph connectorsMS Copilot expands with MS Graph connectors
MS Copilot expands with MS Graph connectors
 
Emergent Methods: Multi-lingual narrative tracking in the news - real-time ex...
Emergent Methods: Multi-lingual narrative tracking in the news - real-time ex...Emergent Methods: Multi-lingual narrative tracking in the news - real-time ex...
Emergent Methods: Multi-lingual narrative tracking in the news - real-time ex...
 
Artificial Intelligence Chap.5 : Uncertainty
Artificial Intelligence Chap.5 : UncertaintyArtificial Intelligence Chap.5 : Uncertainty
Artificial Intelligence Chap.5 : Uncertainty
 
"I see eyes in my soup": How Delivery Hero implemented the safety system for ...
"I see eyes in my soup": How Delivery Hero implemented the safety system for ..."I see eyes in my soup": How Delivery Hero implemented the safety system for ...
"I see eyes in my soup": How Delivery Hero implemented the safety system for ...
 
presentation ICT roal in 21st century education
presentation ICT roal in 21st century educationpresentation ICT roal in 21st century education
presentation ICT roal in 21st century education
 
+971581248768>> SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHA...
+971581248768>> SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHA...+971581248768>> SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHA...
+971581248768>> SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHA...
 
Apidays New York 2024 - The value of a flexible API Management solution for O...
Apidays New York 2024 - The value of a flexible API Management solution for O...Apidays New York 2024 - The value of a flexible API Management solution for O...
Apidays New York 2024 - The value of a flexible API Management solution for O...
 
Strategies for Landing an Oracle DBA Job as a Fresher
Strategies for Landing an Oracle DBA Job as a FresherStrategies for Landing an Oracle DBA Job as a Fresher
Strategies for Landing an Oracle DBA Job as a Fresher
 
Boost Fertility New Invention Ups Success Rates.pdf
Boost Fertility New Invention Ups Success Rates.pdfBoost Fertility New Invention Ups Success Rates.pdf
Boost Fertility New Invention Ups Success Rates.pdf
 
Web Form Automation for Bonterra Impact Management (fka Social Solutions Apri...
Web Form Automation for Bonterra Impact Management (fka Social Solutions Apri...Web Form Automation for Bonterra Impact Management (fka Social Solutions Apri...
Web Form Automation for Bonterra Impact Management (fka Social Solutions Apri...
 
MINDCTI Revenue Release Quarter One 2024
MINDCTI Revenue Release Quarter One 2024MINDCTI Revenue Release Quarter One 2024
MINDCTI Revenue Release Quarter One 2024
 
Apidays New York 2024 - Scaling API-first by Ian Reasor and Radu Cotescu, Adobe
Apidays New York 2024 - Scaling API-first by Ian Reasor and Radu Cotescu, AdobeApidays New York 2024 - Scaling API-first by Ian Reasor and Radu Cotescu, Adobe
Apidays New York 2024 - Scaling API-first by Ian Reasor and Radu Cotescu, Adobe
 
Architecting Cloud Native Applications
Architecting Cloud Native ApplicationsArchitecting Cloud Native Applications
Architecting Cloud Native Applications
 
Automating Google Workspace (GWS) & more with Apps Script
Automating Google Workspace (GWS) & more with Apps ScriptAutomating Google Workspace (GWS) & more with Apps Script
Automating Google Workspace (GWS) & more with Apps Script
 
Apidays New York 2024 - Accelerating FinTech Innovation by Vasa Krishnan, Fin...
Apidays New York 2024 - Accelerating FinTech Innovation by Vasa Krishnan, Fin...Apidays New York 2024 - Accelerating FinTech Innovation by Vasa Krishnan, Fin...
Apidays New York 2024 - Accelerating FinTech Innovation by Vasa Krishnan, Fin...
 
Strategize a Smooth Tenant-to-tenant Migration and Copilot Takeoff
Strategize a Smooth Tenant-to-tenant Migration and Copilot TakeoffStrategize a Smooth Tenant-to-tenant Migration and Copilot Takeoff
Strategize a Smooth Tenant-to-tenant Migration and Copilot Takeoff
 
Connector Corner: Accelerate revenue generation using UiPath API-centric busi...
Connector Corner: Accelerate revenue generation using UiPath API-centric busi...Connector Corner: Accelerate revenue generation using UiPath API-centric busi...
Connector Corner: Accelerate revenue generation using UiPath API-centric busi...
 
TrustArc Webinar - Stay Ahead of US State Data Privacy Law Developments
TrustArc Webinar - Stay Ahead of US State Data Privacy Law DevelopmentsTrustArc Webinar - Stay Ahead of US State Data Privacy Law Developments
TrustArc Webinar - Stay Ahead of US State Data Privacy Law Developments
 

Exploratory testing and the mobile tester : A presentation by Jon Hagar

  • 1. Attack-based Exploratory Testing and The Mobile Tester (with bonus round on ISO 29119) Jon D. Hagar, Consultant, Grand Software Testing embedded@ecentral.com Author: Software Test Attacks to Break Mobile and Embedded Devices Copyright 2015, Jon D. Hagar Grand Software Testing, LLC – “Software Test Attacks to Break Mobile and Embedded Devices” 1 18th June 2015 http://www.gallop.net/meetups
  • 2.  Gaming Testing Story  It only takes a few minutes using an App before users like or hate it  Worse than that. . .  Many users will post a social media review of the app  You don’t want to be a BAD Copyright 2015, Jon D. Hagar Mobile-Embedded Taxonomies from “Software Test Attacks to Break Mobile and Embedded Devices” 2 The Mobile Opportunity
  • 3.  Depth  Passion  Speed What Does it Take to be a Great Mobile App Tester? Copyright 2015, Jon D. Hagar Grand Software Testing, LLC – Software Test Attacks to Break Mobile and Embedded Devices 3 Copyright 2015, Jon D. Hagar Grand Software Testing, LLC – Software Test Attacks to Break Mobile and Embedded Devices 3
  • 4.  As the names imply, these are devices—small, held in the hand, connected to communication networks, including  Cell and smart phones – apps  Tablets  Medical devices  Typically have:  Many of the problems of classic embedded systems  The power of PCs/IT  More user interface (UI) than classic embedded systems  Fast and frequent updates  However, mobile devices are “evolving” with more power, resources, apps, etc.  Mobile is the “hot” area of computers/software  Testing rules and concepts are still evolving  Now starting to include IoT You know what they are right? Mobile and Handheld? Copyright 2015, Jon D. Hagar Mobile-Embedded Taxonomies from “Software Test Attacks to Break Mobile and Embedded Devices”
  • 5.  Definitions  Industry Error Trends Taxonomy  Developer Attacks  Basic Attacks for the Tester  The Big “Scary” Security Attacks  ISO 29119  Summary Copyright 2015, Jon D. Hagar Mobile-Embedded Taxonomies from “Software Test Attacks to Break Mobile and Embedded Devices” 5 Agenda
  • 6.  Test – the act of conducting experiments on something to determine the quality and to provide information to stakeholders  Many methods, techniques, approaches, levels, context  Considerations: input, environment, output, instrumentation  Quality (ies) – Value to someone (that they will pay for)  Functions  Non-functional  It “works”  Does no harm  Are there (critical) bugs? Copyright 2015, Jon D. Hagar Mobile-Embedded Taxonomies from “Software Test Attacks to Break Mobile and Embedded Devices” 6 Basic Definitions
  • 7.  From Wikipedia: Taxonomy is the practice and science of classification. The word finds its roots in the Greek τάξις, taxis (meaning 'order', 'arrangement') and νόμος, nomos ('law' or 'science'). Taxonomy uses taxonomic units, known as taxa (singular taxon). In addition, the word is also used as a count noun: a taxonomy, or taxonomic scheme, is a particular classification ("the taxonomy of ..."), arranged in a hierarchical structure.  Helping to “understand and know” Copyright 2015, Jon D. Hagar Mobile-Embedded Taxonomies from “Software Test Attacks to Break Mobile and Embedded Devices” 7 Seeing the Eyes of the Enemy
  • 8. Copyright 2015, Jon D. Hagar Mobile-Embedded Taxonomies from “Software Test Attacks to Break Mobile and Embedded Devices” 8 Taxonomy (researched) Super Category Aero-Space Med sys Mobile General Time 3 2 3 Interrupted - Saturation (over time) 5.5 Time Boundary – failure resulting from incompatible system time formats or values 0.5 1 Time - Race Conditions 3 1 Time - Long run usages 4 1 20 Interrupt - timing or priority inversions 0.7 3 Date(s) wrong/cause problem 0.5 1 Clocks 4 2 Computation - Flow 6 23 19 Computation - on data 4 1 3 1
  • 9. Copyright 2015, Jon D. Hagar Mobile-Embedded Taxonomies from “Software Test Attacks to Break Mobile and Embedded Devices” 9 Taxonomy part 2 Super Category Aero-Space Med sys Mobile General Data (wrong data loaded or used) 4 5.00 2 Initialization 6 2.00 3 5 Pointers 8 2.00 18 10 Logic and/or control law ordering 8 43 3 30 Loop control –Recursion 1 Decision point (if test structure) 0.5 1 1 Logically Impossible & dead code 0.7 Operating system – (Lack of Fault tolerance , interface to OS, other) 1.5 2 6 Software - Hardware interfaces 16 13 Software - Software Interface 5 2.00 3 Software - Bad command- problem on server 3 5 UI - User/ operator interface 4 5.00 20 10 UI - Bad Alarm 0.5 3 UI - Training – system fault resulting from improper training 3 Other 10.6 9.00 5 5 Note: one report on C/C++ indicated 70% of errors found involved pointers
  • 10.  A pattern (of testing) based on a common mode of failure seen over and over  Part of Exploratory Testing  May be seen as a negative, when it really is a positive  Goes after the “bugs” that may be in the software  May include or use classic test techniques and test concepts  Lee Copeland’s book on test design  Many other good books  A Pattern (more than a process) which must be modified for the context at hand to do the testing  Testers learn mental attack patterns working over the years in a specific domain Attack-based Testing What is an attack? Copyright 2015, Jon D. Hagar Mobile-Embedded Taxonomies from “Software Test Attacks to Break Mobile and Embedded Devices”
  • 11. Attacks (from Software Test Attacks to Break Mobile and Embedded Devices)  Attack 1: Static Code Analysis  Attack 2: Finding White–Box Data Computation Bugs  Attack 3: White–Box Structural Logic Flow Coverage  Attack 4: Finding Hardware–System Unhandled Uses in Software  Attack 5: Hw-Sw and Sw-Hw signal Interface Bugs  Attack 6: Long Duration Control Attack Runs  Attack 7: Breaking Software Logic and/or Control Laws  Attack 8: Forcing the Unusual Bug Cases  Attack 9 Breaking Software with Hardware and System Operations  9.1 Sub–Attack: Breaking Battery Power  Attack 10: Finding Bugs in Hardware–Software Communications  Attack 11: Breaking Software Error Recovery  Attack 12: Interface and Integration Testing  12.1 Sub–Attack: Configuration Integration Evaluation  Attack 13: Finding Problems in Software–System Fault Tolerance  Attack 14: Breaking Digital Software Communications  Attack 15: Finding Bugs in the Data  Attack 16: Bugs in System–Software Computation  Attack 17: Using Simulation and Stimulation to Drive Software Attacks  Attack 18: Bugs in Timing Interrupts and Priority Inversion  Attack 19: Finding Time Related Bugs  Attack 20: Time Related Scenarios, Stories and Tours  Attack 21: Performance Testing Introduction  Attack 22: Finding Supporting (User) Documentation Problems  Sub–Attack 22.1: Confirming Install–ability  Attack 23: Finding Missing or Wrong Alarms  Attack 24: Finding Bugs in Help Files  Attack 25: Finding Bugs in Apps  Attack 26: Testing Mobile and Embedded Games  Attack 27: Attacking App–Cloud Dependencies  Attack 28 Penetration Attack Test  Attack 28.1 Penetration Sub–Attacks: Authentication — Password Attack  Attack 28.2 Sub–Attack Fuzz Test  Attack 29: Information Theft—Stealing Device Data  Attack 29.1 Sub Attack –Identity Social Engineering  Attack 30: Spoofing Attacks  Attack 30.1 Location and/or User Profile Spoof Sub–Attack  Attack 30.2 GPS Spoof Sub–Attack  Attack 31: Attacking Viruses on the Run in Factories or PLCs  Attack 32: Using Combinatorial Tests  Attack 33: Attacking Functional Bugs Copyright 2015, Jon D. Hagar Mobile-Embedded Taxonomies from “Software Test Attacks to Break Mobile and Embedded Devices”
  • 12. 1: Developer Attacks for Mobile and IoT Three of many Copyright 2015, Jon D. Hagar Grand Software Testing, LLC – Software Test Attacks to Break Mobile and Embedded Devices 12
  • 13. Attack 1: Static Code Analysis (testing)  When to apply this attack?  After/during coding  What faults make this attack successful?  Many  Example: Issues with pointers  Who conducts this attack?  Developer, tester, independent party  Where is this attack conducted?  Tool/test lab  How to determine if the attack exposes failures?  Review warning messages and find true bugs  How to conduct this attack  Obtain and run tool  Find and eliminate false positive  Identify and address real bugs  Repeat as code evolves  Single unit/object  Class/Group  Component  Full system Copyright 2015, Jon D. Hagar Grand Software Testing, LLC – Software Test Attacks to Break Mobile and Embedded Devices 13
  • 14. Attack 2: Finding White–Box Data Computation Bugs  When to apply this attack?  After/during coding  What faults make this attack successful?  Mistakes associated with data  Example: Wrong value of Pi  Who conducts this attack?  Developer, tester, independent party  Where is this attack conducted?  Development Tool/test lab  How to determine if the attack exposes failures?  Structural-data test success criteria not met  How to conduct this attack  Obtain tool  Determine criteria and coverage  Create test automation with specific values (really a programing problem)  NOT NICE NUMBERS  Run automated test cases  Resolve failures  Peer check test cases  Repeat as code evolves Copyright 2015, Jon D. Hagar Grand Software Testing, LLC – Software Test Attacks to Break Mobile and Embedded Devices
  • 15. 2: Tester Basic Attacks What is Missing, Usability, Alarms Sampling of where to start Exploratory Testing Copyright 2015, Jon D. Hagar Grand Software Testing, LLC – Software Test Attacks to Break Mobile and Embedded Devices 15
  • 16. Attack 4: Finding Hardware– System Unhandled User Cases  When to apply this attack?  Starting at system-software analysis  What faults make this attack successful?  Lack of understand of the world  Example: Car braking on ice  Who conducts this attack?  Developer, tester, analyst  Where is this attack conducted?  Environments, simulations, field  How to determine if the attack exposes failures?  An unhandled condition exist  Note: data explosion problem  How to conduct this attack  Knowledge  Out-of-box thinking  Operation Concepts  Analysis  Modeling  Lab testing  Field testing  Feedback  Repeat Copyright 2015, Jon D. Hagar Grand Software Testing, LLC – Software Test Attacks to Break Mobile and Embedded Devices
  • 17.  When to apply this attack? …when your app/device has a user  What faults make this attack successful? …devices are increasingly complex  Who conducts this attack? …see chart on Roles  Where is this attack conducted? …throughout lifecycle and in user’s environments  How to determine if the attack exposes failures?  Unhappy “users”  Bugs found  See sample checklist Jean Ann Harrison Copyright 2013 Attack : Testing Usability Mobile IoT Usability Tends to be “Poor” Copyright 2015, Jon D. Hagar Grand Software Testing, LLC – “Software Test Attacks to Break Mobile and Embedded Devices”
  • 18.  Refine checklist to context scope  Define a role  Watch what is happening with this role  Define a usage (many different user roles)  Guided explorations or ad hoc  Stress, unusual cases, explore options  Capture understanding, risk, observations, etc.  Checklist (watch for confusion of the tester)  Run Exploratory Attack (s)  Learn  Re-plan-design  Watch for Bias  Switch testers  Repeat Copyright 2015, Jon D. Hagar Grand Software Testing, LLC – “Software Test Attacks to Break Mobile and Embedded Devices” Usability Attack Pattern
  • 19. 3: IoT and Mobile Security Attacks And Now for Something Completely Different Well, At Least A Very Scary (Not Silly) Walk 19 Copyright 2015, Jon D. Hagar Grand Software Testing, LLC – “Software Test Attacks to Break Mobile and Embedded Devices”
  • 20.  Fraud – Identity  Worms, virus, etc.  Fault injection  Processing on the run  Hacks impact  Power  Memory  CPU usage Copyright 2015, Jon D. Hagar Grand Software Testing, LLC – Software Test Attacks to Break Mobile and Embedded Devices Mobile Security Concerns • Eavesdropping – yes everyone can hear you • Hijacking • Click-jacking • Voice/Screen • Physical Hacks • File snooping • Lost phone
  • 21.  Mobile systems are highly integrated hardware– software–system solutions which:  Must be highly trustworthy since they handle sensitive data  Often perform critical tasks  Security holes and problems abound  Coverity Scan 2010 Open Source Integrity Report - Android  Static analysis test attack found 0.47 defects per 1,000 SLOC  359 defects in total, 88 of which were considered “high risk” in the security domain  OS hole Android with Angry Birds  Researchers Jon Oberheide and Zach Lanier  Robots and Drones rumored to be attacked  Cars and medical devices being hacked  Stuxnet Virus and its family The Current Security Situation Copyright 2015, Jon D. Hagar Grand Software Testing, LLC – “Software Test Attacks to Break Mobile and Embedded Devices”
  • 22.  Apply when the device is mobile and has  Account numbers  User-ids and passwords  Location tags  Restricted data  Current authentication approaches in use on mobile devices  Server-based  Registry (user/password)  Location or device-based  Profile-based Security Attacks Copyright 2015, Jon D. Hagar Grand Software Testing, LLC – “Software Test Attacks to Break Mobile and Embedded Devices”
  • 23.  Attack 28 Penetration Attack Test  Attack 28.1 Penetration Sub–Attacks: Authentication — Password  Attack 28.2 Sub–Attack Fuzz Test  Attack 29: Information Theft—Stealing Device Data  Attack 29.1 Sub Attack –Identity Social Engineering  Attack 30: Spoofing Attacks  Attack 30.1 Location and/or User Profile Spoof Sub–Attack  Attack 30.2 GPS Spoof Sub–Attack Security Attacks (only a starting point checklist of things to do) Copyright 2015, Jon D. Hagar Grand Software Testing, LLC – “Software Test Attacks to Break Mobile and Embedded Devices”
  • 24.  Security attacks must be done with the knowledge and approval of owners of the system and software  Severe legal implications exist in this area  Many of these attacks must be done in a lab (sandbox)  In these attacks, I tell you conceptually how to “drive a car very fast (150 miles an hour) but there are places to do this with a car legally (a race track) and places where you will get a ticket (most public streets)”  Be forewarned - Do not attack you favorite app on your phone or any connected server without the right permissions due to legal implications Warnings when Conducting Security Attacks Copyright 2015, Jon D. Hagar Grand Software Testing, LLC – “Software Test Attacks to Break Mobile and Embedded Devices”
  • 25. Attacks (from Software Test Attacks to Break Mobile and Embedded Devices)  Attack 1: Static Code Analysis  Attack 2: Finding White–Box Data Computation Bugs  Attack 3: White–Box Structural Logic Flow Coverage  Attack 4: Finding Hardware–System Unhandled Uses in Software  Attack 5: Hw-Sw and Sw-Hw signal Interface Bugs  Attack 6: Long Duration Control Attack Runs  Attack 7: Breaking Software Logic and/or Control Laws  Attack 8: Forcing the Unusual Bug Cases  Attack 9 Breaking Software with Hardware and System Operations  9.1 Sub–Attack: Breaking Battery Power  Attack 10: Finding Bugs in Hardware–Software Communications  Attack 11: Breaking Software Error Recovery  Attack 12: Interface and Integration Testing  12.1 Sub–Attack: Configuration Integration Evaluation  Attack 13: Finding Problems in Software–System Fault Tolerance  Attack 14: Breaking Digital Software Communications  Attack 15: Finding Bugs in the Data  Attack 16: Bugs in System–Software Computation  Attack 17: Using Simulation and Stimulation to Drive Software Attacks  Attack 18: Bugs in Timing Interrupts and Priority Inversion  Attack 19: Finding Time Related Bugs  Attack 20: Time Related Scenarios, Stories and Tours  Attack 21: Performance Testing Introduction  Attack 22: Finding Supporting (User) Documentation Problems  Sub–Attack 22.1: Confirming Install–ability  Attack 23: Finding Missing or Wrong Alarms  Attack 24: Finding Bugs in Help Files  Attack 25: Finding Bugs in Apps  Attack 26: Testing Mobile and Embedded Games  Attack 27: Attacking App–Cloud Dependencies  Attack 28 Penetration Attack Test  Attack 28.1 Penetration Sub–Attacks: Authentication — Password Attack  Attack 28.2 Sub–Attack Fuzz Test  Attack 29: Information Theft—Stealing Device Data  Attack 29.1 Sub Attack –Identity Social Engineering  Attack 30: Spoofing Attacks  Attack 30.1 Location and/or User Profile Spoof Sub–Attack  Attack 30.2 GPS Spoof Sub–Attack  Attack 31: Attacking Viruses on the Run in Factories or PLCs  Attack 32: Using Combinatorial Tests  Attack 33: Attacking Functional Bugs Copyright 2015, Jon D. Hagar Mobile-Embedded Taxonomies from “Software Test Attacks to Break Mobile and Embedded Devices”
  • 26. Bonus Round ISO 29119 (and how in may impact mobile and testing) Copyright 2015, Jon D. Hagar Grand Software Testing, LLC – Software Test Attacks to Break Mobile and Embedded Devices 26
  • 27. Motivation for ISO29119  Conflicts in definitions, processes and procedures  “One ring to rule them all” — standards to be replaced by one  e.g., IEEE 829, IEEE 1008, BS7925-1/-2, IEEE 1028  Users do not know which standard to follow  Lacking in current standards or incomplete:  Organizational areas  e.g., Test Policy and Organizational Test Strategy  Project Test Management  BS7925 only covers unit testing  General processes  Common functional techniques missing  Coverage of non-functional testing
  • 28. ISO/IEC29119 –Structure and Flow BS7925-1 BS7925-2 IEEE 829 Concepts & Vocabulary Part 1 Process Assessment Testing Techniques Part 4 Documentation Part 3Part 2 Processes Keyword-Driven Testing Part 5 ISO/IEC 33063 ISO 12207 ISO 15288 Directives IEEE 1008 Thanks to Stuart Reid
  • 29. Part 1: Concepts & Vocabulary SOFTWARE TESTING CONCEPTS Scope, Conformance, Normative References TESTING IN DIFFERENT LIFE CYCLE MODELS ROLES AND RESPONSIBILITIES IN TESTING ANNEXES – Metrics, Examples, Bibliography DEFINITIONS Test:Approach,Basis, Methods -RiskBasedTesting
  • 30. Part 2: Testing Processes TEST MANAGEMENT PROCESSES ORGANIZATIONAL TEST PROCESS DYNAMIC TEST PROCESSES
  • 31. TEST MANAGEMENT PROCESSES ORGANIZATIONAL TEST PROCESS DYNAMIC TEST PROCESSES Instantiating Testing Processes Ref: S. Reid
  • 32. Organize Test Plan Development Identify & Estimate Risks Design Test Strategy Determine Staffing and Scheduling Document Test Plan Schedule, Staffing Profile Test Strategy Analyzed Risks Scope Identify Risk Treatment Approaches Gain Consensus on Test Plan Approved Test Plan Draft Test Plan Test Plan Publish Test Plan Understand Context Treatment Approaches Test Planning Processes
  • 33. Part 3 – Test Documentation TEST DOCUMENTATION ANNEXES - EXAMPLES Scope, Conformance, Normative References Selectasubsetofdocs
  • 34. Part 3: Test Documentation  Organizational test documentation  Test policy  Test strategy  Project test documentation  Project test plan  Test project completion report  Test Level documentation  Test plan  Test specification  Test results  Anomaly reports  Level test status report  Test environment report  Test level completion report  Appendices  Examples of documents at each level of testing
  • 35. Part 4 – Test Techniques TEST COVERAGE MEASUREMENT Scope, Conformance, Normative References ANNEXE – TESTING OF QUALITY CHARACTERISTICS ANNEXE – SELECTION OF TECHNIQUES ANNEXE – TEST TECHNIQUE EFFECTIVENESS TEST DESIGN TECHNIQUES Functional Structural
  • 36. Part 5- Keyword-Driven Testing  Part 5 will address:  Concept  Applicability  Interfaces  Approach  Part 5 WD was sent out in May and next draft due in Nov 2013
  • 37. Impact to 29119 to Mobile/Smart Maybe not much EXCEPT  Some domains are more regulated such as:  safety-related  telecoms  financial – banks, stock markets, etc.  Impact to business  International contracting  Assessment  Legal  Process improvement  Large company to company (trading language)
  • 38. Do Testers Need Standards? – Yes, maybe, but  Standards support common communication within the topic  Common reference points  Starting point for research, usage (pro & con), critic  Maturity is an issue but a baseline serves as sounding board and common reference point for “Scientific” methods  An international benchmark  Thinkers and researchers can prove/disprove benchmark(s)  Part of being in a profession (but only part)
  • 39.  These attacks are presented at a summary level only  Much more detail and effort are needed  Understanding your local context and error patterns is important (one size does NOT fit all)  Attacks are patterns…you still must THINK and tailor  ISO may help, but only in a few contexts Wrap Up of this Session Copyright 2015, Jon D. Hagar Grand Software Testing, LLC – “Software Test Attacks to Break Mobile and Embedded Devices”
  • 40.  James Whittaker (attacks)  Elisabeth Hendrickson (simulations)  Lee Copeland (techniques)  Brian Merrick (testing)  James Bach (exploratory and tours)  Cem Kaner (test thinking)  Jean Ann Harrison (her thinking and help)  ISO 29119 standards and working group 26  Many teachers  Generations past and future  Books, references, and so on Notes: Thank You (ideas used from) Copyright 2015, Jon D. Hagar Grand Software Testing, LLC – “Software Test Attacks to Break Mobile and Embedded Devices”
  • 41.  “Software Test Attacks to Break Mobile and Embedded Devices” – Jon Hagar  “How to Break Software” James Whittaker, 2003  And his other “How To Break…” books  “A Practitioner’s Guide to Software Test Design” Copeland, 2004  “A Practitioner’s Handbook for Real-Time Analysis” Klein et. al., 1993  “Computer Related Risks”, Neumann, 1995  “Safeware: System Safety and Computers”, Leveson, 1995  Honorable mentions:  “Systems Testing with an Attitude” Petschenik 2005  “Software System Testing and Quality Assurance” Beizer, 1987  “Testing Computer Software” Kaner et. al., 1988  “Systematic Software Testing” Craig & Jaskiel, 2001  “Managing the Testing Process” Black, 2002 Book Notes List (my favorites) Copyright 2015, Jon D. Hagar Grand Software Testing, LLC – Software Test Attacks to Break Mobile and Embedded Devices
  • 42. • www.stickyminds.com – Collection of test info • www.embedded.com – info on attacks  www.sqaforums.com - Mobile Devices, Mobile Apps - Embedded Systems Testing forum • Association of Software Testing – BBST Classes http://www.testingeducation.org/BBST/ • Your favorite search engine • Our web sites and blogs (listed on front page) More Resources Copyright 2015, Jon D. Hagar Grand Software Testing, LLC – Software Test Attacks to Break Mobile and Embedded Devices