Jon Hagar author of "Software Test Attacks to Break Mobile and Embedded Devices" presents Software testing concepts for mobile and imbedded devices in this webinar- hosted by XBOSoft.
2. XBOSoft info
• Founded in 2006
• Dedicated to software quality
• Software QA consulting
• Software testing services
• Offices in San Francisco, Beijing, Oslo and
Amsterdam
4. Housekeeping
• Everyone except the speakers is muted
• Questions via the gotowebinar control on the right side of your
screen
• Questions can be asked throughout the webinar, we’ll try to fit
them in when appropriate
• General Q & A at the end of the webinar
• You will receive info on recording after the webinar
5. About Jon Hagar
Jon….
• has over 30 years in software testing/verification and
validation
• is the owner of Grand Software Testing, a company
specializing in software test consulting and training for
mobile and embedded systems.
• author of “Software Test Attacks to Break Mobile and
Emedded Devices”
6. Jon Hagar Copy right 2013 Software Test Attacks to Break Mobile and Embedded Devices 6
SOFTWARE TESTING CONCEPTS
FOR MOBILE AND EMBEDDED
DEVICES
Jon Hagar
embedded@ecentral.com
jon.d.hagar@gmail.com
breakingembeddedsoftware.wordpress.com
7. Jon Hagar Copy right 2013 Software Test Attacks to Break Mobile and Embedded Devices 7How to Attack Embedded Software 7
AGENDA FOR TODAY
• Introductions
• Definitions
• Risk based concepts
• Exploratory approaches
• Attacking the scenario(s)
• Mobile and embedded tester skills
• Wrap up and references
8. Jon Hagar Copy right 2013 Software Test Attacks to Break Mobile and Embedded Devices 8How to Attack Embedded Software 8
MOBILE, SMART, AND HANDHELD
• As the names imply, these are devices—small, held in the hand, often
connected to communication networks, including
• Cell and smart phones – apps
• Tablets
• Medical devices
• Typically have:
• Many of the problems of classic “embedded” systems
• The power of PCs/IT
• More user interface (UI) than classic embedded systems
• Fast updates
• Are getting more power, memory, and features (software e.g., apps)
• The “hot” area of computers/software
• Testing rules are “evolving”
9. Jon Hagar Copy right 2013 Software Test Attacks to Break Mobile and Embedded Devices 9How to Attack Embedded Software 9
CLOSE COUSINS:
Embedded Software Systems . . .
• Interacts with unique hardware/systems to solve specialized problems in
the “real world”
• IT software runs with largely “generic” hardware
• Users are barely aware the device uses or has software
• Usually have significant hardware interface issues and concerns
• Initialization, noise, power up/down, timers, sensors, etc.
• Often are resource constrained
• RAM, ROM, stack, power, speed, time, etc.
• Typically has a restricted or no Human User/Computer Interface (HCI) but
is evolving rapidly
• Often no way (or only a risky way) to update and/or change the software
• Involves risks, hazards, safety, and/or some specialized domain knowledge
and logic/algorithms usually controlling hardware
10. Jon Hagar Copy right 2013 Software Test Attacks to Break Mobile and Embedded Devices 10How to Attack Embedded Software 10
WHAT DO THESE LOOK LIKE?
Examples
– Avionics systems: planes, cars, rockets, military,…..
– Telecom: switch, routers, phones, cell devices,….
– Transportation: traffic control, railroad, trucking, ….
– Industrial control: lighting, machines, HVAC, nuclear/power,…
– Medical: pacemaker, dispensers, …….
– Home and office systems: control, entertainment (TV box), …
– And the list goes on
• Now starting to include PDA’s and other items that “blur” the
lines
11. Jon Hagar Copy right 2013 Software Test Attacks to Break Mobile and Embedded Devices 11How to Attack Embedded Software 11
TYPES OF MOBILE-
EMBEDDED APPS
• Native Applications
• Local to device
• Hybrid Applications
• Local to device but
interacts w/internet
• Web Applications
• Not local to device
All interactions on
internet
And JeanAnn Haarrison
12. Jon Hagar Copy right 2013 Software Test Attacks to Break Mobile and Embedded Devices 12How to Attack Embedded Software 12
THE “WORLD” OF MOBILE-
SMART/EMBEDDED SOFTWARE
• Inputs and outputs involve hardware, software, and humans
• Time dependent
• NOTE: most software has “time” (performance) issues but here things are
often “hard real time”
• In Mobile/Embedded real-time may be a requirement
Software
Stimulus-Inputs Response-Outputs
Expected
Unexpected
Wanted
UnwantedHardware
13. Jon Hagar Copy right 2013 Software Test Attacks to Break Mobile and Embedded Devices 13How to Attack Embedded Software 13
MOBILE AND EMBEDDED SOFTWARE TESTING:
THE SAME BUT DIFFERENT
• Same patterns in programming errors =>
same test patterns to be applied
• But there are Differences
• The development lifecycle
• Mobile and embedded error patterns
• “Embedded” unique sensor (input and output)
• Mobility
• Network connections
• Resources
14. Jon Hagar Copy right 2013 Software Test Attacks to Break Mobile and Embedded Devices 14How to Attack Embedded Software 14
Hardware Change
Hardware New Product
EXAMPLE HIGH LEVEL MOBILE-EMBEDDED
LIFECYCLE DIFFERENCE
System Creation
Hardware Build
Software Build
Software Build
Software Build
Software Build
Hw
Issue
Feature
Results: Software is “late”
Software Build
Months, Days, Hours
15. Jon Hagar Copy right 2013 Software Test Attacks to Break Mobile and Embedded Devices 15How to Attack Embedded Software 15
THE ERROR (BUG) PATTERN:
SIMILAR AND DIFFERENT
• Handheld/Embedded software has
similar defects to other software
• Requirements & Design
• Logic & Math
• Control Flow
• Data
• Initialization & Mode changes
• Interfaces
• Security
• Gaming
• etc. . .
Mobile/Embedded additions
• Software and hardware
development cycles done in
parallel, where aspects of the
hardware may be unknown to the
software development effort
• Limited resources
• Control/use of unique hardware
(sensors)
• Hardware problems which are often
fixed with software late in the
project
• (a big one) Very tight real-time
(often in milli- or micro-second
ranges) or load performance issues
16. Jon Hagar Copy right 2013 Software Test Attacks to Break Mobile and Embedded Devices 16How to Attack Embedded Software 16
MORE SAMPLE MOBILE-EMBEDDED PRODUCT
ERRORS TESTERS SHOULD CONSIDER
• Mobility -
• Environment and device susceptible to outside influence: noise, weather, heat, cold,
• Network connect: wifi, data, changing signal, etc
• Battery factors: Heat, charge level, charge duration, device conflicts
• Output —outputs to devices and electronics that are susceptible to noise
influences, calibrations, changes to hardware, and time
• Inputs – Sensors, speed, timing, and interrupts all can impact
devices/software
• Complexity—the size of the system or some aspect of the system makes
missed bugs likely
• Device Resources -
• Screen size
• Memory
• CPU speed
17. Jon Hagar Copy right 2013 Software Test Attacks to Break Mobile and Embedded Devices 17How to Attack Embedded Software 17
THUS TESTERS SHOULD THINK: RISKS
• You cannot test everything
• Risk-based testing helps bound the test scope problem
• Testing is about providing information and understanding
• Test exploration gets you started with whatever you have (or
don’t have)
18. Jon Hagar Copy right 2013 Software Test Attacks to Break Mobile and Embedded Devices 18How to Attack Embedded Software 18
IN MOBILE AND EMBEDDED:
EXPLORATORY TESTING
• Rapid feedback
• Learning
• Upfront rapid
learning
• Attacking
• Address risk
• Independent assessment
• Target a defect
• Prototyping
• Need information
• Test beyond the
requirements
Yes, we must test requirements
(necessary, but not sufficient)
So include exploratory risk-based testing:
19. Jon Hagar Copy right 2013 Software Test Attacks to Break Mobile and Embedded Devices 19How to Attack Embedded Software 19
MUCH INFORMATION EXISTS ON EXPLORATORY
RISK-BASED TESTING
• Exploratory Testing: Based in the scientific method, where we
plan-design, execute, learn, and change the test effort with
limited “scripting”
• Check out works by:
• Kaner
• Bach
• Whittaker
• ISO 29119
• I like to recommend and do exploratory attack-based testing
20. Jon Hagar Copy right 2013 Software Test Attacks to Break Mobile and Embedded Devices 20How to Attack Embedded Software 20
WHY ATTACK?
• Attacking your software is in part, the process of attempting to
demonstrate a system (hardware, firmware, software and operations)
does not meet requirements, functional and non-functional
objectives.
• Mobile/embedded software testing must include "the system"
(hardware, software, operations, users)
• Attack common modes of failure, especially where the application is
engaged and visible by the user.
Attack testing with approaches to include:
Tools Levels
Patterns Techniques
21. Jon Hagar Copy right 2013 Software Test Attacks to Break Mobile and Embedded Devices 21How to Attack Embedded Software 21
SO, WHAT IS AN ATTACK?
• Based on a common mode of failure seen over and over
• Maybe seen as a negative, when it is really a positive
• Goes after the “bugs” that may be in the software
• Uses one or more classic test techniques and concepts
• A Pattern (more than a process) which must be modified for
the context at hand to do the testing
• Many testers learn these for a domain after years and form
a mental “pattern” model (most good testers attack)
22. Jon Hagar Copy right 2013 Software Test Attacks to Break Mobile and Embedded Devices 22How to Attack Embedded Software 22
KINDS OF ATTACKS
• Whittaker offers a good starting point for software
attacks in general that can be applied to embedded:
• User Interface Attacks
• Data and Computation
• File System Interface
• Software/OS Interface
• Whittaker’s “How to Break Software” lists 23 attacks
• “Software Test Attacks to Break Mobile and Embedded
Devices” lists 33 attacks and 8 sub attacks
23. Jon Hagar Copy right 2013 Software Test Attacks to Break Mobile and Embedded Devices 23How to Attack Embedded Software 23
MOBILE AND EMBEDDED ATTACK
CLASSIFICATION
• Developer Attacks (unit/code testing)
• Control System Attacks
• Hardware-Software Attacks
• Mobile and Embedded Software Domain Attacks
• Time Attacks (Performance)
• Human User Interface Attacks
• Smart and/or Mobile Phone Functional App Attacks
• Mobile/Embedded Security Attacks
• Generic Attacks
• Functional, mind mapping, and combinatorial tests
24. Jon Hagar Copy right 2013 Software Test Attacks to Break Mobile and Embedded Devices 24
A DETAILED LIST OF ATTACKS
• Attack 1: Static Code Analysis
• Attack 2: Finding White–Box Data Computation Bugs
• Attack 3: White–Box Structural Logic Flow Coverage
• Attack 4: Finding Hardware–System Unhandled Uses in Software
• Attack 5: Hw-Sw and Sw-Hw signal Interface Bugs
• Attack 6: Long Duration Control Attack Runs
• Attack 7: Breaking Software Logic and/or Control Laws
• Attack 8: Forcing the Unusual Bug Cases
• Attack 9 Breaking Software with Hardware and System Operations
• 9.1 Sub–Attack: Breaking Battery Power
• Attack 10: Finding Bugs in Hardware–Software Communications
• Attack 11: Breaking Software Error Recovery
• Attack 12: Interface and Integration Testing
• 12.1 Sub–Attack: Configuration Integration Evaluation
• Attack 13: Finding Problems in Software–System Fault Tolerance
• Attack 14: Breaking Digital Software Communications
• Attack 15: Finding Bugs in the Data
• Attack 16: Bugs in System–Software Computation
• Attack 17: Using Simulation and Stimulation to Drive Software Attacks
• Attack 18: Bugs in Timing Interrupts and Priority Inversion
• Attack 19: Finding Time Related Bugs
• Attack 20: Time Related Scenarios, Stories and Tours
• Attack 21: Performance Testing Introduction
• Attack 22: Finding Supporting (User) Documentation Problems
• Sub–Attack 22.1: Confirming Install–ability
• Attack 23: Finding Missing or Wrong Alarms
• Attack 24: Finding Bugs in Help Files
• Attack 25: Finding Bugs in Apps
• Attack 26: Testing Mobile and Embedded Games
• Attack 27: Attacking App–Cloud Dependencies
• Attack 28 Penetration Attack Test
• Attack 28.1 Penetration Sub–Attacks: Authentication — Password Attack
• Attack 28.2 Sub–Attack Fuzz Test
• Attack 29: Information Theft—Stealing Device Data
• Attack 29.1 Sub Attack –Identity Social Engineering
• Attack 30: Spoofing Attacks
• Attack 30.1 Location and/or User Profile Spoof Sub–Attack
• Attack 30.2 GPS Spoof Sub–Attack
• Attack 31: Attacking Viruses on the Run in Factories or PLCs
• Attack 32: Using Combinatorial Tests
• Attack 33: Attacking Functional Bugs
25. Jon Hagar Copy right 2013 Software Test Attacks to Break Mobile and Embedded Devices 25How to Attack Embedded Software 25
AN EXAMPLE DETAILED TAXONOMY:
A2D AND D2A BUG POSSIBILITIES
Type Situation Impact Notes
A2D A2D representation information is lost
because measurement is not precise
Software computation
is based on incorrect
data
Number of bits used to store the analog converted
data is not large enough or sampling rate to get
bits is not correct.
A2D A2D information is contaminated with noise Software computation
use noise when it
should not
The noise term may not be known, accounted for,
or misrepresented.
A2D A2D information is calculated correctly Computation has
unknown error
Sources of error can come from: calibrations used
on variables, variables lacking initialization, or
calculations are not done with enough accuracy
(single versus double floating point
D2A D2A conversion losses “least significant
bits” (LSB) in conversions, but bits are, in
fact, important because computer word
sizes are too small
Output to analog
device is wrong
Number of bits stored from the digital world to the
analog world do not have enough precision, so
analog data is incorrect.
D2A D2A information does not account for noise
of the real world
Software computation
does not include a
factor for noise
The analog values are not correct given the noise
of the real world (output data may be lost in the
noise).
D2A D2A information is calculated correctly
because of internal factors
Computation has
unknown error
Sources of error can come from: calibrations used
on variables, variables lacking initialization, or
calculations are not done with enough accuracy
(single versus double floating point
26. Jon Hagar Copy right 2013 Software Test Attacks to Break Mobile and Embedded Devices 26How to Attack Embedded Software 26
FINALLY, IN ADDITION TO TESTER SKILLS:
SKILL FOR THE MOBILE/EMBEDDED TESTER
• Tester skills: planning, design, execution, reporting, exploration, techniques,
tools…………………….
• Exposure or knowledge about products from the domain in which you are
testing: aerospace, medical, automobile manufacturing, smart apps,
airplanes, factory systems, robotics, regulated environments, etc.
• Knowledge of hard sciences: math, physics, electronics, hardware,
engineering, etc. for logical thought processes
• Knowledge of Soft sciences: psychology, philosophy, sociology, human
factors (human machine interface), gaming, arts… for creative & conceptual
thought processes
• Passion – follow your bliss
27. Jon Hagar Copy right 2013 Software Test Attacks to Break Mobile and Embedded Devices 27How to Attack Embedded Software 27
FOR THE SOFTWARE YOU TEST
ASK YOURSELF
• Do you know how it fails?
• Do you test for success or failure?
• Both?
• Where can you improve?
28. Jon Hagar Copy right 2013 Software Test Attacks to Break Mobile and Embedded Devices 28How to Attack Embedded Software 28
SUMMARY: THANK YOU (IDEAS USED FROM)
• James Whittaker (attacks)
• Elisabeth Hendrickson (simulations)
• Lee Copeland (techniques)
• Brian Merrick (testing)
• James Bach (exploratory & tours)
• Cem Kaner (test thinking)
• JeanAnn Harrison
• Many teachers
• Generations past and future
• Books, references, etc.
29. Jon Hagar Copy right 2013 Software Test Attacks to Break Mobile and Embedded Devices 29How to Attack Embedded Software 29
BOOK LIST (MY FAVORITES)
• “Software Test Attacks to Break Mobile and Embedded Devices”
– Jon Hagar,
• “How to Break Software” James Whittaker, 2003
• And his other “How To Break…” books
• “Testing Embedded Software” Broeckman and Notenboom, 2003
• “A Practitioner’s Guide to Software Test Design” Copeland, 2004
• “A Practitioner’s Handbook for Real-Time Analysis” Klein et. al., 1993
• “Computer Related Risks”, Neumann, 1995
• “Safeware: System Safety and Computers”, Leveson, 1995
• Honorable mentions:
• “Embedded System and Software Validation” Roychoudhury, 2009
• “Systems Testing with an Attitude” Petschenik 2005
• “Software System Testing and Quality Assurance” Beizer, 1987
• “Testing Computer Software” Kaner et. al., 1988
• “Systematic Software Testing” Craig & Jaskiel, 2001
• “Managing the Testing Process” Black, 2002