•
•
•
11/10/2017 2© 2015, Perfecto Mobile Ltd. All Rights Reserved.
•
•
•
•
•
•
5/24/2018 4© 2018, Perfecto Mobile Ltd. All Rights Reserved.
State of DevOps Report, 2018 - DORA
State of DevOps Report, 2018 - DORA
5/24/2018 9© 2018, Perfecto Mobile Ltd. All Rights Reserved.
3
2
1
3
2
●
●
●
●
●
● ●
●
●
•
•
•
•
Insights into the CI
Pipeline
Risk/Focus Area
Mapping
Summary Report
List
Single Test Report
Visual Validations
Noise reduction through
error/failure-classification
Mobile Testing Landscape
Criteria Appium Espresso XCUITests
Language Any Java swift/objective-C
By Open source Google Apple
App supported APK and IPA APK IPA
Code required No Yes Yes
Testtype Black box White box White box
Speed 8t t 2t
Setup Hard Easy Medium
CI Medium Easy Hard
Flakiness of test Very Low Low
Object Locators Xpath (external) Id (from R file) Id
Used by QA Android dev* iOS dev*
Responsive Web Design (RWD/PWA) – Tools Stack
1. What’s the test engineer’s gut feeling 😊
2. Risk calculated as probability to occur and
impact to customers
3. Value – does the test provide new
information and, if failed, how much time to
fix?
4. Cost efficiency to develop – how long does
it take to develop and how easy is it to
script?
5. History of test – volume of historical
failures in related areas and frequency of
breaks
Source: Angie Jones
• Pairing / Coaching
• Use the right object identification strategy
• Use the right test framework to work with
• Measure test efficiency within the CI
• Risk-based approach to test automation
• Continuous test data analysis and improvement
• How fast are testing activities moving, and what is slowing down these activities?
• Test flakiness
• Test duration
• % of automated vs. manual tests
• Application quality measurements
• # of escaped defects and in which areas
• MTTD – mean time to detection of defect
• Build quality
• Pipeline efficiency measurements
• # of user stories implemented per iteration
• Test automation as part of DoD across iterations
• Broken builds with categories
• CI length trending
• Lab availability and utilization
• Quality costs measurements
• Operational costs, lab availability issues
• Cost of hardware/software
• Costs of defects by severity and stage
Creating a Successful Continuous Testing Environment
Creating a Successful Continuous Testing Environment
Creating a Successful Continuous Testing Environment

Creating a Successful Continuous Testing Environment

  • 2.
    • • • 11/10/2017 2© 2015,Perfecto Mobile Ltd. All Rights Reserved.
  • 3.
  • 4.
    5/24/2018 4© 2018,Perfecto Mobile Ltd. All Rights Reserved.
  • 5.
    State of DevOpsReport, 2018 - DORA
  • 6.
    State of DevOpsReport, 2018 - DORA
  • 9.
    5/24/2018 9© 2018,Perfecto Mobile Ltd. All Rights Reserved. 3 2 1 3 2 ● ● ● ● ● ● ● ● ●
  • 13.
  • 14.
    Insights into theCI Pipeline Risk/Focus Area Mapping Summary Report List Single Test Report Visual Validations Noise reduction through error/failure-classification
  • 16.
    Mobile Testing Landscape CriteriaAppium Espresso XCUITests Language Any Java swift/objective-C By Open source Google Apple App supported APK and IPA APK IPA Code required No Yes Yes Testtype Black box White box White box Speed 8t t 2t Setup Hard Easy Medium CI Medium Easy Hard Flakiness of test Very Low Low Object Locators Xpath (external) Id (from R file) Id Used by QA Android dev* iOS dev*
  • 17.
    Responsive Web Design(RWD/PWA) – Tools Stack
  • 18.
    1. What’s thetest engineer’s gut feeling 😊 2. Risk calculated as probability to occur and impact to customers 3. Value – does the test provide new information and, if failed, how much time to fix? 4. Cost efficiency to develop – how long does it take to develop and how easy is it to script? 5. History of test – volume of historical failures in related areas and frequency of breaks Source: Angie Jones
  • 19.
    • Pairing /Coaching • Use the right object identification strategy • Use the right test framework to work with • Measure test efficiency within the CI • Risk-based approach to test automation • Continuous test data analysis and improvement
  • 20.
    • How fastare testing activities moving, and what is slowing down these activities? • Test flakiness • Test duration • % of automated vs. manual tests • Application quality measurements • # of escaped defects and in which areas • MTTD – mean time to detection of defect • Build quality • Pipeline efficiency measurements • # of user stories implemented per iteration • Test automation as part of DoD across iterations • Broken builds with categories • CI length trending • Lab availability and utilization • Quality costs measurements • Operational costs, lab availability issues • Cost of hardware/software • Costs of defects by severity and stage