3. Roles QA Plays
• Product Coverage
• Platform Coverage
• Stability and Performance
• Community
4. Roles QA Plays (cont..)
• Assist development with Automation andTree maintenance
• Active bug triage
• Device Flashing and Maintenance
• Partner and User support
• InternalTesting programs (Beta)
5. Product Coverage
• QA representative per functional team
• Building and runningTest cases on features based on user stories
• Daily coverage
• Bug analysis across branches (triage, qawanted, regression
hunting)
• Automated and manual smoketests on device
• Hitting acceptance criteria
6. Device Automation
• Setting up 30 flame devices for lab automation
• Executing various tests: Gaia functional, power measurement, b2g
performance (fps and cold launch), and marketplace
• Reporting to Jenkins server short term, long termTask cluster by
way ofTree Herder
7. User story planning
Acceptance defined
U defined
Testcase creation
Regression tests
cleanup
Partial FeatureTestrun
RegressionTR1
Bug Bash
Feature review
FL Acceptance
Regression TR2
L10n TR1
MTBF Features exploratory
Regression TR3
Bug Bash
FC acceptance
Daily smoke tests / Ongoing Bugwork
!
L10n TR2
Partner CS/ IOT stuff
Internal certifications
CC acceptance
9. Testrun Metrics (cont)
• 1.3 Exploratory
! •! # of blockers in exploratory run [search whiteboard = dogfood1.3]!
! ◦! 41 bugs (http://goo.gl/3dNUfl)!
! •! # of man hours on exploratory!
! ◦! 2 Weeks of 1/27 - ~11 testers!
! ◦! 40 hour week!
! •! Blockers per man hours = (# of testers * # hours / total # of blockers)!
! ◦! (11* 80 hours) / 41 blockers = 2.15 blockers!
!
!
10. Acceptance Criteria
• Product coverage Criteria listed for FL, FC, and CC milestones
• Platform and automation milestones are ongoing
• Tracked in: https://wiki.mozilla.org/Release_Management/
FirefoxOS/Release_Milestones
11. • Working with different platform teams for backlog of test coverage
(ie.WebRTC, graphics)
• Building out existing Mochitest coverage for Gfx / JS / Dom /
Layout / Web Api
• Building test apps on device for Product QA to execute
Platform Coverage
12. • Q2 Goal: 72 hours uptime per device, no crash/ no hangs
• Running Hamachi (1.4) and Flame (2.0) [10 hamachis in parallel]
• Latest results (April 29th, 2014)
• Buri - 12 hours uptime per device
• Flame - 50 hours uptime per device
• Executed once a week, starting after FL
Stability (MTBF)
13. • Perf Metrics. establish a in-house baseline. analyzing results and
building more automated tests
• Working draft: https://wiki.mozilla.org/FirefoxOS/Performance/
Release_Acceptance
• Results through Datazilla and Eideticker tools
• Security testing? Seccomp builds? B2G fuzzing? working on
these things in the future
Perf and Security
14. • With Flame, working with Foxtrot and other teams on focus areas
• Posting entry projects for device and automation tasks on public
pages (oneanddone, badges)
• Bug bashes, local meetups
• Mentoring / code reviewing process
• More transparency with tests, reports, bugs, and testing
opportunities
Community
15. Find us!
• in your meetings, workweeks, offices, bugs, right behind you.
• https://wiki.mozilla.org/B2G/QA
• irc #fxosqa