LCE13: Test and Validation Summit: Evolution of Testing in Linaro (II)

148
-1

Published on

Resource: LCE13
Name: Test and Validation Summit: Evolution of Testing in Linaro (II)
Date: 09-07-2013
Speaker:
Video: http://youtu.be/59i6Zblr6cg

Published in: Technology
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total Views
148
On Slideshare
0
From Embeds
0
Number of Embeds
1
Actions
Shares
0
Downloads
4
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide

LCE13: Test and Validation Summit: Evolution of Testing in Linaro (II)

  1. 1. Linaro Test and Validation Summit Linaro Engineering Teams LCE13 - Dublin, July 2013 How Do We Better Test our Engineering
  2. 2. PART 1: Linaro Platform Overview LAVA Citius,Altius,Fortius "Faster,Higher,Stronger" Builds&CI BuildYourCodeWhenYou areNotLooking QAServices Coverallbases PART 2: Linaro Engineering KernelDevelopers &Maintainers LandingTeams LinaroGroups LEG/LNG (DotheyuseCI? Manual orAutomatedTesting?)*Howdotheyvalidate/verifytheiroutput? *What/howdotheydevelop? &easiertouse
  3. 3. Agenda (Tuesday, 9am-1pm Time Topic Speaker 9:00 Introduction (Dev, Test, Loop) Alan Bennett 9:15 Overview of the CI loop (25 min) Fathi Boudra 9:40 QA services (20 min) Milosz Wasilewski 10:00 Recent LAVA updates (45 min) Antonio Terceiro 10:45 BREAK 11:00 LNG Mike Holmes 11:30 Landing Teams Scott Bambrough 12:00 KWG PM Kevin Hilman 12:30 LEG Grant Likely PlatformUpdates technicaldetails
  4. 4. Preface Why is the Quality of Linaro Engineering so important?
  5. 5. Preface Why is the Quality of Linaro Engineering so important?
  6. 6. ● applying continuous quality control ● frequent integration of small pieces of software ● rapid feedback ● Extreme programming (XP) ○ minimize integration problems ● Shared code repositories ● daily commits ● automated build systems ● extensive unit tests ● testing in cloned production environments Highlights of Continuous Integration Preface Continuous Integration
  7. 7. CI Loop C hanges M ade Autom ated Build TestR eport/ Feedback Test Source Control System Build Testing Development
  8. 8. Linaro Test and Validation Summit Fathi Boudra Builds and Baselines LCE13 - Dublin, July 2013 How Do We Better Test our Engineering
  9. 9. ● CI Present ○ Anatomy of CI loop ● CI Future ○ What is on the CI roadmap Overview
  10. 10. ● Get the source ○ Source code is under SCM ■ Git (git.linaro.org) ■ Bazaar (bazaar.launchpad.net) ● Build the code ○ Use a build system ■ Jenkins (ci.linaro.org and android-build.linaro. org) ■ LAVA (yes, LAVA can be used!!!) ● Publish the build results ○ Build artifacts are available (snapshots.linaro.org) Anatomy of CI loop
  11. 11. ● Submit the results for testing ○ LAVA (validation.linaro.org) ● Get the tests results ○ E-mail notifications with filters (validation.linaro. org/lava-server/dashboard/filters) ○ LAVA dashboard (validation.linaro.org/lava- server/dashboard) Anatomy of CI loop
  12. 12. ● Different type of jobs ○ Kernel CI ○ Engineering builds ○ Components ● Build triggers ○ manual, periodically, URL trigger, post-commit ● Do the build ○ shell script(s) ■ can be maintained under SCM (linux-preempt-rt) ○ Groovy script(s) ● Publish ○ to snapshots.linaro.org ○ to package repositories (PPA, other) Build jobs in depth
  13. 13. ● Submit to LAVA ○ Generate a LAVA job file (json) ○ test definitions are pulled from SCM (git.linaro. org/gitweb?p=qa/test-definitions.git) ● Misc ○ Jenkins can run unit tests (e.g qemu-ltp job) ■ junit ■ xunit ○ CI helpers ■ post-build-lava ■ post-build-ppa ■ Linaro CI build tools Build jobs in detail
  14. 14. ● LAVA CI Runtime ○ LAVA as a build system ● LAVA Publishing API ○ LAVA ability to publish artifacts on remote host ● Build time optimization ○ persistent slaves ○ mirrors and caching ● Better documentation CI Future
  15. 15. Any Questions? Q&A
  16. 16. Linaro Test and Validation Summit Milosz Wasilewski QA Services LCE13 - Dublin, July 2013 How Do We Better Test our Engineering
  17. 17. QA Services Tasks: ● manual testing ● dashboard monitoring ● reporting ● porting tests to LAVA
  18. 18. Manual Testing Current approach: ● test results are not very detailed ● no connection between test case description and result sheet ● results stored in google spreadsheet ● bug linking done manually (makes it hard to extract the list of 'known issues')
  19. 19. Future: ● store test cases in some better suited place than wiki ● preserve test case change history ● store manual test results along automatic ones (in LAVA) ● have ability to link bugs from various tracking systems to failed cases (in LAVA) ● generate reports easily (known issues, fixed problems, etc.) ○ might be done using LAVA if there is an easy way to extract testing results (for example REST API) Manual Testing
  20. 20. ● Monitoring dashboard ○ adding bugs ○ debugging failed runs ● Creating custom dashboards ○ Dashboard from filter ○ No need to edit python code to create/edit dashboard ○ Private/public dashboards ○ Dashboard email notification (falls in the concept of filter-as-dashboard approach) Dashboards
  21. 21. ● Use only binaries that were already automatically tested ● Don't repeat automated tests in manual run (we have to be confident that automated results are reliable) Release workflow
  22. 22. LAVA: Faster, Higher, Stronger (& easier to use) Antonio Terceiro LAVA LCE13 - Dublin, July 2013 Test and Validation Summit
  23. 23. ● Improvements ● New testing capabilities ● Engineering Progress Overview ● What are we missing? ○ Open Discussion ○ We want to hear from you Overview
  24. 24. ● ~90 ARM devices ● ~300 ARM CPUs ● ~150 jobs submitted per day ● ~99% reliability Context (0): the size of LAVA, today
  25. 25. ● LAVA started as an in-house solution ● Open source since day 1 ● Other organizations (incl. Linaro members) interested in running their own LAVA lab We need to go from an in-house service to a solid product Context (1)
  26. 26. ● No bootloader testing ● Tests only involve single devices We need to provide features to support new demands in test and validation Context (2)
  27. 27. Improvements
  28. 28. ● Queue size monitored with munin ● Nagios monitoring all sorts of things (e.g. temperature on Calxeda highbank nodes) ● Health check failures Monitoring
  29. 29. Easing LAVA installation ● Effort on proper upstream packaging so that packages for any (reasonable) OS can be easily made ● WIP on Debian and Fedora packaging $ apt-get install lava-server $ yum install lava-server Packaging enhancements
  30. 30. Easing LAVA learning ● Documentation is ○ scattered ○ outdated ○ confusing Documentation overhaul is in the LAVA roadmap. Documentation overhaul
  31. 31. Easing LAVA usage ATM a lava-test-shell job requires ● 1 JSON job file ● 1 YAML test definition file ● + the test code itself $ sudo apt-get install lava-tool $ lava script submit mytestscript.sh $ lava job list LAVA test suite helper tool
  32. 32. Getting more out of LAVA data More information out of LAVA data ● Improvements in test results visualization in the LAVA dashboard
  33. 33. LAVA is too hard to develop ● Too many separate components ○ Also a mess for bug/project management ● Requires almost a full deployment for development ● Consolidated client components (3 to 1) ● Will consolidate server components (3+ to 1) Developer-friendliness
  34. 34. New testing capabilities
  35. 35. ● LAVA Multi-purpose Probe ● 1 base design, 5 boards now ● USB serial connection(s) to the host ● management of other connections to/from devices under test LMP
  36. 36. ● prototype sets manufactured and under test ● Use cases: ethernet hotplug, SATA hotplug, HDMI hotplug and EDID faking, USB OTG testing, USB mux (sort of), lsgpio, audio hotplug, SD-Mux for bootloader testing LMP (2)
  37. 37. LMP (3) - how it works (e.g. SD-MUX) DUT SDC1 Host LMP USB serialUSB MSD
  38. 38. Multi-node testing (1) ● Schedule jobs across multiple target devices ○ Client-server, peer-to-peer and other scenarios ● Combine multiple results into a single result ● LAVA will provide a generic interface, test writers can program any tests they need. ○ (special hardware setups possible but need to be handled case-by-case) Other sessions: ● LAVA multi-node testing on Thursday ● LNG multi-node use-cases on Friday
  39. 39. Multi-node testing (2) ● Logistics challenge! ● We might end up needing 20 of every device type in the lab ● Need to manage the needed growth in the lab in a sensible way
  40. 40. Other projects ● Lightweight interface for kernel developers ● Boot from test UEFI on all Versatile Express boards ● Support for new member boards
  41. 41. Overview of Engineering Progress
  42. 42. In Progress ● LAVA LMP ● Multi-node testing ● Helper tool ● Test result visualization improvements ● Lightweigth interface for kernel devs ● UEFI on V. Express ● Support for new member boards In Progress X Planned Planned (for soon) ● Server components consolidation ● QA improvements ● Doc overhaul
  43. 43. Open Discussion
  44. 44. ● What is your experience getting started with LAVA? ● What would have made your experience easier? ● Any suggestions to the LAVA team? Let us know! ● Feedback about the image reports revamp? Seed Questions

×