0
Linaro Test and
Validation Summit
Linaro Engineering Teams
LCE13 - Dublin, July 2013
How Do We Better Test our Engineering...
PART 1: Linaro Platform
Overview
LAVA
Citius,Altius,Fortius
"Faster,Higher,Stronger"
Builds&CI
BuildYourCodeWhenYou
areNot...
Linaro Test and
Validation Summit
Mike Holmes
LCE13 - Dublin, July 2013
LNG Engineering
● Create:
Two Linux kernels (with and without RT) and
a yocto filesystem.
● Benchmark:
RT + KVM + HugePage + Dataplane API...
● Our code is validated using CI, and
performance trends monitored.
● Our output is verified on one general
purpose ARM pl...
Automated testing is done using
● custom scripts run via jenkins & lava executing
○ (RT) LTP (Real Time Test Tree)
○ (RT) ...
● We test against three branches
○ linux-lng-tip (development)
○ linux-lng-lsk (bug fixes to stable)
○ linux-lng-lsk-RT (b...
● Some of the SoC vendors hardware has up to 16 x
10Gb links, generating this much traffic is non trivial.
● Tests equipme...
● Multiple nodes may be needed to test traffic
interoperability.
● It is not feasible to replicate the test environment at...
Questions
○ LAVA is(isn't) working for us
■ Interactive shells in the LAVA environment would
speed debugging given that te...
Linaro Test and
Validation Summit
Scott Bambrough
LCE13 - Dublin, July 2013
Landing Team Engineering
● Bootloaders
● Linux kernels based on mainline or current
RC's
● Linux kernels based on LSK (expected)
● Ubuntu member bu...
● Kernel code is validated using CI in the
Linaro LAVA Lab, on various member
hardware devices and ARM fast models.
● Our ...
● Currently run only basic compile boot test + default CI
tests (LTP, powermgmt)
● This needs to change, we want/need to d...
1. Much better LAVA documentation
2. Document the tests themselves
3. Infrastructure for testing
4. Infrastructure for bet...
● Deployment Guide
○ what are the hardware requirements for a LAB
○ what are the infrastructure requirements for a LAB
○ h...
● Test Developer's Guide
○ how to integrate tests to be run in lava-test-shell
(lava glue)
○ recommendations on how best t...
● Impossible to answer the question: What tests are
available in LAVA?
● http://lava-test.readthedocs.org/en/latest/index....
● Buddy systems
○ TI LT developed tests that require access to
reference material for comparison
■ video frame captures
■ ...
● Web dashboard won't cut it
● need to separate analysis from display
○ rather do an analysis, then decide how to display
...
example:
● generate test report as PDF
○ perform tests, generate a report
○ include metadata regarding tests
■ metadata fr...
example:
● regression analysis of kernel changes
○ perform tests one day, make changes, test next
○ did any test results c...
example:
● test report comparing:
○ current member BSP kernel
○ current LT kernel based on mainline
● evidence of quality/...
Linaro Test and
Validation Summit
Kevin Hilman
LCE13 - Dublin, July 2013
Kernel Developer/Maintainer
Most kernel development is done with little or
no automation
● build: local, custom build scripts
● boot: manual boot test...
● Code review on mailing list
● build/boot testing by maintainers
● build testing in linux-next (manual)
○ several develop...
This model is "good enough" for most
developers and maintainers, so...
Why should we use Jenkins/LAVA?
Linaro test/validat...
● Local testing: aid in build, boot, test cycle
○ local LAVA install, using local boards
○ reduce duplication of custom sc...
● Has to be easy to install
○ packaged (deb, rpm)
○ or git repo for development (bzr is ......)
● Has to fit into existing...
● Broad testing
● multi-arch (not just ARM)
● ARM: all defconfigs (not just Linaro boards)
○ also: allnoconfig, allmodconf...
Tracking build breakage in upstream trees
● when did build start breaking
● what are the exact build error messages
(witho...
Where is the line between Jenkins and LAVA?
● Jenkins == build, LAVA == test?
● when a LAVA test fails how do I know...
○ ...
● "Master image" is not useful
○ LAVA assumes are powered on and running master
image (or will reboot into master image)
○...
● Terminology learning curve
○ dispatcher, scheduler, dashboard
○ device, device-type
○ What is a bundle?
○ WTF is a bundl...
Kernel + modules: omap2plus_defconfig
● 1 minute
○ hackbox.linaro.org (-j48: 12 x 3.5GHz Xeon, 24G)
● 1.5 minutes
○ khilma...
Linaro Test and
Validation Summit
Grant Likely
LCE13 - Dublin, July 2013
LEG Engineering
LCE13: Test and Validation Summit: The future of testing at Linaro
Upcoming SlideShare
Loading in...5
×

LCE13: Test and Validation Summit: The future of testing at Linaro

310

Published on

Resource: LCE13
Name: Test and Validation Summit: The future of testing at Linaro
Date: 09-07-2013
Speaker:

Published in: Technology
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total Views
310
On Slideshare
0
From Embeds
0
Number of Embeds
0
Actions
Shares
0
Downloads
5
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide

Transcript of "LCE13: Test and Validation Summit: The future of testing at Linaro"

  1. 1. Linaro Test and Validation Summit Linaro Engineering Teams LCE13 - Dublin, July 2013 How Do We Better Test our Engineering 2nd Half
  2. 2. PART 1: Linaro Platform Overview LAVA Citius,Altius,Fortius "Faster,Higher,Stronger" Builds&CI BuildYourCodeWhenYou areNotLooking QAServices Coverallbases PART 2: Linaro Engineering KernelDevelopers &Maintainers LandingTeams LinaroGroups LEG/LNG (DotheyuseCI? Manual orAutomatedTesting?)*Howdotheyvalidate/verifytheiroutput? *What/howdotheydevelop? &easiertouse
  3. 3. Linaro Test and Validation Summit Mike Holmes LCE13 - Dublin, July 2013 LNG Engineering
  4. 4. ● Create: Two Linux kernels (with and without RT) and a yocto filesystem. ● Benchmark: RT + KVM + HugePage + Dataplane APIs Required to test kernel and userspace performance, some tests may be run in both spaces. ● Platforms: Arndale, AM335x Starter Kit ? (LSI & TI boards in future ?) QEMU - versatile Express ? LNG outputs
  5. 5. ● Our code is validated using CI, and performance trends monitored. ● Our output is verified on one general purpose ARM platform and against two SoC vendor platforms via a configurable switch to allow for dedicated links between nodes under test. ● Using open source software, we use one realistic network application, a general purpose benchmark and five feature specific test suites. LNG outputs are verified by
  6. 6. Automated testing is done using ● custom scripts run via jenkins & lava executing ○ (RT) LTP (Real Time Test Tree) ○ (RT) Cyclictest ○ (RT) Hackbench ○ (KVM) virt-test ○ (Hugepage) sysbench OLTP ○ (KVM, Hugepage, RT) openvswitch (Kernel and userspace) ○ (KVM, Hugepage, RT) netperf ○ Traffic test cases via pcap files and tcpreplay LNG uses these tools
  7. 7. ● We test against three branches ○ linux-lng-tip (development) ○ linux-lng-lsk (bug fixes to stable) ○ linux-lng-lsk-RT (bug fixes to stable RT variant) ● LNG specific CFG fragments ○ KVM (or will this be in a lsk kernel per default?) ○ PREEMPT_RT ○ NO_HZ_FULL (or will this be in a lsk kernel per default?) ○ HUGEPAGE (is that a CFG option?) LNG Kernel branches / configuration
  8. 8. ● Some of the SoC vendors hardware has up to 16 x 10Gb links, generating this much traffic is non trivial. ● Tests equipment such as IXIA traffic generators are expensive. ● Test equipment needs to be remotely switched between the different hardware under test in an automated way ● Scheduling test runs that take days and requires specific equipment to be dedicated to the task. LNG unique challenges
  9. 9. ● Multiple nodes may be needed to test traffic interoperability. ● It is not feasible to replicate the test environment at every developer's desk. ● the applied RT patch even when disabled, alters the execution paths ● Some test run for 24 hours or more LNG unique challenges
  10. 10. Questions ○ LAVA is(isn't) working for us ■ Interactive shells in the LAVA environment would speed debugging given that testing can only be performed with the test equipment in the lab ■ Multinode testing, with the reservation and configuration of network switches is required. ■ Long term trends in performance data need to analysed and compared for regression analysis, triggering alerts for deviations. ○ Further thoughts on Friday ○ https://lce-13.zerista. com/event/member/79674 LNG Q&A
  11. 11. Linaro Test and Validation Summit Scott Bambrough LCE13 - Dublin, July 2013 Landing Team Engineering
  12. 12. ● Bootloaders ● Linux kernels based on mainline or current RC's ● Linux kernels based on LSK (expected) ● Ubuntu member builds ● Android member builds ● ALIP member build Some outputs are public, others confidential. LT Outputs
  13. 13. ● Kernel code is validated using CI in the Linaro LAVA Lab, on various member hardware devices and ARM fast models. ● Our kernel code is also validated in member LAVA labs on both current and next gen hardware. ● Our builds at present are a sanity tested by the LT's but most testing is done by piggybacking on QA or automated testing set up by the platform team. Verification of LT outputs
  14. 14. ● Currently run only basic compile boot test + default CI tests (LTP, powermgmt) ● This needs to change, we want/need to do more ● We need more SoC level tests, having LT's aware of how to produce tests to run in LAVA will become more important LT and kernel tests
  15. 15. 1. Much better LAVA documentation 2. Document the tests themselves 3. Infrastructure for testing 4. Infrastructure for better analysis of results LT & Member Services Needs
  16. 16. ● Deployment Guide ○ what are the hardware requirements for a LAB ○ what are the infrastructure requirements for a LAB ○ hardware setup, software installation instructions ● Administrator's Guide ○ basically how Dave Piggot does his job ○ after initial setup, day to day ops and maintenance Better Documentation
  17. 17. ● Test Developer's Guide ○ how to integrate tests to be run in lava-test-shell (lava glue) ○ recommendations on how best to write tests for lava- test-shell ● User's Guide for lava-test-shell ○ for developers to use lava-test-shell ○ section devoted to using lava-test-shell in workflow of kernel developer? Better Documentation
  18. 18. ● Impossible to answer the question: What tests are available in LAVA? ● http://lava-test.readthedocs.org/en/latest/index.html ○ not sufficient, not up to date ○ problem isn't LAVA team, Linaro needs an acceptance policy on what a test has available before being used in LAVA ● would like to see meta-data in test documentation that can be used in test reports ○ in a format that can be used in report generation Document the tests
  19. 19. ● Buddy systems ○ TI LT developed tests that require access to reference material for comparison ■ video frame captures ■ audio filed ○ TI LT audio/video tests required external box to capture hdmi/audio output ○ Need to do more of this type of automated testing to verify that lower level functions work correctly at BSP level ○ GStreamer insanity test suite requires access to multimedia content Infrastructure for Testing
  20. 20. ● Web dashboard won't cut it ● need to separate analysis from display ○ rather do an analysis, then decide how to display ● why infrastructure? ○ think there should be a level of reuse for components used to do analysis ○ think these should be separate from LAVA ○ think of this a more of a data mining operation Infrastructure for Analysis
  21. 21. example: ● generate test report as PDF ○ perform tests, generate a report ○ include metadata regarding tests ■ metadata from test documentation? example: ● test report comparing: ○ current member BSP kernel ○ current LT kernel based on mainline ● evidence of quality/stability of LT/mainline kernel ● could be used to convince product teams Infrastructure for Analysis
  22. 22. example: ● regression analysis of kernel changes ○ perform tests one day, make changes, test next ○ did any test results change? ■ yes, send report of changes via email example: ● generate test report as PDF ○ perform tests, generate a report ○ include metadata regarding tests ■ metadata from test documentation? Infrastructure for Analysis
  23. 23. example: ● test report comparing: ○ current member BSP kernel ○ current LT kernel based on mainline ● evidence of quality/stability of LT/mainline kernel ● could be used to convince product teams Infrastructure for Analysis
  24. 24. Linaro Test and Validation Summit Kevin Hilman LCE13 - Dublin, July 2013 Kernel Developer/Maintainer
  25. 25. Most kernel development is done with little or no automation ● build: local, custom build scripts ● boot: manual boot testing on local hardware ● debug: custom unit-test scripts, manual verification of results ● publish: to public mailing lists ● merged: into maintainer trees, linux-next ● test: manual test of maintainer trees, linux-next ○ but many (most?) developers don't do this Current workflow: development
  26. 26. ● Code review on mailing list ● build/boot testing by maintainers ● build testing in linux-next (manual) ○ several developers do manual build tests of their pet platforms in linux-next and report failures ● Intel's 0-day tester (automated, but closed) ○ regular, automatic build tests ○ multi-arch build tests ○ boot tests (x86) ○ automatic git bisect for failures ○ very fast results ○ detailed email reports ○ extremely useful Current workflow: validation
  27. 27. This model is "good enough" for most developers and maintainers, so... Why should we use Jenkins/LAVA? Linaro test/validation will have to be ● at least as easy to use (locally and remotely) ● output/results more useful ● faster ○ build time ○ diagnostic time Current workflow: "good enough"
  28. 28. ● Local testing: aid in build, boot, test cycle ○ local LAVA install, using local boards ○ reduce duplication of custom scripts/setup ○ encourage writing LAVA-ready tests ○ easy to switch between local, and remote LAVA lab ● Remote CI: broader coverage ○ "I'm about ready to push this, I wonder if broke any other platforms..." ○ automatic, fast (ish) response Potential Usage models
  29. 29. ● Has to be easy to install ○ packaged (deb, rpm) ○ or git repo for development (bzr is ......) ● Has to fit into existing developer work flow ○ LAVA does not exclusively own hardware ○ developers have non-Linaro platforms ○ command-line driven ○ must co-exist with existing interactive use of boards ■ existing Apache setup ■ existing TFTP setup ■ existing, customized bootloaders ■ ... Local testing: LAVA
  30. 30. ● Broad testing ● multi-arch (not just ARM) ● ARM: all defconfigs (not just Linaro boards) ○ also: allnoconfig, allmodconfig, randconfig, ... ● Continuous builds ○ Linus' tree, linux-next, arm-soc/for-next, ... ○ developers can submit their own branches ● On-demand builds ○ register a tree/branch ○ push triggers a build ● fast, automatic reporting of failures ○ without manual monitoring/clicking through jenkins Remote CI
  31. 31. Tracking build breakage in upstream trees ● when did build start breaking ● what are the exact build error messages (without Jenkins click fest) ● which commit (probably) broke the build ○ automated bisect Useful output: build testing
  32. 32. Where is the line between Jenkins and LAVA? ● Jenkins == build, LAVA == test? ● when a LAVA test fails how do I know... ○ was this a new/updated test? ○ was this a new/updated kernel? ○ if so, can I get to the Jenkins build? In less than 10 clicks? Issues: Big picture
  33. 33. ● "Master image" is not useful ○ LAVA assumes are powered on and running master image (or will reboot into master image) ○ assumptions about SD card existence, partitioning... ○ assumptions about shell prompts linaro-test [rc=0] # ○ etc. etc. ● Goal: LAVA directly controls bootloader ○ netboot: get kernel + DTB + initrd via TFTP ○ extension via board-specific bootloader scripting Tyler's new "bootloader" device support in LAVA appears to have mostly solved this !! Issues: LAVA design
  34. 34. ● Terminology learning curve ○ dispatcher, scheduler, dashboard ○ device, device-type ○ What is a bundle? ○ WTF is a bundle stream? ○ Documentation... not helpful (enough said) ● Navigation ○ click intensive ○ how to get from a log to the test results? or... ○ from a test back to the boot log? ○ what about build log (Jenkins?) ○ can I navigate from Jenkins log to the LAVA test? Issues: LAVA usability
  35. 35. Kernel + modules: omap2plus_defconfig ● 1 minute ○ hackbox.linaro.org (-j48: 12 x 3.5GHz Xeon, 24G) ● 1.5 minutes ○ khilman local (-j24: 6 x 3.3GHz i7, 16G RAM) ● 8 minutes ○ Macbook Air (-j8: 2 x 1.8GHz i7, 4G) ● 14 minutes ○ Thinkpad T61 (-j4: 2 x Core2Duo, 4G RAM) ● 16 minutes ○ Linaro Jenkins (-j8: EC2 node, built in tmpfs) ● 17 minutes ○ ARM chromebook (-j4: 2 x 1.7 GHz A15, 2G RAM) Issues: Jenkins performance
  36. 36. Linaro Test and Validation Summit Grant Likely LCE13 - Dublin, July 2013 LEG Engineering
  1. A particular slide catching your eye?

    Clipping is a handy way to collect important slides you want to go back to later.

×