Effective performance evaluation as part
of a CI approach
Andy Still, Technical Director & Founder, Intechnica
Mark Smith, Online QA Manager, Channel 4
Mission Impossible
Agenda
Performance in CI: Background and challenges
Andy Still
An example implementation
Mark Smith
Intechnica: Introduction
 Specialists in IT application performance
 Vendor independent
 Promote performance by design
 Enable performance best practice
Digital performance experts
Background
Performance in a modern development process?
Relative
importance of
Performance
Control
Software lifecycle timeline
Performance with Agile
Background
What is CI?
If you are going to fail…..
Fail Early and Fail Often
Key features
 Automated process,
run as often as
possible
 Validate as many
things as possible
• Pull dependencies
• Build
• Unit tests
• Syntax checks
• Automation tests
• Integration tests
• ??Performance??
• Deployment
What is Continuous Integration?
Should ….. Can we build
performance into our CI Process?
YES!!!
Why?
 Performance is a first class citizen
• As important as functional issues
• Harder to fix – can be architectural changes
 The earlier you know about problems the better.
If something is hard to do…. Do it early and do it often!
But….
 Don’t be over ambitious
• Test at a level appropriate to stage of development / environment
 Avoid false positives
• Set realistic pass / fail settings
• Tests have to run headless – can’t require any human interpretation
 Remember - performance is linear not binary
• The checkin that broke the build may not be the one that caused the
problem.
Types of performance testing in CI
Types of tests that could be used
 Micro benchmarking
• Set KPIs against specific unit tests / integration tests
• Need to consider
• Datasets v mocking
• Environments
• 3rd party integrations
 Client side
• Set KPIs for areas such as page weight etc.
 Full test under load
• Deploy to realistic environments / datasets
Fitting Performance into CI
Shortening the feedback loop
Performance and CI – Development - APM
Performance and CI - Deployment and CI Testing
Debate
What are the real challenges to successful implementation
of performance controls within CI:
Process vs. Tooling
Challenges to successful delivery of performance
controls within CI
 Where do we start?
 Because performance is not binary people struggle to know how to baseline
and how to test.
 No frameworks currently exist to guide in the approach.
 Performance testing is always done at the end – right?
 Tests are brittle if delivered at the UI layer.
 How can you script and set KPIs against functionality that is not yet
completed.
Process is the inhibiting factor
Challenges to successful delivery of performance
controls within CI
 CI cannot be delivered effectively without tooling however…
 The integration of tooling and process is frequently seen as a blocker to
successful implementation.
• Nothing available off the shelf that provides an integrated CI capability
• Generally requires bespoke integration approach based on available
API’s
 Tooling should:
• Empower developers to performance test
• Automate deployment and performance testing of code drops
• Provide automation of APM analysis
• Automate benchmark comparison and reporting , Pass or Fail?
Tooling
Web: www.intechnica.co.uk
Email: andy.still@intechnica.co.uk
Twitter: @andy_still / @intechnica
Blog: http://internetperformanceexpert.wordpress.com
Tel: 0845 680 9679
Address: Fourways House, 57 Hilton Street, Manchester, M1
2EJ
Andy Still
Technical Director & Co-founder - Intechnica
Questions…..
Then over to Mark to see how they are solving these problems at Channel 4
Mark Smith
 25 years experience in IT development and testing
 Online QA Manager at Channel 4
• BDD and ATDD automation framework
 QA Manager at Asos
• Full scale implementation of automated functional testing
 Fair Isaac
• Pioneered automated testing with reusable keyword driven frameworks
• Built web service testing framework using VUGen and Excel macros
 Razorfish
• Early exposure to Web development and eCommerce
Introduction
 Performance at least
as important as
functionality on many
sites
 Performance
breakages can be
more challenging to
fix than functional
ones
 Build failures and
short feedback loops
can prevent these
breakages making it
into the code base
Why Continuous performance testing?
• May be too disruptive
during early stages of a
project
• Might meet resistance to the
concept of failing builds for
non-functional reasons
• But on stable projects early
identification of
performance issues can
only be of benefit
• So yes, you do, and as
soon as is practical
But Do I also want pure CI performance testing?
Challenges
• Each project needs
own instance of tooling
• License costs high if
using enterprise tools
• Technically more
challenging if using
open source tools
Continuous Integration Testing
Requirements
 Integration with CI tool
 Ability to fail build on percentile
response time, error rates and
other metrics at transaction
level
 Front end instrumentation
 Post run data collation
Full Volume Testing also needs:
 Integration with CI tool
 Ability to fail build on percentile response time, error rates
and other metrics at transaction level
 Front end instrumentation
 Post run data collation
 Scalability to enterprise loads
 Viable injection infrastructure
 Real time monitoring of back end
Selected Tools
Channel 4 chose the open source route
• Jenkins
– CI build management
• Jmeter
– Load Testing
• webPageTest
– Front End Instrumentation
Front end instrumentation
webPageTest can run full workflows during load tests
 used to monitor metrics such as:
 Time to first byte
 DOM Content Ready End
 Load Event End time
 Doc Complete Time (ms)
 Fully loaded
 Time to title
 Bytes in (Doc)
 Num requests (fully loaded)
Jenkins controls the process
Jenkins calls JMeter…
which in turn calls webPageTest…
and results analysed using ’raw page data’ link from
previous page
Can measure time to first byte, document complete and
fully loaded time…
and also more specific metrics, such as cached and
uncached page responses
Details output to Jenkins project workspace, with links
to the full reports
Benefits of using webPageTest with JMeter:
 Final results analysis aimed at CI build pass/fail. Results can trigger build
failure for example.
 Immediately raises awareness in the team for front end performance issues.
The builds break until the front end hits its targets.
 Graphical representation of browser cached and uncached performance, build
by build
 Click through from Jenkins workspace to log of test run results, step by step
 Click through to detailed xml data from webPageTest
 Click through to detailed test analysis presented in webPageTest front end,
including Google page speed like analysis for example.
 Highly configurable test runs, using webPageTest scripting engine to suit many
and varied projects
 Ability to set up CI to test in different browsers
 Ability to set up different CI test scenarios, using webPageTest scripts. For
example testing with / without 3rd party content
 Ability to set up CI front end testing on different client hardware. Perhaps best
case / worst case.
So what do we get?
CI testing gives us
 Very short feedback loops
 Ability to fail build by many metrics
 code base protected from performance degradations
 Performance trending data
Iterative full volume testing adds
 Early and regular view of real world performance
Credits
Oliver Lloyd – www.github.com/oliverlloyd/jmeter-ec2
 JMeter on EC2 Setup Script
Nick Godfrey - webwob.com
 Jenkins Integration
 webPageTest Integration

Effective performance evaluation as part of a CI approach - Mission Impossible?

  • 1.
    Effective performance evaluationas part of a CI approach Andy Still, Technical Director & Founder, Intechnica Mark Smith, Online QA Manager, Channel 4 Mission Impossible
  • 2.
    Agenda Performance in CI:Background and challenges Andy Still An example implementation Mark Smith
  • 3.
    Intechnica: Introduction  Specialistsin IT application performance  Vendor independent  Promote performance by design  Enable performance best practice Digital performance experts
  • 4.
    Background Performance in amodern development process?
  • 5.
  • 6.
  • 7.
    If you aregoing to fail….. Fail Early and Fail Often Key features  Automated process, run as often as possible  Validate as many things as possible • Pull dependencies • Build • Unit tests • Syntax checks • Automation tests • Integration tests • ??Performance?? • Deployment What is Continuous Integration?
  • 8.
    Should ….. Canwe build performance into our CI Process?
  • 9.
  • 10.
    Why?  Performance isa first class citizen • As important as functional issues • Harder to fix – can be architectural changes  The earlier you know about problems the better. If something is hard to do…. Do it early and do it often!
  • 11.
    But….  Don’t beover ambitious • Test at a level appropriate to stage of development / environment  Avoid false positives • Set realistic pass / fail settings • Tests have to run headless – can’t require any human interpretation  Remember - performance is linear not binary • The checkin that broke the build may not be the one that caused the problem.
  • 12.
    Types of performancetesting in CI
  • 13.
    Types of teststhat could be used  Micro benchmarking • Set KPIs against specific unit tests / integration tests • Need to consider • Datasets v mocking • Environments • 3rd party integrations  Client side • Set KPIs for areas such as page weight etc.  Full test under load • Deploy to realistic environments / datasets
  • 14.
    Fitting Performance intoCI Shortening the feedback loop
  • 15.
    Performance and CI– Development - APM
  • 16.
    Performance and CI- Deployment and CI Testing
  • 17.
    Debate What are thereal challenges to successful implementation of performance controls within CI: Process vs. Tooling
  • 18.
    Challenges to successfuldelivery of performance controls within CI  Where do we start?  Because performance is not binary people struggle to know how to baseline and how to test.  No frameworks currently exist to guide in the approach.  Performance testing is always done at the end – right?  Tests are brittle if delivered at the UI layer.  How can you script and set KPIs against functionality that is not yet completed. Process is the inhibiting factor
  • 19.
    Challenges to successfuldelivery of performance controls within CI  CI cannot be delivered effectively without tooling however…  The integration of tooling and process is frequently seen as a blocker to successful implementation. • Nothing available off the shelf that provides an integrated CI capability • Generally requires bespoke integration approach based on available API’s  Tooling should: • Empower developers to performance test • Automate deployment and performance testing of code drops • Provide automation of APM analysis • Automate benchmark comparison and reporting , Pass or Fail? Tooling
  • 20.
    Web: www.intechnica.co.uk Email: andy.still@intechnica.co.uk Twitter:@andy_still / @intechnica Blog: http://internetperformanceexpert.wordpress.com Tel: 0845 680 9679 Address: Fourways House, 57 Hilton Street, Manchester, M1 2EJ Andy Still Technical Director & Co-founder - Intechnica Questions….. Then over to Mark to see how they are solving these problems at Channel 4
  • 21.
    Mark Smith  25years experience in IT development and testing  Online QA Manager at Channel 4 • BDD and ATDD automation framework  QA Manager at Asos • Full scale implementation of automated functional testing  Fair Isaac • Pioneered automated testing with reusable keyword driven frameworks • Built web service testing framework using VUGen and Excel macros  Razorfish • Early exposure to Web development and eCommerce Introduction
  • 22.
     Performance atleast as important as functionality on many sites  Performance breakages can be more challenging to fix than functional ones  Build failures and short feedback loops can prevent these breakages making it into the code base Why Continuous performance testing?
  • 23.
    • May betoo disruptive during early stages of a project • Might meet resistance to the concept of failing builds for non-functional reasons • But on stable projects early identification of performance issues can only be of benefit • So yes, you do, and as soon as is practical But Do I also want pure CI performance testing?
  • 24.
    Challenges • Each projectneeds own instance of tooling • License costs high if using enterprise tools • Technically more challenging if using open source tools
  • 25.
    Continuous Integration Testing Requirements Integration with CI tool  Ability to fail build on percentile response time, error rates and other metrics at transaction level  Front end instrumentation  Post run data collation
  • 26.
    Full Volume Testingalso needs:  Integration with CI tool  Ability to fail build on percentile response time, error rates and other metrics at transaction level  Front end instrumentation  Post run data collation  Scalability to enterprise loads  Viable injection infrastructure  Real time monitoring of back end
  • 27.
    Selected Tools Channel 4chose the open source route • Jenkins – CI build management • Jmeter – Load Testing • webPageTest – Front End Instrumentation
  • 28.
    Front end instrumentation webPageTestcan run full workflows during load tests  used to monitor metrics such as:  Time to first byte  DOM Content Ready End  Load Event End time  Doc Complete Time (ms)  Fully loaded  Time to title  Bytes in (Doc)  Num requests (fully loaded)
  • 29.
  • 30.
  • 31.
    which in turncalls webPageTest…
  • 32.
    and results analysedusing ’raw page data’ link from previous page
  • 33.
    Can measure timeto first byte, document complete and fully loaded time…
  • 34.
    and also morespecific metrics, such as cached and uncached page responses
  • 35.
    Details output toJenkins project workspace, with links to the full reports
  • 36.
    Benefits of usingwebPageTest with JMeter:  Final results analysis aimed at CI build pass/fail. Results can trigger build failure for example.  Immediately raises awareness in the team for front end performance issues. The builds break until the front end hits its targets.  Graphical representation of browser cached and uncached performance, build by build  Click through from Jenkins workspace to log of test run results, step by step  Click through to detailed xml data from webPageTest  Click through to detailed test analysis presented in webPageTest front end, including Google page speed like analysis for example.  Highly configurable test runs, using webPageTest scripting engine to suit many and varied projects  Ability to set up CI to test in different browsers  Ability to set up different CI test scenarios, using webPageTest scripts. For example testing with / without 3rd party content  Ability to set up CI front end testing on different client hardware. Perhaps best case / worst case.
  • 37.
    So what dowe get? CI testing gives us  Very short feedback loops  Ability to fail build by many metrics  code base protected from performance degradations  Performance trending data Iterative full volume testing adds  Early and regular view of real world performance
  • 38.
    Credits Oliver Lloyd –www.github.com/oliverlloyd/jmeter-ec2  JMeter on EC2 Setup Script Nick Godfrey - webwob.com  Jenkins Integration  webPageTest Integration