Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

CCDC’s (ongoing) Journey to Continuous Delivery - London Continuous Delivery - Dec 2015


Published on

The video of this talk is online at:

Christmas lightning talk about Cambridge Crystallographic Data Centre's adoption of Continuous Integration, and the unexpected benefits it gave us.

Published in: Software
  • Be the first to comment

  • Be the first to like this

CCDC’s (ongoing) Journey to Continuous Delivery - London Continuous Delivery - Dec 2015

  1. 1. 1 CCDC’s (ongoing) Journey to Continuous Delivery Clare Macrae Cambridge Crystallographic Data Centre 1 December 2015 @ClareMacraeUK
  2. 2. 2 The Cambridge Crystallographic Data Centre Established in 1965. - UK Registered Charity. - Financially self-supporting, not-for-profit. - University Partner Institute. International Data Repository Archive of crystal structure data High quality scientific database Scientific Software Provider Search/analysis/visualisation tools Scientific applications Collaborative Research Organisation New methodologies Fundamental research Employer of around 60 permanent staff - Scientific editors - Software developers - Applications scientists - Cambridge, UK - Rutgers University, US
  3. 3. 3 Experimental Data Structural Knowledge Jeff Dahl, CC-BY-SA CC-BY-SA Radspunk, CC-BY-SA C10H16N+,Cl- Scientific Context
  4. 4. 4 Development Context • C++, Python, Fortran (decades old) and more • No high-level support for CI, due to 2- to 3-day builds! Several million lines of code • We had build and release automation in the ‘80s • Many more manual layers added since then • Couldn’t remember the last time we had a green build Legacy build & release tools & scripts • Around four months elapsed time in preparation • Too much manual effort, for release and testingOne release per year! • Due to effort currently required in each release Belief: we cannot release more often
  5. 5. 5 Initial Concerns – why did we wait for CI? “Our builds are too slow” “Our builds never succeed” “Too many warnings from our code”
  6. 6. 6 C++ CI Journey… C# team started using TeamCity Jan 2014: started background- project for our TeamCity C++ builds Sep 2014: Big push to move C++ builds to TeamCity Oct 2014: C++ TeamCity builds turned on
  7. 7. 7 Getting to Green • Several people in concerted effort to get any green builds • Before TeamCity, we hadn’t had any green builds in years!
  8. 8. 8 Speeding up tests • Incredible value of powerful tools for seeing where time goes
  9. 9. 9 • TeamCity showed evidence of need for new build machines… • … and the benefit of the new machines Speeding up build machines
  10. 10. 10 Patterns become visible
  11. 11. 11 Teams auto-deploying (internally)
  12. 12. 12 Teams (and individuals) taking initiative • This is right outside the Director’s office!
  13. 13. 13 Finally treating warnings as errors
  14. 14. 14 Where are we now? Much happier developers! Team agreements needed for handling build errors… More reliable infrastructure required Discussions have started on releasing more often!
  15. 15. 15 Initial Concerns –what really happened? “Our builds are too slow” CI system identified bottlenecks, and speeded up builds, tests Also got us new build hardware Easy to identify the slow tests – either speed them up, or run less often “Our builds never succeed” CI system massively simplified fixing build errors Seeing test history across all builds exposes flicker tests Per-push feedback within 15-30 minutes during the day, instead of the next morning “Too many warnings from our code” Fast feedback made it viable to treat warnings as errors