Managing Software Debt Workshop at Intel
Upcoming SlideShare
Loading in...5
×
 

Managing Software Debt Workshop at Intel

on

  • 1,549 views

 

Statistics

Views

Total Views
1,549
Views on SlideShare
1,457
Embed Views
92

Actions

Likes
3
Downloads
36
Comments
0

3 Embeds 92

http://www.ontechnicaldebt.com 64
http://www.gettingagile.com 27
https://si0.twimg.com 1

Accessibility

Upload Details

Uploaded via as Adobe PDF

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment

    Managing Software Debt Workshop at Intel Managing Software Debt Workshop at Intel Presentation Transcript

    • Managing  So)ware  Debt  Workshop
    • First  Things  First... Follow  on  Twi)er:  @csterwa Hash  Tag:  #swdebt 2
    • Chris  Sterling Co-­‐founder  &  CTO  of  Agile  Advantage   www.AgileAdvantage.com Author  of  Book  “Managing  SoHware   Debt:  Building  for  Inevitable  Change” Consults  on  soHware  technology,   Agile  technical  pracOces,  Scrum,  and   effecOve  management  techniques InnovaOon  Games®  Trained  Facilitator Email:  chris@agileadvantage.com   Web:  h<p://www.agileadvantage.com CerOfied  Scrum  Trainer Follow  me  on  Twi)er:  @csterwa Blog:  h<p://www.ge?ngagile.com Open  Source  Developer Hashtag  for  presentaOon:  #swdebt 3
    • Agenda Managing  SoHware  Debt • ConOnuous  IntegraOon • An  Overview • Quality  Dashboards Types  of  SoHware  Debt Release  Management • Technical • The  Power  of  2  Scripts:  Deploy   • Quality and  Rollback • ConfiguraOon  Management • ConOnuous  IntegraOon • Design • Automated  PromoOon • PlaZorm  Experience • Turn  On/Off  Features AsserOng  Quality  &  Design Wrap  Up • Refactoring • SoHware  Debt  Management   Strategy • Test  AutomaOon • The  “No  Defect”  Mindset • DefiniOon  of  Done 4
    • Managing  So)ware  Debt An  Overview
    • THE  DISECONOMIES  OF  SCALE  IN  SOFTWARE  DEVELOPMENT* Project  size  is  easily  the  most  significant   determinant  of  effort,  cost  and  schedule  [for  a   soHware  project].* *  “SoHware  EsOmaOon:  DemysOfying  the  Black  Art  “  –  Steve  McConnell
    • Big  Ball  of  Mud “A  Big  Ball  of  Mud  is  a  haphazardly  structured,  sprawling,  sloppy,   duct-­‐tape-­‐and-­‐baling-­‐wire,  spaghei-­‐code  jungle.  These  systems   show  unmistakable  signs  of  unregulated  growth,  and  repeated,   expedient  repair.  InformaOon  is  shared  promiscuously  among   distant  elements  of  the  system,  oHen  to  the  point  where  nearly  all   the  important  informaOon  becomes  global  or  duplicated.  The   overall  structure  of  the  system  may  never  have  been  well  defined.  If   it  was,  it  may  have  eroded  beyond  recogniOon.  Programmers  with  a   shred  of  architectural  sensibility  shun  these  quagmires.  Only  those   who  are  unconcerned  about  architecture,  and,  perhaps,  are   comfortable  with  the  inerOa  of  the  day-­‐to-­‐day  chore  of  patching  the   holes  in  these  failing  dikes,  are  content  to  work  on  such  systems.”  * *  Brian  Foote  and  Joseph  Yoder,  Big  Ball  of  Mud.   Fourth  Conference  on  Pa)erns  Languages  of  Programs  (PLoP  97/EuroPLoP  97)   MonOcello,  Illinois,  September  1997
    • Lack  of  emphasis  on  soware  quality  a2ributes  contributes  to  decay
    • Types  of  So)ware  Debt Technical,  Quality,   ConfiguraOon  Management,   Design,  and  PlaZorm   Experience
    • Why  not  just  call  it  all  “ Technical  Debt” Technical  debt  tended  to  focus  more  on   programming  aspects  of  soHware  delivery  and   leH  out  full  soHware  development  life  cycle Each  type  of  soHware  debt  can  be  managed  and   monitored  using  different  tools  and  approaches Focusing  on  managing  each  type  of  soHware   debt  simplifies  creaOon  of  an  overall  strategy  that   promotes  holisOc  perspecOve 11
    • Types  of  SoHware  Debt Technical  Debt:  These  are  the  acOviOes  that  a  team  or  team   members  take  shortcuts  on  now  that  will  impede  future   development  if  leH  as  is. Quality  Debt:  There  is  a  diminishing  ability  to  verify  the  funcOonal   and  technical  quality  of  soHware:  the  “Break/Fix”  mentality. Configura<on  Management  Debt:  IntegraOon  and  release   management  become  more  risky,  complex,  and  error-­‐prone. Design  Debt:  The  cost  of  adding  features  is  increasing  toward  the   point  where  it  is  more  than  the  cost  of  wriOng  from  scratch. Pla?orm  Experience  Debt:  The  availability  and  alignment  of  people   to  business  objecOves  that  involve  soHware  changes  is  becoming   more  limited  or  cost-­‐prohibiOve. 8
    • Exercise:   Discuss  the  5  Types  of   SoHware  Debt  and  Examples   You’ve  Seen  in  the  Real  World
    • Principle:   No  ma)er  what,  the  cost  of   addressing  soHware  debt   increases  with  Ome.
    • Asser<ng  Quality  &  Design Teams  must  focus  on  asserOng   sustainable  quality  and  design   to  support  future  customer   needs
    • Principles  of  Executable  Design The  way  we  design  can  always  be  improved. We’ll  get  it  “right”  around  the  third  Ome. We  most  likely  won’t  get  it  “right”  the  first  Ome. Design  and  construct  for  change  rather  than  longevity. Lower  the  threshold  of  pain. If  we  are  not  enhancing  the  design  then  we   are  just  wri3ng  a  bunch  of  tests. 16
    • Merciless  Refactoring Refactoring:  a  disciplined  technique  for   restructuring  an  exisOng  body  of  code,  altering  its   internal  structure  without  changing  its  external   behavior.*   Merciless:  having  or  showing  no  [mercy  -­‐   showing  great  kindness  toward  the  distressed]   Relieve  your  distressed  code  through  kindness   and  disciplined  restructuring *  From  h)p://www.refactoring.com/   17
    • Where  to  Start  Refactoring? Does  this  change  directly  affect  feature  I  am  working   on? Would  change  add  clarity  for  feature  implementaOon? Will  change  add  automated  tests  where  there  are   none? If  “yes”  to  any  ques3on  above,  ask  following  ques3on   to  decide  if  you  should  work  on  it  now: At  first  glance,  does  refactoring  look  like  a  large   endeavor  involving  significant  porOons  of  the  soHware’s   components? 18
    • When  to  Stop  Refactoring? Am  I  refactoring  code  not  directly  affected  by  feature? Is  other  code  directly  affected  by  feature  I  am  working  on  that   has  not  been  refactoring  sufficiently? If  refactoring  is  exploding  feature  esOmate  given  to  Customer   then  I  should  bring  it  up  to  Team  to  decide  how  we  should   progress If  Team  decides  that  refactoring  can  be  absorbed  into  current   iteraOon  without  affecOng  delivery  on  our  commitments  then   conOnue  refactor If  refactoring  affects  commitments  then  bring  it  to  Customer   for  discussion  how  to  proceed 19
    • Automate  TesOng  to  Support   Refactoring  cannot  be  done  effecOvely  without   automated  tests  surrounding  code   Start  by  creaOng  automated  test  which  fails   If  difficult  to  create  at  unit  level  look  at   automated  acceptance  tests  from  funcOonal   perspecOve Over  Ome  look  for  ways  to  create  automated   unit  tests 20
    • Case  Study:   Test  AutomaOon  Reduces  Cost   of  Change
    • Manual  Regression  TesOng TesOng  was  taking  75  person  hours  during  2  full   test  runs  consisOng  of: • Comprehensive  manual  regression  tesOng • Data  conversion  and  validaOon Cost  for  tesOng  was  $17,000  each  iteraOon 22
    • Introducing  Fit  into  TesOng  Process AHer  8  iteraOons  team  had  introduced  healthy   amount  of  Fit  fixtures  and  automated  tests Reduced  70+  hour  test  runOme  down  to  6  hours   which  now  included: • Fit  automated  regression  tesOng   • Data  conversion  and  validaOon  automated  with  Fit   fixtures   Reduced  cost  of  tesOng  each  iteraOon  from   $17,000  to  $7,000 23
    • The  Agile  Regression  TesOng  Triangle* Smoke++  Tests Risk-­‐based  UI  & API  Automated Integra3on  Tests Tests Automated  & Exploratory Automated  Unit  Tests Make  up  largest  porOon  of regression  tests  and  are developed  by  programmers *  The  Agile  Triangle  has  been  modified  from  Mike  Cohn’s  original  version 24
    • Test-­‐Driven  Design  (TDD) Lets  Walk  Through  a  Scenario   Using  TDD  to  Implement  a   SoluOon
    • TDD  -­‐  Basic  “Flow” Write  Failing  Test Refactor  to   Acceptable   Make  Test  Pass Design 26
    • Ji)er  –  Example  TDD  Session Fake  micro-­‐blogging  tool  named  “Ji)er”  is  made  by   Sea)le-­‐based  ficOOous  company  that  focuses  on  enabling   coffee  injected  folks  to  write  short  messages  and  have   common  online  messaging  shorthand  expanded  for  easy   reading.  The  user  story  we  are  working  on  is: So  it  is  easier  to  read  their  kid’s  messages,  Mothers  want   to  automa3cally  expand  common  shorthand  nota3on The  acceptance  criteria  for  this  user  story  are: • LOL,  AFAIK,  and  TTYL  are  expandable • Expand  lower  and  upper  case  versions  of  shorthand 27
    • Expand  LOL  to  “laughing  out  loud” public class WhenMotherWantsToExpandMessagesThatContainShorthandTest { @Test public void shouldExpandLOLToLaughingOutLoud() { JitterSession session = mock(JitterSession.class); when(session.getNextMessage()).thenReturn("Expand LOL please"); MessageExpander expander = new MessageExpander(session); assertThat(expander.getNextMessage(), equalTo("Expand laughing out loud please")); } } public class MessageExpander { public String getNextMessage() { String msg = session.getNextMessage(); return msg.replaceAll("LOL", "laughing out loud"); } } 28
    • But  wait…what  if…? What  if  LOL  is  wri)en  in  lower  case? What  if  it  is  wri)en  as  “Lol”?  Should  it  be  expanded?   What  if  some  variaOon  of  LOL  is  inside  a  word? What  if  characters  surrounding  LOL  are  symbols,  not  le)ers?   Write  these  down  as  upcoming  programmer  tests  as   comments  so  I  don’t  forget  them.   // shouldExpandLOLIfLowerCase // shouldNotExpandLOLIfMixedCase // shouldNotExpandLOLIfInsideWord // shouldExpandIfSurroundingCharactersAreNotLetters 29
    • Expand  LOL  If  Lower  Case @Test public void shouldExpandLOLIfLowerCase() { when(session.getNextMessage()).thenReturn("Expand lol please"); MessageExpander expander = new MessageExpander(session); assertThat(expander.getNextMessage(), equalTo("Expand laughing out loud please")); } This  forced  use  of  java.u1l.regex.Pa6ern  to  handle  case  insensi1vity. public String getNextMessage() { String msg = session.getNextMessage(); return Pattern.compile("LOL”, Pattern.CASE_INSENSITIVE) .matcher(msg).replaceAll("laughing out loud"); } 30
    • Don’t  Expand  “Lol”  –  Mixed-­‐Case @Test public void shouldNotExpandLOLIfMixedCase() { String msg = "Do not expand Lol please"; when(session.getNextMessage()).thenReturn(msg); MessageExpander expander = new MessageExpander(session); assertThat(expander.getNextMessage(), equalTo(msg)); } This  forced  me  to  stop  using  Pa6ern.CASE_INSENSITIVE  flag  in  pa6ern   compila1on.  Only  use  “LOL”  or  “lol”  for  now. public String getNextMessage() { String msg = session.getNextMessage(); return Pattern.compile("LOL|lol").matcher(msg) .replaceAll("laughing out loud"); } 31
    • Don’t  Expand  “LOL”  If  Inside  Word @Test public void shouldNotExpandLOLIfInsideWord() { String msg = "Do not expand PLOL or LOLP or PLOLP please"; when(session.getNextMessage()).thenReturn(msg); MessageExpander expander = new MessageExpander(session); assertThat(expander.getNextMessage(), equalTo(msg)); } The  pa6ern  matching  is  now  modified  to  use  spaces  around  each   varia1on  of  valid  LOL  shorthand. return Pattern.compile("sLOLs|slols").matcher(msg) .replaceAll("laughing out loud"); 32
    • Expand  “LOL”  If  Not  Inside  Word @Test public void shouldExpandIfSurroundingCharactersAreNotLetters() { when(session.getNextMessage()).thenReturn("Expand .lol! please"); MessageExpander expander = new MessageExpander(session); assertThat(expander.getNextMessage(), equalTo("Expand .laughing out loud! please")); } Final  implementa1on  of  pa6ern  matching  code: return Pattern.compile("bLOLb|blolb").matcher(msg) .replaceAll("laughing out loud"); 33
    • Quality  Debt “Promises  make  debt,  and   debt  makes  promises.”  -­‐  Dutch   Proverb
    • Effect  of  Project  Constraints  on  Quality 35
    • Ken  Schwaber   “For  every  [dollar]  of  compe22ve   advantage  gained  by  cu9ng  quality,  it   costs  $4  to  restore  it;  and  so@ware  is   an  organiza2onal  asset  and  decisions   to  cut  quality  must  be  made  by   execu2ve  management  and  reflected   in  the  financial  statements.” hIp://www.infoq.com/presenta2ons/agile-­‐quality-­‐canary-­‐coalmine
    • Acceptance  Test-­‐Driven  Development 37
    • DefiniOon  of  Done  -­‐  Assert  Quality Acceptance defined criteria for each Code checked in with reference to user story US#/Task# Unit tests written and passed Tested on FE Code compiles with no errors and no Integration test written & passes warnings Test code reviewed New code doesn’t break existing code Environment requirements documented Test case review (Dev to review test Interface document updated/added case written) and checked in to SVN Architectural impact assessed and Acceptance criteria verified complete artifacts updated if necessary All P1-P3 bugs for the story are Comments in code closed Error codes added Test approves user story Code reviewed by peer Story demonstrated to product owner and accepted on Target Platform 38
    • Release  DefiniOon  of  Done Every  release  should  have  clear  quality  criteria With  a  “Release  DefiniOon  of  Done”  you  can   understand  targets  be)er Measure  the  gap  between  the  teams’  DefiniOon   of  Done  and  a  Release  DefiniOon  of  Done. • This  gap  is  a  source  of  quality  issues  and  represents   significant  risk  to  schedule
    • Exercise:   What’s  in  Your  DefiniOon  of   Done?
    • Advanced  Quality  Asser<ons   Using  Automated  Tools  and   Dashboards
    • ConOnuous  IntegraOon 42
    • Quality  Dashboard  -­‐  Sonar 43
    • Quality  Dashboard  -­‐  Sonar 44
    • Quality  Dashboard  -­‐  Sonar 45
    • Quality  Dashboard  -­‐  Sonar 46
    • Early  Warning  Signs Early  Warnings: •Broken  Builds •Broken  Automated  Tests •Broken  Custom  Thresholds 47
    • Early  Warning  on  Quality  Dashboard Early  Warnings: •Design  Debt  in  DuplicaOon  (DRY) •Technical  Debt  in  Code  Complexity •Quality  Debt  in  Bug  DB  (Break/Fix) •Other  Custom  Thresholds 48
    • Release  Management “If  releases  are  like  giving   birth,  then  you  must  be  doing   something  wrong.”  -­‐  Robert   Benefield
    • Case  Study:  Enterprise  Agile  AdopOon 180+  person  “Web  2.0”  product  organizaOon Waterfall  SDLC  that  development  uses  to  deliver  in  6  month  release   cycles Want  to  use  Agile  methods  to  be  more  responsive  to  users  and  keep  up   with  other  “Web  2.0”  companies TransiOoned  to  Agile  methods  on  15  teams  in  3  months Changed  release  management  strategy,  added  XP  technical  pracOces,   and  implemented  Scrum  product  development  framework  for  scaled   coordinaOon Able  to  release  every  week  to  users  within  4  months Used  streamlined  deployment  environment  process  to  validate  product   changes  daily  using  ConOnuous  IntegraOon  and  automated  promoOons 50
    • The  Power  of  2  Scripts:  Deploy  &  Rollback 51
    • TradiOonal  Source  Control  Management Code Complete Version  1 Integrate  for Branch Version  2 Debt Main  Branch Death  March { Debt  accrues  quickly  within  stabiliza<on  periods 52
    • Flexible  Source  Control  Management Version 1 Version 2 Main Branch { Not Easy! Must have proper infrastructure to do this. 53
    • Scaling  ConOnuous  IntegraOon End-­‐to-­‐End  & Load/Stress Integrated   Component Valida<on Component Valida<on 54
    • Automated  PromoOon  to  Environments 55
    • Principle: The  value  of  technical  aspects   in  an  applicaOon  or  its   surrounding  infrastructure  is   the  cost  of  not  addressing   them. 29
    • Describe  as  Abuse  User  Stories Implement Security As a Malicious Hacker I want to steal credit card for User Information information so that I can make fraudulent charges *  From  “User  Stories  Applied”  presented  by  Mike  Cohn  Agile  2006   30
    • Some  PotenOal  Abusers • Malicious  Hacker • Mass  of  users • SQL  injector • Disgruntled  employee • Naïve  API  user • ImpaOent  clicker • Denial-­‐of-­‐service  (DoS)  a)acker • Sleazy  user 31
    • Exercise:   Abuse  Story  WriOng
    • SoHware  Quality  A)ributes  Defined 32
    • SoHware  Quality  A)ributes  RaOng  Tool 33
    • Exercise:   Focusing  on  SoHware  Quality   A)ributes
    • Pla?orm  Experience  Debt   “As  in  Nature,  if  an   organizaOon  is  too  inflexible  or   stands  sOll  too  long  it  will  get   eaten.”  -­‐  James  Burke  (author   and  historian)
    • Principle: Rather  than  creaOng  teams  to   work  on  projects,  let’s  find   ways  to  give  projects  to  cross-­‐ funcOonal  teams.
    • Component  Team  ConfiguraOon “Component  Team”  structure Separate  Product  Backlog Managing  dependencies  is   oHen  serialized ProblemaOc  integraOon  issues   are   typically  faced  if  mulOple   components  are   required  to  release Use  an  “IntegraOon Team”  to  pull   components  together Causes  more  rework  than   “Feature  Team”  structure 65
    • Feature  Team  ConfiguraOon “Feature  Team”  structure Uses  common  Product  Backlog Integra2on  is  done  in  parallel Requires  high  levels  of   communica2on  across  teams  to   resolve  integra2on  issues Forces  Product  Owners  to   be  more  coordinated   Sprints  should  be   synchronized Cross  team  fer2liza2on  is  a requirement  to  successfully   deliver  in  parallel 66
    • Exercise:   CreaOng  a  SoHware  Debt   Management  Strategy
    • The  “No  Defect”  Mindset “What  he  needs  is  some  way   to  pay  back.  Not  some  way  to   borrow  more.”  -­‐-­‐  Will  Rogers 39
    • Case  Study:  Field  Support  ApplicaOon 2000+  users  access  applicaOon  each  day ApplicaOon  supports  mulOple  perspecOves  and  workflows   from  Field  Support  OperaOons  to  Customer  Service Team  of  5  people  delivering  features  on  exisOng  Cold   Fusion  plaZorm  implementaOon MigraOng  Architecture  to  Spring/Hibernate  in  slices  while   sOll  delivering  valuable  features 36  2-­‐week  Sprints,  33  producOon  releases,  and  only  1   defect  found  in  producOon So,  what  was  the  defect  you  say?  Let  me  tell  you… 40
    • Can  We  Afford  a  “No  Defect”  Policy? This  team  worked  on  legacy  codebase  inherited  from  another   vendor Other  vendor  had  been  slowing  down  month  aHer  month  and   cost  of  development  was  increasing In  first  iteraOon  this  team  was  able  to  deliver  more  than  other   vendor  was  able  to  in  previous  2  months AHer  24  iteraOons  this  team  was  10  Omes  faster  delivery  than1st   iteraOon Acceptance  Test-­‐Driven  Development  and  ConOnuous  IntegraOon   were  greatest  technical  factors  to  support  team  in  these  results Can  you  afford  not  to  have  a  “No  Defect”  policy? 41
    • Acceptance  Test-­‐Driven  Development 71
    • The  Power  of  2  Scripts:  Deploy  &  Rollback 72