Your SlideShare is downloading. ×
Assessing Your Agility: Introducing the Comparative Agility Assessment
Upcoming SlideShare
Loading in...5
×

Thanks for flagging this SlideShare!

Oops! An error has occurred.

×

Saving this for later?

Get the SlideShare app to save on your phone or tablet. Read anywhere, anytime - even offline.

Text the download link to your phone

Standard text messaging rates apply

Assessing Your Agility: Introducing the Comparative Agility Assessment

446
views

Published on

Published in: Business, Technology

0 Comments
3 Likes
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total Views
446
On Slideshare
0
From Embeds
0
Number of Embeds
0
Actions
Shares
0
Downloads
29
Comments
0
Likes
3
Embeds 0
No embeds

Report content
Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
No notes for slide

Transcript

  • 1. ®AssessingYour Agility:Introducing the ComparativeAgility AssessmentKenny Rubin and Mike CohnAgile Development PracticesNovember 13, 20081
  • 2. ®Who We AreKenny RubinFormer managing director ofScrum AllianceEarly-market thought leaderin ObjectTechnology /SmalltalkMike CohnAuthor of two popular bookson agile developmentCo-founder of Agile Allianceand Scrum Alliance2
  • 3. ®ScenarioOur company has decided to use agileWe get training and maybe somecoachingAfter six months, management wantsto know:“How are we doing at adopting agile?”3
  • 4. ®Are we where we should be?In which areas do we need to improve?In which areas are we excelling?How are we doing relative to others?How are we doing relative to our competitors?4
  • 5. ®We need an assessmentframeworkAn instrument for “measuring” agilityDesirable attributesMust evaluate multiple dimensions ofagilityMust lead to actionablerecommendations5
  • 6. ®AgendaThe Assessment FrameworkAssessment ProcessPreliminary Industry ResultsSample Company Results6
  • 7. ®Assessment frameworkCharacteristics(3–6 per dimension)Dimensions(7 total)Questions(~125 total)7
  • 8. ®Seven assessmentdimensionsTeamworkPlanningTechnical practicesQualityCultureKnowledge creation8
  • 9. ®All-encompassing,task-oriented planscreated upfront;reluctance to updateplans; little buy-in todates from teamCreated at multiplelevels of detail;created by teamwith full buy-inPlanning(dimension)Planning levelsCritical variablesProgress trackingSourceWhenCharacteristicsAllupfrontSpreadthroughoutWe do the right amount of upfront planning;helpful without being excessive.Effort spent on planning is spreadapproximately evenly throughout the project.QuestionsTrueMore true than falseNeither true nor falseMore false than trueFalseResponsesAn Example9
  • 10. ®DimensionCollected at different levelsof detail; progressivelyfocused, augmented withdocumentationRequirementsDocument-centric; collectedupfront; littleacknowledgement ofemergenceFour characteristicsCommunication focusLevel of detailEmergenceTechnical design10
  • 11. ®Communication FocusWritten requirements are augmented withdiscussion.just-in-time discussions.Our product owner is available to discussfeatures during the iteration.We acknowledge that not all details can be11
  • 12. ®Level of detailTeams are able to start projects withincomplete requirements.some features are negotiable.Requirements are represented at differentlevels of detail based on how soon weexpect to implement them.12
  • 13. ®EmergenceChange is a natural part of our business; weaccept it and embrace it at reasonable times.Product owners can change requirementswithout a lot of fuss.Development teams can request and negotiaterequirements changes with product owners.Product owners acknowledge that sometimesfeatures turn out to be bigger than anyonethought.13
  • 14. ®Technical designProjects begin with a big, distinct technicaldesign phase.Technical design occurs iterativelythroughout a project.Technical design is a team activity ratherthan something performed by individualsworking alone.14
  • 15. ®Agenda✓ The Assessment FrameworkAssessment ProcessPreliminary Industry ResultsSample Company Results15
  • 16. ®Assessment approachesConsultativeAdministered to a team of people by a consultantresponses collected during interviewsSelf-administeredIndividuals working on projects complete eitherpaper or online version of the surveyOnline version is atwww.ComparativeAgility.com16
  • 17. ®Assessment philosophyNot trying to determine maturitylevelsOrganizations do not need to beperfectOnly better than their competitorsLead to the idea of a ComparativeAgility Assessment“How am I doing compared to mycompetition?”17
  • 18. ®Sample from online survey18
  • 19. ®Agenda✓ The Assessment Framework✓ Assessment ProcessPreliminary Industry ResultsSample Company Results19
  • 20. ®17%4%16%63%TeamDepartmentDivisionOrganizationAs you respond to thissurvey, will you be thinkingmostly about your:7%11%15%13%54%0-6 Months7-12 Months1Year2YearsLongerHow long had this groupbeen doing agiledevelopment prior tostarting this project?20
  • 21. ®3%14%25%26%33%Commercial SoftwareWeb DevelopmentInternal SoftwareContract DevelopmentOtherWhich bestcharacterizes thisproject?9%11%12%31%38%1-1011-2526-5051-100> 100About how many peoplewere or are on theproject being assessed?21
  • 22. ®0 1 2 3 4 5Seven DimensionsAll data-2 Std Devs -1 Std Dev+1 Std Dev +2 Std DevsKnowledge CreationCultureQualityTechnical PracticesPlanningTeamwork22
  • 23. ®0 1 2 3 4 5Technical PracticesAll data-2 Std Devs -1 Std Dev+1 Std Dev +2 Std DevsContinuous IntegrationRefactoringTest-driven developmentPair programmingCoding StandardCollective Ownership23
  • 24. ®0 1 2 3 4 5Quality.Timing -2 Std Devs -1 Std Dev+1 Std Dev +2 Std DevsThere is no big handoff betweenprogrammers and testers either during orat the end of an iteration.At the end of each iteration there is littleAll types of testing (performance,integration, scalability, etc.) areperformed in each iteration.Testers are productive right fromthe start of each iteration.iteration in which they are found.24
  • 25. ®Interesting results:Length of agile experience0 +1-12+ years≤ 6 monthsKnowledge creationCultureQualityTechnical practicesPlanningRequirementsTeamworkxxxx25
  • 26. ®Agile web developmentCompared to the overall sample, web projects:Are more likely to contain duplicated code, lesslikely to have a coding standard, and do lessrefactoringAre these things less important on web projects?Are less likely to be built automatically once a dayAre more likely to have collocated product ownersAnd more likely to have product owners whorespond in a timely mannerAre more likely to be done in mini-waterfalls26
  • 27. ®What do you think the averageTeams know their velocityTrue(5)False(1)Product owners provide acceptancecriteria for each featureWe dont cancel training, holiday, andvacation time when behind scheduleTesters are productive right from thestart of each iterationxxxx27
  • 28. ®Agenda✓ The Assessment Framework✓ Assessment Process✓ Preliminary Industry ResultsSample Company Results28
  • 29. ®How does a company usethis data?Stock their improvement backlog withitems for teams (including non-deliveryteams) to work onIdentify Big Hairy Audacious Goals(BHAGs) to ask teams to meetIdentify leading and lagging indicatorsof success to gauge and measureprogress29
  • 30. ®Dimensions of an examplecompanyDirected; individuals work insilos; multiple locations; multipleprojectsTeamworkSelf-organizing, cross-functionalteams; dedicated team members;collocatedDocument-centric; collectedupfront; little acknowledgementof emergenceRequirementsCollected at different levels ofconversation-focused, augmentedwith documentationAll-encompassing, task-orientedplans created upfront; reluctanceto update plans; little buy-in todates from teamPlanningCreated at multiple levels ofby team with full buy-inxx0 +1-130
  • 31. ®Quality is tested in afterdevelopment; little emphasis onor effective use of automationQualityQuality is built into the productduring each iteration; automatedunit and acceptance testsdeadlines through heroic effort;command-and-controlCultureTrusting, collaborative, andadaptiveinconsistent use of iterationsKnowledgeCreatingAll work performed in strictlyCode written by programmersworking alone; little emphasis ontesting; code becomes harder tointegration and system buildsTechnicalPracticesCode written in pairs using test-driven development; code notallowed to degrade over time;and tested at least once per dayx0 +1-131
  • 32. ®“Hmm, those TechnicalPractices and Quality scores looklow compared to other companies.Let’s dig deeper.”Code written by programmersworking alone; little emphasis ontesting; code becomes harder tointegration and system buildsTechnicalPracticesCode written in pairs using test-driven development; code notallowed to degrade over time;and tested at least once per day0 +1-132
  • 33. ®Test-drivendevelopmentPair programmingRefactoringContinuousintegrationTechnical PracticescharacteristicsCoding standardsCollectiveownership0 +1-133
  • 34. ®Quality characteristicsAutomated unittestingTimingCustomeracceptance testing0 +1-134
  • 35. ®Teams feel an appropriate amount of pressureto meet deadlines.Product owners are willing to consider deliveringless than 100% of a solution.in release planningManagement Style:If your company just received this assessment, what might you do?Product owners understand that sometimes solving20% of the problem delivers 80% of the value.We don’t cancel training, holiday, and vacationtime when behind schedule.We maintain a high rate of productivity withoutbeing overworked.Management allows team members to make thedecisions that should be theirs to make.0 +1-135
  • 36. ®How you canparticipateTake the survey, its free!Get a report summarizingyour answersWe’re working on gettingcomparative reportingavailableTimeline is somewhatdependent on how muchmore data we get and howfastYou can opt-in to atouch with new reportingfeaturesVisit the website for details:www.ComparativeAgility.com36
  • 37. ®Contact informationKenny Rubinkrubin@innolution.com(303) 827-3333 Mike Cohnmike@mountaingoatsoftware.com(720) 890-6110www.ComparativeAgility.com37