Your SlideShare is downloading. ×
Agile in a nutshell
Upcoming SlideShare
Loading in...5
×

Thanks for flagging this SlideShare!

Oops! An error has occurred.

×

Introducing the official SlideShare app

Stunning, full-screen experience for iPhone and Android

Text the download link to your phone

Standard text messaging rates apply

Agile in a nutshell

46,430
views

Published on

This is a presentation I put together for a conference in 2011. It gives a fast, high level view of where Agile Software Development came from, its core values and principles, and its core practices. …

This is a presentation I put together for a conference in 2011. It gives a fast, high level view of where Agile Software Development came from, its core values and principles, and its core practices. It is structured as 7 PechaKucha decks in a row, with short breaks in between, which requires high energy, intensity, and a sense of humor. :)

Published in: Technology

9 Comments
12 Likes
Statistics
Notes
No Downloads
Views
Total Views
46,430
On Slideshare
0
From Embeds
0
Number of Embeds
2
Actions
Shares
0
Downloads
256
Comments
9
Likes
12
Embeds 0
No embeds

Report content
Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
No notes for slide
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • The waterfall model is a sequential design process, often used in software development processes, in which progress is seen as flowing steadily downwards (like a waterfall) through the phases of Conception, Initiation, Analysis, Design, Construction, Testing and Maintenance.\n\nThe unmodified "waterfall model". Progress flows from the top to the bottom, like a waterfall.\n\nThe waterfall development model originates in the manufacturing and construction industries: highly structured physical environments in which after-the-fact changes are prohibitively costly, if not impossible. Since no formal software development methodologies existed at the time, this hardware-oriented model was simply adapted for software development.\nThe first formal description of the waterfall model is often cited as a 1970 article by Winston W. Royce,[1] though Royce did not use the term "waterfall" in this article. Royce presented this model as an example of a flawed, non-working model (Royce 1970). This, in fact, is how the term is generally used in writing about software development—to describe a critical view of a commonly used software practice.\n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • Over the years I have come to describe Test Driven Development in terms of three simple rules. They are:\nYou are not allowed to write any production code unless it is to make a failing unit test pass.\nYou are not allowed to write any more of a unit test than is sufficient to fail; and compilation failures are failures.\nYou are not allowed to write any more production code than is sufficient to pass the one failing unit test.\n
  • Behaviour-Driven Development (BDD) is an evolution in the thinking behind TestDrivenDevelopment and AcceptanceTestDrivenPlanning.\nIt brings together strands from TestDrivenDevelopment and DomainDrivenDesign into an integrated whole, making the relationship between these two powerful approaches to software development more evident.\nIt aims to help focus development on the delivery of prioritised, verifiable business value by providing a common vocabulary (also referred to as a UbiquitousLanguage) that spans the divide between Business and Technology.\nIt presents a framework of activity based on three core principles:\nBusiness and Technology should refer to the same system in the same way - ItsAllBehaviour \nAny system should have an identified, verifiable value to the business - WheresTheBusinessValue \nUp-front analysis, design and planning all have a diminishing return - EnoughIsEnough \nBDD relies on the use of a very specific (and small) vocabulary to minimise miscommunication and to ensure that everyone – the business, developers, testers, analysts and managers – are not only on the same page but using the same words.\n
  • Feature Driven Development (FDD) is an iterative and incremental software development process. It is one of a number of Agile methods for developing software and forms part of the Agile Alliance. FDD blends a number of industry-recognized best practices into a cohesive whole. These practices are all driven from a client-valued functionality (feature) perspective. Its main purpose is to deliver tangible, working software repeatedly in a timely manner.\n\nFDD was initially devised by Jeff De Luca to meet the specific needs of a 15 month, 50 person software development project at a large Singapore bank in 1997. Jeff De Luca delivered a set of five processes that covered the development of an overall model and the listing, planning, design and building of features. The first process is heavily influenced by Peter Coad´s approach to object modeling. The second process incorporates Peter Coad's ideas of using a feature list to manage functional requirements and development tasks. The other processes and the blending of the processes into a cohesive whole is a result of Jeff De Luca's experience. Since its successful use on the Singapore project there have been several implementations of FDD.\nThe description of FDD was first introduced to the world in Chapter 6 of the book Java Modeling in Color with UML[1] by Peter Coad, Eric Lefebvre and Jeff De Luca in 1999. In Stephen Palmer and Mac Felsing´s book A Practical Guide to Feature-Driven Development[2] (published in 2002) a more general description of FDD, decoupled from java modeling in color, is given.\nThe original and latest FDD processes can be found on Jeff De Luca´s website under the ´Article´ area. There is also a Community website available at which people can learn more about FDD, questions can be asked, and experiences and the processes themselves are discussed.\n
  • \n
  • \n
  • Acceptance Test Driven Development (ATDD) is a practice in which the whole team collaboratively discusses acceptance criteria, with examples, and then distills them into a set of concrete acceptance tests before development begins. It’s the best way I know to ensure that we all have the same shared understanding of what it is we’re actually building. It’s also the best way I know to ensure we have a shared definition of Done.\n
  • \n
  • \n
  • \n
  • \n
  • \n
  • Origins in XP\n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • Negotiable... and Negotiated\nA good story is negotiable. It is not an explicit contract for features; rather, details will be co-created by the customer and programmer during development. A good story captures the essence, not the details. Over time, the card may acquire notes, test ideas, and so on, but we don't need these to prioritize or schedule stories.\n\n
  • Valuable\nA story needs to be valuable. We don't care about value to just anybody; it needs to be valuable to the customer. Developers may have (legitimate) concerns, but these framed in a way that makes the customer perceive them as important.\nThis is especially an issue when splitting stories. Think of a whole story as a multi-layer cake, e.g., a network layer, a persistence layer, a logic layer, and a presentation layer. When we split a story, we're serving up only part of that cake. We want to give the customer the essence of the whole cake, and the best way is to slice vertically through the layers. Developers often have an inclination to work on only one layer at a time (and get it "right"); but a full database layer (for example) has little value to the customer if there's no presentation layer.\nMaking each slice valuable to the customer supports XP's pay-as-you-go attitude toward infrastructure.\n\n
  • A good story can be estimated. We don't need an exact estimate, but just enough to help the customer rank and schedule the story's implementation. Being estimable is partly a function of being negotiated, as it's hard to estimate a story we don't understand. It is also a function of size: bigger stories are harder to estimate. Finally, it's a function of the team: what's easy to estimate will vary depending on the team's experience. (Sometimes a team may have to split a story into a (time-boxed) "spike" that will give the team enough information to make a decent estimate, and the rest of the story that will actually implement the desired feature.)\n\n
  • Good stories tend to be small. Stories typically represent at most a few person-weeks worth of work. (Some teams restrict them to a few person-days of work.) Above this size, and it seems to be too hard to know what's in the story's scope. Saying, "it would take me more than month" often implicitly adds, "as I don't understand what-all it would entail." Smaller stories tend to get more accurate estimates.\nStory descriptions can be small too (and putting them on an index card helps make that happen). Alistair Cockburn described the cards as tokens promising a future conversation. Remember, the details can be elaborated through conversations with the customer.\n\n
  • A good story is testable. Writing a story card carries an implicit promise: "I understand what I want well enough that I could write a test for it." Several teams have reported that by requiring customer tests before implementing a story, the team is more productive. "Testability" has always been a characteristic of good requirements; actually writing the tests early helps us know whether this goal is met.\nIf a customer doesn't know how to test something, this may indicate that the story isn't clear enough, or that it doesn't reflect something valuable to them, or that the customer just needs help in testing.\nA team can treat non-functional requirements (such as performance and usability) as things that need to be tested. Figure out how to operationalize these tests will help the team learn the true needs.\n\n
  • \n
  • \n
  • \n
  • \n
  • \n
  • Transcript

    • 1. Agile in a Nutshell Steven “Doc” List
    • 2. Agile in a Nutshell Steven “Doc” List
    • 3. What’s coming?Seven sections, Pecha Kucha / Ignite style 20 slides, 15 seconds each = 5:00 7 * 5:00 (300) = 2100 seconds = 35:00 Plus short breaks (I’ve GOT to breathe!) Q&A (or get a break)
    • 4. What’s coming?Seven sections, Pecha Kucha / Ignite style 20 slides, 15 seconds each = 5:00 7 * 5:00 (300) = 2100 seconds = 35:00 Plus short breaks (I’ve GOT to breathe!) Q&A (or get a break)
    • 5. The Seven Sections
    • 6. The Seven Sections Memorize this because there WILL be a test!1.History 5.Roles & People2.Principles 6.Practices3.Players 7.User Stories and more4.Lifecycle
    • 7. Fasten your seatbelts!
    • 8. 1. History
    • 9. 1: ll? fa er What is atw
    • 10. w at What is er fa l? l Dr. Winston Royce, 19701:
    • 11. Project Plans 1:
    • 12. BDUFBig Design Up Front 1:
    • 13. Silos 1:
    • 14. Isolation 1:
    • 15. ReRe ac vo ti lu on tio , n 1:
    • 16. ReRe ac vo ti lu on tio , n 1:
    • 17. Lightweight 1:
    • 18. Lightweight 1:
    • 19. Collab oration 1:
    • 20. Transparency 1:
    • 21. Time toValue 1:
    • 22. Schools of Thought 1:
    • 23. Scrum 1:
    • 24. ExtremeProgramming (XP) 1:
    • 25. Snowbird Feb 2001 1:
    • 26. 1:
    • 27. 1:
    • 28. 1:
    • 29. 1:
    • 30. 1:
    • 31. Is this enough? 1:
    • 32. Breathe! Take a drink of water.
    • 33. 2. Principles
    • 34. 2:
    • 35. Our highest priority is to satisfy the customer through early and continuous delivery of valuable software. 2:
    • 36. Welcome changingrequirements, even late in development. Agileprocesses harness change for the customers competitive advantage. 2:
    • 37. Responding to Change over Following a PlanWelcome changing requirements, even late in development. Agileprocesses harness change for thecustomers competitive advantage. 2:
    • 38. Deliver working softwarefrequently, from a couple of weeks to a couple ofmonths, with a preferenceto the shorter timescale. 2:
    • 39. Working Software overComprehensive DocumentationDeliver working software frequently, from a couple ofweeks to a couple of months, with a preference to the shorter timescale. 2:
    • 40. Iterative DevelopmentDeliver working software frequently, from a couple ofweeks to a couple of months, with a preference to the shorter timescale. 2:
    • 41. Business people and developers must worktogether daily throughout the project. 2:
    • 42. Build projects aroundmotivated individuals. Givethem the environment and support they need, and trust them to get the job done. 2:
    • 43. Individuals and Interactions over Processes and ToolsBuild projects around motivated individuals. Give themthe environment and support they need, and trust them to get the job done. 2:
    • 44. The most efficient and effective method ofconveying information toand within a development team is face-to-face conversation. 2:
    • 45. Co-location Daily Stand-Up Retrospectives The most efficient and effective method of conveyinginformation to and within a development team is face-to- face conversation. 2:
    • 46. Working software primaryWorking software is theis theprimary measure of progress. measure of progress. 2:
    • 47. Working software is theprimary measure of progress. 2:
    • 48. Agile processes promotesustainable development. The sponsors, developers, and users should be able tomaintain a constant pace indefinitely. 2:
    • 49. Agile processes promotesustainable development. The sponsors, developers, and users ab le in ta e should be able to Sus ac Pmaintain a constant pace indefinitely. 2:
    • 50. Simple DesignContinuous attention totechnical excellence andgood design enhances agility. 2:
    • 51. Simplicity--the art of maximizing the amount ofwork not done--is essential. Simple Design 2:
    • 52. The best architectures,requirements, and designsemerge from self-organizing teams. 2:
    • 53. Emergent Design,Evolutionary Architecture The best architectures,requirements, and designsemerge from self-organizing teams. 2:
    • 54. At regular intervals, the teamreflects on how to become moreeffective, then tunes and adjusts its behavior accordingly. 2:
    • 55. At regular intervals, the teamreflects on how to become moreeffective, then tunes and adjusts its behavior accordingly. Retrospectives 2:
    • 56. Breathe! Take a drink of water.
    • 57. 3. Players
    • 58. Ken Schwaber Scrum 3: Players
    • 59. Ken Schwaber Scrum Jeff Sutherland 3: Players
    • 60. Kent Beck ExtremeProgramming 3: Players
    • 61. Kent Beck ExtremeProgramming Ron Jeffries 3: Players
    • 62. Kent Beck Ron Jeffries ExtremeProgramming Ward Cunningham 3: Players
    • 63. Kent Beck Ron Jeffries Extreme Programming Martin FowlerWard Cunningham 3: Players
    • 64. Agile Alliance “AgileRetrospectives” Diana Larsen 3: Players
    • 65. Agile Alliance “AgileRetrospectives” Esther Derby Diana Larsen 3: Players
    • 66. Alistair Cockburn Crystal Methodology“Agile Software Development” 3: Players
    • 67. Robert C. “Uncle Bob” Martin Software Craftsmanship “Clean Code” 3: Players
    • 68. James Grenning Planning Poker 3: Players
    • 69. James Grenning Planning Poker 3: Players
    • 70. Elisabeth Hendrickson Test Obsessed 3: Players
    • 71. Adaptive Leadership “Agile Project Management” Jim Highsmith 3: Players
    • 72. Lisa Crispin Agile Testing “Agile Testing” 3: Players
    • 73. Mary PoppendieckLean Software Development“Lean Software Development” 3: Players
    • 74. Take a breath.
    • 75. 4. Lifecycle Design & Operations /Inception Iteration 0 Build BAU
    • 76. Let’s talk about the iterative model 4:
    • 77. Let’s talk about the iterative model Lather, Rinse, Repeat 4:
    • 78. Design & OperationsInception Iteration 0 Build / BAU 4:
    • 79. Inceptio Design & Operation Iteration 0 n Build s / BAU Begin at the beginning 4:
    • 80. Inceptio Design & Operation Iteration 0 n Build s / BAU Who What Why Value 4:
    • 81. Inceptio Design & Operation Iteration 0 n Build s / BAU Workshops User Story (Epic) Writing Team Norms Plan Release 1 Plan Iteration 1 4:
    • 82. Iteration Design & OperationInception 0 Build s / BAU Laying the foundation 4:
    • 83. Iteration Design & OperationInception 0 Build s / BAU Software Hardware Furniture Network Infrastructure 4:
    • 84. Design & OperationInception Iteration 0 Build s / BAU Iterate...ate...ate 4:
    • 85. Design & OperationInception Iteration 0 Build s / BAU P r o j e c t 4:
    • 86. Design & OperationInception Iteration 0 Build s / BAU P r o j e c tRelease Release Release 4:
    • 87. Design & OperationInception Iteration 0 Build s / BAU P r o j e c tRelease Release Release Features Features Features MVPs MVPs MVPs Other Other Other 4:
    • 88. Design & OperationInception Iteration 0 Build s / BAU Release 4:
    • 89. Design & Operation Inception Iteration 0 Build s / BAU Release RetrospectivePlan Iterations 4:
    • 90. Design & Operation Inception Iteration 0 Build s / BAU IterationPlan 4:
    • 91. Design & Operation Inception Iteration 0 Build s / BAU Iteration Analyze Retrospective Showcase DesignPlan Code Test Deploy 4:
    • 92. Design & Operation Inception Iteration 0 Build s / BAU PlanningRelative 4:
    • 93. Design & Operation Inception Iteration 0 Build s / BAU PlanningRelative Collaborative 4:
    • 94. Design & Operation Inception Iteration 0 Build s / BAU PlanningRelative Collaborative Iterative 4:
    • 95. Design & Operation Inception Iteration 0 Build s / BAU PlanningInnovation Games Planning Poker Vertical Slice Story Mapping 4:
    • 96. Design & Operation Inception Iteration 0 Build s / BAU PlanningInnovation Games Planning Poker Vertical Slice Story Mapping 4:
    • 97. Design & Operation Inception Iteration 0 Build s / BAU PlanningInnovation Games Luke Hohmann 4:
    • 98. Design & Operation Inception Iteration 0 Build s / BAU PlanningInnovation Games Planning Poker James Grenning 4:
    • 99. Design & Operation Inception Iteration 0 Build s / BAU PlanningInnovation Games Planning Poker Story Mapping Jeff Patton 4:
    • 100. Take a sip of water.
    • 101. 5. Roles & People
    • 102. Product Owner /Product Manager 5: Roles &
    • 103. Product Owner /Product Manager 5: Roles &
    • 104. Project Manager 5: Roles &
    • 105. Project Manager 5: Roles &
    • 106. Iteration Manager / ScrumMaster 5: Roles &
    • 107. Iteration Manager / ScrumMaster 5: Roles &
    • 108. Business Analyst 5: Roles &
    • 109. User Experience Analyst / Designer 5: Roles &
    • 110. Developer 5: Roles &
    • 111. Developer 5: Roles &
    • 112. Architect / Technical Lead 5: Roles &
    • 113. Tester / Quality Analyst 5: Roles &
    • 114. QA / Test Lead 5: Roles &
    • 115. BA Lead 5: Roles &
    • 116. Scaling Projects 5: Roles &
    • 117. Scaling Projects 5: Roles &
    • 118. Scrum of Scrums 5: Roles &
    • 119. Huddles 5: Roles &
    • 120. Inclusivity 5: Roles &
    • 121. Shared... Responsibility,Accountability, Success 5: Roles &
    • 122. Shared... Responsibility,Accountability, Success 5: Roles &
    • 123. Shared... Responsibility,Accountability, Success 5: Roles &
    • 124. Shared... Responsibility,Accountability, Success 5: Roles &
    • 125. Trust 5: Roles &
    • 126. Trust 5: Roles &
    • 127. Courage 5: Roles &
    • 128. Drink, Breathe, Pause
    • 129. 6. Practices
    • 130. Co-location 6: Practices
    • 131. Co-location 6: Practices
    • 132. Pairing 6: Practices
    • 133. Showcase / Demo 6: Practices
    • 134. Showcase / Demo 6: Practices
    • 135. Refactoring 6: Practices
    • 136. Card Wall 6: Practices
    • 137. Ready In Test Done Card Wall 6: Practices
    • 138. Daily Stand-Up 6: Practices
    • 139. Daily Stand-Up 6: Practices
    • 140. Retrospective 6: Practices
    • 141. Retrospective 6: Practices
    • 142. Big Visible Charts 6: Practices
    • 143. As a... I need... So that...User Story 6: Practices
    • 144. I will know this is complete when...Acceptance Criteria Given... When... Then... 6: Practices
    • 145. Test-First 6: Practices
    • 146. TDD 6: Practices
    • 147. BDD 6: Practices
    • 148. FDD 6: Practices
    • 149. DDD Eva ns E ric 6: Practices
    • 150. DDDD 6: Practices
    • 151. ATDD 6: Practices
    • 152. Collaborative Estimation 6: Practices
    • 153. Collaborative Planning 6: Practices
    • 154. Iterations / Flow 6: Practices
    • 155. Drink, Breathe, Pause
    • 156. Drink, Breathe, Pause
    • 157. 7: User Stories
    • 158. Co-location 7: User Stories
    • 159. Co-location 7: User Stories
    • 160. The User Story Template Role: As a... Goal: I want/need... Value: So that... 7: User Stories
    • 161. Role: As a... 7: User Stories
    • 162. Goal: I want/need... 7: User Stories
    • 163. Value: So that... 7: User Stories
    • 164. Acceptance CriteriaHigh Level: I will know this is complete when...Detailed... 7: User Stories
    • 165. Given...When...Then... Given: The context and setup 7: User Stories
    • 166. Given...When...Then... When: The action 7: User Stories
    • 167. Given...When...Then... Then: Expected results 7: User Stories
    • 168. Hierarchy of DetailThemes: Planning 7: User Stories
    • 169. Hierarchy of DetailEpics: BIG Stories 7: User Stories
    • 170. Hierarchy of DetailStories: INVEST and the Three C’s 7: User Stories
    • 171. INVESTI: Independent 7: User Stories
    • 172. INVESTN: Negotiable 7: User Stories
    • 173. INVESTV: Valuable 7: User Stories
    • 174. INVESTE: Estimable 7: User Stories
    • 175. INVESTS: Small 7: User Stories
    • 176. INVESTT: Testable 7: User Stories
    • 177. The Three C’s CardConversationConfirmation 7: User Stories
    • 178. So much more! Planning Releases & Iterations Big Visible ChartsInformation Radiators ... 7: User Stories
    • 179. I’m done!
    • 180. I’m done!
    • 181. Steven “Doc” List Agile CoachDoc@AnotherThought.com www.StevenList.com