Your SlideShare is downloading. ×
Agile Experience
Upcoming SlideShare
Loading in...5
×

Thanks for flagging this SlideShare!

Oops! An error has occurred.

×

Introducing the official SlideShare app

Stunning, full-screen experience for iPhone and Android

Text the download link to your phone

Standard text messaging rates apply

Agile Experience

4,043
views

Published on

Rama Krishna Pulivendla's presentation at Agile Goa 2007 conference. http://agileindia.org/agilegoa07/index …

Rama Krishna Pulivendla's presentation at Agile Goa 2007 conference. http://agileindia.org/agilegoa07/index
This highlights Tech Mahindra's experience with Agile

Published in: Technology, Business

0 Comments
2 Likes
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total Views
4,043
On Slideshare
0
From Embeds
0
Number of Embeds
0
Actions
Shares
0
Downloads
250
Comments
0
Likes
2
Embeds 0
No embeds

Report content
Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
No notes for slide

Transcript

  • 1. Case Study by Rama Krishna Pulivendla Agile Experiences
  • 2. Agenda
    • Introduction
    • Why we had to go in Agile way?
    • How the issues were addressed?
    • Team background
    • Agile Practices followed
    • Initial Issues
    • How did we overcome them?
    • Current state
    • Release planning
    • Iterative development
    • Daily Stand-ups
    • Retrospectives
    • User Stories
    • Pair Programming
    • Test Driven Development
    • Continuous Integration
  • 3. Why we had to go in Agile way?
    • Customer started seeing the changing scenarios and want the development team to adapt to the changing requirements and was not willing to renegotiate the schedule
      • Results in team ramp up at a short notice and make the existing/senior team members spend long hours in the office to ensure that they complete the same
    • Clarity of requirements
      • Initially the customer may not be clear in explaining his requirements and needs a quick prototyping to get the look and feel
        • Final decision may take its own time
          • This affects schedule and increases cost for the customer as he has pay up for the delays caused from his end.
    • Time line for delivery was long
    • Some times requirements use to be over taken by an event
    • Communication flow from the end user to developer
      • Mismatch between user expectations and what has been delivered.
    • Development of other interfacing modules going in parallel
      • Customer may get what he has asked for but not useful till all the modules are complete. Adds to the cost for keeping the team intact
    • Post Delivery Defects
    • Initial estimates going wrong and increase in efforts
      • Explain the issues to the customer and try to negotiate on the contract
        • May not be successful to renegotiate the contract
  • 4. How the issues were addressed?
    • Constant communication between the end user and the developer to minimize the communication gaps
      • Clarity of requirements as explained by the customer
      • Requirements as understood by the developer
    • Customer can see the early drops and can suggest enhancements, if required
      • Customer is happy as working software is available
      • Easy to negotiate the contract for the enhancements
    • Priority of which functionality is decided by the customer
      • In line with the progress of the other modules
        • Any issues in other modules, customer can re-prioritize the work stack.
    • Continuous integration helps in early detection of faults and reduces the post delivery defect density
  • 5. Team Background
    • PM and Developers are from waterfall model and never worked on Agile methodologies
    • In house training on Agile methodologies
    • Team was ready to adapt to Agile
    • Support from Senior Management
    • Willing customer
  • 6. Agile Practices followed…
    • Release planning/Hot house
    • Iterative Development
    • Daily stand-ups
    • Retrospectives
    • User Stories
    • Burn charts
    • Customer Collaboration
    • Automated testing
    • Continuous integration
    • Test Driven Development
    • Pair Programming
    • Product and Spring backlog
    • Value Stream Analysis
  • 7. Initial Issues
    • Releases once in two weeks (six iterations/release)
      • Quickly getting in to the waterfall model
    • Communication with the customer
      • Updating progress and what’s coming up in the next delivery. Communication between the PM and the developer regarding assessing the progress and risk of not delivering.
        • Tendency to stay late and complete the work but couldn’t sustain it for long.
      • Requirement clarifications – Got into tendency that requirements are clear and there is no need to further talk to customer
    • Reluctance for pair programming
      • “I can work better when I am alone”
      • “Actual efforts” more than the “planned efforts”
    • Missing the delivery date because the functionality is not complete
    • Code quality and rework on delivered functionality
    • Not recognizing the importance of Acceptance Test results
  • 8. How did we overcome?
    • Informal review of work at the end of the day in addition to scrum meeting every day morning, which is very focused on the activities.
      • The purpose is to encourage team members to do better revised estimates so that the progress is shown correctly on the burn chart.
    • Pair programming between “experts” and “novices” for ramping up the new team members quickly.
      • Recognizing the importance of it and why it is required
      • After the initial resistance, team realized the importance of it
    • Constant discussions with the customer (even if the requirements are clear)
      • As soon as the developer picks up a user story, to call customer and verify the acceptance criteria
      • Discuss the possible changes to the screen
    • Early demonstrations to ensure that the implementation is in sync with what customer is expecting.
    • Understanding the importance of AT results
      • Failed AT is given highest priority to fix in comparison to the currently underdevelopment functionality
  • 9. Current State
    • Project rated excellent in all the agile practices
    • Zero post delivery defect density
    • Satisfied customer
    • Helping customer in planning and prioritizing
  • 10. Release Planning
    • Release Plan
    • Prioritized requirements
    • New Product Backlog
    • Updated Wiki page
    • Retrospective of previous release (Day1)
    • Current Status of various functionality(Day1)
    • Discussion of new requirements(Day1)
    • Preparation of Index cards for new requirements with estimates (Day1)
    • Determining velocity (based on previous release)(Day2)
    • Prioritizing the user stories(Day2)
    • High Level Business Requirement
    • Current product backlog
    • Velocity from the previous release
    Outputs
    • Customer Representative (s)
    • System Owner
    • Project Manager
    • Development team
    • Support representative
    • Testing team representative
    Project Manager Activities Inputs Participant Initiator/Driver 2 Days, every quarter Duration
  • 11. Release Planning Game Backlog Prioritizing the Work Stack Final Work Stack
  • 12. Iterative development
    • Implementation
      • Developers pick the user stories
        • Only one user story per developer at any point of time
      • Daily stand-up meetings
      • ATs run every night automatically.
      • Test Driven Development (TDD), Pair programming, Continuous testing by testers on the latest build
      • Communication with the customer about progress and requirements
      • 2 weeks duration. Release candidate on second Thursday
      • Testing and sign-off by testing team on Friday
      • Deployment on Monday!
    Iteration Planning Meeting:
    • Updated wiki page
    • Setting the velocity for this iteration
    • Identifying the user stories and setting the priority for them.
    • Release Plan
    • Product Backlog
    • Velocity from the previous iteration
    Outputs
    • Customer Representative
    • Project Manager
    Project Manager (Tech Mahindra) Activities Inputs Participants Initiator/Driver 30 Minutes, Telephonic call Duration
  • 13. Daily Stand-ups
    • Every day at 10AM. Duration 10-15 mins
    • Attended by all developers (unless on leave) and project manger/scrum master
    • What is discussed?
      • What was done yesterday?
        • Which user story working on? And paired with whom?
        • Acceptance Tests/JUNITs written for the functionality before starting coding – To be mentioned by both Primary and Secondary
      • What is planned today?
        • Functionality to cover that day
        • Discussions required with Customer (this is to plan the discussions)
      • Issues faced
        • Anything that stopped the developer from completing the previous day’s activities
        • Possible dependencies like clarifications required from customer etc.
      • Pair Programming
        • Who is going to pair with whom?
  • 14. Retrospectives
    • Iteration Retrospective
      • After the completion of iteration
      • Direct participation
        • Developers
        • Testing team
      • In direct participation (through mail)
        • Customer
    • Release Retrospective
      • During the Release planning meeting
      • Comments mainly from the customer
    • What is discussed?
      • What went well?
      • What could be better?
      • What we learnt?
      • What still puzzles us?
  • 15. User Stories
    • Promise for Conversation with the user/customer
    • Index cards used to write the user stories. Contains the following
      • User story description
      • Acceptance criteria
      • Estimate for implementing the user story
    • Index cards are stuck on the white board during the iteration with following information
      • Developer’s initials
      • Estimated hours of work
      • Hours of work spent so far
      • Remaining (estimated) hours of work – Information used for marking the burn down chart
    • User stories are also mentioned on the WIKI page
      • Status update by developers/testing team is done here.
        • Developer
          • Under development (developer name)
          • Ready for Test (build xxx)
        • Tester
          • Tested (build xxx)
          • Failed, if acceptance criteria is not met
        • Visible to customer.
    • Burn charts are maintained both for the iteration and at the release level.
  • 16. User Stories (Continued…)
    • Splitting the major functionality to smaller user stories
      • STATS Functionality
        • Requirement was to provide various stats for the logs created by the user.
        • User wants stats to mailed on 1 st of the month and also stored in the application so that he can refer them
        • Scheduler and Email stats developed as per the user priority – Iteration 2
        • Headline figures for Roaming stats, Details of stats developed – Iteration 3
        • STATS viewer added on the main page so that the user can check the stats anytime – Iteration 4
        • Further enhancements/changes to the display requested by the user after reviewing the functionality (priority as set by the user – Iteration 5 onwards)
  • 17. Pair Programming
    • Implementation of user stories is done in pairs.
    • No production code is checked in without a pair
    • Primary and Secondary
      • Primary is the owner of a particular user story
      • Secondary provides a guidance, support and helps in implementation
      • Primary for a user story is constant, secondary is rotated
    • Two pairing sessions per day
      • First session primary and secondary swap roles for the second session.
      • User story 1 – X is primary and Y is secondary
      • User Story 2 – Y is primary and X is secondary
    • Pairing Matrix is maintained to see how many times same pairs were together
    • No separate code review process
  • 18. Charts
  • 19. Test Driven Development
    • The first activity once the developer picks up the user story is writing the Acceptance test and checking in the AT.
    • This is followed by the junits being written and then the code is written to pass the junits first and then the Acceptance tests
    • Secondary programmer ensures that the same are done before the development of code.
    • Same is verified by the “failed” test case appearing on the next day’s automated testing.
    • This is mentioned in the stand up meeting the next day by both primary and secondary
  • 20. Continuous Integration
    • Code is checked in after the implementation of the functionality as identified by the user story after synchronizing the code with the latest.
    • Merging is done after manually identifying the code changes to merge.
    • Once checked in, new build is made and all the junits are executed to ensure that nothing is broken in the existing functionality
    • One person checks in the code at a time and waits for the build to be made. Others wait till the build is complete before checking in their code.
    • Acceptance test suite is run the next day morning to ensure that the earlier delivered functionality is not broken.
    • Fixing of any acceptance test that has failed is given the highest priority (over any other activity)
  • 21.
    • Questions?
  • 22.
    • Thank You