Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Sept 16, 2011 webcast moneyball for software teams jonathan alexander


Published on

Moneyball for Software Teams?

Date: This event took place live on September 16 2011

Presented by: Jonathan Alexander

Duration: Approximately 60 minutes.

Cost: Free

The new movie "Moneyball" starring Brad Pitt is about to be released. Based on the bestselling book of the same name, the movie explores the use of "sabermetrics" to build winning baseball teams. In this webcast, Jonathan Alexander, author of Codermetrics, suggests that these same ideas be applied to software teams. Jonathan will discuss how you can apply similar ideas to improve your software teams, giving examples of specific metrics and techniques to help you identify, analyze, and discuss the successes and failures of your software engineers and to help make the team a more cohesive and productive. If you manage or lead teams of software developers and engineers, and want to make sure that your focus is on success, you can't afford to miss this entertaining and instructive event.
Presented by: Jonathan Alexander

Published in: Business, Technology
  • Be the first to comment

  • Be the first to like this

Sept 16, 2011 webcast moneyball for software teams jonathan alexander

  1. 1. Moneyball for Software Teams?  Presented by Jonathan Alexander  VP Engineering at Vocalocity  Author of Codermetrics (O‟Reilly 2011)
  2. 2. The Popularity of Moneyball Advanced stats used to analyze baseball players and teams Bill James, the father of sabermetrics, author and consultant Michael Lewis, author of Moneyball published 2003 Moneyball starring Brad Pitt being released Sept. 23rd, 2011 O‟Reilly Strata Summit: The Business of Data (NYC Sept. 20-21) has Paul DePodesta, VP NY Mets as a featured speaker about Moneyball
  3. 3. Metrics Have Changed the Game(s)  Scouting  Drafting  Trades  Coaching  Player Development  Salary Arbitration
  4. 4. Principles of Moneyball:Identify Undervalued Assets Analyze the skills correlated with winning Discover important skills that market undervalues Leverage knowledge to build winning teams © Brandon Vincent |
  5. 5. Principles of Moneyball:Challenge Assumptions Gather situational statistics Objectively check assumptions against facts Adjust strategy based on analysis and results © Eric Broder Van Dyke |© Richard Kane |
  6. 6. Principles of Moneyball: Study and Learn from Outliers  anomaly (noun): an incongruity or inconsistency, a deviation from the norm  outlier (noun): a person or thing that lies outside, a point widely separated from the main cluster Games At Bats Hits Doubles Triples HRs RBI Avg. OPS All StarsPiazza „92-‟07 1,912 6,911 2,127 344 8 427 1,335 .308 .922 12Pudge ‟91-‟10 2,499 9,468 2,817 565 51 309 1,313 .298 .800 14
  7. 7. Techniques Used For Moneyball Leverage basic performance statistics  Hits, Runs Batted In (RBI), Runs Allowed (ERA) Add “situational” statistics gathered by “spotters”  Errors, out-of-zone fielding, pressure situations Develop “advanced” statistics through combinations and formulas  OPS (on-base plus slugging) , FIP (fielding independent pitching) , BABIP (batting average on balls in play) , WAR (wins above replacement) Analyze statistics to find best predictors of individual and team success
  8. 8. Moneyball for Software Teams? Implement new techniques to gather metrics on a wide variety of contributions Find ways to measure “wins” and “losses” Analyze how individual contributions and team “chemistry” correlate to wins and losses  Examine Assumptions  Discover Patterns Use metrics to create focus and help identify opportunities to change, adjust, improve
  9. 9. The Magic Triangle Challenge Oft-discussed “triangle”: Features-Time-Quality Is it true? You can‟t add more work unless you lengthen time or reduce quality Avg. Total Quality Release Complexity Complexity Problems Quality % Release 1 1.2 272 86 68% Release 2 1.6 248 77 69% ? Release 3 1.5 274 109 60% Release 4 2.8 318 69 78% Release 5 2.4 347 88 75% Release 6 1.4 261 92 65% Release Quality % = 100 – (Quality Problems / Total Complexity)
  10. 10. Questions To Answer How well do team members handle their core responsibilities?  Examples: Design, Code, Test In what ways do team members contribute beyond their core responsibilities?  Examples: Innovate, Take Initiative, Handle Adversity How much do team members help others?  Examples: Assist, Mentor, Motivate Is the software team succeeding or failing?  Examples: New Users, Production Bugs, Efficiency
  11. 11. What Are The Roles On Your Teams?  Playmakers and Scorers  Defensive Stoppers  Utility and Role Players  Backups  Motivators  Veterans and Rookies
  12. 12. Example: Skill MetricsMetric Description FormulaPoints Measure the overall productivity of each Points = Sum of complexity rating for all completed tasks coder on assigned tasksUtility Measure how many assigned tasks each Utility = Number of tasks completed coder completesAssists Measure the amount of coder interruptions Assists = Count of times that coder helps others and how much a coder helps othersSaves Measure how often a coder helps fix urgent Saves = Number of severe product issues coder helps fix production issuesTackles Measure how many potential issues Tackles = Number of times a coder takes initiative or innovates a coder handles proactivelyTurnovers Measure the complexity of assigned tasks Turnovers = Sum of complexity for all tasks that are not completed that a coder fails to completeErrors Measure the magnitude of production Errors = Sum of bug severity factored by population affected issues found in areas of coder responsibilityRange Measure how many areas of software a coder Range = Number of areas worked on by a coder works on
  13. 13. Example: Response MetricsMetric Description FormulaWins Measure the number of active users added Wins = Sum (User Activations)Losses Measure the number of active users lost Losses = Sum (User Deactivations)Win Rate Determine the average amount of time it Win Rate = Time elapsed divided by the number of new users takes to get a “win” (new user)Loss Rate Determine the average amount of time it Loss Rate = Time elapsed divided by the number of lost users takes to accumulate each “loss” (lost user)Win Percentage Measure the percentage of trials that successfully Win Percentage = (Successful Trials / Trials Completed) × 100 convert to active usersGain Measure the number of Wins minus the Gain = Wins - ((Trials Completed – Successful Trials) + Losses) missed opportunities and LossesPenalties Per Win Measure the overall urgency of customer Penalties Per Win = Penalties / Wins support issues relative to the number of new users
  14. 14. Example: “Advanced” MetricsMetric Description FormulaPower Measure the average complexity of the tasks Power = Points / Utility that a coder completesTemperature Measure how “hot” or “cold” a coder is at Temperature = Previous Temp. × (Current Points / Previous any given time (start with Temp. 72) Points)O-Impact “Offensive Impact” to summarize how a coder O-Impact = Points + Utility + Assists helps move projects alongD-Impact “Defensive Impact” to summarize how a coder D-Impact = (Saves + Tackles) × Range helps solve issues or avoid larger problemsPlus-Minus Measure the amount of positive contributions Plus-Minus = Points - Turnovers - Errors versus negative issues for each coderTeamwork Establish a relative rating for team-oriented Teamwork = Assists + Saves + Range - Turnovers contributionsFielding Establish a relative rating for the range and Fielding = (Utility + Range) - (Turnovers + Errors) breadth of work successfully handledIntensity Establish a relative rating for heightened Intensity = Saves + Tackles + (Avg. Temp. – Start Temp.) productivity and dealing with hot issuesWin Shares Assign a relative level of credit to each coder Win Shares = Wins × Influence × Efficiency for new usersLoss Shares Assign a relative level of responsibility to Loss Shares = Losses × (1.0 - Efficiency) each coder for lost users
  15. 15. Techniques to Gather and Track Metrics Get data from existing systems  Project tracking, bug tracking, customer support Instrument your software for usage data  New users, lost users, feature usage, measured benefits Self-reporting or “spotters” for situational data Create documents or database for metric storage and tracking
  16. 16. Increase Team Awareness of Skills
  17. 17. Create Focus on Team Results
  18. 18. Identify Key Goals and AccomplishmentsMetric Description FormulaBoost Measure the amount of additional user Boost = Sum of the percentage of users receiving each benefit benefits deliveredAcceleration Measure the ratio of user benefits delivered Acceleration = Boost / Number of Urgent User Issues) x 100 to urgent user issues created
  19. 19. Steps for Adopting Metrics 1. Find a Sponsor 2. Create a Focus Group 3. Conduct a Trial (restart or stop if trial fails) 4. Introduce Metrics to the Team 5. Create a Metrics Storage System 6. Establish a Forum for Discourse 7. Expand Metrics Used and Analysis
  20. 20. Places and Times to Use Metrics  Regular Team Meetings (sprint retrospectives)  Project Post-Mortems  Mentoring  Establishing Goals and Rewards  Performance Reviews (validation)  Promotion Consideration
  21. 21. Moneyball Strategies forBuilding Better Software Teams Recruit for “Comps”  Profile your team, identify roles you need, then recruit Improve the Farm System  Use interns, contract-to-perm, promote from within Make Trades  Re-organize teams internally to fill roles and balance skills Coach the Skills You Need  Focus on those with aptitude, use target metrics
  22. 22. Recruiting Comps Defensive Stopper Profile Candidate A Profile Candidate B Profile Candidate C Profile Avg. Points Medium High Medium Medium Avg. Utility Medium Medium Medium Medium Avg. Assists Medium Low High Medium Avg. Errors Low Medium Low Medium Avg. Saves High Low High Medium Avg. Tackles High Low Medium Low Avg. Range Medium Low Medium Medium Target Profile Best Candidate
  23. 23. Potentially Undervalued Assets  Defensive Stoppers  Utility Players  Backups
  24. 24. Resources for Further Exploration Codermetrics: Analytics for Improving Software Teams 262 Pages Released August, 2011 In bookstores, Safari Online, or at Codermetrics.Org – community website Post ideas or stories Share resources (spreadsheets, analysis) Ask questions Post Events Follow on Twitter @codermetrics
  25. 25. Special Offer Visit to purchase your copy of Codermetrics and enter code 4CAST to save 40% off print book & 50% off ebook with special code 4CASTVisit to view upcoming webcasts and online events