Software Measurement  Software Economics 2010 lecture 3: metrics in organizations
Mark Kofman Co-founder at  PROGRAMETER Metrics tracking kit for software development Interests:  software quality, metrics, startups, entrepreneurship, distributed and multi-culture teams Contact details: Email:  [email_address] Skype: kofman Twitter: @markkofman
Agenda Applications of software metrics Why introducing metrics is so hard? Lesson from the battlefield Common mistakes Steps to introduce metrics in organization
Agenda Applications of software metrics Why introducing metrics is so hard? Lesson from the battlefield Common mistakes Steps to introduce metrics in organization
Purpose of Metrics in Software Development  Quality assurance   Code quality, defects, application performance Performance measurement  How efficient is the organization, team or individual member Process improvement faster react to changes in the context
Quality Assurance Quality cannot be measured  directly -> derived from other metrics
Quality Assurance One can only  fix any two  -> cannot put lots of different qualities without affecting both budget and speed  Time Cost Funtionality
Quality Assurance What happens if we  overemphasize Quality of code Performance of the software User satisfaction Focus on  one ultimately important quality  and  minimize drawbacks  for others Start-ups : low-cost or development speed Airplanes : bug free Netbooks : small and light
QA : User Satisfaction Surveys User is able to  operate  software  without assistance User  understands  what software does User  likes  software Software works and does the right thing Number of defects  per iteration Number of delivered features
QA : Application Performance Application  average response time  under particular load Throughput  – number of concurrent requests Memory and CPU usage  under particular load Scalability How metrics change if  hardware is improved How metrics change if  software is distributed  over several systems
QA : Code Quality Well documented Comment density  – ratio of comment lines to all lines Well designed Distance from Main Sequence  -> abstractness and instability Depth of inheritance tree, number of children, coupling, complexity,  ... Code duplication number of  very similar code blocks
QA:  Defect Statistics How many defects  we have found?  How many  defects escaped  QA? How many are  open, assigned ? What is the  ratio of defects to tasks ? Defect fixing  effort estimation Toolkit: Jira, Mantis, Trac, Bugzilla, Programeter, ...
QA:  Code and Design Improvement Code compliance  to standards and conventions Identify design smells Rigidity, fragility, immobility, viscosity Design compliance  to principles Open close principle, dependency inversion principle, ... Toolkit: Checkstyle, FindBugs, PMD, ckjm, Eclipse Metrics, ...
QA:  Performance Tuning Memory and CPU  usage Performance Average response time, number of concurrent users Analyze  user behavioral patterns Most used methods, execution time Toolkit: YourKit, JProfiler, JMeter, JAMon, ...
QA:  Value Estimation   How much value  does particular feature bring? Is it  worth developing/buying ? Which features bring  the most value ?
Performance Measurement Performance management  is a set of activities to ensure that goals are consistently being met in an effective and efficient manner. Focus employees' attention on what matters most to success “ if you don’t measure results, you can’t tell success from failure and thus you can’t claim or reward success or avoid unintentionally rewarding failure,”
PM:  Focusing on Important Things Use metrics to find out components that Change the most  (e.g. churned  files) Are used the most  (e.g. logs) Drive optimization effort there Design for  easier change Improve  performance  where it is  crucial Toolkit: Programeter, JAMon, FishEye, SvnPlot
PM:  Time Control How do I (or my personnel) spend days ? How much time is spent on coding, communication and procrastination? How much is “billable” time? Are there any trends? Toolkit: RescueTime, Toggl
PM:  Burn-down Chart How much work remains  Will we hit the deadline? Points Planned Actual Remaining
PM:  Burn-up Chart How much work is done Scope Deadline estimation Remaining Completed Points
PM: Team Performance  Developer's performance Velocity  – number of completed features Churned lines of code  – number of added, modified or deleted lines of code Coding efficiency  – ratio of developer's code to churned lines of code Code maturity  – how much of developer's code gets rewritten by peers Toolkit:  Programeter, Scrum Sprint Monitor
PM:  Task and Feature Tracking Project estimation Size -> number of specialists, time and budget estimates Project tracking How much is completed, how much is left Are we on time? When we will deliver? Toolkit: MS Excel, MS Project, Gantt charts, Jira, Pivotal Tracker, Trac, Mingle, ...
PM:  Developer Benchmarking
Developer Benchmarking VS
Developer Benchmarking VS
PM:  Expertise Management What tools, technologies and libraries developers use Programming languages Library dependencies  (Ant+Ivy, Maven, ...) Code analysis: which  external components  developer uses in his code Better team formation and expertise sharing Toolkit: ...
Process Improvement Series of actions taken to identify, analyze and improve existing processes within an organization to meet new goals and objectives. Metrics are used to identify bottlenecks
PI:  Velocity How much work can a team complete per iteration Completed Points Iterations
PI:  Knowledge Sharing Risky knowledge Percentage of components  where developers posses only  unique  knowledge Pair programming, developer rotation Team friendliness of developer Ratio of shared knowledge to total knowledge 1 -> all knowledge is shared, 0 -> only unique Indicator of a culture:  “my code – don't touch it” Toolkit: Programeter
PI:  Knowledge Sharing
PI:  Return on Investment (ROI) Attractiveness of an investment calculated by financial performance of invested money (V final  – V init ) / V init V init  – initial value of an investment V final   – final value of an investment Example : bank savings account -> 4% Payback Period  – length of time when investment is returned
PI:  Total Cost of Ownership Total cost of procuring, using, managing and disposing of an asset over its useful life Example : TCO of a car much higher than its price! Price Gasoline Technical reviews Insurance …
Example: E-Print Feature A : support of doc, rtf, docx, txt, odt, pdf documents Development cost:  2.35K EUR Infrastructure cost:  40 EUR per month Feature B : support of pdf documents Development cost:  350 EUR Infrastructure cost:  No additional costs Application is not validated yet -> cut the costs -> implement feature B
Example : E-Print Initial application 3 year revenue:  10K EUR Additional development costing  3K EUR  would increase 3 year revenue by  4K EUR ROI = (4K – 3K) / 3K = 33%
TCO vs ROI  ROI  -> Making go or not go decisions Project drives increase in revenue “ Automation of customer relations” TCO  -> Evaluating possible decisions Build or buy, in-house or outsource, ... Swapping one system with another “ Migrating from one CRM system to another”
One Metric – Different Benefits Code complexity Developer  ←  code quality Complex code is not comprehensible Project manager  ←  testing effort Complex code requires more testing effort Executive  ←  maintenance costs Complex code leads to higher maintenance costs
Metrics at Specialist Level LOC WMC DIT NOC RFC Churned LOC LCOM Ce Ca Instability Abstractness Churn Count DIT Points Knowledge Friendliness Velocity # Requirements # Bugs # Defects Response Time Scalability
Specialist Level -> Project Level Project Size Design Quality LOC Points # Requirements DIT RFC LCOM Abstractness DIT
Metrics at Project Level Productivity Project Size Design Quality Architecture Quality Customer Satisfaction Knowledge Sharing Application Performance Effort Team Efficiency Correctness
Project Level -> Organization Level Productivity Project Size Effort ROI TCO Team Efficiency Productivity Design Quality TCO Personnel  Qualities
Metrics at Organization Level ROI TCO Personnel  Qualities Information  Productivity Knowledge Capital
Agenda Applications of software metrics Why introducing metrics is so hard? Lesson from the battlefield Common mistakes Steps to introduce metrics in organization
We don't like to be measured People are  afraid  that data will be used  against   them Software developers are not different
Prima donna  effect  Great developers is a  scarce resource In most organizations developers are treated as prima donnas allowing them to dictate development processes
Too Much Attention to  Code Metrics In 19XXs code metrics got too much attention of researchers and university teachers Code metrics are cool, but not enough to give you a full picture about your software project or organization Lines of Code  most popular metric When manager hears “software metrics” she often feels that this is a developers playground
There are no  industry standards  for software measurement Software industry is still young Historically metrics had “bad smell” New trendy process appears every 10 years It was RUP in 90s, then extreme programming, now Scrum and agile processes If process changes every 10 years it is hard to talk about standard metrics Open Initiative: sdlcmetrics.org, you are welcome to contribute
Agenda Applications of software metrics Why introducing metrics is so hard? Lesson from the battlefield Common mistakes Steps to introduce metrics in organization
How we helped introducing metrics in “Brown Solutions” Brown Solutions – not a real name International company, about 1,000 employees,  young  company, young managers, success highly depends on software  product quality. Metrics team role Introduce metrics into the organization, automate collection of the metrics, assure needed data is gathered and is accurate
“ Brown Solutions”, part 1 I got a call “Mark, we need metrics. CEO has setup us a goal to have it ready by the end of this year. Let's meet  tomorrow ” We meet. “ Do you know which metrics you want to collect? NO. But we have a list of 30 metrics somebody has gathered” “ Why do you need metrics” Mmmm ...you know, metrics are important. Who will be using the metrics? Mmm ...I guess everybody
“ Brown Solutions”, part 2 Now I am also in the “ Mmmm ” state. What should we do? Where should we start? We start building the prototype based on the list of metrics provided to gather feedback  Prototype implementing 5 metrics is ready. We go for feedback.  “ Mmm....I have no time this week. Let's talk next month” “ Wow. It looks cool. Can you also implement this metric for me.”
“ Brown Solutions”, part 3 We are still in “ Mmmm ” state. What should we do? How should we proceed? We decide that we should limit number of metrics to only  high priority  ones. For that purpose we need clear metrics framework that would allow to prioritize metrics prior to implementing them. After few workshops, we have a  metrics framework  that groups metrics into different categories and asks metric author to provide context information to each metric.
“ Brown Solutions”, part 4 It's getting better. Or at least  I thought so... The person responsible for metrics implementation in Brown Solutions resigns.  New person has new ideas about what metrics are needed, how they should be introduced. We have to iterate over metrics framework again. Again changes of scope.
“ Brown Solutions”, part 5 Finally we start thinking about automation of metrics gathering. We are digging into company development tools: bug tracking, development planning, code repositories. We need to change fields in defect tracking system. We need to introduce new field “found in version” and make few other fields mandatory. Again resistance. Luckily we have strong support in management team and we get things done fast.
“ Brown Solutions”, part 6 Development teams starts building scripts to collect data from different tools. We face some issues, due to the fact that different teams use development tools in a different way.  Another round of discussions and changes into the development tool usage guidelines. Finally, automation solution has been rolled out and employees can use metrics in their daily work.
“ Brown Solutions”, part 7 How to spread the metrics to all employees? We build a wiki page, describing each metric in detail so users can anytime find metric definitions and examples of usage.  We also setup skype chat where users can ask any metric related question from the metric specialist. From first days of solution usage we again receive new set of metrics people would like to use... so we start iterating.
What did we do wrong?
Agenda Applications of software metrics Why introducing metrics is so hard? Lesson from the battlefield Common mistakes Steps to introduce metrics in organization
“ Great! Let's collect data on everything and then we'll find correlations, meaning and information!”
Start Small! Define a  concrete goal  and start moving towards it using  metrics as indicators Use metrics frameworks to assist you Goal – Question – Metric (GQM) Practical Software Measurement (PSM) ...
Measurement Culture is Missing Don't use  metrics to  punish or reward Don't ignore the data Make decisions and follow through with actions
Hope that there is a  single super metric  for all I don't know one
 
 
 
Forgetting about context There is a difference in what “1 bug” means in different organizations.  Not a huge issue for 10 developers team of web service? But what about manager of failed NASA project because of this  1 bug?
Factors Influencing People Productivity People factors : size and expertise of the development organization Problem factors : complexity and number of changes in design constraint and requirement Process factors : analysis and design techniques, languages and CASE tools Product factors : product requirements – reliability, performance etc Resource factors : availability of hardware and software resources
“ A  good guideline is that measures of individual productivity give you questions to ask but they don’t give you the answers” Steve McConnell
Agenda Applications of software metrics Why introducing metrics is so hard? Lesson from the battlefield Common mistakes Steps to introduce metrics in organization
9 steps Identify user roles for metrics Target goals Define questions Select metrics Standardize definitions Requirements for Life-Cycle Tools Automate metrics collection Decision criteria after collected metrics Communication plan
1. Identify user roles for metrics Don't start if you don't know who will be using metrics Users will be evaluating your work, when you deliver the results
2:Target Goals, 3: Define Questions, 4: Select Metrics Goal-Question-Metrics, Victor Basili
5. Standardize definitions You are dealing with numbers -> be precise in definitions Misinterpretation can lead to Wrong strategic decisions Somebody fired
6. Requirements for Life-Cycle Management Tools Collecting metrics in most cases influences the practices how your company uses application life cycle management tools bug tracking, project management, test management, requirements management, build and release management, code repositories Be ready to do significant changes Try to secure support of development tools team
7. Automate collection of metrics People don't like routine work –  manual measurement is a burden Measurement should become a habit Use tools that  simplify or automate data collection Programeter, Cast Software, ... Rescue Time, Toggl, ...  Pivotal Tracker, Jira, ... Hackystat, Sonar, FishEye, … Memory and CPU usage profilers Automatic load testing tools
8. Decision criteria Answer “ what if.. ” questions …  escaped defects per week increased 20% …  0 lines of code was developed …  velocity increased 2 times Integrate this into you  knowledge base
9. Communication Plan Numbers if used incorrectly they can damage your organization Be sure to  clearly  and  loudly communicate  of metrics in your organization. Be honest and straight-forward Share  the information Show the data, tell how it is used, interpret data together, encourage to use data Involve  people in metrics definition Feeling of ownership
9. Communication Plan Don't use  metrics to  punish or reward Articulate  organizational  goals Let people know your intentions
9. Communication Plan Don't ignore the data Make decisions and follow through with actions Define  data items and procedures Same understanding what do metrics express, when and how they are collected and reported Understand trends Trends are more important than single data points
Go Back To Step 1 Don't expect to get it right in the first try Start small, but be ready to iterate
Additional Reading Encyclopedia of Software Development Life-Cycle Metrics http:// www.sdlcmetrics.org L. Westfall, 12 Steps to Useful Software Metrics http://www.westfallteam.com/Papers/12_steps_paper.pdf SDLC Metrics blog by Programeter http:// blog.programeter.com
Call to Action Encyclopedia of Software Development Life-Cycle Metrics  www.sdlcmetrics.org We are looking for contributors Internship position
E-mail:   [email_address] Skype:  kofman
Thank you for your time and attention!

Software Measurement: Lecture 3. Metrics in Organization

  • 1.
    Software Measurement Software Economics 2010 lecture 3: metrics in organizations
  • 2.
    Mark Kofman Co-founderat PROGRAMETER Metrics tracking kit for software development Interests: software quality, metrics, startups, entrepreneurship, distributed and multi-culture teams Contact details: Email: [email_address] Skype: kofman Twitter: @markkofman
  • 3.
    Agenda Applications ofsoftware metrics Why introducing metrics is so hard? Lesson from the battlefield Common mistakes Steps to introduce metrics in organization
  • 4.
    Agenda Applications ofsoftware metrics Why introducing metrics is so hard? Lesson from the battlefield Common mistakes Steps to introduce metrics in organization
  • 5.
    Purpose of Metricsin Software Development Quality assurance Code quality, defects, application performance Performance measurement How efficient is the organization, team or individual member Process improvement faster react to changes in the context
  • 6.
    Quality Assurance Qualitycannot be measured directly -> derived from other metrics
  • 7.
    Quality Assurance Onecan only fix any two -> cannot put lots of different qualities without affecting both budget and speed Time Cost Funtionality
  • 8.
    Quality Assurance Whathappens if we overemphasize Quality of code Performance of the software User satisfaction Focus on one ultimately important quality and minimize drawbacks for others Start-ups : low-cost or development speed Airplanes : bug free Netbooks : small and light
  • 9.
    QA : UserSatisfaction Surveys User is able to operate software without assistance User understands what software does User likes software Software works and does the right thing Number of defects per iteration Number of delivered features
  • 10.
    QA : ApplicationPerformance Application average response time under particular load Throughput – number of concurrent requests Memory and CPU usage under particular load Scalability How metrics change if hardware is improved How metrics change if software is distributed over several systems
  • 11.
    QA : CodeQuality Well documented Comment density – ratio of comment lines to all lines Well designed Distance from Main Sequence -> abstractness and instability Depth of inheritance tree, number of children, coupling, complexity, ... Code duplication number of very similar code blocks
  • 12.
    QA: DefectStatistics How many defects we have found? How many defects escaped QA? How many are open, assigned ? What is the ratio of defects to tasks ? Defect fixing effort estimation Toolkit: Jira, Mantis, Trac, Bugzilla, Programeter, ...
  • 13.
    QA: Codeand Design Improvement Code compliance to standards and conventions Identify design smells Rigidity, fragility, immobility, viscosity Design compliance to principles Open close principle, dependency inversion principle, ... Toolkit: Checkstyle, FindBugs, PMD, ckjm, Eclipse Metrics, ...
  • 14.
    QA: PerformanceTuning Memory and CPU usage Performance Average response time, number of concurrent users Analyze user behavioral patterns Most used methods, execution time Toolkit: YourKit, JProfiler, JMeter, JAMon, ...
  • 15.
    QA: ValueEstimation How much value does particular feature bring? Is it worth developing/buying ? Which features bring the most value ?
  • 16.
    Performance Measurement Performancemanagement is a set of activities to ensure that goals are consistently being met in an effective and efficient manner. Focus employees' attention on what matters most to success “ if you don’t measure results, you can’t tell success from failure and thus you can’t claim or reward success or avoid unintentionally rewarding failure,”
  • 17.
    PM: Focusingon Important Things Use metrics to find out components that Change the most (e.g. churned files) Are used the most (e.g. logs) Drive optimization effort there Design for easier change Improve performance where it is crucial Toolkit: Programeter, JAMon, FishEye, SvnPlot
  • 18.
    PM: TimeControl How do I (or my personnel) spend days ? How much time is spent on coding, communication and procrastination? How much is “billable” time? Are there any trends? Toolkit: RescueTime, Toggl
  • 19.
    PM: Burn-downChart How much work remains Will we hit the deadline? Points Planned Actual Remaining
  • 20.
    PM: Burn-upChart How much work is done Scope Deadline estimation Remaining Completed Points
  • 21.
    PM: Team Performance Developer's performance Velocity – number of completed features Churned lines of code – number of added, modified or deleted lines of code Coding efficiency – ratio of developer's code to churned lines of code Code maturity – how much of developer's code gets rewritten by peers Toolkit: Programeter, Scrum Sprint Monitor
  • 22.
    PM: Taskand Feature Tracking Project estimation Size -> number of specialists, time and budget estimates Project tracking How much is completed, how much is left Are we on time? When we will deliver? Toolkit: MS Excel, MS Project, Gantt charts, Jira, Pivotal Tracker, Trac, Mingle, ...
  • 23.
    PM: DeveloperBenchmarking
  • 24.
  • 25.
  • 26.
    PM: ExpertiseManagement What tools, technologies and libraries developers use Programming languages Library dependencies (Ant+Ivy, Maven, ...) Code analysis: which external components developer uses in his code Better team formation and expertise sharing Toolkit: ...
  • 27.
    Process Improvement Seriesof actions taken to identify, analyze and improve existing processes within an organization to meet new goals and objectives. Metrics are used to identify bottlenecks
  • 28.
    PI: VelocityHow much work can a team complete per iteration Completed Points Iterations
  • 29.
    PI: KnowledgeSharing Risky knowledge Percentage of components where developers posses only unique knowledge Pair programming, developer rotation Team friendliness of developer Ratio of shared knowledge to total knowledge 1 -> all knowledge is shared, 0 -> only unique Indicator of a culture: “my code – don't touch it” Toolkit: Programeter
  • 30.
  • 31.
    PI: Returnon Investment (ROI) Attractiveness of an investment calculated by financial performance of invested money (V final – V init ) / V init V init – initial value of an investment V final – final value of an investment Example : bank savings account -> 4% Payback Period – length of time when investment is returned
  • 32.
    PI: TotalCost of Ownership Total cost of procuring, using, managing and disposing of an asset over its useful life Example : TCO of a car much higher than its price! Price Gasoline Technical reviews Insurance …
  • 33.
    Example: E-Print FeatureA : support of doc, rtf, docx, txt, odt, pdf documents Development cost: 2.35K EUR Infrastructure cost: 40 EUR per month Feature B : support of pdf documents Development cost: 350 EUR Infrastructure cost: No additional costs Application is not validated yet -> cut the costs -> implement feature B
  • 34.
    Example : E-PrintInitial application 3 year revenue: 10K EUR Additional development costing 3K EUR would increase 3 year revenue by 4K EUR ROI = (4K – 3K) / 3K = 33%
  • 35.
    TCO vs ROI ROI -> Making go or not go decisions Project drives increase in revenue “ Automation of customer relations” TCO -> Evaluating possible decisions Build or buy, in-house or outsource, ... Swapping one system with another “ Migrating from one CRM system to another”
  • 36.
    One Metric –Different Benefits Code complexity Developer ← code quality Complex code is not comprehensible Project manager ← testing effort Complex code requires more testing effort Executive ← maintenance costs Complex code leads to higher maintenance costs
  • 37.
    Metrics at SpecialistLevel LOC WMC DIT NOC RFC Churned LOC LCOM Ce Ca Instability Abstractness Churn Count DIT Points Knowledge Friendliness Velocity # Requirements # Bugs # Defects Response Time Scalability
  • 38.
    Specialist Level ->Project Level Project Size Design Quality LOC Points # Requirements DIT RFC LCOM Abstractness DIT
  • 39.
    Metrics at ProjectLevel Productivity Project Size Design Quality Architecture Quality Customer Satisfaction Knowledge Sharing Application Performance Effort Team Efficiency Correctness
  • 40.
    Project Level ->Organization Level Productivity Project Size Effort ROI TCO Team Efficiency Productivity Design Quality TCO Personnel Qualities
  • 41.
    Metrics at OrganizationLevel ROI TCO Personnel Qualities Information Productivity Knowledge Capital
  • 42.
    Agenda Applications ofsoftware metrics Why introducing metrics is so hard? Lesson from the battlefield Common mistakes Steps to introduce metrics in organization
  • 43.
    We don't liketo be measured People are afraid that data will be used against them Software developers are not different
  • 44.
    Prima donna effect Great developers is a scarce resource In most organizations developers are treated as prima donnas allowing them to dictate development processes
  • 45.
    Too Much Attentionto Code Metrics In 19XXs code metrics got too much attention of researchers and university teachers Code metrics are cool, but not enough to give you a full picture about your software project or organization Lines of Code most popular metric When manager hears “software metrics” she often feels that this is a developers playground
  • 46.
    There are no industry standards for software measurement Software industry is still young Historically metrics had “bad smell” New trendy process appears every 10 years It was RUP in 90s, then extreme programming, now Scrum and agile processes If process changes every 10 years it is hard to talk about standard metrics Open Initiative: sdlcmetrics.org, you are welcome to contribute
  • 47.
    Agenda Applications ofsoftware metrics Why introducing metrics is so hard? Lesson from the battlefield Common mistakes Steps to introduce metrics in organization
  • 48.
    How we helpedintroducing metrics in “Brown Solutions” Brown Solutions – not a real name International company, about 1,000 employees, young company, young managers, success highly depends on software product quality. Metrics team role Introduce metrics into the organization, automate collection of the metrics, assure needed data is gathered and is accurate
  • 49.
    “ Brown Solutions”,part 1 I got a call “Mark, we need metrics. CEO has setup us a goal to have it ready by the end of this year. Let's meet tomorrow ” We meet. “ Do you know which metrics you want to collect? NO. But we have a list of 30 metrics somebody has gathered” “ Why do you need metrics” Mmmm ...you know, metrics are important. Who will be using the metrics? Mmm ...I guess everybody
  • 50.
    “ Brown Solutions”,part 2 Now I am also in the “ Mmmm ” state. What should we do? Where should we start? We start building the prototype based on the list of metrics provided to gather feedback Prototype implementing 5 metrics is ready. We go for feedback. “ Mmm....I have no time this week. Let's talk next month” “ Wow. It looks cool. Can you also implement this metric for me.”
  • 51.
    “ Brown Solutions”,part 3 We are still in “ Mmmm ” state. What should we do? How should we proceed? We decide that we should limit number of metrics to only high priority ones. For that purpose we need clear metrics framework that would allow to prioritize metrics prior to implementing them. After few workshops, we have a metrics framework that groups metrics into different categories and asks metric author to provide context information to each metric.
  • 52.
    “ Brown Solutions”,part 4 It's getting better. Or at least I thought so... The person responsible for metrics implementation in Brown Solutions resigns. New person has new ideas about what metrics are needed, how they should be introduced. We have to iterate over metrics framework again. Again changes of scope.
  • 53.
    “ Brown Solutions”,part 5 Finally we start thinking about automation of metrics gathering. We are digging into company development tools: bug tracking, development planning, code repositories. We need to change fields in defect tracking system. We need to introduce new field “found in version” and make few other fields mandatory. Again resistance. Luckily we have strong support in management team and we get things done fast.
  • 54.
    “ Brown Solutions”,part 6 Development teams starts building scripts to collect data from different tools. We face some issues, due to the fact that different teams use development tools in a different way. Another round of discussions and changes into the development tool usage guidelines. Finally, automation solution has been rolled out and employees can use metrics in their daily work.
  • 55.
    “ Brown Solutions”,part 7 How to spread the metrics to all employees? We build a wiki page, describing each metric in detail so users can anytime find metric definitions and examples of usage. We also setup skype chat where users can ask any metric related question from the metric specialist. From first days of solution usage we again receive new set of metrics people would like to use... so we start iterating.
  • 56.
    What did wedo wrong?
  • 57.
    Agenda Applications ofsoftware metrics Why introducing metrics is so hard? Lesson from the battlefield Common mistakes Steps to introduce metrics in organization
  • 58.
    “ Great! Let'scollect data on everything and then we'll find correlations, meaning and information!”
  • 59.
    Start Small! Definea concrete goal and start moving towards it using metrics as indicators Use metrics frameworks to assist you Goal – Question – Metric (GQM) Practical Software Measurement (PSM) ...
  • 60.
    Measurement Culture isMissing Don't use metrics to punish or reward Don't ignore the data Make decisions and follow through with actions
  • 61.
    Hope that thereis a single super metric for all I don't know one
  • 62.
  • 63.
  • 64.
  • 65.
    Forgetting about contextThere is a difference in what “1 bug” means in different organizations. Not a huge issue for 10 developers team of web service? But what about manager of failed NASA project because of this 1 bug?
  • 66.
    Factors Influencing PeopleProductivity People factors : size and expertise of the development organization Problem factors : complexity and number of changes in design constraint and requirement Process factors : analysis and design techniques, languages and CASE tools Product factors : product requirements – reliability, performance etc Resource factors : availability of hardware and software resources
  • 67.
    “ A good guideline is that measures of individual productivity give you questions to ask but they don’t give you the answers” Steve McConnell
  • 68.
    Agenda Applications ofsoftware metrics Why introducing metrics is so hard? Lesson from the battlefield Common mistakes Steps to introduce metrics in organization
  • 69.
    9 steps Identifyuser roles for metrics Target goals Define questions Select metrics Standardize definitions Requirements for Life-Cycle Tools Automate metrics collection Decision criteria after collected metrics Communication plan
  • 70.
    1. Identify userroles for metrics Don't start if you don't know who will be using metrics Users will be evaluating your work, when you deliver the results
  • 71.
    2:Target Goals, 3:Define Questions, 4: Select Metrics Goal-Question-Metrics, Victor Basili
  • 72.
    5. Standardize definitionsYou are dealing with numbers -> be precise in definitions Misinterpretation can lead to Wrong strategic decisions Somebody fired
  • 73.
    6. Requirements forLife-Cycle Management Tools Collecting metrics in most cases influences the practices how your company uses application life cycle management tools bug tracking, project management, test management, requirements management, build and release management, code repositories Be ready to do significant changes Try to secure support of development tools team
  • 74.
    7. Automate collectionof metrics People don't like routine work – manual measurement is a burden Measurement should become a habit Use tools that simplify or automate data collection Programeter, Cast Software, ... Rescue Time, Toggl, ... Pivotal Tracker, Jira, ... Hackystat, Sonar, FishEye, … Memory and CPU usage profilers Automatic load testing tools
  • 75.
    8. Decision criteriaAnswer “ what if.. ” questions … escaped defects per week increased 20% … 0 lines of code was developed … velocity increased 2 times Integrate this into you knowledge base
  • 76.
    9. Communication PlanNumbers if used incorrectly they can damage your organization Be sure to clearly and loudly communicate of metrics in your organization. Be honest and straight-forward Share the information Show the data, tell how it is used, interpret data together, encourage to use data Involve people in metrics definition Feeling of ownership
  • 77.
    9. Communication PlanDon't use metrics to punish or reward Articulate organizational goals Let people know your intentions
  • 78.
    9. Communication PlanDon't ignore the data Make decisions and follow through with actions Define data items and procedures Same understanding what do metrics express, when and how they are collected and reported Understand trends Trends are more important than single data points
  • 79.
    Go Back ToStep 1 Don't expect to get it right in the first try Start small, but be ready to iterate
  • 80.
    Additional Reading Encyclopediaof Software Development Life-Cycle Metrics http:// www.sdlcmetrics.org L. Westfall, 12 Steps to Useful Software Metrics http://www.westfallteam.com/Papers/12_steps_paper.pdf SDLC Metrics blog by Programeter http:// blog.programeter.com
  • 81.
    Call to ActionEncyclopedia of Software Development Life-Cycle Metrics www.sdlcmetrics.org We are looking for contributors Internship position
  • 82.
    E-mail: [email_address] Skype: kofman
  • 83.
    Thank you foryour time and attention!