Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Jurnal an example of using key performance indicators for software development


Published on

Published in: Education, Technology, Business
  • Be the first to comment

  • Be the first to like this

Jurnal an example of using key performance indicators for software development

  1. 1. An Example of Using Key Performance Indicators for Software Development Process Efficiency Evaluation Ž. Antolić R&D Center Ericsson Nikola Tesla d.d. Complete Address: Krapinska 45, Zagreb, HR-10000, Croatia Phone: +385 1 365 4584 Fax: +385 1 365 4082 E-mail: zeljko.antolic@ericsson.comAbstract - This paper gives an overview of possible Key measurement across the entire life cycle of a softwarePerformance Indicators (KPI) that can be used for project, from inception to retirement. Measurement issoftware process efficiency evaluation. The overview is implemented as a proactive discipline, and measurementbased on currently used KPIs in software development derived information is considered to be a strategicprojects on CPP platform. The most important KPIs are resource.analyzed, and their usage in the process efficiency Measurement is most important at the project level.evaluation is discussed. The outcome of the measurement is Software measurement helps the project manager do aused to initiate further process adjustments and better job. It helps to define and implement moreimprovements. In addition, there is possibility to perform realistic plans, to properly allocate scarce resources tobenchmarking between different development projects, put those plans into place, and to accurately monitorand based on collected data easier search for best practices progress and performance against those plans. Softwarein the projects that can be broadly implemented. Some measurement provides the information required to makeproposals and future directions in the area of process key project decisions and to take appropriate action.measurement are given. Measurement helps to relate and integrate the information derived from other project and technical management disciplines. In effect, it allows the software I. INTRODUCTION project manager to make decisions using objective information. All successful software organizations implement In this article, the overview of process measurementmeasurement as part of their day-to-day management in software development projects on CPP platform willand technical activities. Measurement provides the be given, and some Key Performance Indicators (KPIs)objective information they need to make informed will be discussed. Also, one example of projectdecisions that positively impact their business and benchmarking will be presented. At the end, someengineering performance. In successful software improvement proposals, and directions for further workorganizations, measurement-derived information is will be given.treated as an important resource and is made available todecision makers throughout all levels of management. The way measurement is actually implemented andused in a software organization determines how much II. ISO/IEC 15939 Software Measurement Processvalue is realized in terms of business and engineeringperformance. Measurement is most effective when The International Standard ISO/IEC 15939 identifiesimplemented in support of an organization’s business the activities and tasks that are necessary to successfullyand technical objectives and when integrated with the identify, define, select, apply, and improve softwareexisting technical and management activities that define measurement within an overall project or organizationala software project. Measurement works best when it measurement structure. It also provides definitions forprovides objective information related to the risks and measurement terms commonly used within the softwareproblems that may impact a project’s defined objectives. industry [2]. The software measurement process itself isIn other words, measurement works best when it is shown on Fig. 1.considered a significant, integral part of projectmanagement [1]. Measurement Requirements User Feedback Technical and Top-performing organizations design their technical Managementand management processes to make use of objective Objectives and Issues Processes Analysis Resultsmeasurement data. Measurement data and associated Measurementanalysis results support both short and long-term Plan Establish Plan Perform and Sustaindecision making. A mature software development Commitment Measurement New Measurementorganization typically uses measurement to help plan Issues Analysis Results and Performanceand evaluate a proposed software project, to objectively Improvement Actions Evaluate Measurestrack actual performance against planned objectives, to Scope of Measurement Measurementguide software process improvement decisions and Processinvestments, and to help assess overall business andtechnical performance against market-driven Fig. 1. ISO/IEC 15939 Software Measurement Processrequirements. A top-performing organization uses
  2. 2. III. CMMI Process Area Measurement and Analysis IV. Data Collection Model According to Capability Maturity Model Integration All projects have specific objectives that are typically (CMMI), the purpose of Measurement and Analysis is to defined in terms of system capability, resource budgets, develop and sustain a measurement capability that is milestones, quality, and business or system performance used to support management information needs [3]. targets. Project success depends largely on how well The Measurement and Analysis process area involves these objectives are achieved. Project issues are areas of the following: concern that may impact the achievement of a project • Specifying the objectives of measurement and objective: risks, problems, and lack of information, for analysis such that they are aligned with example. identified information needs and objectives; The most information needs in one project can be • Specifying the measures, data collection and grouped into general areas, called information storage mechanisms, analysis techniques, and categories. We can identify seven information reporting and feedback mechanisms; categories, which represent key areas of concern for the project manager [4]: • Implementing the collection, storage, analysis, and reporting of the data; • Schedule and Progress; • Providing objective results that can be used in • Resources and Cost; making informed decisions, and taking • Product Size and Stability; appropriate corrective actions. • Product Quality; The integration of measurement and analysis • Process Performance; activities into the processes of the project supports the • Technology Effectiveness; following: • Customer Satisfaction. • Objective planning and estimating; The information about project performance is • Tracking actual performance against collected in cycles. The typical data collection cycle is established plans and objectives; four weeks. Based on achieved results, product supplier • Identifying and resolving process-related (software development project) performs analysis and issues; defines operation excellence action plan within two • Providing a basis for incorporating weeks time frame. When actions are established, the measurement into additional processes in the measurement results and operational excellence action future. plans are ready for presentation on Operating Steering The initial focus for measurement activities is at the Group (OSG) for the project. Results form the all project level. However, a measurement capability may projects and OSG meetings are input for R&D Center prove useful for addressing organization wide Steering Group meeting, organized each quarter [5]. information needs. Projects may choose to store The data collection process is shown on Fig. 3. project-specific data and results in a project-specific repository. When data are shared more widely across Start of End of projects, the data may reside in the organization’s Quarter 4 weeks 2 weeks 4 weeks 2 weeks Quarter measurement repository. Supplier R&D KPI Data Collection The Measurement and Analysis contexts according to Cycle OE Action Supplier OSG’s Supplier Planning SG CMMI model is shown on Fig. 2. Supplier Scorecards OE Actions Supplier OSG’s R&D Supplier Produced Established completed SG meeting completed Align Measurement Analysis Activities Define / Refine Baseline Specify Data Collect Data Establish Specify Specify Collection Measurement Analysis Cut-Off Date Data Collection End Date Measures and Storage Objectives Procedures Procedures Validate Data Create Scorecards 12 month data windowMeasurement Objectives Measurement Repository Procedures, Tools Measurement Indicators Fig. 3. Overview of data collection processProvide Measurement Results Communicate Store Data & Analyze Measurement Collect The measurement of project performance gives Measurement Results Results Data Data organization increased opportunities to improve and share good practices, and to increase the possibility to reach wanted operational efficiency. The measurement activities responsibility is on the corporate R&D level. Fig. 2. Measurement and Analysis Context This responsibility covers forum, processes, tools, definitions of metrics, collections, analyzing and reporting on corporate level.
  3. 3. V. KPI Definitions for CPP Development Projects B. Assignment Content Adherence In order to follow performance of CPP software Definition:development projects, we have defined set of KPIs. Measures supplier’s ability to deliver full assignmentSome of KPIs are applicable for early project scope by end of assignment. It is based on percentage ofdevelopment phases, some of them for complete life completed functionality/requirements [7].cycle, and some of them only for product maintenance. Result format: The set of KPIs, their behavior and applicability are Reported as a percentage, 100% is the highest result.shown on Fig. 4 [6]. Formula: For each KPI we have defined: • Description; (No. of Compl. Req. / No. of Commit. Req.) x 100 • Results format; Requirements are smallest measurable ‘packages’ of • Formula; functionality; e.g. features, documents, items in • Frequency. Statement of Compliance, Requirement Specification, Requirement Management Tool, or ImplementationA. Schedule Adherence Proposal. Number of Completed Requirements counts packages Definition: of functionality delivered during the entire assignment. Measures timeliness and ‘quality’ of deliveries Total Number of Committed Requirements counts therelative to baseline schedule and acceptance criteria. packages of functionality originally planned for theBased on percentage deviation between planned and assignment; may be revised based on Change Requestactual lead times [7]. guidelines. Result format: Frequency: Reported as a percentage, 100% is the highest result. Measured and reported at the end of an assignment. Formula: [1 – ABS (ALT – PLT) / PLT] x 100 KPI measurement has to be based on requirements that are the smallest objects of measurement and easily PLT = Planned Start Date – Planned Finish Date measurable. For example, content adherence for an ALT = Actual Finish Date – Planned Start Date assignment with 2 major deliveries should not be based If no planned start date is specified for intermediate or at the ‘delivery level’ but rather based at the coreparallel deliverables, the earliest planned start date (e.g. functionalities/requirements within each delivery.TG2 or assignment start date) may be used. Assignments where scope is not frozen at TG2 (Project GO decision) need to handle scope additions through the Planned Start/Finish Dates are replaced with revised CR handling guidelines.dates in case of Ericsson caused/mandated CRs. Assignment Phase Performance Design KPIs Desired Behaviour Pre-TG2 and Development Follow-up Maintenance Metric RXI (TG2-PRA) (Maintenance (Post GA) PRA to GA) Delivery to Commitment Schedule Time Accurate Estimating Adherence Avoid Delaying Scope Until Later Assignments Overall Assignment Delivery to Commitment Content Content Adherence Delivery of Full Scope Delivery to Commitment Cost Cost Adherence Accurate Estimating Avoidance of Buffers Fault Slip Through Quality of Supplier Deliverables to Ericsson Quality (FST) Delivery of fault free deliverables to Ericsson Service Levels TR Closure Rate TR response to Commitment Efficiency in resolving quality issues Cost of Quality Cost per TR (CTR) Reduction in cost of poor quality Key: KPI Applicable to Assignment KPI N/A to Assignment May be applicable (decided case by case) Fig. 4. KPI Definitions for CPP projects
  4. 4. C. Cost Adherence Frequency: Monthly from start to end of I&V (cumulative data Definition: collecting); or at each ‘drop’ on completion of the Measures supplier’s ability to deliver assignment respective I&V.scope within the agreed/committed cost, including man-hour, lab and travel costs. Based on deviation between TRs that do not relate to ‘genuine’ faults, i.e.committed (baseline) and expected (actual + forecast) cancelled, postponed, duplicated, and rejected TRs, arecosts at assignment/deliverable level [7]. to be excluded. All ‘minor’ faults, faults that do not Result format: affect the main operation of the system are to be Reported as a percentage, 100% is the highest result. excluded. Formula: E. Trouble Report Closure Rate [1 – (ECost – CCost) / CCost] x 100% Committed cost is the baseline at assignment start. Definition:Contingency value (buffer) should be specified Measures supplier’s ability to answer TRs within theseparately, if known. specified goals. It is based on deviation between the Expected Cost to Complete is (actual + forecast) each actual TR answering times and TR goals, set by themonth: Assignment Owner [7]. • Actual costs incurred so far; Result format: • Forecast of all remaining Costs to Complete; Reported as lost days, averaged across TR priority. • Forecast of contingency sums (optional). The lowest result is 0, indicating that the TRs are answered within the goals. Delivering an Assignment under the Committed Costswill have neutral impact on the KPI. Aim is to Formula:discourage unnecessarily using budgeted hours; NLD / (OTR+ NTR) Frequency: Measured monthly at assignment level, or at end of NLD = number of lost days within the time incrementeach major deliverable. for all open and new TRs OTR = number of open TRs at beginning of the time increment Costs have to be defined at assignment level(mandatory), and optionally (if possible) at deliverable NTR = number of new TRs during time incrementlevel, to enable precise change control. The TR handling time starts at the point at which the TR enters the supplier organization, and ends at the point at which the TR is answered.D. Fault Slip Through Time increment is typically 12 months in the past from reporting date. Definition: Frequency: Measures supplier’s ability to capture faults before Measurement is done on a monthly basis.making deliveries to I&V> • Assuming that supplier conducts Function Testing (FT); F. Cost per Trouble Report • Supplier or external organization may conduct I&V (Integration and Verification) Definition: Testing. Measures supplier’s efficiency in fixing TRs (answer Based on Trouble Report (TR) slippage between FT plus solution), i.e. maintenance costs relative to TRsand I&V test phases. resolved, in man-hours [7]. • Assuming that TRs are analyzed to identify Result format: ‘true’ slipped TRs; Reported as man-hours. • If TRs are not analyzed, then 0% may not be Formula: the expected best result due to the different scope in FT and I&V testing [7]. Cost of Maintenance / Number of TRs Resolved Result format: Cost of Maintenance activities is total hours spent on Reported as a percentage, 0% is the lowest result. TR Handling activities. Formula: Number of TRs resolved are TRs that include a fix/solution. [1 – FT Faults / All Faults] x 100% The result is expressed as a rolling average, over past Faults are classified as FT or I&V based on testing 12 months from the current reporting date, across allphase, not who does the testing. All parties conducting product areas in maintenance.the testing need to capture the Function Test and I&V Frequency:Faults, based on assignment TR Handling Measurement is done on a monthly basis.guidelines/tools.
  5. 5. VI. Project Benchmarking VII. Improvements and Future Directions Benchmark office measures and analyzes the The set of KPIs described in this article is the basicdevelopment unit performance in order to improve their set, established 18 months ago. We are today in theOperational Efficiency, for example by good practice position where we have enough measurement results tosharing across organization. The measurements are perform precise analysis. But, it is obvious that this isfocused towards the project perspectives of the not complete list of KPIs that can be measured in thedevelopment unit performance, and possibilities to make software development project. Many other interestingexternal or internal benchmarking possible. The data can be collected.Benchmark office supports the development unit Two product life cycle phases are the most importantsteering in analyzing their operational and process for new, more advanced KPIs and measurements in theefficiency and improvements. The Benchmark Office is future; verification phase and maintenance phase.responsible for the process, definitions and tools, as well In the verification phase of the project we have toas performing analysis on corporate level. measure how efficient our verification activities are. It is The one example of CPP project benchmarking is not enough to measure number of executed test cases,shown on Fig. 5. It can be seen from the figure that most and pass rate. These measurements are not telling usof the KPIs are on the commitment or stretched level. much about expected product quality. The idea is toThat means project has fulfilled its goals. establish fault rate measurement. The fault rate measures The project marked with D1 is the oldest measured how many faults we discover in certain time intervalproject, and it has the lowest achieved results. The (typically one week). If fault rate decrease with time,successor projects have performed much better. That is that means product quality is improving by performingachieved by performing the root cause analysis of the test activities. Additionally, we can set lower fault rateKPIs. The analysis has resulted with corrective and limit, in order to plan how long we will go with ourpreventive actions in the next projects, and positive testing, and when we can stop with testing assuming thatresult is visible. product has reached expected quality level. The project marked with D2 had problems with In the maintenance phase of the life cycle we wouldbudget (visible from Cost Adherence KPI). Detailed like to measure product maintenance cost compared withanalysis shown that initial estimations were too development effort. At the moment we know theoptimistic, and 3rd party supplier part of the project has average cost to remove the fault. According to thisspent much more than it was planned. measurement it is difficult to compare quality level for The benchmarking itself has no intention to initiate two different products. The new KPI can measure totalonly the competition between projects and organization. product maintenance cost in the first year of operation,The full benefit can be achieved if results are deeply and compare it with total development cost. The resultanalyzed, and preventive and corrective actions are set can be expressed as percentage of product developmentfor the ongoing and future projects (learning from cost. With this measurement we will be able to compareexperience). quality level for different products in the maintenance phase of the life cycle. Supplier Assignment KPI Results vs. Sponsor Expectations KPI Performance Range of Assignments KPIs Supplier Average Development Assignment KPIs 85% 90% 95% 98% 99% D2 D1 D3 98,9% Schedule Adherence (%) D4 85% 90% 95% 98% 99% D2 D3 Cost Adherence (%) D1 D4 D5 98,0% D6 80% 60% 45% 30% 15% Fault Slip Through (%) D1 24,4% D2 85% 90% 95% 98% 99% D1 Content Adherence D2 100,0% (at Assignment end) D3 D4 Maintenance Assignment KPIs 15 10 7 5 3 TR Closure Rate M3 (lost days) DFU1 3,3 DFU2 100 80 60 50 40 DFU1 M3 Cost per TR (man-hours) DFU2 49,4 Fig. 5. CPP Project Benchmarking
  6. 6. VIII. Conclusion References The measurement method and KPIs described in this [1] J.McGarry, “Measurement Key Concepts andarticle are coming from the CPP software development Practices,” Practical Software & System Measurements,projects at Ericsson. We have started this process as USA, 2003.measurement program, and have implemented in all [2] ***, “Systems and Software Engineering - Measurementdevelopment projects from the end of year 2006. The Process”, International Organization for Standardization,data were collected and analyzed on monthly basis, and Geneva, 2002.used as input for further improvement activities in thedevelopment projects. [3] M.B.Chrissis, M.Konrad, S.Shrum, “Capability Maturity The measurement process should be an integral part Model Integration – Guidelines for Process Integrationof the way business is conducted. Data must be provided and Product Improvement”, Pearson Education Inc.,early enough to allow management to take actions. Boston, 2004.Results must be communicated throughout the [4] D.Ishigaki, C.Jones, “Practical Measurement in theorganization in a timely manner. Decisions should not Rational Unified Process”, Rational Software, USA,wait for perfect data, but should be based on accurate, supported by risk management and root causeanalysis. [5] ***, “Data Collection Process 2007 – R&D Consultancy Both the measurement process and the specific KPIs Supplier Benchmark”, Internal Ericsson documentation,should be periodically evaluated and improved. Stockholm, Sweden, 2007.Measurement is an iterative process; the KPIs are [6] Z.Antolic, “CPP KPI Measurements 2008”, Internalrefined as information needs change and the Ericsson Documentation, Zagreb, Croatia, 2004.organization implements improvement actions. [7] C.Braf, “KPI Definitions and Guidelines”, Internal In the future, we can expect more demands on Ericsson Documentation, Stockholm, Sweden, product quality, reduced project lead-time, andreduced project budgets. The possible answer on thesedemands is to always have accurate data about projectand product performance, and fast improvementprograms, preventive and corrective actions based onanalysis of key performance indicators in the project.