Your SlideShare is downloading. ×
  • Like
  • Save
Identifying Your Organization's Best Software Practices
Upcoming SlideShare
Loading in...5
×

Thanks for flagging this SlideShare!

Oops! An error has occurred.

×

Now you can save presentations on your phone or tablet

Available for both IPhone and Android

Text the download link to your phone

Standard text messaging rates apply

Identifying Your Organization's Best Software Practices

  • 416 views
Published

The term, software development best practice, usually refers to a development process or technique that has gained consensus over time as a value added practice. But who is to say whether something is …

The term, software development best practice, usually refers to a development process or technique that has gained consensus over time as a value added practice. But who is to say whether something is a best practice or not? This presentation will shed new light on how we should be defining software development best practices. And, more importantly, how an organization can identify their own best practices and what impact those practices have on improving performance and reducing defects. Learn how to assess your current development processes and techniques and gain insight into your specific best software practices. A best practice is only a “best practice” if it is yielding positive results.

Find out more here: http://www.davidconsultinggroup.com/blogs/harris/?p=770

Published in Technology , Business
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Be the first to comment
    Be the first to like this
No Downloads

Views

Total Views
416
On SlideShare
0
From Embeds
0
Number of Embeds
0

Actions

Shares
Downloads
2
Comments
0
Likes
0

Embeds 0

No embeds

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
    No notes for slide

Transcript

  • 1. Identifying Your Organization’s Best Software Practices David Herron d.herron@davidconsultinggroup.com
  • 2. Topics for Discussion• Common best practices• What defines a best practice• How to identify a best practice• Identifying your best practices©2013 David Consulting Group 1
  • 3. Are These Best Practices?• What do we consider a best practice? – Frameworks? • CMMI, ITIL, PMI – Methodologies? • Agile, Lean, Six Sigma – Processes? • xP, TDD – Practices? • Estimating, R&I©2013 David Consulting Group 2
  • 4. What Defines a Best Practice?• Common best practices• Criteria for identifying a best practice• Characteristics of a best practice – Documented – Repeatable – Transferable – Proven performance – Provides value©2013 David Consulting Group 3
  • 5. Quality Best Practices• A quality best practice is a process that achieves the definition of quality. Dr. Tom DeMarco says “a product’s quality is a function of how much it changes the world for the better.” – DeMarco, T., Management Can Make Quality (Im)possible, Cutter IT Summit, Boston, April 1999 Another definition, coined by Gerald Weinberg, is "Quality is value to some person." – Quality Software Management: Systems Thinking©2013 David Consulting Group 4
  • 6. Measures of Quality Used to evaluate the effectiveness Defect Removal Efficiency of development quality activities Used to evaluate the overall Defect Density quality of the developed software Used to evaluate the quality of the Delivered Defects delivered software Used to determine the quality of Test Cases Passed First Time software being tested Used to determine if inspections Inspection Rate by Document positively impact quality Used to monitor trends in the Volatility number of changes per month©2013 David Consulting Group 5
  • 7. Typically There is a Measure Missing Quality (Defects Project Cost (000’s) Released) PO Special $500 12 Vendor Mods $760 18 Pricing Adj. $80 5 Store Sys. $990 22 Which project had the most effective quality practices?©2013 David Consulting Group 6
  • 8. Tracking Quality with Size Size Cost Quality Project (Functional Rate (Defects Density (000’s) Value) Released) PO 250 $500 $2,000 12 .048 Special Vendor 765 $760 $993 18 .023 Mods Pricing 100 $80 $800 5 .050 Adj. Store Sys. 1498 $990 $660 22 .014©2013 David Consulting Group 7
  • 9. Characteristics of Effective Sizing• Meaningful to developer and user• Defined (industry recognized)• Consistent (methodology)• Easy to learn and apply• Accurate, statistically based• Available when needed (early)• Addresses project level information needs©2013 David Consulting Group 8
  • 10. Why Function Points?• Function Point Analysis is a standardized method for measuring the functionality delivered to an end user. – Consistent method – Easy to learn – Availability in the lifecycle – Acceptable level of accuracy – Meaningful internally and externally• Function Point counts have replaced Line of Code counts as a sizing metric that can be used consistently and with a high degree of accuracy.©2013 David Consulting Group 9
  • 11. The Function Point Methodology• The software deliverable is sized based upon the functionality delivered. – Inputs – Outputs Five key components are – Inquiries identified based on logical user view – Data Stores Input Inquiry Output – Interface Files Data Stores Interface Application File©2013 David Consulting Group 10
  • 12. Identifying the Functionality PURCHASE USER ORDER USER Input SYSTEMInputs ADD, CHG INVOICES PAYMENTS Interface PURCHASE ORDER INFO PAYMENTS INVOICES Inquiry VENDOR Data Stores USER PAYMENT STATUS ACCOUNTS PAYABLE Output USER PAID INVOICES©2013 David Consulting Group 11
  • 13. Determine the Functional SizeThe FP Lite Process: TM1. Identify Components USER USER PURCHASE ORDER2. Assess Complexity SYSEM PAYMENTS3. Apply Weightings INVOICES VENDOR USER4. Compute Function Points ACCOUNTS PAYABLE USER Components Low Average High Total Data Stores _______ x7 3 x10 _______ x15 30 Interfaces _______ x5 1 x7 _______ x10 7 Inputs _______ x3 3 x4 _______ x6 12 Outputs _______ x4 1 x5 _______ x7 5 Inquiries _______ x3 1 x4 _______ x6 4 58 Function Point Size©2013 David Consulting Group 12
  • 14. Function Point Quality Measures• Defect Density – Measures the number of defects identified across one or more phases of the development project lifecycle and compares that value to the total size of the application. Number of defects (by phase or in total) Total number of function points• Test Case Coverage – Measures the number of test cases that are necessary to adequately support thorough testing of a development project. Number of test cases Number of function points©2013 David Consulting Group 13
  • 15. Function Point Quality Measures• Reliability – A measure of the number of failures an application experiences relative to its functional size. Number of production failures Total application function points• Rate of Growth – Growth of an application’s functionality over a specified period of time. Current number of function points Original number of function points• Stability – Used to monitor how effectively an application or enhancement has met the expectations of the user. Number of changes Number of application function points©2013 David Consulting Group 14
  • 16. Non-FP Quality Measures • Defect Removal Efficiency – Tracks the number of defects removed by lifecycle phase. Peer Reviews TestingRange Reqs. Design Code Unit Test Sys. Test UAT Prod TotalInsertion Rate 21 30 35 17 11 3 117Defects Found 5 16 27 31 24 12 2 117Removal Efficiency 4.3% 13.7% 23.1% 26.5% 20.5% 10.3% 1.7% Review Effectiveness 41.0% Test Effectiveness 57.3% • Customer Satisfaction – Gather information relating to delivery performance, communication, management, solutions, etc. – Level of importance. ©2013 David Consulting Group 15
  • 17. Improve Decision Making• It is necessary for the organization to put a “stake in the ground” relative to current performance level in order to identify opportunities to improve development practices.• Performance measures enable the organization to establish reasonable service level measures.• The organization must have a baseline to serve as a base for on-going improvement strategies focused on optimizing software delivery.©2013 David Consulting Group 16
  • 18. A Measurement Baseline Model QUANTITATIVE QUALITATIVE Size Process Effort MethodsMeasures Duration Skills Identifieshow you Cost Tools what youare doing Management Quality are doing Measured Capability Performance Maturity Standard of Baseline of performance Performance ©2013 David Consulting Group 17
  • 19. The Baseline Process• Identify the data set (typically project oriented)• Collect the data – Project measures (e.g., effort, size, cost, duration, defects) – Project attributes (e.g., skill levels, tools, process, etc.)• Analyze the data – Performance comparisons (identification of process strengths and weaknesses) – Industry averages and best practices – Performance modeling (identify high impact areas)• Report the results©2013 David Consulting Group 18
  • 20. All Baseline Projects COLLECT QUANTITATIVE DATA Size Effort Duration Cost Quality Measured Measured Performance Data Performance Completion Function Project Start Date Duration Effort Hrs Date Points A Project A1 4/1/2007 7/8/2007 3.3 1873 90.0 19.3 B Project B2 3/1/2007 1/31/2008 11.2 3810.24 139.0 27.4 C Project C3 5/7/2007 7/9/2007 2.1 92.61 28.0 3.3 D Project D4 2/1/2007 9/17/2007 7.6 2041.2 200.0 10.2 E Project E5 3/5/2007 11/1/2007 8.0 2041.2 188.0 10.9 F Project F6 2/12/2007 9/26/2007 7.5 630 132.0 4.8 G Project G7 2/1/2007 9/13/2007 7.5 6773.76 276.0 24.5 H Project H8 2/15/2007 7/26/2007 5.4 2402.4 115.0 20.9 I Project I9 1/1/2007 3/30/2007 2.9 273 58.0 4.7 J Project J10 7/2/2007 9/27/2007 2.9 235.2 70.0 3.4 K Project K11 7/1/2007 11/1/2007 4.1 5913.6 202.0 29.3©2013 David Consulting Group 19
  • 21. Reporting and Analyzing the ResultsSize is Performance Productivity 2200expressed 2000in terms 1800 A representative selectionof functionality 1600 of projects is measureddelivered to the 1400user Software 1200 Size 1000 Organizational Baseline 800 600 400 200 0 0 2 4 6 8 10 12 14 16 18 20 22 24 26 28 30 32 34 36 Rate of delivery is a Rate of Delivery measure of productivity Function Points per Person Month ©2013 David Consulting Group 20
  • 22. Quantitative Performance Evaluation Quantitative Assessment: COLLECT QUANTITATIVE DATA Size  Perform functional sizing on all selected projects. Effort  Collect data on project level of effort, cost, duration Duration and quality. Cost  Calculate productivity rates for each project, including Quality functional size delivered per staff month, cost per functional size, time to market, and defects delivered. Measured Performance Results: Baseline Productivity Average Project Size 133 Average FP/SM 10.7 Average Time-To-Market (Months) 6.9 Average Cost/FP $939 Delivered Defects/FP 0.0301©2013 David Consulting Group 21
  • 23. Project Attributes – Qualitative DataManagement Definition Design• Team Dynamics • Clearly Stated • Formal Process• High Morale Requirements • Rigorous Reviews• Project Tracking • Formal Process • Design Reuse• Project Planning • Customer Involvement • Customer Involvement• Automation • Experience Levels • Experienced Development• Management Skills • Business Impact Staff • AutomationBuild Test Environment• Code Reviews • Formal Testing Methods • New Technology• Source Code Tracking • Test Plans • Automated Process• Code Reuse • Staff Testing Experience • Adequate Training• Data Administration • Effective Test Tools • Organizational Dynamics• Computer Availability • Customer Involvement • Certification• Experienced Staff• Automation©2013 David Consulting Group 22
  • 24. Profile of Current Practices COLLECT QUALITATIVE DATA Capability profile of current practices Process Methods Management Skills Tools Management Environment Requirements Capability Maturity Test Design Industry Avg. Build Best Practice Current StateThe process modeling tool realigns the collected evaluation data into six categories - Management, Requirements, Design, Build, Test and Environment ©2013 David Consulting Group 23
  • 25. Qualitative Performance Evaluation COLLECT Qualitative Assessment QUALITATIVE DATA  Conduct interviews with members of each Process project team Methods Skills  Collect Project Profile information Tools  Develop Performance Profiles to display Management strengths and weaknesses among the selected Capability projects Profiles Results Project Nam e Profile Score Managem ent Definition Design Build Test Environm ent Accounts Payable 55.3 47.73 82.05 50.00 46.15 43.75 50.00 Priotity One 27.6 50.00 48.72 11.36 38.46 0.00 42.31 HR Enhancements 32.3 29.55 48.72 0.00 42.31 37.50 42.31 Client Accounts 29.5 31.82 43.59 0.00 30.77 37.50 42.31 ABC Release 44.1 31.82 53.85 34.09 38.46 53.13 42.31 Screen Redesign 17.0 22.73 43.59 0.00 15.38 0.00 30.77 Customer Web 40.2 45.45 23.08 38.64 53.85 50.00 34.62 Whole Life 29.2 56.82 28.21 22.73 26.92 18.75 53.85 Regional - East 22.7 36.36 43.59 0.00 30.77 9.38 30.77 Regional - West 17.6 43.18 23.08 0.00 26.92 9.38 26.92 Cashflow 40.6 56.82 71.79 0.00 38.46 43.75 38.46 Credit Automation 23.5 29.55 48.72 0.00 38.46 6.25 26.92 NISE 49.0 38.64 56.41 52.27 30.77 53.13 53.85 Help Desk Automation 49.3 54.55 74.36 20.45 53.85 50.00 38.46 Formula One Upgrade 22.8 31.82 38.46 0.00 11.54 25.00 46.15©2013 David Consulting Group 24
  • 26. Industry Data Reveals Best Practices MEASURES CHARACTERISTICS Research Software Size Skill Levels Level of Effort Automation Time to Market Process Delivered Defects Management Cost User Involvement Environment PERFORMANCE Analysis LEVELS PROFILES • Correlate Performance Levels to Characteristics Results • Substantiate Impact of Characteristics • Identify Best Practices©2013 David Consulting Group 25
  • 27. Modeled Improvements Project Nam e Profile Score Managem ent Definition Design Build Test Environm entAccounts Payable 55.3 47.73 82.05 50.00 46.15 43.75 50.00Priotity One 27.6 50.00 48.72 11.36 38.46 0.00 42.31HR Enhancements 32.3 29.55 48.72 0.00 42.31 37.50 42.31Client Accounts 29.5 31.82 43.59 0.00 30.77 37.50 42.31ABC Release 44.1 31.82 53.85 34.09 38.46 53.13 42.31 BaselineScreen Redesign 17.0 22.73 43.59 0.00 15.38 0.00 30.77 ProductivityCustomer Web 40.2 45.45 23.08 38.64 53.85 50.00 34.62 Average Project Size 133Whole Life 29.2 56.82 28.21 22.73 26.92 18.75 53.85Regional - East 22.7 36.36 43.59 0.00 30.77 9.38 30.77 Average FP/SM 10.7Regional - West 17.6 43.18 23.08 0.00 26.92 9.38 26.92 Average Time-To-Market (Months) 6.9Cashflow 40.6 56.82 71.79 0.00 38.46 43.75 38.46Credit Automation 23.5 29.55 48.72 0.00 38.46 6.25 26.92 Average Cost/FP $939NISE 49.0 38.64 56.41 52.27 30.77 53.13 53.85 Delivered Defects/FP 0.0301Help Desk Automation 49.3 54.55 74.36 20.45 53.85 50.00 38.46Formula One Upgrade 22.8 31.82 38.46 0.00 11.54 25.00 46.15 • Process Improvements: Performance Improvements: • Peer Reviews Productivity ~ +131% • Requirements Management Time to Market ~ -49% • Configuration Management Defect Ratio ~ -75% Project Nam e Profile Score Managem ent Definition Design Build Test Environm entAccounts Payable 75.3 61.73 82.05 60.00 60.15 53.75 50.00Priotity One 57.6 57.00 55.72 18.36 45.46 22.00 49.31HR Enhancements 52.3 32.55 51.72 23.00 42.31 57.50 49.31Client Accounts 69.5 53.82 65.59 12.00 50.77 67.50 49.31ABC Release 74.1 55.82 69.85 49.09 52.46 63.13 49.31 ProductivityScreen Redesign 67.0 43.73 63.59 21.00 36.38 20.00 51.77 ImprovementCustomer Web 59.2 49.45 27.08 58.64 53.85 54.00 49.62Whole Life 50.2 49.82 32.21 27.73 31.92 24.75 53.85 Average Project Size 133Regional - East 57.7 59.36 49.59 0.00 30.77 9.38 50.77 Average FP/SM 24.8Regional - West 52.6 55.18 30.08 0.00 33.92 19.38 26.92 Average Time-To-Market (Months) 3.5Cashflow 67.6 66.82 71.79 0.00 49.46 53.75 49.46Credit Automation 60.5 41.55 78.72 0.00 50.46 26.25 46.92 Average Cost/FP $467NISE 79.0 68.64 76.41 62.27 65.77 53.13 53.85 Delivered Defects/FP 0.0075Help Desk Automation 79.3 64.55 74.36 47.45 63.85 54.00 58.46Formula One Upgrade 52.8 49.82 52.46 0.00 31.54 25.00 56.15©2013 David Consulting Group 26
  • 28. Quantitative and Qualitative PerformanceMeasurement COLLECT COLLECT QUANTITATIVE DATA QUALITATIVE DATA Size ProcessCollection Effort Methods Duration Skills Cost Tools Quality ManagementAnalysis Measured Capability Performance Maturity Results Baseline Performance Opportunities for Best Action Improvement Practices©2013 David Consulting Group 27
  • 29. Summary• A best practice delivers measurable value• Measurement is key to identifying and ensuring best practice results• Use size as a normalizing factor• Organizational best practices are identifiable©2013 David Consulting Group 28
  • 30. Contact UsEmail: d.herron@davidconsultinggroup.comPhone: 1-610-644-2856, ext 21http://www.davidconsultinggroup.com @DavidConsultGrp /DavidConsultGrp /company/David-Consulting-Group©2013 David Consulting Group 29