New IDC Research on Software Analysis & Measurement

2,286 views

Published on

Watch this exciting webinar with Melinda Ballou, a leading analyst with IDC, as she reviews the newly defined market category of Software Quality Analysis and Measurement (SQAM). Hear Melinda discuss the motivation behind increased spend on SQAM such as competitive pressures requiring rapid adaptability while avoiding software failure, complex sourcing environments that include onshore, offshore and open source options, and economic impacts that drive efficiency and accountability in development.

To view the webinar, visit http://www.castsoftware.com/news-events/event/idc-software-analysis-measurement?gad=ss

Published in: Technology, Business
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
2,286
On SlideShare
0
From Embeds
0
Number of Embeds
10
Actions
Shares
0
Downloads
53
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide

New IDC Research on Software Analysis & Measurement

  1. 1. Din B s es ri ui s vg nA at iy i S A : dp bi wt Q M al t h A E e i M re n m r n akt gg
  2. 2. Driving Business Adaptability with Software Quality Analysis & Measurement: An Emerging Market Melinda Ballou Program Director, IDCApplication Life-Cycle Management & Executive Strategies Service
  3. 3. Agenda  Define and Understand SQAM & Trends Driving Adoption  Evaluate SQAM Survey Results  Key Strategies Moving into 2012/13  Questions?© 2012 IDC Feb-12 2
  4. 4. Industry Highlights:Disruptive Trends Driving SQAM Adoption  Diverse deployment demands for mobile, cloud, embedded drive corporate need for architectural impact analysis for application portfolio, business dynamism is enabled by software quality analysis  Organizations re-invest, seeking to do more with fewer resources with financial and staffing constraints; leveraging efficient approaches to restore and sustain high performing, timely, business-critical software.  Complex sourcing/off-shoring plus use of open source need strong teaming, effective code management, testing, and metrics enabled by SQAM; Services driven environment (SaaS/cloud, Devops emergence)  Global economic competition and local compliance across geographies demand quality, change and portfolio management, adaptability and rigor  Flexible development paradigm with services creation increasingly drive technology and business collaboration – strong agile emergence  Emerging security issues (as driver) and virtualization/cloud (as enabling technology) for SQAM adoption; ad hoc approaches unsustainable  End-user experience and business impact challenges of rich Internet, mobile, embedded, with social media collaboration/community opportunities© 2012 IDC Feb-12 3
  5. 5. SQAM DefinitionEvolving Beyond Traditional ASQ • Software Quality Analysis and Measurement: software tools that enable organizations to observe, measure, and evaluate software complexity, size, productivity, and risk (including technical & structural quality, non-functional testing) • Architectural assessment of design consequences (on software performance, stability, adaptability, and maintainability) • Static analysis and dynamic analysis • Quality metrics for complexity, size, risk, and productivity to establish baselines and to help judge project progress and resource capabilities • Application portfolio evaluation through understanding the impact of architectural flaws and dependencies • In-phase prevention of additional software problems not easily observable through typical ASQ tools.© 2012 IDC Feb-12 4
  6. 6. SQAM Share and Forecast Summary  Total Revenue for SQAM in 2010 = $356.3M  13.6% growth projected for 2011 to reach $406M  2008 = $279.1M; 2009 = $309.5M  Expected growth to $714M by 2015  CAGR for the forecast period (‘11-’15) is 14.9%  Top Five Vendor 2010 Revenues (narrow range):  CAST & Coverity @ around $39M  HP @ $38M & Parasoft $37M & IBM @ $36.5M  SQAM numbers for 2011 currently in process© 2012 IDC Feb-12 5
  7. 7. SQAM ForecastComparing the 2007 and 2011 Models 800 30% 700 25% 600 20% Millions of $ 500 Growth 400 15% 300 10% 200 5% 100 0 0% 2006 2007 2008 2009 2010 2011 2012 2013 2014 2015 2011-2015 Forecast 2011-2015 Growth© 2012 IDC Feb-12 6
  8. 8. “Quality Gap”: High Cost of Failure Poor Quality = Increased Business Risk ($$$$$) Lost Revenue Lost Customers Increased Costs Damaged Brand Lost Productivity Lower Profits© 2012 IDC Feb-12 7
  9. 9. IDC SQAM Survey Demographics  200 companies (NA/SA 35.5%, EMEA 37%, Asia16.5%, CEMA 11%)  Majority very large organizations: – 5,000-9,999 employees (45.5%) 10,000-29,999 employees (32%); 30,000+ employees (21.5%)  IT employees: 100-299 (63.6%); 250-499 (17%); 500+ (19.5%)  Revenue: $2B-$3.9B (48%); $4B-$9.9B (29.5%); $10-$19.9B (9%); $20B+ (12%); with around 95% currently using SQAM solutions  IT management 29%; IT ops 21%; App Owner 20%; Software dev 10%  Major industries: manufacturing, financial services, etc.  Self-described majority directors and managers (76%)  Key Drivers: Complex sourcing, business velocity, compliance, budget© 2012 IDC Feb-12 8
  10. 10. Demographics for User RoleQS1. Which of the following statements best describes your involvement with software quality analysis and metrics tools used in your organisation? I use software quality analysis and metrics tools 26.0% I am responsible for business, cost, and vendor management issues related to application failures 35.5% and IT time to market I influence or am involved in the purchasing process (recommending or sign-off) of software quality 23.5% analysis and metrics tools I both use and am involved in purchasing of 15.0% software quality analysis and metrics tools None of the Above (neither use nor purchase) 0.0% 0% 5% 10% 15% 20% 25% 30% 35% 40%N = 200Source: Custom Survey for CAST, IDC, December 2010 © 2012 IDC Feb-12 9
  11. 11. Team Distribution Broad Across AreasQA7. MEAN SUMMARY TABLE (INCLUDING 0) – How are your organization’s software developers distributed among the following teams? Architecture 14.2 Requirements 11.7 Development/Engineering/Modeling 21.5 Code analysis and assessment 7.3 Quality Assurance (QA) 12.7 Security 7.1 Software Change and Configuration Management 6.9 Release Provisioning and Operations 5.3 Maintenance 11.5 Other 1.9 0 5 10 15 20 25N = 200Source: Custom Survey for CAST, IDC, December 2010 © 2012 IDC Feb-12 10
  12. 12. Complexity, Cost & AgilityDrive AdoptionQC1. MEAN SUMMARY TABLE – How important to your organization are the following factors as drivers in the adoption of software quality analysis tools. Business consequences of poor quality code design 2.1 (impact of production problems) Increased costs due to constant application failures 2.2 Improvement in software development decision and 2.3 planning process Lowering of maintenance and performance costs and 2.3 resource impact (detection and MTTR) Internal and external customer satisfaction 2.0 Fit to existing systems and standards 2.3 Compliance initiatives (SOX, JSOX, Basel II) 2.5 Offshoring/Outsourcing oversight and management 2.9 Resource constraints (efficiency, productivity 2.3 improvement and resource reallocation to innovation) Security concerns 2.0 Business agility/speed of competitive 2.2 response/compressed delivery cycle Architectural complexity and increased resulting risk 2.3 1.0 2.0 3.0N = 200Source: Custom Survey for CAST, IDC, December 2010 © 2012 IDC Feb-12 11
  13. 13. Resource Constraints MidComplexity Create ChallengesQC3. Which of the following is the most significant challenge to the quality of your organization’s software development today ? Complexity 18.5% Outsourcing 5.5% Virtualization management 8.0% Multi-threaded software 6.5% Internal Staffing/Resources 8.5% Financial resources/Budget 19.0% Time to implement/Pace of change 12.0% Project prioritization 11.5% Poor architecture 2.0% None - No hurdle 8.5% Other (Please specify) 0.0% 0% 5% 10% 15% 20%N = 200Source: Custom Survey for CAST, IDC, December 2010 © 2012 IDC Feb-12 12
  14. 14. Majority Plan SQAM Spending IncreaseCombined totals across questions re: spending plans Increase 79.5%Decrease 20.5% 0% 10% 20% 30% 40% 50% 60% 70% 80% 90%N = 200Source: Custom Survey for CAST, IDC, December 2010 © 2012 IDC Feb-12 13
  15. 15. Balanced SQAM Budget Split across AreasQC7B. MEAN SUMMARY TABLE (INCLUDING 0) How will your organization’s budget for software quality analysis and measurement tools be distributed across the following functional areas in 2011 ? Architectural Analysis & Risk Evaluation 20.2 Quality Metrics/Measurement 18.7 Application Portfolio Management/Application 20.0 Portfolio Analysis/Software Dependencies Code Analysis (Static & 18.9 Dynamic)/Transactions/Sizing Security 21.4 Other 0.9 0 5 10 15 20 25N = 200Source: Custom Survey for CAST, IDC, December 2010 © 2012 IDC Feb-12 14
  16. 16. Context for SQAM AdoptionQC9. Which of the following tools or approaches does your organization currently employ in reviewing code and uncovering code problems for software as it is designed and developed? Manual review/peer code review 52.0% Static analysis tools 29.5% Dynamic analysis tools 37.0% Application portfolio management tools 32.5% Architectural design tools 34.5% Unit testing 50.5% Functional testing 54.0% Virtualization for test labs and deployment 28.5% Other 1.5% None of the Above 5.0% 0% 10% 20% 30% 40% 50% 60%N = 200; Multiple Responses Allowed; Does not Sum to 100%Source: Custom Survey for CAST, IDC, December 2010 © 2012 IDC Feb-12 15
  17. 17. Positive Perspective on DefectsQC10. On average, how many architectural and other code problems requiring patches are discovered in the 12-month period following release of the software into production? None 5.5% 1 to 10 19.0% 11 to 25 19.0% 26 to 50 16.5% 51 to 150 17.5% 151 to 500 3.0% More than 500 1.5% Dont Know 18.0% 0% 5% 10% 15% 20%N = 200Source: Custom Survey for CAST, IDC, December 2010 © 2012 IDC Feb-12 16
  18. 18. Coupled with Increase in Challenges…QC11. Over the past 2 years, has the amount of time an average developer in your organization spent doing code analysis increased, decreased, or stayed the same? Significantly Increased 4.0% Increased 30.5% Remained the Same 49.0% Decreased 16.0% Significantly Decreased 0.5% 0% 10% 20% 30% 40% 50% 60%N = 200Source: Custom Survey for CAST, IDC, December 2010 © 2012 IDC Feb-12 17
  19. 19. … And High OptimismQC12. How confident are you that your organization’s current code review process identifies all potentially serious problems? Not at all Confident 1.5% 2 9.5% 3 34.5% 4 31.0% Very Confident 15.0% Don’t Know 8.5% 0% 5% 10% 15% 20% 25% 30% 35% 40%N = 200Source: Custom Survey for CAST, IDC, December 2010 © 2012 IDC Feb-12 18
  20. 20. Confidence BalanceQC14. How often does your organization’s quality analysis and measurement team find problems, complexity and risks that were not found during code review? Never 3.5% 2 19.5% 3 42.0% 4 15.5% All the time 5.5% Don’t Know 14.0% 0% 5% 10% 15% 20% 25% 30% 35% 40% 45%N = 200Source: Custom Survey for CAST, IDC, December 2010 © 2012 IDC Feb-12 19
  21. 21. Decrease in EffortQC15A. Over the past 2 years – Has the amount of time it takes to identify code problems, fix them, rework and roll out new releases increased, decreased, or stayed the same. Significantly Increased 1.5% Increased 20.0% Remained the Same 45.0% Decreased 21.0% Significantly Decreased 5.0% Don’t Know 7.5% 0% 10% 20% 30% 40% 50%N = 200Source: Custom Survey for CAST, IDC, December 2010 © 2012 IDC Feb-12 20
  22. 22. More Decreases ExpectedQC15B. In the next 2 years – Will the amount of time it takes to identify code problems, fix them, rework and roll out new releases increased, decreased, or stayed the same. Significantly Increased 2.5% Increased 14.5% Remained the Same 44.5% Decreased 25.5% Significantly Decreased 6.0% Don’t Know 7.0% 0% 10% 20% 30% 40% 50%N = 200Source: Custom Survey for CAST, IDC, December 2010 © 2012 IDC Feb-12 21
  23. 23. Context for PurchasingQC16. Which of the following individuals in your organization are involved with introducing and investing in software quality analysis and measurement tools and processes? Senior IT executive (CIO, CSO, CTO) 45.5% VPs 18.0% Director of IT 60.0% Development Manager 46.0% Network Manager 22.0% Security Manager 26.0% Desktop Manager 12.0% Architect 22.5% Other IT Manager 28.5% Senior non-IT Executives (i.e. CEO, … 11.0% Line of Business Managers 12.5% Procurement or Purchasing… 12.5% Other 2.0% 0% 10% 20% 30% 40% 50% 60% 70%N = 200; Multiple Responses Allowed; Does not Sum to 100%Source: Custom Survey for CAST, IDC, December 2010 © 2012 IDC Feb-12 22
  24. 24. Survey Summary • Survey provides context for current adoption patterns • Challenges exposed – complexity, agility, security and financial constraints play role • Optimism increase for defect problems (overly exuberant or more efficient?) • Future plans and overall purchase increase laid out • Survey supports findings for SQAM market growth© 2012 IDC Feb-12 23
  25. 25. IDC Survey Calls to Action • The challenges of increased complexity and high-end development across diverse platforms increase code problems, increase costs and drive debilitating consequences resulting from defects pre- and post-deployment • Companies must become better educated about the business consequences and labor costs of poor software design since optimism mask the need for change • Organizations should evaluate SQAM tools to supplement traditional ASQ along with appropriate process and organizational approaches • Across industries, poorly designed and problematic software leads to brand perception impact above and beyond individual problems – demand response© 2012 IDC Feb-12 24
  26. 26. Goals of Effective IT/BusinessAlignment Innovation: Maximize Upside Through Technology- Enabled Business Processes New Business Value Reduced Exposure Compliance: Minimize Downside Through Risk Management© 2012 IDC Feb-12 25
  27. 27. IT and Business Challenges:Silos, Gaps  Today’s applications are high-visibility, and carry a high cost-of- failure -- customer self-serve, supplier/channel; key internal business applications  “Network effect” – failure in one leads to other failures  The need for SQAM as part of quality life-cycle is key since G2000 organizations are split across groups: – Business/users stakeholders – Architects, Designers and Developers – QA professionals – Operational staff  Must extend the Quality life-cycle across geographies, life cycle phases and groups© 2012 IDC Feb-12 26
  28. 28. Summary  Coordinate a Quality Life-Cycle approach that targets pragmatic approaches to SQAM from design through to deployment to obtain benefits  Evaluate your organization’s current strategies for design, application portfolio review, effective quality processes and automated tools adoption  Schisms between business, architects, development, testers and operations must be addressed -- IT groups and the business must build a common language, common metrics, and common tools and practices that include SQAM  Drive towards an effective quality strategy to help cut costs, increase efficiency and business agility, to sustain brand, address competitive challenges© 2012 IDC Feb-12 27
  29. 29. Upcoming Webinar Align Vendor SLAs with Long Term Value with Steve Hall, author of "Managing Global Development Risk” and Partner at ISG (formerly TPI), a leading research, consulting and advisory services firm Thursday, February 16th 11am-12pm EST (9:30pm IST, 5pm CET, 4pm UK, 4pm GMT, 8am PST)Steve Hall will discuss the challenge of aligning vendor SLAs with long term business value. He will provide details on how you can build healthier and transparent relationships with vendors by incorporating application structural quality measurement and practical, meaningful metrics to mitigate risk and maintain value from vendor relationships. You’ll learn how you can avoid vendor lock-in, improve production support activities and align metrics between vendors and project managers.
  30. 30. To view the entire webinar including Q&A click here New IDC Research on Software Analysis and Measurement To learn more about CAST Pete Pizzutillo p.pizzutillo@castsoftware.com www.castsoftware.com blog.castsoftware.com slideshare.net/castsoftware Twitter: @OnQuality
  31. 31. L a nmo ea o t A T er r bu C S w w c ss f aec m w .a tot r.o w bo .a tot aec m lgc ss f r.o ww w fc b o .o c so q a t w . e o kc m/a tn u ly a iw w sd s aen t a tot ae w . ie h r.e/ ss f r l c w w w t ie.o O Q a t w . t r m/ n u ly wt c i

×