Published on

  • Be the first to comment

  • Be the first to like this

No Downloads
Total Views
On Slideshare
From Embeds
Number of Embeds
Embeds 0
No embeds

No notes for slide


  1. 1. testing, hosting and managing business applications August 8, 2006 Managing Quality in your ERP Project 12 Mistakes to Avoid & Best Practices to Adopt Dan Downing, VP Testing Services M ENTORA G ROUP Atlanta • Boston • DC
  2. 2. Objectives <ul><li>Shine the spotlight on key Quality mistakes that ERP implementations should avoid </li></ul><ul><li>Things you will learn: </li></ul><ul><ul><li>The 12 Mistakes to Avoid </li></ul></ul><ul><ul><li>Their tell-tale signs and risks </li></ul></ul><ul><ul><li>Strategies for mitigating their impact </li></ul></ul><ul><ul><li>Tools to enable repeatability </li></ul></ul>
  3. 3. ERP: Definition <ul><li>Strict: </li></ul><ul><li>Enterprise Resource Planning </li></ul><ul><ul><li>Financials, Manufacturing, HR, Supply Chain, CRM… </li></ul></ul><ul><ul><li>e.g., Oracle 11i, SAP, PeopleSoft… </li></ul></ul><ul><li>Looser: </li></ul><ul><li>Any business-critical packaged application that on which you run a substantial part of your business </li></ul><ul><ul><li>e.g., Integrated healthcare, student administration, hospital management, customer service, brokerage & trading, staff recruitment & placement… </li></ul></ul>
  4. 4. ERP Challenges <ul><li>High-intrusion </li></ul><ul><li>Expensive to implement </li></ul><ul><li>Unique workflows, interfaces </li></ul><ul><li>Often coupled with BPR </li></ul><ul><li>Don’t control quality </li></ul><ul><li>Conflicting maintenance cycles </li></ul><ul><li>Complex middleware </li></ul><ul><li>Challenging to tune </li></ul>Technology Stack Application Configurability Enterprise Scope Vendor Dependency
  5. 5. Competing Stakeholders & Key Goals Management Hosting Provider ERP Consultant Business IT ERP Vendor QA Competitive Advantage Cost-savings Time & Money Personal goals Budget
  6. 6. 12 Quality Risks Functional correctness System performance Testing infrastructure Test cases Who does testing? How much time for testing? Measuring quality What tools? Automated vs. manual Sponsorship Test data Enough testing?
  7. 7. Mistake #1 <ul><li>Tell-tale signs </li></ul><ul><li>Industry analysts still bearish on ERP software quality </li></ul><ul><li>Software is configured to your workflows, interfaced to your surrounding systems, with your converted data </li></ul><ul><li>Risk </li></ul><ul><li>Configuration decisions trigger software conflicts that yield incorrect results </li></ul><ul><li>Best Practices to Mitigate </li></ul><ul><li>End-end workflow functional testing </li></ul><ul><li>Upgrades: Run same tests on both releases </li></ul><ul><li>Resources </li></ul><ul><li>Testing team, test case inventory, automated functional testing tools </li></ul>Confusing product quality with implementation quality
  8. 8. Mistake #2 <ul><li>Tell-tale signs </li></ul><ul><li>Hosting provider hosts no/few similar environments (hw, patch level, disk subsystems) </li></ul><ul><li>System configuration based on standard vendor recommendations vs. hard statistics </li></ul><ul><li>Risk </li></ul><ul><li>Under-configured environment </li></ul><ul><li>Poor peak load performance </li></ul><ul><li>Best Practices to Mitigate </li></ul><ul><li>Performance test peak transaction volumes </li></ul><ul><li>Resources </li></ul><ul><li>Cross-functional performance team, load testing tool, performance expert </li></ul>Trusting your hosting provider’s environment assertions DB Server CPU at 100%
  9. 9. Mistake #3 <ul><li>Tell-tale signs </li></ul><ul><li>Quality not a corporate priority </li></ul><ul><li>No quality representation on leadership team </li></ul><ul><li>No Quality owner </li></ul><ul><li>No separate testing budget </li></ul><ul><li>Risk </li></ul><ul><li>Rocky go-live at best, major failure at worst </li></ul><ul><li>Best Practices to Mitigate </li></ul><ul><li>Make Quality an integral KPI for each major activity of the implementation </li></ul><ul><li>Resources </li></ul><ul><li>Enroll VP of Development or Director of QA to develop strategy </li></ul>Failure to secure strong sponsorship for testing 6 
  10. 10. Mistake #4 <ul><li>Tell-tale signs </li></ul><ul><li>The only test cases generated by conference room pilot team </li></ul><ul><li>Test cases incomplete: coverage, input data, expected results </li></ul><ul><li>Risk </li></ul><ul><li>Uneven quality due to inadequate test coverage; testing the process, not the load </li></ul><ul><li>Best Practices to Mitigate </li></ul><ul><li>Make complete test cases a deliverable by each function group </li></ul><ul><li>Resources </li></ul><ul><li>QA, business users, Test Management tool or Excel </li></ul>Confusing conference room pilot with test case development ?
  11. 11. Mistake #5 <ul><li>Tell-tale signs </li></ul><ul><li>ERP consultant’s focus is on the software supporting the business, not on the infrastructure supporting the load </li></ul><ul><li>No separate testing plan </li></ul><ul><li>Business configuration team under-resourced, over-stretched </li></ul><ul><li>Risk </li></ul><ul><li>Uncertain functional quality, poor performance, no quality metrics for decision-making </li></ul><ul><li>Best Practices to Mitigate </li></ul><ul><li>Plan and staff Testing as a separate subproject </li></ul><ul><li>Resources </li></ul><ul><li>QA team, business users, testing consultant </li></ul>Believing that testing can be done by the implementation team
  12. 12. Mistake #6 <ul><li>Tell-tale signs </li></ul><ul><li>Project plan shows testing not starting until after configuration and development are complete </li></ul><ul><li>Testing resourced by same people as other activities </li></ul><ul><li>Risk </li></ul><ul><li>Uncertain quality due to compressed testing time, incomplete coverage, no quality metrics </li></ul><ul><li>Best Practices to Mitigate </li></ul><ul><li>Start test planning, tester training, testing setup early in the project </li></ul><ul><li>Resources </li></ul><ul><li>QA team, business users, testing consultant </li></ul>Failure to allocate enough time for testing
  13. 13. Mistake #7 <ul><li>Tell-tale signs </li></ul><ul><li>Quality not a regular leadership topic </li></ul><ul><li>No/weak quality metrics for each phase </li></ul><ul><li>No compelling visuals of quality KPIs </li></ul><ul><li>Risk </li></ul><ul><li>No basis for management decisions based on quality; uninformed go-live decision </li></ul><ul><li>Best Practices to Mitigate </li></ul><ul><li>Identify key quality metrics by phase; graphs to summarize supporting detail (early: risk areas, functional gaps, enhancements; later: test coverage, testing progress, defects) </li></ul><ul><li>Resources </li></ul><ul><li>Test Management tool, Excel </li></ul>Failure to make quality visible early-on
  14. 14. Mistake #8 <ul><li>Tell-tale signs </li></ul><ul><li>Test plan assumes testing is manual </li></ul><ul><li>No budget for test tools, testing consultant </li></ul><ul><li>Few defined test cycles </li></ul><ul><li>Testing function but not load </li></ul><ul><li>Risk </li></ul><ul><li>Low test repeatability; error-prone; fewer test cycles; incomplete test coverage </li></ul><ul><li>Best Practices to Mitigate </li></ul><ul><li>Strategy that combines automated and manual testing; budget to support it </li></ul><ul><li>Resources </li></ul><ul><li>Functional, load & test management tools; expert testing consultant to accelerate </li></ul><ul><li>Pre-defined scripts from value-add vendors </li></ul>Failure to use the right tools to support testing
  15. 15. Mistake #9 <ul><li>Tell-tale signs </li></ul><ul><li>Test plan assumes testing all automated </li></ul><ul><li>Test plan compresses test time on expectation that automated is faster </li></ul><ul><li>Risk </li></ul><ul><li>Ineffective and incomplete testing; test automation black eye based on failed expectations; uncertain quality </li></ul><ul><li>Best Practices to Mitigate </li></ul><ul><li>Plan that blends selective automated testing, whose coverage grows over time </li></ul><ul><li>Resources </li></ul><ul><li>Functional testing tools; expert testing consultant to implement & transfer knowledge </li></ul>Believing that automated testing replaces manual testing
  16. 16. Mistake #10 <ul><li>Tell-tale signs </li></ul><ul><li>Infrastructure plan has no/limited dedicated testing environments </li></ul><ul><li>Insufficient database instances, disk storage </li></ul><ul><li>Volatile, non-production test data </li></ul><ul><li>Risk </li></ul><ul><li>Frustrated test team, low testing productivity; incomplete testing; uncertain quality </li></ul><ul><li>Best Practices to Mitigate </li></ul><ul><li>Plan for at least these environments: pilot, user acceptance, QA-1 (current release), QA-2 (new release), training (2) </li></ul><ul><li>Resources </li></ul><ul><li>QA leadership, infrastructure team, ERP/DB vendor </li></ul>Failure to secure enough testing infrastructure
  17. 17. Mistake #11 <ul><li>Tell-tale signs </li></ul><ul><li>No plan for populating / copying / refreshing test databases </li></ul><ul><li>Test cases not planned with ‘cascading data dependencies’ understood </li></ul><ul><li>Risk </li></ul><ul><li>Frustration; low testing productivity; failure to meet timeline </li></ul><ul><li>Best Practices to Mitigate </li></ul><ul><li>Plan for disk-to-disk test DB backups; sequence test cases with macro workflows & id data dependencies </li></ul><ul><li>Resources </li></ul><ul><li>QA leadership, data expert, Test Management tool </li></ul>Underestimation of effort to create reusable test data Procure to Pay Disk-to-disk DB refresh
  18. 18. Mistake #12 <ul><li>Tell-tale signs </li></ul><ul><li>No operational maintenance plan </li></ul><ul><li>No/low consideration for test reusability </li></ul><ul><li>Not a focus for any one stakeholder </li></ul><ul><li>Not today’s concern </li></ul><ul><li>Risk </li></ul><ul><li>Inability to keep up with patches, upgrades; risk of introducing problems without proper testing </li></ul><ul><li>Best Practices to Mitigate </li></ul><ul><li>Growing, reusable automated test inventory </li></ul><ul><li>Resources </li></ul><ul><li>QA leadership, functional and load testing tools </li></ul>Believing that you’re done with testing once you go live Then what?
  19. 19. Summary <ul><li>ERP quality is different </li></ul><ul><ul><li>Scope, complexity, stakeholders, cost, competing goals </li></ul></ul><ul><ul><li>Quality is embedded in many activities </li></ul></ul><ul><ul><li>Don’t trust a stakeholder’s quality assertions </li></ul></ul><ul><li>Make it a leadership topic </li></ul><ul><ul><li>Define quality KPIs for each phase </li></ul></ul><ul><ul><li>Graph them! </li></ul></ul><ul><li>Use tools that will enhance testing productivity </li></ul><ul><ul><li>Planning, environments and disk are key </li></ul></ul><ul><ul><li>Automation is not a silver bullet </li></ul></ul><ul><ul><li>Repeatability yields ROI – you’ll leverage this after go-live </li></ul></ul><ul><ul><li>No sure way to test system performance without tools </li></ul></ul><ul><li>Watch for the tell-tales signs of these common mistakes </li></ul><ul><ul><li>Leverage best practices, enroll experts to mitigate! </li></ul></ul>
  20. 20. Questions? <ul><li> </li></ul><ul><li> </li></ul>