• Share
  • Email
  • Embed
  • Like
  • Private Content
Ensuring a Quality Order Management Implementation
 

Ensuring a Quality Order Management Implementation

on

  • 128 views

Software testing is a process, or a series of processes, designed to make sure computer code does what it was designed to do and that it does not do anything unintended. ...

Software testing is a process, or a series of processes, designed to make sure computer code does what it was designed to do and that it does not do anything unintended.
- The program should be predictable and consistent, offering no surprises to users.
- Test should start with the assumption that the program contains defects and the goal is to find as many defects as possible.

Introduction to Quality and Testing
- Why is Quality important to Sterling Order Management System
- QA in OMS
- Journey to QA Center of Excellence
- Conclusion

Statistics

Views

Total Views
128
Views on SlideShare
128
Embed Views
0

Actions

Likes
0
Downloads
1
Comments
0

0 Embeds 0

No embeds

Accessibility

Categories

Upload Details

Uploaded via as Adobe PDF

Usage Rights

CC Attribution License

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment

    Ensuring a Quality Order Management Implementation Ensuring a Quality Order Management Implementation Presentation Transcript

    • Ensuring a Quality Order Management Implementation Subroto Majumdar, OMS Director subroto.majumdar@perficient.com
    • Agenda 2  Introduction to Quality and Testing  Why is Quality important to Sterling OMS  QA in OMS  Journey to QA Center of Excellence  Conclusion
    •  Software testing is a process, or a series of processes, designed to make sure computer code does what it was designed to do and that it does not do anything unintended.  The program should be predictable and consistent, offering no surprises to users.  Test should start with the assumption that the program contains defects and the goal is to find as many defects as possible. Introduction 3
    • Why is quality important for Sterling OMS? • OMS – A Major Enterprise Initiative • QA - Airplanes vs. Order Management Implementation – Hopefully, no one gets hurt – Impacts project timelines – Companies lose money and delay their ROI – It can mean the difference between business accepting or rejecting the solution – It hurts the product and vendor reputation • Bill Lear an engineer and pioneer in executive jets Grounded “All” customer planes till he could triage and resolve a mysterious issue with his jets. – Quality cost money but is the difference between success and failure • OMS Headless Application - Quality that cannot always be seen but felt 4
    • Importance of Testing and Validation  Testing should be introduced in the early stage of the Software Development Life Cycle (SDLC) – Cost of fixing the defect is higher if testing is not done in early stage & defects found in later critical stages  Retesting and Regression Testing is also important – Ensure that no previously working functions have failed as a result of the fix – New features have not created problems with previous software versions 5
    • Typical Testing phases in OMS testing projects Testing Phases 6 UAT/Performance E2E FT CIT UT  UT – Unit Testing  CIT – Component Integration Testing  FT – Functional Testing  E2E – End To End or System Testing  UAT – User Acceptance Testing  Performance Testing
    • OMS - What can be Tested?  Order flow across different status  Holds resolution  Credit card authorization  Fraud check validation  Sourcing and scheduling  Release  Ship confirmation, ship cancellation, status tracking  Settlement and invoicing  Data purge  Order load (data conversion)  Inventory load, ATP and sync  Item load and sync  Customer data management  Alerts, emails and exceptions 7
    • OMS - What can be Tested? 8  UI and Database validations have their own importance  UI validations recommended when the tester needs to validate the complete life cycle of a single order – Eg: Validating the ship confirmation from the fulfillment center. For this scenario, the tester can validate the order creation in OMS, order processing till release and once the shipment confirmation arrives, the order status is updated to Shipped.  Database validations beneficial when the tester has to validate from a range of data – Eg: To validate if there is any order where tax is $0 or shipping discount of more than $5 can be done by creating a simple query to fetch the above results. Other DB validations include data purge, to check when the order reached a specific status, to confirm if the order was picked up by the agent or not, verify the bulk inventory and item feed coming from different systems to OMS.
    • Performance Testing Scope - OMS  Performance testing is very important since OMS always deals with a huge volume of order, item and inventory related data.  At times, it also processes bulk orders in a single batch and hence performance testing plays a key role in any OMS implementation. 9
    • Automation Testing Scope - OMS Some of the many scenarios which can be automated:  Order flow across various status  Shipment charges  Tax calculation  Discounts and promotional charges  Payment tenders and tokens  Item validation  Inventory reservation pre/post order creation  Shipment and settlement process 10
    • Sample Waterfall OMS Testing Methodology CALENDAR Requirement Analysis Strategy Formulation Test Planning Test Case Creation Test Execution Scripting Post Deployment Evaluation Release Test Summary Report Benefits Onsite Onsite - Offshore 1 2 3 4 5 6 7 8 9 11
    • Test Strategy/ Planning Test Case Development – Vendor assumes full responsibility for all software testing activities for a specific project, release, or enhancement, Manages delivery with guidance and oversight from client testing/SQA leadership. – Your team creates the test plan and manages overall testing delivery. – Vendor executes full set of testing activities defined within the client test plan under client’s direct management. – Client manages overall testing delivery and creates all testing artifacts. – Vendor is responsible only for executing test cases provided by the client, logging defects, and providing necessary reporting. Environment Preparation Test Execution Analysis Mgmt Reporting Sign Off DeliveryModel Full Ownership Combined Ownership Testing Execution Only Typical Delivery Models 12
    • Typical QA Team Responsibility 13  Testing Strategy  Test Planning  Test Case Creation  Test Execution  Defects Verification  Reporting  Automation  Positive Sign Off  Successful Go Live
    • Sample QA Artifacts Detailed Scorecards - Integrated into Dashboard Executive Dashboard Partners Stakeholders ID RID Opened Description and Impact Mitigation Activities Severity Probability Due Date Owner Status Comments / Resolution 001 D 2/10/04 The project described in this Statement of Work is based on a projected start date of 2/9/2004. Project will start on 2/9 based on good faith. High Unlikely 2/9/04 F. Schutrumpf Closed CLOSED (2/9/04): Project started on schedule on good faith based on verbal ok from Bruce relayed from Warren. Issue 009 is tracking non- signed SOW. 001 D 4/12/04 Delivery of Pan and transition of code base to the Billing Implementation Team on 4/23/2004. Pan slips can jeopardize planned delivery of Billing functionality components that have Pan dependencies. In addition, given the finite number of iterations prior to year end, implementation of Pan functionality could limit scope of Billing, Payments, Fraud or Streamline functionality implemented by Year end First iteration designed to transfer, organize and cleanup existing Siebel code base. Schedule work for subsequent iterations that must have Requirements fully defined up front prior to iteration start to minimize developer churn Blending of Pan and Billing functionality on second and third iterations to continue working Pan functionality, but also include new billing functionality in each release. Scope impact evaluation at the end of the second iteration to assess the open issues surrounding Pan. Medium Certain On-going K. Sheen Open UPDATE 4/12/04: On-going dependency until all Pan dependencies for planned Billing, Payments and Fraud functionality have been resolved. UPDATE 4/26/04: Current first iteration contains multiple PAN user stories still. The first iteration will contain mostly Pan functionality (rather than billing functionality). This may require a change to scope (Payments or Streamline Integration) to maintain schedule. This will be evaluated at the completion of the first iteration. UPDATE (5/3/04): Second iteration will most likely still include mostly Pan functionality. In addition, specific BAs will need to be assigned to continue the Pan requirements wrap-up. This is expected to take 2 full time BAs through the end of the second iteration (assumes that Pan functionality can be wrapped up in the third iteration). This will definitely affect the amount of billing work that can be completed by 10/22/04. A complete impact evaluation will be completed at the end of the second iteration (see also issue #010) 002 I 4/12/04 The project described in this Statement of Work is based on a Projected Start Date of 4/12/04. Team members can ramp up more slowly to effect transition. Medium Unlikely 2/10/04 K. Sheen Closed CLOSED (4/12/04): Project started on schedule. Page 1 Technology Triage Center – Severity 1 Defect Summary Sunday June 4th, 2006 (Data Pulled a 7:45pm) Overall Status Defect ID Status Partner Project Summary Due Date Action Item Detected on Date 2299 New Visage Web Portal Production: Incorrect shipping cost shown in Siebel for Web Portal Order Per Darren, it appears that this is just a display item (totals & amt charged is correct) 5/27/2006 2382 New Visage CRM PROD - Adding the Shared Feature 'Guest Services Offer - Courtesy 100,200,50' throwing a System Failure Error in Siebel 6/1/2006 2435 New Visage Activation Activating a second suscriber under a demo account fails in Siebel 6/4/2006 2437 New Patni Web Portal Mecury fails to send emails created within test director 6/4/2006 2436 New Patni Web Portal Production: NSP: Entered LTDCASE1 into shopping cart - now when the page refreshes the sku is not there at all 6/4/2006 2151 Open Visage ROE ROE PROD: After ROE ESN activation, Welcome Email from Siebel is not sent to the ROE customer Rick confirmed all channels require this & per Scott, Visage is working this. Needs to provide due date. 5/22/2006 2404 Open Convergys Web Portal Order Handset Process - Calculate Sales Tax error in Production with concurrent users New duedat 6/6 - need to confirm before placing it in the ticket - 6/4 6/1/2006 2354 Reopen Visage Web Portal Production: Shared Features added or removed on Web Portal do not reflect despite giving an Order Confirmation Sent back to webportal dev team as it failed retest - 6/4 5/31/2006 2355 Reopen Visage Web Portal Production: Non-Shared Features added or removed on Web Portal do not reflect despite giving an Order Confirmation Sent back to webportal dev team as it failed retest - 6/4 5/31/2006 2416 Retest Telcordia Service Control Call setup latency - mobile-terminated calls are taking more than 10 seconds to ring through 11:30 fix tonight - retest tomorow - 6/3 5/31/2006 2369 Retest Disney Mobile CRM Production: Activation failure for web sales order in Data Platform Sharique to do sales order and see if it passes - 6/4 6/1/2006 1147 Reopen Patni Web Portal EID Verification needs to be integrated in the Checkout Process 5/31/2006 Equifax to push their side into prod tomorrow (6/1) at 6am - we can follow our prod push after this executes. Update tomorrow afternoon. 5/31 4/27/2006 2047 Retest mPortal Family Alerts Production: Config file incorrect for non-MDN accounts, preventing Family Alert! from working 6/3/2006 Into prod last night - ready for retest. - Ian to retest - 6/4 5/18/2006 2427 New Patni Web Portal Production: Site Wide: Hitbox tracking not working properly - talk to Todd Sheive at Hitbox 6/4/2006 Going in tomorrow morning - 6/4 6/3/2006 2429 New Patni Web Portal Production: SMS Messaging: need to set the originating address to 1002 in the XML in order to support the billing reference data agreed with Convergys 6/4/2006 Working on, pushed by tuesday night - 6/4 6/3/2006 2207 Open Patni ROE ROE PROD: Family Account with SA Portability failed at AO & FM level ...Record unusually stored in Siebel 6/4/2006 All ROE CR's will go in at end of next week (6/9) - 6/3 5/24/2006 2274 Open Patni ROE ROE UAT: The Auto Bill Pay Page does not allow to proceed to Order Confirmation 6/4/2006 5/26/2006 1793 Reopen Patni Web Portal BOGO Promotion SKUs are not uploaded in Visage Product Catalog 6/4/2006 Need to understand what actions Patni needs to take to resolve defect - Bobby, Dave, and Ashish to discuss real time - 6/3 5/10/2006 1925 Hold InPhonic Web Portal Production: Entertainment - Ringtones Preview hyperlink not working 6/6/2006 New due-date is Tuesday COB - 6/4 5/16/2006 2380 Open Autodesk LBS Production: Family Locator not functioning on port-in-swapped handset 6/6/2006 Pushing for tomorrow evening - 6/4 6/1/2006 Risk, Issue, Dependency Log Defect Tracking Log Connectivity Network Overall Status C-WAP Test Prod Billing Provider/CRM Provider: AVS Product Network and Operations Finance Legal Test Cases Defined Test Case Sign-Off Started IT / Systems / Data Center Overall Status Page 1 Integrated Application Testing Dashboard Tuesday, November 6th , 2007 Overall Integration Test Status Milestones: Milestone one – COMPLETE Milestone two – COMPLETE Milestone three – 10/17/07 Milestone four – 10/22/07 Milestone five – 10/24/07 · High level status messages go here · Additional thoughts go here. Days until Launch* 4 4 Soft Launch 1/4/2008 Key Issues: ØIssue one - ………………………………………... ØIssue two - ………………………………………. ØIssue three - …………………………………….. Green ATG Link Connectivity Overall Status Martin-Dawes / AAA Test Prod · High level status messages go here · Additional thoughts go here. Martin-Dawes / AirCell Agile (Louisville / Itasca) Passur / Itasca Yellow Yellow Completed Open Defects Sev 1 Sev 2 Sev 3 Overall Status and Issue · High level status messages go here · Additional thoughts go here. Lkasjdlkjflkjsdflkajskldjflkjsdlkjfkjsdlfjlakjdflkajsdlkjflkadlskjflajslkdjfslkdjf · Alkjdlkfjlaksdjfslkdjflkjskldjf · Alkjdlkfjalksjdflkjalkjlkj · ajlksdjflkjasldkjfljaslkdjflasjdlkfjlkajsdflajsdlfjljdflkjalkjdkljlkdjflakdlfkjalkdfjlkjfdljaslkdjflkajsdlkfjlksdjflkajsdlkfjkajsdfkljaldkjf Systems Overall Status Martin-Dawes Dev Test · High level status messages go here · Additional thoughts go here. AAA BridgePort Agile Passur Yellow Infor Martin-Dawes AAA BridgePort Agile Passur ESB (TIBCO) AAA BridgePort Agile Passur ESB (TIBCO) Prod Systems Integration Overall Status Martin-Dawes / AAA Dev Test · High level status messages go here · Additional thoughts go here. Infor / Agile etc / etc etc / etc Yellow etc / etc etc / etc etc / etc etc / etc etc / etc etc / etc etc / etc etc / etc etc / etc etc / etc Prod Critical AirCell Launch Milestones • Integration Testing Environment Build-out • Integration Testing Environment Ready • System Integration Test Planning & Definition • Vendor Q-Gate Sign-off  Vendor testing completed and verified  CM process in place • Install / Configure / Remote Admin Testing • Neighborhood Testing • Formal Integration Testing  End to End Functional Testing  Non-Functional Requirements Testing  Performance / Load Testing  Disaster Recovery Testing • Production Environment Build-out • Production Environment Ready Jun Jul Aug Sept Oct Nov Dec Jan Feb Mar Soft Launch Commercial Launch Critical AirCell Launch Milestones • Integration Testing Environment Build-out • Integration Testing Environment Ready • System Integration Test Planning & Definition • Vendor Q-Gate Sign-off  Vendor testing completed and verified  CM process in place • Install / Configure / Remote Admin Testing • Neighborhood Testing • Formal Integration Testing  End to End Functional Testing  Non-Functional Requirements Testing  Performance / Load Testing  Disaster Recovery Testing • Production Environment Build-out • Production Environment Ready Jun Jul Aug Sept Oct Nov Dec Jan Feb Mar Soft Launch Commercial Launch Days until Launch* 4 4 Soft Launch 1/4/2008 Critical AirCell Launch Milestones • Integration Testing Environment Build-out • Integration Testing Environment Ready • System Integration Test Planning & Definition • Vendor Q-Gate Sign-off  Vendor testing completed and verified  CM process in place • Install / Configure / Remote Admin Testing • Neighborhood Testing • Formal Integration Testing  End to End Functional Testing  Non-Functional Requirements Testing  Performance / Load Testing  Disaster Recovery Testing • Production Environment Build-out • Production Environment Ready Jun Jul Aug Sept Oct Nov Dec Jan Feb Mar Soft Launch Commercial Launch Critical AirCell Launch Milestones • Integration Testing Environment Build-out • Integration Testing Environment Ready • System Integration Test Planning & Definition • Vendor Q-Gate Sign-off  Vendor testing completed and verified  CM process in place • Install / Configure / Remote Admin Testing • Neighborhood Testing • Formal Integration Testing  End to End Functional Testing  Non-Functional Requirements Testing  Performance / Load Testing  Disaster Recovery Testing • Production Environment Build-out • Production Environment Ready Jun Jul Aug Sept Oct Nov Dec Jan Feb Mar Soft Launch Commercial Launch Test Case ID Priority Sequence Priority Sequence Applications Required Applications Ready Test Phase Test Date PASS-001 A 1 A 1 Portal, IMS, AAA 9/15/07 Neighborhood 9/15/07 PASS-007 A 2 A 2 Portal, IMS, AAA 9/15/07 Neighborhood 9/15/07 NET-002 A 3 A 1 Portal, IMS, AAA, MDS 9/22/07 Neighborhood 9/22/07 NET-005 A 4 A 2 NMS, Portal 9/22/07 Neighborhood 9/22/07 LEG-001 A 5 A 1 Passur, MDS 9/1/07 Neighborhood 9/1/07 FIN-028 A 6 A 2 app1, app2, app3.component 9/2/07 Neighborhood 9/2/07 xxx-nnn A 7 A 2 app1, app2, app3.component 10/1/07 Formal E2E 10/1/07 xxx-nnn B 1 A 2 app1, app2, app3.component 10/1/07 Formal E2E 10/1/07 xxx-nnn B 2 A 3 app1, app2, app3.component 10/1/07 Formal E2E 10/1/07 xxx-nnn B 3 A 3 app1, app2, app3.component 10/1/07 Formal E2E 10/1/07 xxx-nnn B 4 A 3 app1, app2, app3.component 10/1/07 Formal E2E 10/1/07 xxx-nnn B 5 B 3 app1, app2, app3.component 10/1/07 Formal E2E 10/1/07 xxx-nnn B 6 B 3 app1, app2, app3.component 10/1/07 Formal E2E 10/1/07 xxx-nnn B 7 B 4 app1, app2, app3.component 10/1/07 Formal E2E 10/1/07 xxx-nnn B 8 B 4 app1, app2, app3.component 10/1/07 Formal E2E 10/1/07 xxx-nnn B 9 B 3 app1, app2, app3.component 10/1/07 Formal E2E 10/1/07 xxx-nnn C 1 B 3 app1, app2, app3.component 10/1/07 Formal E2E 10/1/07 xxx-nnn C 2 B 5 app1, app2, app3.component 10/1/07 Formal E2E 10/1/07 xxx-nnn C 3 C 5 app1, app2, app3.component 10/1/07 Formal E2E 10/1/07 Planned ExecutionOverall Business Unit Test Case Execution Tracking Schedule 0 10 20 30 40 50 60 70 80 4/14 4/16 4/18 4/20 4/22 4/24 4/26 4/28 4/30 5/2 5/4 5/6 5/8 5/10 5/12 5/14 5/16 5/18 5/20 5/22 5/24 5/26 5/28 5/30 6/1 6/3 Defects Daily Opened Daily Closed Total Currently Open Daily Opened 3 6 8 14 13 5 5 11 13 12 7 11 11 4 6 1 0 6 5 5 3 4 2 4 11 12 2 4 3 4 5 3 7 4 6 2 1 2 3 2 4 3 4 3 1 0 4 12 11 3 2 3 Daily Closed 0 0 1 0 3 5 6 7 6 7 4 8 2 7 9 3 2 5 8 12 10 2 3 1 1 4 5 8 5 6 3 11 12 9 8 3 4 5 5 4 2 2 4 1 4 1 7 9 14 4 5 3 Total Currently Open 3 9 16 30 40 40 39 43 50 55 58 61 70 67 64 62 60 61 58 51 44 46 45 48 58 66 63 59 57 55 57 49 44 39 37 36 33 30 28 26 28 29 29 31 28 27 24 27 24 23 20 20 4/ 1 4 4/ 1 5 4/ 1 6 4/ 1 7 4/ 1 8 4/ 1 9 4/ 2 0 4/ 2 1 4/ 2 2 4/ 2 3 4/ 2 4 4/ 2 5 4/ 2 6 4/ 2 7 4/ 2 8 4/ 2 9 4/ 3 0 5/ 1 5/ 2 5/ 3 5/ 4 5/ 5 5/ 6 5/ 7 5/ 8 5/ 9 5/ 1 0 5/ 1 1 5/ 1 2 5/ 1 3 5/ 1 4 5/ 1 5 5/ 1 6 5/ 1 7 5/ 1 8 5/ 1 9 5/ 2 0 5/ 2 1 5/ 2 2 5/ 2 3 5/ 2 4 5/ 2 5 5/ 2 6 5/ 2 7 5/ 2 8 5/ 2 9 5/ 3 0 5/ 3 1 6/ 1 6/ 2 6/ 3 6/ 4 Defect Tracking Heuristics 14
    • Sterling OMS: Driver for QA CoE  We talked about Importance of Quality and QA in the context of Sterling OMS  Journey beyond OMS QA to a QA Center of Excellence 15
    • Ensuring a Quality OM Implementation  Who is responsible for Quality? – QA Team? • Quality is everyone’s responsibility and it’s a continuous effort – Why and Where do defects exists? • So many reasons and so many places – Requirements Validation – Scope Validation – Design Validation – Planning Validation – Experience of the team (Have you done this before?) – Reporting – Communication – Change Management 16 Quality is a journey, not something that happens towards the end…
    • Components of a Quality Center of Excellence 1. Engagement or methodology of QA • Create engagement and a framework to show the value of QA • If I spend x on QA I can expect y savings • Establish credibility and a cost model that focuses on quality, value and stability 2. Creates a metrics or KPI driven QA function • Show how QA improves quality, drives down time to market, increases stability 3. Creates value and ownership much like that of development teams • QA owns the applications they support as much as development does • QA owns strategy, testing plan, automation, regression scripts, • Show this alignment with development 4. Continuous improvement • Find innovative ways to drive down the cost of QA, and deliver higher quality. • Always look for ways to push QA further up stream in the SDLC cycle • Show how the overall quality of an application or project is not the sum of QA but is the sum of all project parts (requirements, designs, development etc.) 17
    • What does a QA COE do?  QA (CoE) should consist of: A team of people that promote collaboration and using best practices around QA to drive business results.  Responsibilities: – Support: CoEs should offer support to the business lines. This may be through services needed, or providing subject matter experts. – Guidance: Standards, methodologies, tools and knowledge repositories are typical approaches to filling this need. – Shared Learning: Training and certifications, skill assessments, team building and formalized roles. – Measurements: CoEs should be able to demonstrate they are delivering the valued results that justified their creation through the use of output metrics. – Governance: Allocating limited resources (money, people, etc.) across all their possible use is an important function of CoEs. They should ensure organizations invest in the most valuable projects and create economies of scale for their service offering. 18
    • What to Measure?  Here are just a few of the measurements that should be considered: – Defect density (defects per lines of code) – Defect by application area – Defects by project phase (requirement, design, development) – Defect re-open rate • And why? (design issue, dev issue) – Performance metrics • Number of critical issues resolved through performance testing • Business value surrounding resolution of critical issues – Automation metrics • Efficiencies gained through automation (regression cycle time) • Improvements in quality as a % of regression – Trending • Number of open critical and priority defects by area • Number of critical or priority defects by phase (design/dev) – Production P1-P4 counts by release • You should see this drop as quality increases and problems are found earlier 19
    • Trends and Patterns Derived  With the right metrics an IT organization can move various SDLC phases to become more quality focused by derived patters found in the QA cycle.  Application area’s which struggle with quality of design or development code can be pin pointed and as such can be provided assistance to become more effective.  Processes within the development and design lifecycles can be brought to the forefront based on how many defects are being derived.  Innovations in the design, development and QA lifecycles can emerge through continuous improvement discussions which can be tied to QA metrics.  A true sense of the overall maturity of a Sterling OMS eco system, or any development domain can be established, benchmarked and then monitored for improvements.  Look for mature development deployments to production that are yielding low production issue counts post launch, compare release over release and improve. But also compare against other application areas 20
    • Conclusion  Quality is very important  Think Total Quality of the Process not just software Quality  Quality is an investment with favorable ROI  Choose the right OMS QA model for your company  Everyone is responsible for Quality  A metrics based QA organization will create positive momentum and spread beyond one or two applications  Quality Center of Excellence. QA is something that can spread beyond Sterling OMS 21
    • Thank you! For more information Listen to our webinar: Learn How to Create a Seamless Omni-Channel Retail Experience 2 Subroto Majumdar, OMS Director subroto.majumdar@perficient.com